Rutherford Weekly

March 16, 2023

Rutherford Weekly - Shelby NC

Issue link:

Contents of this Issue


Page 16 of 23

Thursday, March 16-March 22, 2023 828-248-1408 Rutherford Weekly - Page 17 Mayra Littman Advertising Representative 704-472-7892 RUTHERFORD WEEKLY IS HERE FOR YOU! STRIVING TO HELP ALL BUSINESSES SUCCEED! If you're not sure what to do or how to advertise, give me a call or e-mail. Let's make an appointment and discuss ways we can help! 157 West Main Street, Forest City 828.248.1408 Digital & Print Ads Much More....Call TODAY! Are robots coming to steal your time, money and maybe even your heart? Recent technology advances have created new AI-related security threats, and while they don't look like what you've seen in the movies, you might be at risk. Artifi cial intelligence has taken mind-boggling leaps forward in terms of what it can do in the past few months, with tools like ChatGPT making headlines around the globe. AI can create useful content but generate danger in a whole new way. We're not talking about robot armies marching in lockstep to annihilate mankind. Technology isn't out to get us, but some humans are, and ChatGPT is a shiny new tool in some criminals' arsenals. In this article, we'll explore how scammers and cyber criminals use ChatGPT to harm, then provide tips on protecting yourself. What is ChatGPT? ChatGPT is a conversational AI model developed by OpenAI that can mimic human language and generate coherent and natural responses to text- based input. It's designed to learn from massive amounts of online data, soaking up what's available on the web and then recycling information and phrasing to respond to questions. So far, it's been used for various legitimate purposes, like chatbots and language translation. However, scammers and cybercriminals have also taken advantage of AI's capabilities for malicious purposes. Potential risks to look out for Scammers use ChatGPT to make their malware threats, phishing attempts, and fake profi les more convincing and interactive. By generating well-written copy and fast, believable responses to victims' messages, scammers using ChatGPT can create the illusion of a real person on the other end of the conversation. This can make it harder for victims to identify that they're being scammed. Beware of these potential scams when using artifi cial intelligence. • Phishing scams. Scammers use ChatGPT to create believable phishing scams that mimic legitimate organizations like banks, social media companies, or government agencies. The scammer sends an unsolicited message to the victim via email or messaging app, then prompts them to click a link or provide personal information. The message can appear legitimate since ChatGPT can generate compelling messages modeled after content from the actual bank, government agency, or other organization. However, clicking the link or providing personal information can lead to identity theft or fi nancial loss. • Impersonation scams. Scammers can use AI- generated content to impersonate people like a boss, co-worker, or family member and convince the victim to provide sensitive information or transfer money. The scammer generates an articulate and free-fl owing conversation that appears to be from the person they are impersonating, making it tough for the victim to detect the fraud. • Malware and viruses. Cybercriminals can use ChatGPT to spread malware and viruses. AI allows them to produce a conversation that appears to be from a legitimate source like a friend or colleague, then prompts the victim to click on a link or download a fi le. Once the victim clicks the link or downloads the fi le, their device can become infected with malware or viruses. • ChatGPT romance scams. Are you sure it's a human causing your heart to fl utter during online interactions? Romance scams are a type of fraud where scammers create fake profi les on dating websites or social media platforms to establish romantic relationships with their victims. They then use this relationship to gain the trust of their victims and eventually convince them to send money or provide personal information. • Purchasing scams. Bad actors are using ChatGPT to trick people into buying fake goods. They create a conversation that appears to be from a legitimate seller, then convince the victim to purchase products or services. Scammers trick victims into a digital funds transfer. However, the goods are either fake or do not exist, so the money people pay can be a permanent loss. How to protect yourself from cyber criminals using ChatGPT • Be cautious of unsolicited messages. Use extreme caution if you receive a message from someone you don't know. Don't click on included links or provide requested personal information. Contact them directly to confi rm its authenticity if the message appears from a legitimate organization, such as a bank or government agency. • Verify the identity of the person you're chatting with. Verify their identity if they're chatting with someone online, especially if they're asking for personal information or money. Ask for their contact information, like their business email address or personal phone number, and confi rm it's legit. If possible, communicate with them through a secure and verifi ed messaging platform rather than an unsecured platform. • Scrutinize text. AI written text often uses the same words repeatedly. It also uses short sentences with unimaginative language and no idioms or contractions. Content may include implausible statements. If something seems off, trust your gut. • Use two-factor authentication for your online accounts. Two-factor authentication adds an extra layer of security to your online accounts by requiring an additional code to log in, usually sent to your phone. This makes it more diffi cult for scammers to access your accounts even if they manage to steal your password. • Use a password manager to generate and store strong passwords. Don't use your dog's name and your birth date. Avoid using the same password for everything. We know that gets complicated, but password managers help. Using a password manager generates and stores strong passwords for your online accounts, making it more diffi cult for scammers to gain access. • Be cautious when downloading fi les or clicking on links. Use care, especially if fi les or links come from an unknown source. Scammers can use ChatGPT to generate convincing messages that appear to be from a legitimate source, such as a friend or colleague. Verify the source before clicking on any links or downloading any fi les. • Use caution when talking to strangers. To avoid falling victim to a scam, be cautious when engaging with people online, especially those who seem too good to be true. Be wary of anyone asking for money or personal information, especially if you've never met them. Watch out for anyone who refuses to video chat or meet in person because this could be a sign that they are not who they claim to be. If you suspect you may be a victim, stop communicating with the scammer immediately and report the incident to the authorities. • Educate yourself on the latest scams and fraud tactics. Educate yourself so that you can recognize their schemes and protect yourself. Look up the latest scams and read reports on how others have been targeted by using BBB Scam Tracker. BBB Tip: Here's how to protect yourself using ChatGPT Article Provided By: Juliana O'Rork 6 DAY FORECAST RUTHERFORD COUNTY'S RUTHERFORD COUNTY'S For Up To The Minute Rutherford County Weather Go To THUR MARCH 16 PARTLY CLOUDY 64 43 62 49 FRI MARCH 17 SHOWERS 56 32 MON MARCH 20 PARTLY CLOUDY SAT MARCH 18 64 31 PARTLY CLOUDY SUN MARCH 19 55 29 MOSTLY SUNNY 57 39 TUES MARCH 21 PM SHOWERS ©Community First Media Community First Media www.duffi www.duffi 828-245-5116 828-245-5116 LOCALLY OWNED AND OPERATED FOR 50 YEARS LOCALLY OWNED AND OPERATED FOR 50 YEARS SUPPLIES EQUIPMENT SERVICE PRINTED MATERIALS 671 Oak St., Forest City, NC 28043 BUY LOCAL • SAVE YOUR HARD EARNED DOLL ARS! BUY LOCAL • SAVE YOUR HARD EARNED DOLL ARS! Black & White & Color Copies Many Sizes Low Cost Per Copy CALL US BEFORE YOU BUY ANYWHERE ELSE CALL US BEFORE YOU BUY ANYWHERE ELSE

Articles in this issue

Links on this page

Archives of this issue

view archives of Rutherford Weekly - March 16, 2023