You're in the middle of work when you suddenly get an urgent call from your mom. She claims she's in danger and needs you to send her money. Everything about her voice, from speech patterns to inflections and cadence, sounds exactly like her, and you have absolutely no doubt that it is. After all, there's no way any AI or impersonator can clone someone's voice that perfectly. Right?
If you've already had this happen to you or you've heard about it online, you're not alone in your experience. With technology rapidly advancing, cybercriminals are getting more creative with how they use AI tools to gain trust. Unfortunately, it'll only get more eerie and personal from here on out. Here's how to spot an AI scam and keep you and your loved ones safe.
Is AI Cloning Our Voices?
AI tools have taken the world by storm. It seems like almost everywhere you go, every website you visit, there's an AI attendant or chatbot to serve you. For some, AI models like ChatGPT have even replaced their therapists, friends, and family. Teens, in particular, have found these robot companions helpful, and supportive, especially in times of need.
But people don't often stick to conversing through typed messages—they turn to voice chats, too. And while it's hard to say how safe your personal information is when using AI models this way, it's not impossible for a sample of your voice to be recorded. Worse yet, that's not the only way. In fact, for AI to clone your voice, all it takes is a few seconds of speech. It can then immediately mimic your tone, patterns, and cadence with nearly perfect accuracy. If you regularly post on social media, such as TikTok and Instagram, you have no way of stopping anyone from taking your identity and copying it. Scammers can also use "wrong number" calls to their advantage.
You might think you can immediately spot the difference between an AI and a real person, but think twice—it's not as easy as you assume. Take, for example, the two audio samples from McAfee. Play both recordings, make your guess, then click on the silhouettes. You may be surprised to find which one is the AI-generated voice.
How Can You Stay Safe?
If AI can sound so identical to humans and our loved ones, how can we avoid being tricked? For one, remember the tricks that scammers usually employ, such as using fear as a motivator, making urgent requests, or giving out too-good-to-be-true offers. As soon as you recognize any of these tactics, be on guard and stay wary.
As convincing as a phone call might be, there are often two methods that people, including the Federal Trade Commission, suggest:
- Create a safe word. This is a secret word, phrase, or expression that only you and your loved ones know. Set up and practice a system that is natural, and remember to remind each other of it so each of you knows what to do. Whenever you get a suspicious call, ask for this phrase.
- Have someone else dial the person's number through their own phone. Cybercriminals can not only clone your loved one's voice but also spoof their number, making it impossible to immediately detect whether the call is a scam or a real emergency. Calling directly allows you to quickly verify whether it really is your loved one on the line or if it's a fake.
Above all, be careful of what you choose to post online. Social media might be rampant in our current era, and may be the only way we can keep connected with our friends and family, but it's important to understand that as technology and AI get more advanced, we have to stay vigilant.


