Tech

AI voice scams are on the rise – here’s how to stay safe, according to security experts

Share
Share

  • AI voice-clone scams are on the rise, according to security experts
  • Voice-enabled AI models can be used to imitate loved ones
  • Experts recommend agreeing a safe phrase with friends and family

The next spam call you receive might not be a real person – and your ear won’t be able to tell the difference. Scammers are using voice-enabled AI models to automate their fraudulent schemes, tricking individuals by imitating real human callers, including family members.

What are AI voice scams?

Scam calls aren’t new, but AI-powered ones are a new dangerous breed. They use generative AI to imitate not just authorities or celebrities, but friends and family.

The arrival of AI models trained on human voices has unlocked a new realm of risk when it comes to phone scams. These tools, such as OpenAI’s voice API, support real-time conversation between a human and the AI model. With a small amount of code, these models can be programmed to execute phone scams automatically, encouraging victims to disclose sensitive information.

Share

Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Articles
AI took a huge leap in IQ, and now a quarter of Gen Z thinks AI is conscious
Tech

AI took a huge leap in IQ, and now a quarter of Gen Z thinks AI is conscious

ChatGPT’s o3 model scored a 136 on the Mensa IQ test and...

DeepSeek sees surge in developer use as 3 in 10 businesses adopt the controversial LLM provider
Tech

DeepSeek sees surge in developer use as 3 in 10 businesses adopt the controversial LLM provider

Developers shift from loyalty to flexibility as OpenAI leads, but DeepSeek gains...

China’s CATL launches new EV sodium battery
Tech

China’s CATL launches new EV sodium battery

Chinese battery giant CATL has launched a new sodium-ion battery it says...