AI Impersonation Scams Are Sky-Rocketing in 2025: How to Stay Safe

By - david
18.09.25 08:23 AM
AI impersonation scams have dramatically increased in 2025, growing by 148%. These scams leverage sophisticated AI technologies such as voice cloning and deepfake videos to convincingly mimic trusted individuals, tricking victims through calls, video meetings, messages, and emails. Criminals typically source only a few seconds of audio from public platforms like voicemails, interviews, or social media clips to replicate speech patterns.

Anyone can be targeted via multiple communication channels. The FBI has warned about AI-generated calls that impersonate politicians to spread misinformation. In the corporate world, scammers have conducted deepfake video meetings posing as executives to fraudulently authorize wire transfers. Scammers often collect images and videos from LinkedIn and social media to create convincing digital impersonations.

To stay safe, experts recommend slowing down and independently verifying any urgent or unusual requests. Taking just nine seconds to pause can prevent rushed decisions. If you receive suspicious calls or videos, hang up immediately and contact the person using a verified phone number. Employing multi-factor authentication (MFA) adds an additional security layer, making unauthorized access more difficult. Be alert for subtle red flags in deepfake videos and AI-generated voices, such as unnatural movements or inconsistent background noise.

Stay vigilant. Don’t take voice or video messages at face value. Verify identities independently and implement strong security practices to protect yourself and your organization from AI-based scams.

For more information and related articles, visit the original post at TechRadar.

david