BLOG
Published On: Jul 11, 2025Tags: ,

Tech Tips: How to Stay Safe from AI Impersonation

An individual holding a diagram that says AI, suggesting using tools for AI Impersonation.
10-100 | IT Support, Services & IT Consultancy in Milton Keynes

With the sophistication of cybercrime, AI impersonation is rapidly becoming a dangerous tool in the hands of criminals aiming to infiltrate organizations and governments. In an entirely new and concerning spin on the old cyber scam of pretending to be someone’s colleague, boss, or associate, criminals can now clone voices from videos, clips, or calls, possibly bypassing voice-recognition security software or even tricking family members.

AI-generated deepfakes, cloned voices, and realistic fake messages are on the rise, and it’s crucial to know how to protect yourself and your loved ones. Here’s how you can stay safe:

1. Be Skeptical of Unusual Requests—Even from Familiar People

AI impersonation scams often rely on urgency and emotional manipulation. If you receive a message, call, or video that seems off—even if it looks or sounds like someone you know—pause and verify before acting.

Common red flags:

  • Sudden requests for money or sensitive information.

  • Odd or inconsistent language or typos in ‘official’ correspondence.

  • Communication outside of usual channels (e.g., a strange text instead of a call, or an email from an unfamiliar address).

2. Use a “Safe Word” for Family and Friends

One of the most effective ways to counter AI impersonation is to create a private code word or phrase that only you and trusted individuals know. If someone contacts you claiming to be a loved one in distress, ask for the safe word before engaging further.

3. Double-Check Through a Secondary Channel

If you get a suspicious message or call:

  • Call the person back using a verified number you have on file for them.

  • Ask questions only the real person would know.

  • Avoid responding directly to the suspicious message or link, and don’t click any links sent via text or email.

Never rely solely on texts, emails, or social media for confirmation. Always try to get verbal confirmation when you can’t confirm with the individual you suspect is being impersonated in person.

4. Limit What You Share Publicly Online

The more data scammers have, the easier it is for them to train AI tools to mimic your voice, appearance, or writing style. Once they’ve created an artificial version of you, they may reach out to your family members or to colleagues.

To minimize your risk:

  • Set social media accounts to private.

  • Avoid posting detailed personal info (like birthdays, family member names, pet names, or favorite vacation spots).

  • Refrain from sharing long video or audio clips publicly when not necessary.

5. Verify Video and Audio Before Trusting

Advanced AI tools can create deepfake videos or voice clones that look and sound incredibly real. Before acting on any message that feels unusual:

  • Check for inconsistencies in lip-sync, lighting, or background. AI-generated videos usually have inconsistencies in eye directions, repetitive head movements, or warping limbs/fingers into the background.

  • Run a reverse image search on any suspicious images.

  • Use deepfake detection tools if available.

6. Enable Two-Factor Authentication (2FA)

Even if an AI impersonator tricks someone into revealing personal information, 2FA can act as a second line of defense.

Make sure:

  • All critical accounts (banking, email, social media) have 2FA enabled.

  • Use an authenticator app instead of SMS when possible.

  • Never, ever give anyone any code that appears in your 2FA apps. No Microsoft Employee or other individuals will ask for this code.

7. Stay Educated and Keep Others Informed

Many victims of AI scams are caught off guard simply because they weren’t aware this kind of fraud was possible. Talk to family members—especially elderly relatives or teens—about the risks and how to respond.

Share news articles, videos, or even examples of AI-generated scams to help them recognize the threat. The first line of protection any individual can have in combating these types of scams is simply having a robust knowledge of what to look for and being educated on how this technology is evolving in the hands of people who would misuse it.

For further information on how to safeguard your company against these scams, and learning how 10-100 Consultancy can offer you peace of mind against other rising cyber risks, give our experienced Sales Team a call today, or email sales@10-100.com!