New form of AI scam: Cyber criminals are cheating by making your voice, how to identify and rescue Deepfake call

New form of AI scam: Cyber criminals are cheating by making your voice, how to identify and rescue Deepfake call

Deepfake call: Last month, a 43 -year -old marketing professional in Bengaluru received a nervous call from his “daughter”. He told that he is in the hospital and ₹ 50,000 is needed immediately. The voice was absolutely real, the same tone, the same way and the same way to say “Appa”. He transferred money without thinking. But in reality his daughter was attending class in college.

This was not a film story, but an AI generated deepfake call was a technique that can copy anyone’s voice exactly.

Today, such voice scams are increasing rapidly in big cities of India like Bengaluru, Mumbai and Delhi. They are targeting not only to the elderly, but also educated professionals, students and even startup CEOs.

How does AI Voice Scam works?

Now scammers neither need to hack your phone nor steal SIM. All they need is your 30 -second voice clip which they can easily pick up from Instagram Reel, YouTube video or WhatsApp forwards.

After this, they can make your voice in any language with AI tools such as Elevenlabs, Descript, or open-source voice cloning software. By putting that voice in the script, it is converted into fake stories like medical emergency, police threats, bank loans or kidnapping. The caller ID can also be fake, so that the front person feels that the call is of a close.

Fear in data

According to the report of the Cybercrime Coordination Center (i4C) of the Indian Cybercrime Coordination in 2025, more than 2,800 Deepfake call scam cases were reported from January to May. Their number has increased by 200% in multic cities.

Most cases were due to these reasons:

  • Fake call of troubled “family”
  • Threat in the name of bank or police
  • Ask for data by becoming a fake employer

Bengaluru topped such cases, followed by Mumbai, Hyderabad and Delhi NCR.

Who is on target?

It is wrong to assume that only elders are victims. Nowadays, professionals, students, utiubers and even startup owners are falling prey to them because their voice is present on the Internet in LinkedIn interviews, Instagram Reels, Podcast Clips, etc.

A startup CEO from Hyderabad was about to pay the voice note of a “vendor”, but the scam was caught on the last minute video call.

Why is it more dangerous for India?

India’s linguistic diversity and family spirit makes such scams more dangerous. AI, not only English, can also imitate tone and accent in languages like Hindi, Tamil, Marathi, Bengali and people in India usually consider voice to be more reliable. If someone talks in voice like “son”, “boss” or “bank manager”, then most people rely without thinking.

How to identify Deepfake Call?

Background nois does not cause voice very clean.

If you question the same things again and again, the script loops.

Personal questions get stuck like a berth date or any inside thing.

Ask for video calls, the scammer will be cut immediately.

Use well-known apps such as Whatsapp, Telegram, where the identification of the callers is more safe.

What can you do now?

Install Caller ID apps like Truecaller or Hiya

Follow Cyber Dost (Government Cyber Awareness Platform)

Lie your voice in public such as long Insta reels, podcasts etc.

Record suspicious calls (where legal)

Complain the fraud immediately, call 1930 or file a complaint on cybercrime.gov.in

Also read:

How to avoid cheating in the name of discount? Learn the way of safe and intelligent shopping

(Tagstotranslate) Tech News Hindi (T) AI Scam (T) Deepfake Call (T) Cyber Fraud (T) Online Fraud (T) Online Crime (T) Cyber Crime (T) Cyber Crime (T) Tech News Hindi (T) Tech News Fraud (T) Cybercrime (T) How to avoid online fraud (T) Tech Tips

Source link