Nashik police warn of rising AI-voice scams as fraudsters clone voices to demand urgent money

Nashik

Nashik

Cybercriminals in Nashik are using AI-generated cloned voices to impersonate relatives and pressure victims into urgent UPI transfers; police urge strict verification.

Nashik, November 29, 2025 – A new wave of cyber fraud has reached the city, with criminals using artificial intelligence to clone voices and trick people into sending money. According to police, several residents have reported receiving emergency calls that sound exactly like a relative or close friend. The caller usually claims to be in trouble and demands an immediate UPI transfer. The voice is convincing enough that many fall for the trap before realising it was never their family member on the line.

This method relies on AI tools that can mimic someone’s voice using just a few seconds of audio. Scammers often collect samples from social media videos, WhatsApp recordings or any clip publicly available online. Once fed into voice-cloning software, the system can generate new sentences in the same voice, tone and style. The result is a realistic-sounding call that makes victims panic and respond without verifying.

Police in Nashik say the pattern of complaints is rising. In almost every case, the scammer uses urgency as the main weapon — claiming an accident, medical emergency or legal trouble. Victims are told not to call back and to transfer money immediately. Because the voice feels familiar, many obey without cross-checking, only to discover they have been trapped by AI-generated audio.

Cyber experts warn that such scams are becoming more common across India. They say voice cloning has moved from experimental technology to an easily accessible tool, making it attractive for fraudsters looking for quick gains. Once money is transferred through UPI, it is often moved across multiple accounts within minutes, leaving little chance for recovery.

Nashik police have urged residents to stay alert, stressing that no matter how real the call sounds, verification is essential. The recommended approach is simple: disconnect the call and ring the actual person on their known number. If they are safe and unaware of any emergency, the call was fake. Officers repeatedly reminded the public that emotional pressure is the biggest red flag.

Authorities also cautioned people not to click on payment links sent during such calls, not to share OTPs or PINs, and not to reveal personal banking details under any circumstances. They advised against reacting instantly to unexpected SOS calls, especially if the caller insists on secrecy or demands immediate money transfer.

Some residents who reported these incidents said they were shocked at how accurate the cloned voice sounded. Many initially believed the calls to be genuine because the speech patterns matched perfectly. This, police say, is the precise reason the public must adopt a habit of double-checking before making any financial move.

The rise of this scam has triggered discussions on how vulnerable people are becoming as AI tools get more sophisticated. Officials say the best defence is awareness and calm decision-making. Panic, they explained, is exactly what scammers rely on.

Police have asked victims to report incidents promptly through the national helpline 1930 or the cybercrime.gov.in portal so that patterns can be tracked and potential links between cases identified. They also plan to run awareness drives in the coming weeks to help citizens understand the risks associated with AI-generated content.

As the technology continues to evolve, experts believe voice scams may become even harder to detect. For now, verification and caution remain the most reliable ways to stay safe.


Do-follow Links:

Instagram
YouTube
Facebook
Twitter

Also Read More About Pune

Leave a Reply

Your email address will not be published. Required fields are marked *