Scammers are using AI to clone kids’ voices and fool parents in India


Audio deepfakes can be created with a real clip of the target’s voice. - LIANHE ZAOBAO

NEW DELHI: When a father in India received a call from an unknown overseas number in January, he did not expect he would be the latest to fall for an elaborate fraud scheme involving Artificial Intelligence (AI).

The scammer, who claimed he was a police officer, told Himanshu Shekhar Singh that his 18-year-old son had been caught with a gang of rapists and needed Rs30,000 (S$486) so he could clear his name, Indian media reported on Feb 12.

Singh told The Indian Express of the Jan 8 incident: “The next minute, I heard a voice saying, ‘Papa please pay him, they are real policemen, please save me’.

“I could not doubt even for a second that he was not my boy. The style of speaking, crying... everything was the same.”

While he was suspicious, he feared the caller was a kidnapper, so he made an initial payment of Rs10,000 (S$162).

He then decided to look for his son on his own first. The teenager was later found taking a test at an education centre unharmed.

This was one of three prominent cases to rock New Delhi and its National Capital Region in recent weeks as scammers tap into AI to develop convincing fake voice recordings of children to trick their parents into transferring money.

Audio deepfakes can be created with a real clip of the target’s voice.

In a similar case, a mother from Noida, in Uttar Pradesh, received a scam call where the scammers used technology to mimic her son’s voice.

Fortunately, her son was studying in front of her when the scammers called.

The journalist, who was not named in the article, said: “It is a huge concern that cyber criminals are targeting children now. From where they are getting details of kids and their parents?... This must be thoroughly investigated with utmost priority.”

“Such cases are not very frequent, but recently there has been an uptick in cases of ‘cloning’. We are trying to understand how exactly cyber criminals are creating cloned voices to dupe people,” said Delhi police officer Manish Kumar Mishra.

Such voice cloning cases have happened in other parts of the world.

In May 2023, police in a region of Inner Mongolia were alerted to a case where a scammer used face-swopping technology to impersonate a victim’s friend during a video call.

Believing that his friend needed to pay a deposit to complete a bidding process, the victim transferred 4.3 million yuan (S$805,000) to the scammer.

He realised he had been duped only after the friend said he knew nothing of the situation.

In Hong Kong, a multi-national company was scammed of HK$200 million (S$34 million) after an employee attended a video conference call with deepfake recreations of the company’s Britain-based chief financial officer and other employees.

The “fake” colleagues had given orders to the employee to transfer the sum to separate accounts and the victim had complied. - The Straits Times/ANN

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

India , scammers , AI , voice

   

Next In Aseanplus News

Dubai developers escalate efforts to sell properties to rich Chinese, seeing ‘huge potential’ in sales
Trauma lingers on for survivors of the deadly tsunami that hit Thailand 20 years ago
Megawati, Jokowi feud intensifies after his dismissal
Fired SingPost CEO, CFO to contest employment termination, say move ‘without merits and unfair’
Upper Thailand remains very cold, with a minimum temperature of 12�C
FBM KLCI edges higher at midday, banking stocks lead
Call for collaboration in preserving South-East Asian crafts
South Korean team develops ‘Iron Man’ robot that helps paraplegics walk
Organised crime trial of 22 GISB leaders transferred to Shah Alam High Court
Airbus, Rolls-Royce step in to resolve Malaysia Airlines' grounded A330neo

Others Also Read