Scammers are using AI to clone kids’ voices and fool parents in India


Audio deepfakes can be created with a real clip of the target’s voice. - LIANHE ZAOBAO

NEW DELHI: When a father in India received a call from an unknown overseas number in January, he did not expect he would be the latest to fall for an elaborate fraud scheme involving Artificial Intelligence (AI).

The scammer, who claimed he was a police officer, told Himanshu Shekhar Singh that his 18-year-old son had been caught with a gang of rapists and needed Rs30,000 (S$486) so he could clear his name, Indian media reported on Feb 12.

Singh told The Indian Express of the Jan 8 incident: “The next minute, I heard a voice saying, ‘Papa please pay him, they are real policemen, please save me’.

“I could not doubt even for a second that he was not my boy. The style of speaking, crying... everything was the same.”

While he was suspicious, he feared the caller was a kidnapper, so he made an initial payment of Rs10,000 (S$162).

He then decided to look for his son on his own first. The teenager was later found taking a test at an education centre unharmed.

This was one of three prominent cases to rock New Delhi and its National Capital Region in recent weeks as scammers tap into AI to develop convincing fake voice recordings of children to trick their parents into transferring money.

Audio deepfakes can be created with a real clip of the target’s voice.

In a similar case, a mother from Noida, in Uttar Pradesh, received a scam call where the scammers used technology to mimic her son’s voice.

Fortunately, her son was studying in front of her when the scammers called.

The journalist, who was not named in the article, said: “It is a huge concern that cyber criminals are targeting children now. From where they are getting details of kids and their parents?... This must be thoroughly investigated with utmost priority.”

“Such cases are not very frequent, but recently there has been an uptick in cases of ‘cloning’. We are trying to understand how exactly cyber criminals are creating cloned voices to dupe people,” said Delhi police officer Manish Kumar Mishra.

Such voice cloning cases have happened in other parts of the world.

In May 2023, police in a region of Inner Mongolia were alerted to a case where a scammer used face-swopping technology to impersonate a victim’s friend during a video call.

Believing that his friend needed to pay a deposit to complete a bidding process, the victim transferred 4.3 million yuan (S$805,000) to the scammer.

He realised he had been duped only after the friend said he knew nothing of the situation.

In Hong Kong, a multi-national company was scammed of HK$200 million (S$34 million) after an employee attended a video conference call with deepfake recreations of the company’s Britain-based chief financial officer and other employees.

The “fake” colleagues had given orders to the employee to transfer the sum to separate accounts and the victim had complied. - The Straits Times/ANN

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

India , scammers , AI , voice

   

Next In Aseanplus News

Blinken to attend G7 meeting in Italy, US State Department says
'Abang Adik' wins Best Film at the 10th Asian World Film Festival held in Los Angeles
Dr Wee is set to address key issues at the 11th GLA Conference in Bangkok
Asean News Headlines at 10pm on Friday (Nov 22, 2024)
Aaron-Wooi Yik, Pearly Thinaah bow out of China Masters
Japan government approves mammoth US$140bil stimulus with aims putting more cash into peoples' pocket
Man killed after public place shooting in Sydney's inner city
Gold bars, cash bundles in kimchi boxes: Millions seized from S. Korea tax dodgers
Action star Jackie Chan recalls earning his first HK$4mil pay cheque and buying 7 luxury watches in one go
Singapore raises 2024 growth forecast as recovery takes hold in good manner this year

Others Also Read