A Massachusetts man is facing a “revenge porn” charge after authorities said he posted a fake, explicit photo of an acquaintance to several of his social media accounts.
The person had no idea Jason Kehoe repeatedly shared the image, which appeared to show them nude, until a detective notified them, the West Springfield Police Department said in a Jan 2 news release. The detective learned of the photo on Jan 1, according to police.
Kehoe digitally edited a real photo of the individual and “convincingly made it appear as if the victim was photographed in a state of nudity”, an investigation revealed, according to police.
When he shared the photo online, he captioned it with “intentional derogatory and personally directed comments”, police said.
Detectives arrested Kehoe on a charge of distributing visual material to harass at his West Springfield home, police said. West Springfield is about a 90-mile drive southwest from Boston.
Information regarding Kehoe’s legal representation wasn’t immediately available.
Authorities said “the victim had never had any form of romantic or substantial relationship with Kehoe, and only knew him as an acquaintance”.
After the detective notified the person of the seemingly real, explicit image of them, they showed the detective their “original unaltered online photo-image”, police said.
Under Massachusetts state law, as of September, it is illegal to digitally alter and distribute a photo of a person that appears real and shows them partially or fully nude.
In 2023, a New York man was sentenced to six months in prison in connection with creating “deepfake” pornography using old social media photos of more than a dozen women while they were underage, according to authorities, McClatchy News previously reported.
Patrick Carey, of Long Island, altered their photos from when they were in middle school and high school to make the pictures appear explicit, then posted the images to a pornographic website, according to the Nassau County District Attorney’s Office.
In the past two years, reports of artificial intelligence being used to create nude images of children as a means to sexually exploit them have been on the rise, the National Center for Missing and Exploited Children reported Dec 13.
The organisation said that while “generative artificial intelligence (GAI) has revolutionised how we interact with technology”, it also “introduces significant risks, particularly to children”.
The Westfield Police Department didn’t specify how Kehoe might’ve digitally altered the photo of his acquaintance.
McClatchy News contacted police for more information Jan 3 and was awaiting a response. – The Charlotte Observer/Tribune News Service