More needed to protect Hong Kong victims whose intimate photos were posted online without consent, NGO says


RainLily says that from April 2021 to March 2023, it had received 646 requests for assistance relating to image-based sexual violence. Half of images NGO dealt with were distributed by people victims knew, including intimate partners. — SCMP

An anti-sexual violence crisis centre has appealed to Hong Kong’s privacy watchdog to give greater protection to people who have struggled to get intimate photographs removed from the Internet.

RainLily on Tuesday said it had received 646 requests for help between April 2021 to March 2023 related to image-based sexual violence.

It helped people demand online platforms take down 1,342 photographs of non-consensual intimate content. Almost 1,200 of these images, or 89%, were eventually taken down.

The NGO said many people wanted to remove individual pictures that showed facial features.

Jacey Kan (left) the senior advocacy officer at RainLily, and Doris Chong the organisation’s executive director, have appealed to Hong Kong’s privacy watchdog to better protect people who have problems removing intimate images from the internet. Photo: RainLily

But the Personal Data (Privacy) Ordinance at present rules that the distribution of intimate images without other kinds of personal information was not a breach of the law.

“The Office of the Privacy Commissioner for Personal Data should further review at what level intimate images can be defined as personal data, and assist those experiencing non-consensual distribution of intimate images in a more proactive manner to live up to the public’s expectations,” Jacey Kan Man-Ki, a senior advocacy officer at the group, said.

The commissioner’s office said that generally, if the identity of a relevant individual was found to be an intimate image, it might constitute personal data under the ordinance.

Deputy executive director Doris Chong said RainLily had noticed an increase in requests for help to take down intimate pictures online.

“Despite the introduction of specific criminal offences in October 2021, RainLily’s frontline experience suggests that those facing non-consensual distribution and threatened distribution of intimate images are still hesitant to report such incidents,” she explained.

Kan added that the NGO’s service of requesting online platforms remove images could help avoid victims’ identities being exposed.

Statistics show that most victims were female (71.3%), while around a fifth (22.2%) were male.

Half of the images the NGO dealt with were distributed by people the victims knew, including intimate partners (27.5%) and online acquaintances (15.8%). But one-third of the victims could not identify the person who posted their pictures.

Kan said most of the victims were afraid the pictures would be discovered by their loved ones.

“They worried their families, friends, and colleagues could find those images online. Even if the contents were removed, they would still fear the images being uploaded again and even being body-shamed online, which traumatised the victims,” she explained.

Among the 1,342 intimate photographs, 42% of them were found on pornographic sites, nearly a quarter of them were found on social media platforms (25%) and others on search engines (22.7%).

The remainder were found on content farms (5.3%) and image-hosting websites (5%).

RainLily, however, revealed a variety of obstacles to helping victims remove images.

The NGO said content posted on pornographic sites was spread on an “exponential scale” compared to other platforms, and 25 sites never responded to the NGO and took no removal actions.

Most of the sites only prohibited material that violated the owner’s copyright.

But most of the copyright on non-consensual intimate images belong to the “photographer”, creating difficulties in completely removing such content.

The NGO also pointed out that channels on Telegram often posted links to content farm websites. Posts on content farms included non-consensual pictures, the victims’ personal information and a fabricated story surrounding the image.

The same content was spread across subgroups on these channels. These branches have hundreds to thousands of subscribers.

“Although these platforms have policies and reporting channels, they do not respond to and deal with removal requests effectively,” Kan said.

The image-based sexual violence-related criminal offences came into effect in 2021, but RainLily said there were low conviction and prosecution rates.

It added legislation had limitations. There was no legal obligation to remove the distributed content when the perpetrator was charged.

The images would only be removed when legal proceedings started, which was too late to stop the pictures from circulating online. – South China Morning Post

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Revenge porn

   

Next In Tech News

Jeff Bezos says most people should take more risks. Here’s the science that proves he’s right
Musk, president? Trump says 'not happening'
Bluesky finds with growth comes growing pains – and bots
How tech created a ‘recipe for loneliness’
How data shared in the cloud is aiding snow removal
Trump appoints Bo Hines to presidential council on digital assets
Do you have a friend in AI?
Japan's antitrust watchdog to find Google violated law in search case, Nikkei reports
Is tech industry already on cusp of artificial intelligence slowdown?
What does watching all those videos do to kids' brains?

Others Also Read