‘Cyberbullying laws need more bite’


As social media use becomes widespread among Malaysians, more people are falling victim to cyberbullying. — Filepic

ON HER wedding day, Suzana, 32, (not her real name), posted a photo of herself and her husband in their bridal outfits on social media.

What followed was a torrent of hurtful comments, several of which questioned what her husband saw in her.

“It ruined what was supposed to be the happiest day of my life.

“I did not expect to face online harassment on my wedding day. So I decided to deactivate my account after that,” she said.

Wide use of smartphones and messaging apps has led to an increase in cyberbullying. — FilepicWide use of smartphones and messaging apps has led to an increase in cyberbullying. — Filepic

More people like Suzana are finding themselves victims of cyberbullying, which Oxford defines as the use of electronic communication to bully a person, typically by sending messages of an intimidating or threatening nature.

It can take place on social media, messaging and gaming platforms as well as communications on mobile phones.

There is no legal definition of cyberbullying in Malaysia, which makes dealing with the issue difficult.

Supt Junainah M. Kasbolah heads the police’s Classified Crime Investigation Unit under Bukit Aman Prosecution/Law Section, which specialises in investigations related to the 3R issues – race, religion and royalty.

She said the authorities currently rely on Section 233 of the Communications and Multimedia Act 1998 (Act 588) in dealing with cyber-related cases.

Harme: Sites often fail to grasp the connotation that certain words and phrases might carry.Harme: Sites often fail to grasp the connotation that certain words and phrases might carry.

The Act covers obscene, indecent, false, menacing or hurtful content.

“We have to look at the content, be it in the form of a video, text or message.

“We will then determine if the content is offensive before further action can be taken,” she told StarMetro.

Under Section 233, the offender can be fined not more than RM50,000 or imprisoned for up to one year, or both.

The offender can also be fined RM1,000 for each day that the offence is continued after conviction.

At present, the Malaysian Communications and Multimedia Commission (MCMC) does not have a specific category for cyberbullying, said its Network Security Division head Harme Mohamed.

2154155_2812154155_281As such, each cyber-related complaint is dealt with on a case-by-case basis.

This also means there is a lack of data on the number of cyberbullying cases in Malaysia.

“Each case that we receive is treated as a regular complaint.

“We may classify the issue as sexual harassment or threat based on the nature of the content, such as the spread of personal photos,” he said.

Individuals finding themselves at the receiving end of online harassment are encouraged to make a police report.

Supt Junainah said victims of online abuse must obtain the URL (uniform resource locator) of the offensive content when lodging a report. (see infographic on right)

“A screenshot alone is not sufficient. Without the link, it will be difficult to find the post or comment in question.

“This is because the offensive remark could have been made some time ago and was only discovered recently,” she said.

Supt Junainah said such information would be needed for the police to request a profiling report from MCMC.

“MCMC will produce a time-stamp of the content. If it has been removed, this information will be reflected in the report.

“This process usually takes between 24 hours and five days,” she said, adding that the Act empowered Malaysian authorities to request for removal of certain content.

But Harme said MCMC often had to deal with pushback from social media platforms.

“These platforms operate globally. Their community standards may not align with Malaysian values and laws.

“We can ask them to remove certain content, but it will take time as they have their own internal processes,” he said.

In recent months, the government had faced pressure from stakeholders to impose stronger regulations on social media companies.

There were calls to compel messaging app Telegram to cooperate over the spread of illicit content.

On June 19, Communications and Digital Minister Fahmi Fadzil announced that Telegram had agreed to work with the authorities.

On June 23, MCMC announced it would take legal action against Meta – the parent company of Facebook, WhatsApp and Instagram – for its lack of cooperation.

The commission said Meta had failed to remove a significant volume of harmful content across its platforms despite numerous requests.

Inadequate oversight

Last month, a social media user, who only gave his name as Amir, reported a social media post which he felt carried racist undertones.

Several days later, he received a notification saying no action had been taken as no violation of community standards was found.

“I think they could not understand the colloquial language in which the post was made,” he told StarMetro.

Harme said tech platforms used algorithms to filter harmful content, aside from having local moderators.

“But some content does manage to escape the filter. These sites often fail to grasp the connotation that certain words and phrases might carry,” he said.

Harme gave the example of posts related to May 13, 1969, during the 15th General Election, which carried images of keris and other weapons.

“To these social media platforms, these are just images of an object. But to Malaysians, they carry a certain insinuation,” he said.

Supt Junainah: The authorities rely on Section 233 of the Communication and Multimedia Act 1998.Supt Junainah: The authorities rely on Section 233 of the Communication and Multimedia Act 1998.

Supt Junainah said her 3R unit received a large number of complaints related to social media use in 2021 and 2022.

“In 2021, we recorded more than 200 complaints. In 2022, there was a slight drop to around 190 cases.

“The coronavirus lockdowns implemented within these two years might have contributed to the trend,” she added.

Supt Junainah said the Act needed to be updated to give the authorities more power when carrying out investigations on cyber cases.

At present, the Act is only applicable in the investigation of non-seizable offences which require a warrant.

“The police also need an ‘order to investigate’ from the Deputy Public Prosecutor before initiating a probe,” she said.

Supt Junainah also stressed that a stronger penalty was needed to deter future offenders.

Harme said Malaysia should consider working with its Asean neighbours to formulate a shared framework that tech platforms had to comply with.

“Our country’s 32 million population constitutes a small segment of the digital platforms’ market, which means we have limited influence.

“But collectively, as an Asean bloc, we will have a better chance of effecting change and getting these platforms to listen,” he pointed out.

Worldwide concern

The US-based non-profit Anti-Defamation League (ADL) reported that more than half of American adults said they experienced online hate or harassment.

In the executive summary of its report “Online Hate and Harassment: The American Experience 2023”, ADL said online hate and harassment rose sharply over the past year for teenagers aged 13 to 17 as well as adults.

“Among adults, 52% reported being harassed online, the highest number we have seen in four years, up from 40% in 2022.

“Both adults and teenagers also reported being harassed within the past 12 months, up from 23% in 2022 to 33% in 2023 for adults, and 36% to 51% for teenagers,” the report stated.

The report also showed that 54% of online hate took place on Facebook. (see chart above).

Many countries have taken steps to curb the problem of cyberbullying.

In Singapore, the Protection from Harassment Act 2014 provides legal recourse to victims.

Under this law, they may apply for a protection order to stop the harasser and others from republishing the distressing content.

In Germany, the Network Enforcement Act 2018 requires social media platforms to promptly remove illegal content.

Such content ranges from insulting public officials to threats of violence.

France passed a law in 2020 requiring tech platforms to remove certain content within 24 hours.

The regulation calls for such sites to remove hateful comments based on race, religion, sexuality, gender and disability.

Australia’s Online Safety Act 2021 compels tech platforms to remove content deemed bullying within 24 hours.

Under this enhanced rule, adults who are being bullied can report incidents to Australia’s eSafety Commission.

The improved Act also shortened the removal timeframe of bullying content involving children, from 48 hours to 24 hours.

Follow us on our official WhatsApp channel for breaking news alerts and key updates!
   

Next In Metro News

Health programme benefits over 100 Seremban senior citizens
Good tidings at Pitt Street
Living far from home at Christmas
Oldest Malaysian Lutheran church marks centennial
Martial arts school awarded
River of Life project to be completed by 2030
Donations brighten lives of welfare home residents
Eviction looms for PJ nurseries
Feeling festive in their home away from home
Seremban developer brings communities together to create Christmas memories

Others Also Read