PETALING JAYA: Every day, an average of about 850 TikTok videos in Malaysia are taken down because they have elements of harassment and bullying.
A total of about 78,000 videos in Malaysia with such content were removed by TikTok during the first three months of this year, based on data from the social media giant.
However, such numbers are likely just the tip of the iceberg when it comes to painting the picture on cyberbullying, say mental health experts.
They suggest platforms take more steps to protect users, including speeding up action against online bullies, better user authentication to prevent bullies from creating new accounts and introducing breaks to newsfeeds.
ALSO READ: Cyberbullying still happening
At the same time, more should be done to educate the public on how to maintain their mental health online.
Cyberbullying was recently tossed into the spotlight after the death of social media influencer A. Rajeswary, better known as Esha.
On July 5, she was found dead after becoming a victim of harassment and online threats on TikTok.
Under its safety and civility policy, TikTok's community guidelines does not allow harassment or bullying.
Clips with such elements are removed from the platform, aside from other types of breaches as shown below:
Beyond the reported numbers, Malaysian Mental Health Association (MMHA) president Datuk Dr Andrew Mohanraj said there were likely more cases of social media users who have been emboldened to bully and harass others.
“It is a challenge to go after every case of online bullying and harassment.
“But with the availability of artificial intelligence, together with human moderators, it should not be too difficult to identify and mitigate harmful content,” said the consultant psychiatrist.
Clinical psychologist Sanghamitra Gupta said taking down videos that seek to harass and bully others was a start, but many other cases still go under the radar.
ALSO READ: ‘They said they were going to kill me’
On the average of 850 videos on bullying being taken down in a day, she said such videos were flagged or reported as harassment, underwent review, and were finally taken down.
“By the time they are taken down, sometimes the damage is already done.
“TikTok has millions of users in Malaysia. By comparison to that, 850 videos being taken down is a rather small number to what I believe is actually a much larger problem,” said Sanghamitra, who is a psychology lecturer at HELP University.
It was reported that TikTok had 28.68 million users aged 18 years and older in Malaysia as of early this year, based on the platform's parent company ByteDance's advertising resources.
In its community guidelines, TikTok said it wanted to ensure that anyone can share their voice without the fear of being degraded or bullied.
“We do not allow harassing, degrading, or bullying statements or behaviour.
ALSO READ: Experts on ways to protect against online harassment
“This includes responding to such acts with retaliatory harassment,” it said on its website.
Here’s a list of what users are allowed and prohibited to do by TikTok on this topic:
Following Rajeswary's case, Minister in the Prime Minister’s Department (Law and Institutional Reform) Datuk Seri Azalina Othman Said announced that the government was considering amending the Penal Code to introduce specific provisions for the offence of cyberbullying.
Platforms are also set to be regulated – all social media and Internet messaging services with at least eight million registered users in Malaysia must apply for a Class Licence, to take effect on Jan 1.
Videos removed on the rise
In Malaysia, TikTok removed an overall total of 1.25 million videos from January to March this year that had flouted various policies under its community guidelines.
This is more than double the 570,804 videos removed in the same period last year.
There are six categories of policies in TikTok's community guidelines – integrity and authenticity, mental and behavioural health, privacy and security, regulated goods and commercial activities, safety and civility, and sensitive and mature themes.
From January to March this year, 28% of the videos removed in Malaysia from the platform involved sensitive or mature themes that may be considered offensive.
The second biggest category was safety and civility, which includes cyberbullying.
Meanwhile, other platforms like YouTube, Facebook and Instagram have also seen an uptick in enforcement activity worldwide against content with bullying and harassment components.
For YouTube, the number of videos removed for harassment and cyberbullying globally had spiked in the past three years.
From 149,802 in January to March 2021, the volume of videos nearly tripled to 448,313 in the same period this year.
“We don’t allow content that targets someone with prolonged insults or slurs based on their physical traits or protected group status, like age, disability, ethnicity, gender, sexual orientation, or race,” the video-sharing platform said in a statement on its website.
Similarly, there has also been a surge in action taken by Facebook and Instagram against content that went against their standards for bullying and harassment.
Such content includes posts, photos, videos and comments.
“Taking action could include removing a piece of content from Facebook or Instagram.
“It could also cover photos or videos that may be disturbing to some audiences with a warning, or disabling accounts,” said Meta Platforms Inc, the company that owns and operates both social media platforms, on its website.
On Facebook, some 7.9 million pieces of content were found to have breached its standards in the first three months of this year – up from 6.9 million last year.
Instagram also showed a steady increase in enforcement activity against pieces of content that went against its guidelines, from around 6.6 million in January to March last year, to 10.3 million in the same period this year, a 56% increase.
However, Meta said its “content-actioned” numbers were only part of the story.
“It doesn't reflect how long it took to detect a violation or how many times users saw that violation while it was on Facebook or Instagram,” the company said on its online transparency page.
Stating that bullying and harassment were not tolerated, Meta added that it recognised that bullying can be especially harmful to minors and as such, its policies provided heightened protections for them.
Platforms can do more
While social media platforms are enforcing their guidelines and standards, experts believe more can be done to better protect users.
This includes quicker action in shutting down online bullies, better user authentication, and having breakers for feeds on social media.
Malaysian Society of Clinical Psychology president Joel Low said better authentication would help prevent online bullies from signing up for new accounts to harass others.
“Re-introducing breakers to social media feeds would be good as well.
“In the past, Facebook did not have infinite feeds, meaning that at some point you would reach the end of the posts you were viewing and you had to opt in for more.
“Such breakers were a great way to remind you that you were doomscrolling (spending a lot of time reading negative news online) and break you away from social media,” he said.
Low also urged for more liberation of algorithms on social media that govern the content that is being pushed to users.
“Right now, we’re being fed content that they think we want, and that can sometimes mean pushing us right up against our bullies or the realm in which we’re bullied.
“Liberation would allow for more options for us to engage with different content and people,” said Low, who is The Mind Psychological Services and Training director.
Dr Mohanraj said the time has come for the government to demand platforms be transparent about their content moderation policies and algorithmic changes.
“We need to establish accountability measures for content that incites rage and division,” he urged.
He also suggested that the MCMC, as a regulatory body, develop ethical standards for social media usage.
“Transparency about how algorithms work and the criteria they use to promote content can help users understand why we see certain posts and videos.
“Platforms have the obligation to provide us with more control over our content feeds, allowing us to customise our experience and reduce exposure to content that consistently makes us angry.
“Ultimately, social media companies cannot run away from their moral responsibility to consider the psychological and societal impacts of the design of their platforms,” Dr Mohanraj said.
Lauding the proposed tighter laws on cyberbullying, Sanghamitra said perpetrators must know that online bullying or harassment can have serious consequences.
“This is especially when minors are involved,” she said.
But above all, Sanghamitra urged bully victims to inform the authorities and their family members about their ordeal and seek mental health support.
“This is because the victim tends to internalise and take the mean comments personally – even leading to self-harm and suicide,” she said.
Those suffering from mental health issues or contemplating suicide can reach out to the following support services: Mental Health Psychosocial Support Service (03-2935 9935 or 014-322 3392); Talian Kasih (15999 or 019-261 5999 on WhatsApp); Jakim’s Family, Social and Community Care Centre (011-1959 8214 on WhatsApp); and Befrienders Kuala Lumpur (03-7627 2929, www.befrienders.org.my/centre-in-malaysia, or sam@befrienders.org.my).
Here’s what to keep in mind if you are at the receiving end of bullying or harassment online, based on advice from the mental health experts in this article: