With growing calls from experts and the Malaysian government for stricter social media moderation, it’s easy to overlook the human toll exacted on those tasked with sifting through countless posts containing harmful and extremist content.
Many bemoan the state of content moderation across social media platforms, even as reports highlight the psychological trauma that exposure to such content can inflict on moderators.
One such report from The Guardian last year points out that moderators typically have to sift through content with explicit depictions of murder, torture, and child sexual abuse material (referred to as CSAM), among others, putting them under immense mental strain.
This rings true for Anne, a 26-year-old based out of KL who worked as a content moderator with a video-sharing platform for over two years before moving to a different industry.
While she found the job meaningful, describing it as feeling like being “the Internet police”, she also personally struggled with the emotional toll of constantly confronting disturbing content.
“The most challenging type of content for me to moderate was definitely anything related to child exploitation. It’s deeply unsettling and hard to process,” she says.
During her time in the field, she was placed in a team that was focused on hate speech, adult content, and violent extremist material, but this did not completely shield her from frequent exposure to things like child pornography.
“It stays in your mind,” she says. “When moderating, I have to listen to the videos to check for policy violations too, which means I don’t just watch the content but also have to hear people’s screams and struggles during violent incidents.”
Another moderator, 25-year-old Hana from Seremban, says that she is constantly exposed to all types of content violations during her moderation work, “ranging from graphic violence to suicide and self-harm, animal abuse and also adult sexual content.”
Like Anne, Hana requested to be quoted anonymously during the interview.
Trigger warning
Anne says that the challenges she faced did not end when she left her role as a content moderator. She continues to experience flashbacks of the disturbing content she encountered and says the experience has left her more paranoid.
Vinorra Shaker, Asia Pacific University of Technology and Innovation (APU) School of Psychology associate head, says that such frequent exposure over a long period of time can have severe implications on an individual’s mental health.
“This is because the brain’s stress response system can become overwhelmed by constant exposure to traumatic content. The amygdala, which is responsible for processing emotions, can become hyperactive, leading to heightened anxiety and fear.
“This can lead to symptoms such as flashbacks, nightmares, difficulty concentrating, and avoidance of triggers.
“Additionally, the constant exposure to distressing content can contribute to feelings of hopelessness, helplessness, and worthlessness, increasing the risk of depression,” she says.
She gave the analogy of the human brain functioning like a sponge, absorbing information from the surrounding world, much like a sponge absorbs water.
“In psychology, this process is known as neuroplasticity, which is the brain’s ability to change and adapt. The brain can reorganise its structure in response to new experiences, similar to how a sponge can change shape.
“This adaptability allows the brain to learn and grow throughout life, much like a sponge can absorb different types of liquids. When you frequently see or hear about violence or disturbing events, your brain soaks up those images, sounds and experiences.
“This can have a significant impact on your mental health, both in the short and long term,” she says.
These effects can range from increased anxiety and stress, difficulty sleeping, mood swings reduced empathy, and even developing into post-traumatic stress disorder (PTSD), desensitisation, and even aggressive behaviour.
The effects of such frequent exposure to harmful content can build up over the course of many years and become significantly more severe than those caused by short-term exposure.
“Over time, repeated exposure can desensitise individuals to violence, normalise harmful behaviours, and contribute to a distorted worldview.
“It can make harmful things seem normal or even acceptable. It’s like your brain gets used to seeing bad stuff, and it stops being as shocked by it, but it doesn’t mean it stops being affected by them,” she says.
does note that the impact of repeated exposure to extreme or harmful content can vary greatly from person to person and can be mitigated depending on personal history, coping mechanisms, and the presence of support systems.
“Certain types of content are more likely to cause trauma. For example, content that is personal, graphic, or realistic can be particularly harmful.
“This includes things like real-life videos of violence, news reports about traumatic events, or personal accounts of abuse,” she says, further adding that the “individual’s personal experiences may also add to the trauma they experience from consuming certain types of content”.
“For example, a person who has experienced childhood abuse may be particularly traumatised by watching media depictions of abuse. The content might trigger painful memories and emotions, leading to intense distress and anxiety.
“Pre-existing mental health conditions, such as depression or anxiety, can make individuals more vulnerable to the negative effects of content consumption.”
For Hana, who says she is most deeply affected by content involving animal abuse, the impact is particularly personal.
“For me, it would have to be animal abuse. I’m a dog lover, and whenever I see such content or even animal slaughter, it gets to me,” she says.
However, she has developed ways to cope. “I manage to recuperate and continue moderating after shedding a few tears and reminding myself that these are recordings of past events, beyond my control.”
Hana emphasises the need for self-care to mitigate the negative effects of this sort of work.
“It’s important to know yourself and attend to your needs if something affects you,” she says.
Her other coping strategies include physical activities like yoga, gym workouts, meditation, and breathing exercises.
Support systems
For those in the content moderation line, Vinorra stresses the need for support systems outside the workplace to navigate the emotional challenge of their role and ensure their mental well-being and resilience.
The presence of strong support systems, for instance, friends and family, can combat the feelings of isolation and loneliness that may be exacerbated by the nature of the job.
“A content moderator’s support system outside of work is extremely crucial in mitigating the psychological effects of their job.
“Having friends and family to talk to provides a safe space for moderators to express their feelings, share experiences, and process the emotional weight of their work.
“Engaging with people outside the job can help moderators gain perspective, reminding them of the broader context of life beyond the content they encounter daily,” she points out, adding that hobbies and recreational activities can serve as outlets to relieve stress and disconnect while also promoting healthy coping strategies to manage the emotional toll.
Vinorra believes that social media companies have a major part to play when it comes to supporting the mental health of their moderation staff, which should come in the form of training and psychological support.
“Proper training can significantly help reduce the emotional impact of working as a content moderator.
“Given the nature of the job, which often involves exposure to distressing or harmful content, effective training can equip moderators with tools and strategies to manage their mental health and emotional resilience.
“If moderators feel isolated or unsupported, the emotional burden can grow heavier,” she says.
Some examples of this includes emotional resilience training, trauma-informed training, along with regular debriefing sessions and scenario-based training.
“Organisations that provide strong mental health support, training, and regular debriefing can help mitigate the negative effects of the job, making it more sustainable.
“A positive work culture, with open communication and recognition of the challenges moderators face, can significantly impact job satisfaction and longevity,” she says.
In Anne’s experience, while companies do provide mental health support along with things like a blurring tool to obscure graphic videos and black-and-white filters to reduce the impact of disturbing visuals like blood, coping with the mental fallout of the role remains a challenge.
“While some support exists, it often feels insufficient given the nature of the work,” she says, noting that despite the wellness sessions and counselling offered by the company, the emotional toll persists.
A human touch
While some may hope for the introduction of artificial intelligence (AI) to lighten the load of extreme content that moderators have to deal with, it’s garnered mixed results so far.
An October report from Forbes raised concerns regarding the increasing reliance on AI in the content moderation process on social media platform X (formerly Twitter), raising the question of whether they are capable of managing sensitive and nuanced issues.
Reuters reported the same month that short form video-sharing platform TikTok laid off hundreds of employees as part of its shift towards leveraging AI in content moderation.
From Vinorra’s perspective, AI on its own is not a perfect solution, but can alleviate some of the psychological burden they face.
“AI can certainly help reduce the psychological burden on human moderators by handling large volumes of content quickly and identifying obvious violations.
“Automated systems can flag inappropriate material, allowing human moderators to focus on more nuanced cases that require human judgment and empathy.
“This could lead to a more manageable workload for moderators, potentially reducing emotional strain,” she says.
Despite this, the issue of context, nuance and more subtle human expressions remain, which could lead to errors or misjudgement.
“Sensitive content often requires a human touch to interpret intent and context accurately, making human moderators essential for effective content moderation.
“Additionally, the emotional and psychological complexities of certain content – such as graphic violence or hate speech – benefit from human empathy and understanding,” she says.
She thinks that AI is in a position to reduce the overall amount of harmful content that moderators encounter, which is a similar position to human content moderators like Hana, who hopes that AI will take on a more complementary role in their work.
While AI could potentially handle more obvious violations, reducing the need for human review, it still falls short of having a human touch.
“The fact remains that artificial intelligence is not really on par with the human mind to assess all complexities within something like a video, thus, we will still need (human) content moderators for this very reason,” Hana says.