PETALING JAYA: Social media and online messaging platforms should provide tools and settings for young users.
“This includes options for parent to limit screen time and restrict access to harmful content, content not appropriate for their age and potentially harmful interactions,” said the Malaysian Communications and Multimedia Commission’s (MCMC) new Code of Conduct which was published yesterday.
It said service providers should also provide effective age verification measures to prevent kids from accessing harmful content.
Such harmful content includes child sexual abuse material, non-consensual sharing of intimate content, content used to conduct scams and content used to bully others.
In the code of conduct for best practices by these platforms, the MCMC said sole reliance on self-declaration of age was inadequate, as this measure could be easily circumvented.
The MCMC said the platforms should also provide tools and settings that empower child users to protect themselves from harmful content.
Apart from that, the code said users should be able to use the platform safely, safeguarded against exposure to harmful content in a manner that respects users’ rights to freedom of expression and privacy under Malaysian laws.
“Therefore, service providers should set clear and robust systems, mechanisms and procedures for timely identification, assessment and removal of harmful content.
“They should also ensure that a dedicated local content moderation team is always in place within Malaysia, equipped with training and support to understand Malaysia’s local nuances, context and sensitivities,” it said.
It added that platforms should provide prompt and effective responses in both standard and crisis situations.
The platforms should also prevent and eliminate child sexual abuse material by immediately reporting such harmful content, upon identification, to the relevant law enforcement agencies – including the MCMC.
“They should also ensure its removal within 24 hours from the time of said reporting, or dealing with said harmful content in such a manner as directed by the law enforcement agency,” it said.
Platforms are also to implement policies to address users who create or distribute harmful content such as by warning, reprimanding, restricting, suspending or terminating user accounts that repeatedly breach community guidelines or break Malaysian laws.
The code of conduct highlighted that service providers must be held accountable to ensure a safer online environment.
To do this, such online platforms should submit half-yearly online safety reports to the MCMC, among others, which include results of assessments of systemic risks on the platform.