Singapore’s Infocomm Media Development Authority (IMDA) made its first moves under updated regulations covering online safety, ordering Meta Platforms to remove a Facebook page involving child sexual exploitation material (CSEM) and directing ISPs to block access to an associated website.
IMDA stated Meta Platforms complied within 24 hours. It explained action was taken after local police alerted it about a Facebook page which was part of an online network enabling the sharing of illegal material.
The regulator then found a Facebook group carrying similar posts, with links to a website with the content.
IMDA stated tackling the threat of harmful online content is a “global issue which requires a whole-of-society effort”.
The Singapore government strengthened its regulatory framework in 2022 by passing the online safety act, which requires social media sites to quickly block access to harmful content.
IMDA stated social media services “have a responsibility to ensure online safety for all users, particularly children”.
“We recognise that the industry has taken active steps in recent years to combat harmful online content on social media and urge social media services to remain vigilant in detecting and preventing the dissemination of harmful online content”.
Earlier this week, Meta Platforms reportedly established a taskforce to probe claims its Instagram service hosts and distributes CSEM.