Singapore Can Order Social Media Platforms To Take Down "Egregious" Content, Under New Bill
The Singapore Parliament, on Wednesday, passed an Online Safety Bill that would hold social media platforms accountable for any harm that may come to local users.

The legislation, which received unanimous support among MPs, goes into effect as early as 2023, as The Straits Times reports.
Under it, the Infocomm Media Development Authority (IMDA) has the power to order social media platforms, such as Facebook, Instagram, YouTube and TikTok, to take down content it deems “egregious”. This covers posts that advocate or support suicide, self-harm, child sexual exploitation and terrorism, along with any materials that pose a risk to public health or incite racial or religious tensions.
Social media platforms can face fines of up to S$1 million and risk being blocked from the country if they fail to comply. Internet service providers like Singtel, StarHub and M1 could also be slapped with fines of up to S$500,000 if they don't block the platforms in question.
The Bill was debated by 16 MPs from both sides in the leadup to its passing. A number of MPs, however, still feel that its reach has to be broader, covering more forms of harmful content including those that depict animal abuse and unrealistic beauty standards.

Another concern that was brought up was how the Bill has the potential to compromise democratic freedoms, with the IMDA essentially having the power to control what content is and isn't allowed on these platforms. But Communications and Information Minister Josephine Teo asserted that that wouldn't be the case.
"I would also like to remind members of the overarching purpose of the Bill – that is, to provide a safe environment and conditions that protects online users, while respecting freedom of speech and expression,” she said.
She added that the IMDA will stipulate a specific timeline for disabling access to content depending on the severity of the harm they could cause. In line with this, social media platforms are also expected to act on users' reports in a timely manner.
The new Bill also comes with a draft Code of Practice that will be imposed on regulated social media platforms to create safeguards for users, especially those under 18. These include tools that allow parents to manage their children's safety on these platforms and mechanisms for reporting unwanted interactions, among others. The code could be rolled out as early as 2023.
While most social media platforms already require users to be at least 13 years old to sign up, the age verification process may be easy to circumvent. For this reason, Teo said that personal data can be used to implement age-appropriate policies. But she also noted that there's currently no international consensus on standards for effective and reliable age verification by social media platforms which Singapore can use as a reference
"Instead, we will continue to closely monitor and extensively consult on the latest developments in age verification technology, taking into account data protection safeguards, and consider viable regulatory options,” she said.
Parliament on Wednesday passed an Online Safety Bill that would hold social media platforms accountable for any harm that may come to local users.
The legislation, which received unanimous support among MPs, goes into effect in 2023.
It gives the IMDA the power to order social media platforms to take down content it deems “egregious”
Social media platforms can face fines of up to S$1 million and risk being blocked from the country if they fail to comply.