Will Online Safety Bill make online harms worse?

Wednesday, 15 November 2023 00:00 -     - {{hitsCtrl.values.hits}}

 

Creating regulatory bodies that issue orders with short time frames backed up by large penalties is likely to shut off the opportunities for collaboration. If the costs of complying with the directions of the regulator from a small and insignificant market are excessive, the platform companies are likely to withdraw their services from Sri Lanka causing significant harm to users. Such outcomes will not endear the responsible politicians to the active youth demographic that is likely to be decisive in the coming elections


 

The Government claims that some individuals experience significant harm caused by online content. Some, if not all, critics of the Government’s Online Safety Bill will agree. The divergence is on whether this legislation is the solution. I have outlined below that the solution proffered by this Bill is ineffective. It would also make things worse for those who suffer the harm. It is likely to make things worse for all who use services provided by the global tech companies.



What is different about social media?

 Governments have been dealing with electronic media for over eight decades. Now they are trying to regulate social media. It is important to understand what distinguishes social media from the old media.

Unlike with radio and television broadcasting and newspapers, there are millions of potential publishers on social media. This makes it difficult to hold content creators accountable. In some cases, it is not even possible to identify the content creators. Contrary to the old media, publishers and the platforms that facilitate their activities need not be located in a specific country. This poses a serious problem in terms of enforcing government decisions.

Viral dissemination is truly a novel feature. What is published on a social media platform can keep going even if the original is deleted. The rapidity of viral dissemination is the major cause of harm from defamatory, inflammatory, and similar content on social media. The fastest possible takedown of the offending messages, irrespective of the location of the message creator, is the response that would minimize the harm. Even better would be technological defences that prevent the post going up in the first place.



The ineffectiveness of the bill

According to the drafters of the bill, harm can be avoided or alleviated by a five-member Commission determining that a message qualifies as a prohibited statement (ex facie having the elements of the offences set out in the bill) and issuing a notice to the originator of the message to take measures to prevent its circulation. If this is not done within 24 hours, the Commission will issue a takedown notice to the platform on which the message was posted or the internet service provider. Again, it is expected that the takedown would occur within 24 hours.

The Supreme Court did not look kindly upon the procedure that was originally proposed because it gave no opportunity for the content generator to defend himself or herself, thereby violating a core principle of natural justice. In its ruling on whether the Online Safety Bill was consistent with the Constitution, the Supreme Court did not hold the entirety of section 26 to be an intrusion into matters reserved for the judiciary by an instrument of the executive, as it was hoped. However, it added requirements to require the Commission to give the party who generated the content a hearing.

 


Only platform companies such as Meta, YouTube, and TikTok are capable of doing this. Governments cannot. The fact that no nude images are visible on Facebook is proof of the efficacy of the algorithmic response. The platform companies are private entities not obligated to give both sides a hearing. They can act with lightning speed to takedown and otherwise reduce the number of people who will see a message that falls outside the standards embedded in its algorithms and procedures. For them, jurisdiction is immaterial


 

Even with the original design that allowed for the issuance of ex parte takedown notices, a takedown would take three or four working days, at least. Now, with the changes mandated by the Supreme Court, it would take even longer, during which time the offending message would be seen by an increasingly large audience and possibly be shared and retweeted. If the filing of a complaint or investigation by the Commission were to be publicised, the Streisand effect is likely to kick in, making even more people aware of the offending message and ensuring it goes viral.

Neither the original solution nor the solution as modified by the Supreme Court determination addresses the phenomenon of viral dissemination. The notices are focused on the content originator. Even if he or she were to obey the notice, nothing would happen to the copies of the “prohibited statement” that would have spread across the internet. If the content creator is outside Sri Lankan jurisdiction, the Commission cannot even ensure that person follows its edicts. It would then have to compel the platform or the ISP to block the content. Those capable of the most effective response would be the platform companies. But they too are located outside Sri Lanka and are unlikely to respond with alacrity to coercion. Even the cooperation that is currently extended is likely to taper off.



What will work?

The solution is quick takedown, not only of the original post but also of the virally disseminated copies. The takedown should occur irrespective of the location of the original publisher and those of the subsequent disseminators. Or even better, prevention of the publication of the offending message.

Only platform companies such as Meta, YouTube, and TikTok are capable of doing this. Governments cannot. The fact that no nude images are visible on Facebook is proof of the efficacy of the algorithmic response. The platform companies are private entities not obligated to give both sides a hearing. They can act with lightning speed to takedown and otherwise reduce the number of people who will see a message that falls outside the standards embedded in its algorithms and procedures. For them, jurisdiction is immaterial.

If what the Government wants to do is to effectively respond to the problem caused to Sri Lankan society and inhabitants, it should work collaboratively with the platform companies. Making laws without consulting them, or even taking the trouble to understand the key features of social media is counterproductive. Creating regulatory bodies that issue orders with short time frames backed up by large penalties is likely to shut off the opportunities for collaboration. If the costs of complying with the directions of the regulator from a small and insignificant market are excessive, the platform companies are likely to withdraw their services from Sri Lanka causing significant harm to users. Such outcomes will not endear the responsible politicians to the active youth demographic that is likely to be decisive in the coming elections.

The Budget Debate makes it unlikely that there will be space in the order paper for the Online Safety Bill to be debated and adopted. This should be seen as an opportunity to gracefully withdraw this piece of counterproductive legislation and initiate extensive consultations with the platform companies on how to collaborate on finding effective solutions for the problems being experienced by our citizens due to misuse of social media.

Recent columns

COMMENTS