Photo courtesy of BBC
A year ago, Sri Lanka hurriedly passed legislation that fundamentally undermined human rights in ways unprecedented in the country’s history. The Online Safety Act (OSA), rammed through parliament without meaningful consultation, represents one of the most sweeping threats to freedom of expression, privacy and democratic discourse in any jurisdiction globally, ever. Its passage was marked by parliamentary ignorance with even opposition MPs unaware of its substantive features and threats less than 24 hours before the Bill was taken up for debate. Couched in the language of protecting women and children, the law instead created a framework for expanded censorship, the evisceration of privacy, mass surveillance and authoritarian control that affects everyone from high net worth individuals and business leaders to trade unionists and academics. The Global Network Initiative’s (GNI) Content Regulation and Human Rights framework, which analyses over twenty governmental initiatives worldwide addressing user generated content, offers a practical template to transform this profoundly problematic law into one that effectively addresses legitimate online harms while protecting fundamental freedoms. With the new NPP government signalling openness to reform, there is a critical window of opportunity to reshape the OSA using established human rights principles.
The OSA undermined fundamental human rights including and especially the freedom of expression in ways unprecedented in Sri Lanka’s political, legal, policymaking and regulatory history. Less than 24 hours before the Online Safety Bill was debated in parliament, nearly all leading members of the opposition in parliament knew very little about its substantive features and not, as many suggest, flaws, aimed at expanding censorship, mass surveillance, the erosion of privacy and authoritarianism. The quality of debate in parliament was shockingly bad. MPs were entirely ignorant of online harm fundamentals and just wanted a law – defended as one that would protect women, and children – to guard against challenges to impunity through investigative journalism, civic media’s critical gaze, political critique, biting satire, grounded criticism and public accountability.
Ignorance around what the OSA enables including how it can be weaponised by a pliant judiciary defines its appreciation to date. Over 2024, I met with hundreds of individuals from across Sri Lanka to speak about the law’s exceptional threat to lives and livelihoods. From high net worth individuals, business leaders and owners of some of the largest media platforms to trade unionists from the North, academics and lawyers no one – without exception – grasped how bad the law was. Leading economists didn’t realise that all their calculations around Foreign Direct Investment (FDI), bilateral ventures, multilateral agreements and industry-led growth would be near impossible with the OSA in play because of, among other considerations, the manner in which it completely undermined privacy-protections that foreign registered entities and governments need to comply with because of their own domestic legislation. No one I spoke with grasped how the OSA is irreconcilably at odds with Sri Lanka Personal Data Protection Act (PDPA) in ways so egregious that it will require the Supreme Court’s intervention to determine how compliance with privacy protection frameworks can be ensured at the same time the OSA demands they are stripped from every person on earth. This isn’t hyperbole. The OSA is written so that it applies to everyone, anywhere in the world and extends to everything ever uploaded to the internet at any point of their lives and henceforth in perpetuity.
Although there’s never been a law like this in Sri Lanka or in the history of online regulations in the world, the judiciary was also not up to the task of rejected it. A comparison between the Supreme Court judgement on the Online Safety Bill, which saw a historically unprecedented number of cases taken against it, and the Telecoms Bill in 2024 highlighted how justices on the bench had no grasp of online harms or the design of fit-for-purpose legislation to address it. As lawyer and activist Ambika Satkunanathan noted, even if the Supreme Court’s amendments had been implemented in full, it was still inadequate to address the fundamental flaws on the Online Safety Bill.
The OSA affected everyone in Sri Lanka, whatever they did. For example, the extent to which it impacted advertisers and marketers was significant yet unknown to them, with one leading agency telling me after a briefing on the law that it was the most serious threat to the country’s industry in its history. Leading lawyers I spoke with didn’t appreciate how the law’s deliberately loose provisions and phrasing could be weaponised against them and especially in the defence of clients who had held power to account or exposed wrongdoing in the political class. Diplomats didn’t grasp how bilateral relations would be impacted, including trade, business and industry portfolios leaving aside human rights, reconciliation and political issues. Over 60 members from over 40 of the country oldest and largest trade unions didn’t recognise until a workshop on the OSA was held how the law could and would be used against their strategic plans, privileged communications, organisation and activism. Archivists, including at the National Archives, didn’t grasp the implications of the OSA on the integrity of existing material and future collections including around socio-political moments and movements of significance like 2022’s aragalaya and resulting digital artefacts. Not a single opposition member of parliament understood the potential of the OSA to undermine, disrupt and even completely halt campaigns, communications and political activities especially leading up to and during electoral moments, undermining electoral integrity.
Tellingly, those most aware of how the OSA would be weaponised were Tamil activists from the North and East. In just one or two weeks after the Act was passed, a meeting with them clearly highlighted how it had already resulted in chilling effects amongst the community, including around social media content and commentary around legitimate Tamil grievances, the Tamil national question, accountability for war crimes, memorialisation and their search for justice over enforced disappearances. This grounded fear is given life to by the fact that the police, as recently as late 2024, wanted to see the full implementation of the OSA, including the establishment of the Orwellian Online Safety Commission (OSC). This is the same police that wanted millions to install an app that can potentially persistently track and trace everyone who has it on their phones, which alerts readers to what even under the NPP government endures as an active, pervasive surveillance apparatus that has been historically and asymmetrically directed at and weaponised against Tamils and minorities.
In fact, the OSA is so bad that it generated a historically unparalleled number of statements robustly critiquing it from the Asia Internet Coalition (AIC) representing the biggest companies in the world dealing with software development, social media, AI and cloud computing. Because of this pushback, Ranil Wickremesinghe’s government and the law’s champion Tiran Alles were forced to amend it. However, the amendments in August 2024 made some aspects worse and only served, in the main, to indemnify the industry. No one else benefitted. For example, the amendments completely failed to clarify the process of determining what constitutes a “false statement” or how to judge intent. The law’s fundamental incompatibility with the PDPA also remained unresolved, exacerbating compliance challenges.
In the most recent and profoundly worrying development around the weaponisation of the OSA, January saw a leading mainstream media platform use the law against another, smaller online media platform. This precedent setting instrumentalisation of the law sets the stage for its heightened abuse to target social media and online civic media initiatives that platform critical journalism implicating mainstream media’s connections with, and capitulation to, corrupt individuals and criminals. From the first application of the law in 2024 to this case and all the times in between there’s not been a single instance where women or children have been protected.
Even more depressingly, compounding the government’s violence in introducing this law is the enduring inability of so many in Sri Lanka’s civil society to grasp online harm regulation basics. As recently as last month, a letter signed by leading human rights and media freedom NGOs in Sri Lanka along with international counterparts such as Amnesty, PEN International, Human Rights Watch, IFEX and the Committee to Protect Journalists incredibly noted that “…the law should be replaced with new cybersecurity legislation aimed at addressing genuine online harms, such as harassment and fraud”. This is complete, and utter nonsense, ironically mirroring what some MPs in the Wickremesinghe government stated as a justification for the OSA in early 2024, conflating through sheer ignorance or strategic cunning cybersecurity threats with online hate and harms. Others reported in the media as having played a role in the bringing about the August 2024’s amendments have no demonstrable experience or expertise in online harm regulation. This risks a greater license for those in government to say they have consulted broadly only to ramrod what want realised with the help of useful idiots.
The NPP government’s telegraphing of how social media was integral to its electoral successes and signalling late 2024 it intended to amend the law were antecedent to an announcement mid-January “…it will not implement the Online Safety Act…in its original form but intends to introduce amendments following public consultations”. The same article quoted Deputy Minister of Digital Economy Eranga Weeraratne stating that “the Government plans to make the necessary changes to minimise the Act’s adverse effects while preserving its effectiveness in addressing online misconduct”, and that “The Act will be implemented after making the necessary modifications. We will not enforce it in its current form. There are many aspects that need to be revised. However, we will use this legislation to prevent the misuse of freedom”. Although a pause in the application of the OSA is good news and wouldn’t even have been remotely considered under Ranil Wickremesinghe’s government, MP Weeraratne’s comments raise far more questions than they answer. Flagging this, a letter from the Collective for Social Media Declaration (CSMD) addressed to President Anura Kumara Dissanayake on 29 January stressed “there is significant concern about the lack of public disclosure regarding the proposed changes. While the Ministry of Mass Media has indicated that stakeholder discussions are planned, there remains no clarity on who is drafting these amendments, what specific changes are being considered, and crucially, whether normative human rights standards are being incorporated.”
Based on an earlier presentation on how human rights could be at the core of online harm regulations, based on recommendations by the former UN Special Rapporteur David Kaye and Chatham House in the UK, the current political moment calls for concrete ideas on how the NPP government can address the significant flaws in the OSA. To this end, Global Network Initiative’s (GNI) Content Regulation and Human Rights framework offers a comprehensive, human rights-centred approach to reforming the deeply problematic Online Safety Act. The GNI document, which analyses over twenty governmental initiatives worldwide addressing user-generated content, proposes specific recommendations organised around four fundamental pillars: legality, legitimacy, necessity and privacy. These recommendations provide clear guidance for how content regulation can be designed and implemented while respecting human rights principles and avoiding unintended consequences. The document’s emphasis on precise legal definitions, transparent governance, proportionate measures and strong privacy protections directly addresses many of the OSA’s current shortcomings, including its vague terminology, overbroad scope, insufficient procedural safeguards and problematic privacy implications. By applying GNI’s framework, which is grounded in international human rights law and informed by diverse stakeholder perspectives, Sri Lanka has an opportunity to transform the OSA from its current form as a potential instrument of censorship into a balanced regulatory framework that effectively addresses legitimate online harms while protecting fundamental freedoms.
Any reform of the OSA as it stands requires a comprehensive approach that balances legitimate concerns about online harms with fundamental rights protections. The recommendations below, drawing from GNI’s framework and grounded in Sri Lanka’s socio-political realities, provide a roadmap to realise a more effective, and rights-respecting instrument to address online harms.
Key recommendations to improve legality include:
- The OSA should be revised through an open, participatory process involving diverse stakeholders including civil society organisations, technology companies of various sizes, legal experts and human rights defenders. This process should include rigorous impact assessments examining effects on freedom of expression, privacy and business operations.
- The Online Safety Commission’s (OSC) powers should be more precisely defined with robust oversight mechanisms. While the Amendment requires Constitutional Council approval for some actions, additional checks and balances are needed. This could include regular parliamentary oversight, mandatory public reporting and independent auditing of the Commission’s decisions. Clear procedural rules should govern how the Commission exercises its discretionary powers.
- The definitions of prohibited content need significant refinement. Rather than broad categories like “false statements,” the law should precisely define specific types of harmful content with clear examples and exclusions. For instance, instead of generally prohibiting false statements that threaten national security, the law should specify what constitutes a genuine national security threat and require demonstration of actual or imminent harm.
Key recommendations to improve public legitimacy of the law include:
- The revision of all clauses in the OSA to ensure all content restrictions align with legitimate purposes under Article 19(3) of the ICCPR. This means removing vague provisions about “prohibited statements” and focusing on clearly defined categories of content where restrictions serve specific legitimate aims like protecting public order or the rights of others.
- The law should be fundamentally restructured around international human rights law principles and the GNI framework. This means explicitly incorporating the “tri-partite test” of legality, legitimacy and necessity/proportionality into the law’s framework.
- Explicitly protect controversial or offensive speech that does not cause demonstrable harm. This could include adding a provision clarifying that content cannot be restricted solely because it causes discomfort or offence to certain audiences. The Amendment’s exclusion of “metaphor made in good faith” is a start but wholly insufficient.
- Content regulation should be platform-neutral, applying the same standards to both online and offline expression. Where different procedures are necessary for digital content, these should be clearly justified based on technical necessity rather than creating a more restrictive environment for online speech.
Ensuring necessity and proportionality of the law could be strengthened by:
- Establishing different obligations for different types of services based on their role in the technology stack and capacity to address specific harms. For instance, infrastructure providers should have far more limited obligations than social media platforms and smaller platforms should have modified requirements compared to large ones.
- The 24 hour content removal requirement should be replaced with a more nuanced system. Only certain categories of clearly illegal content (like child sexual abuse material) should require immediate action. Other content should have graduated timelines based on complexity and context with proper appeals mechanisms before permanent removal.
- The law should explicitly encourage varied approaches to content moderation beyond removal, including labelling, downranking and fact-checking. Platforms should be given flexibility to experiment with different approaches while maintaining transparency about their methods.
- A proper appeals and remedy system should be established. This should include mandatory notice to users about content restrictions, clear appeal procedures with reasonable timeframes, and independent review mechanisms for significant cases. The law should require platforms to provide detailed, regular transparency reports aimed at non-experts and consumers about content moderation decisions.
Strengthening privacy protections could involve:
- Coordinating and collaborating with the Data Protection Office established under the PDPA and ensuring that the laws dealing with online harms do not impinge on privacy protections for all citizens.
- Investigation powers (especially under Section 33) should be revised to include stronger privacy safeguards. This could include requiring judicial warrants for accessing user data, setting clear standards for what data can be accessed and how it can be used and establishing oversight mechanisms to prevent abuse.
- The law should explicitly protect end-to-end encryption and anonymous speech. Instead of requiring platforms to enable content tracing, which could compromise security, alternative mechanisms should be developed to address illegal content while preserving privacy.
- Clear due process requirements should be established for government access to user data. This should include demonstrating necessity and proportionality, providing user notification (except in clearly defined emergency circumstances), and establishing data minimisation and deletion requirements.
The implementation and review of any law established to regulate online harms would benefit from:
- Mandatory periodic review provisions to assess its effectiveness and impact on rights. This should involve collecting empirical data about content removals, appeals and effects on different communities with requirements to modify provisions that are shown to be ineffective or causing unintended harm.
- A multi-stakeholder oversight body, independent of government, to monitor implementation and recommend improvements. This could be modelled after the RTI Commission. The body would include subject-domain experts and representatives from civil society, industry, academia, the legal community and government, with the power to conduct investigations and make binding recommendations.
- Clear metrics for success beyond simple, top-level content removal statistics. These could include measures of user satisfaction, appeal outcomes, effects on legitimate speech and effectiveness in addressing genuine harms.
The GNI framework provides Deputy Minister Weeraratne, and the NPP government with a proven, rights-respecting template to achieve their stated aims of reforming the OSA while maintaining its effectiveness against online misconduct. The framework’s emphasis on open, participatory law making aligns perfectly with the Deputy Minister’s commitment to stakeholder and public consultation while its recommendations around precise definitions, clear limiting criteria and judicial determination can help “prevent the misuse of freedom” without creating a chilling effect on legitimate expression. GNI’s approach to establishing differentiated obligations based on service type and capacity, coupled with its emphasis on transparency, due process and remedy offers practical guidance for implementing “necessary modifications” that would minimise adverse effects while preserving regulatory effectiveness.
Finally, and critically, GNI’s focus on empirical support and careful deliberation to ensure laws are proportionate provides a methodology for the government to demonstrate that any restrictions on expression are truly necessary and appropriate, helping achieve the delicate balance between addressing genuine online harms and protecting fundamental rights that the Deputy Minister seeks to establish.
One can only hope someone from the NPP government is listening and open to learning.