Groundviews

Amending the Online Safety Act and the Centrality of Human Rights Principles

Photo courtesy of Amnesty International

Cabinet Spokesman Minister Vijitha Herath’s position on the Online Safety Act (OSA) at the press briefing held on November 6 essentially re-affirms what was noted in Anura Kumara Dissanayake’s manifesto in relation to the Act. I was sorely disappointed to read the NPP’s position on the OSA when the manifesto was first released and wrote “That AKD/NPP find it beneficial to have the OSA in our statute books speaks volumes, & contradicts other promises in their 2024 presidential election manifesto to secure/strengthen the Freedom of Expression. Thing is, OSA is wholly antithetical to first principles, & rights, incl. privacy, association, & peaceful assembly. What I had hoped would be key point in manifesto to repeal the OSA, will now not eventuate. It’s a pity the party, and its leader are so willingly ignorant of what OSA represents, or, worse, secretly like to have this law to use as they see fit, with cosmetic amendments.”

However, the news report on Herath’s statement adds more nuance to what was in the manifesto. What he noted suggests the government is interested in a process of inquiry to determine new amendments, and is also open to an entirely new Bill. My position, repeatedly stressed, is that the OSA as it stands is utterly unworkable, untenable, undemocratic and demands repeal.

On the other hand, I appreciate that for the new government, this position may be seen as obstinately opposed or hostile to what they are interested in realising and on the face of it in good faith. I also appreciate – better than most given doctoral study, and research on online harms since 2014 – that some who oppose the president and government are not above engineering strategic, significant offline unrest by weaponising online rumours in the same manner that social media was instrumentalised by Sinhala-Buddhist chauvinists and the Bodu Bala Sena (BBS) in 2012, 2014, early 2018 and notably after the Easter Sunday terrorism in 2019. Given this history and realpolitik considerations the argument to completely reject and repeal the OSA may be perceived as taking away a law some in government may feel is a necessary instrument, even with limited application, to stop the spread of incendiary disinformation.

How best can one reconcile the new government’s avowed interest in meaningful amendments and a principled critique of the law as it now stands?

What the drafting of the OSA, its tragi-comically hurried passage in parliament and even the amendments in August never remotely embraced was a human rights based approach to defining, and dealing with online harms. Given the inheritance and possible encumbrance of the OSA, I believe it’s an opportune time to consider how a human rights framework may help the government address the worst of the OSA.

What’s out there?

To elucidate what a human rights led approach should incorporate and inspire, I have leaned heavily on the seminal writing of and output from the former UN Special Rapporteur for the Freedom of Expression David Kaye. Additionally, I have relied on publications from the World Economic Forum (WEF), the British parliament and Chatham House.

A New Constitution for Content Moderation and The Risks of Internet Regulation by Kaye along with The Right Way to Regulate Digital Harms by Kaye and Jason Pielemeier are foundational frameworks in approach online regulation by giving primacy to human rights. Kaye was Special Rapporteur from 2014 to 2020. In May 2016, his report to the UN General Assembly included a prescient section on regulation in the digital age, which I have also used in this article.

In March 2019, the Select Committee on Communications from the UK’s House of Lords published Regulating in a Digital World, which proposed ten foundational principles that should undergird the regulation of online content. They included parity, accountability, transparency, openness, ethical design, privacy, recognition of childhood, respect for human rights, quality, education and awareness raising, democratic accountability, proportionality and an evidence-based approach. Many of these points overlapped with Kaye’s repeated stress on human rights fundamentals.

In August 2023, WEF published a Toolkit for Digital Safety Design Interventions and Innovations: Typology of Online Harms. The report served “…as a foundation to build a common terminology and shared understanding of the diverse range of risks that arise online, including in the production, distribution and consumption of content.”

Additionally, and although I’ve not used them in this article, the Asian Internet Coalition’s statements, and the human rights based critique of the bill by the Global Network Initiative (GNI) are also vital reading.

A detailed study of mine on the current OSA, including how the amendments in August 2024 ironically made aspects of it worse (while addressing some of the problems in the original law), is in the public domain. This article’s stress on human rights as the basis for any future amendments or a new bill doesn’t take away from what’s noted in my earlier critique of the law as it stands. To reiterate, in an ideal world, the OSA should be repealed. But given things as they are, what should the government consider in amending it?

Incorporating human rights principles proposed by Kaye and Chatham House

Fundamental rights framework: Any amendments should explicitly incorporate international human rights law (IHRL) as its foundational framework rather than solely focusing on content restrictions. The amendments passed in August 2024 began this process by removing some problematic sections in the original law (like religious outrage) but should go further by positively affirming freedom of expression, privacy and association rights as guiding principles for implementation.

Necessity and proportionality tests: Amendments and a new bill should adopt the “tri-partite test” repeatedly emphasised by David Kaye: any restrictions on online expression must meet criteria of legality, legitimacy and necessity/proportionality. The amendments should incorporate specific requirements for the Online Safety Commission (OSC) to demonstrate that any content removal orders or other restrictions are clearly prescribed by law, pursuing legitimate aims and are necessary and proportionate to achieve those aims.

Procedural safeguards: Following Chatham House’s recommendations, the OSA should strengthen procedural protections by:

Precision in definitions: A new bill or any amendment proposed should avoid vague terminology enabling censorship including through pressured, partisan, parochial and expedient judicial interpretation. While the amendments in August improved on some vague, problematic definitions (like “false statements”), other terms remain ill-defined, and broad. Amendments should precisely, and restrictively define what constitutes “prohibited content” following human rights standards rather than expedient, arbitrary, subjective or partisan criteria.

Multi stakeholder governance: As both Kaye, and Chatham House emphasise, online safety regulation’s design, review and application must involve diverse, multiple stakeholders. Amendments should be based and built on formal consultation mechanisms with, inter alia, civil society, subject/domain experts, researchers, platform/intermediary representatives, user advocacy groups and Special Procedures of the UN Human Rights Council.

Platform accountability framework: Rather than focusing solely on arbitrary content removal, amendments should establish a framework for platform accountability that includes:

Protection of privileged and private communications: Following Chatham House’s concerns about privacy, amendments must clearly protect encrypted communications and private messaging. The amendments have begun to address this by limiting some surveillance powers but could strengthen privacy protections further given the unprecedented, and significant issues with the lawas it stands.

Positive obligations: Amendments must include positive obligations for promoting free expression online, not just restrictions. These could include digital literacy programmes, support for diverse and local language content, protection of journalist and human rights defender communications and the promotion of platform diversity and competition.

Rights respecting enforcement: The amendments should ensure that enforcement mechanisms respect human rights by:

Future proofing: Technology changes rapidly and in ways no one can accurately foretell (generative AI being a good example). Following the emphasis by both Kaye and Chatham House on rapid, iterative technological change, any amendments should:

Applying the ten fundamental principles in UK House of Lords report.

  1. Parity principle: The current law could be improved by more consistently applying the “parity principle” – ensuring equivalent outcomes online and offline. While the OSA addresses wide and vague online harms, alignment with offline equivalents could make it stronger. The amendments passed in August removed sections on religious outrage and mutiny, which suggests an attempt to achieve better parity with offline rights. However, the law could more explicitly state that enforcement should seek equivalent outcomes rather than identical mechanisms, recognising that online and offline environments require different implementation approaches.
  2. Accountability: The OSA establishes accountability through the yet to be established and highly problematic Online Safety Commission (OSC) but should be re-defined by adding more accessible, fit for purpose redress and remedial mechanisms. While the Act provides for court proceedings, it completely lacks provisions for quick, low cost dispute resolution systems including through mechanisms that don’t burden or require the courts. The amendments improve accountability by requiring written reasons when the OSC chairperson refuses meetings but could go further by mandating regular public reporting on enforcement actions, and their outcomes. Transparency begets better outcomes, which in turn established public trust in the law.
  3. Transparency: Aligned with the previous point, the OSA could benefit from stronger transparency requirements, particularly regarding decision making. While it requires some disclosure of information, it lacks specific provisions about algorithmic transparency or clear explanations of content moderation decisions. The amendments’ emphasis on a Code of Practice (almost entirely lifted from New Zealand) provides an opportunity to embed transparency requirements but these could be more explicitly mandated in the primary legislation.
  4. Openness: The OSA could better balance openness with protection. While it aims to protect against harms, it could more explicitly safeguard the freedom of expression. The amendments improve this by removing some restrictive provisions but the law could include positive obligations to preserve first principles and internet openness alongside meaningful safety measures.
  5. Ethical design: This principle is wholly under-appreciated and under-developed in the OSA. While the Act addresses harmful content, it pays little attention to design standards especially around language and product localisation that could prevent harms from occurring and going viral. Given Sri Lanka’s tragic experience social media weaponisation, the legislation should be amended to require “safety by design” principles in product development and mandate ethical impact assessments for new features or services.
  6. Privacy: The OSA’s privacy provisions require significant strengthening in order to be compatible with the country’s Personal Data Protection Act (PDPA). While it addresses private information disclosure, it could better integrate with PDPA’s frameworks and include stronger provisions about data minimisation and purpose limitation. The amendments’ addition of protections against non-consensual intimate (NCII) content sharing is positive but should be part of a more comprehensive privacy framework.
  7. Recognition of childhood: While the OSA includes strong provisions on child protection, it could better recognise children’s positive rights online. The Act focuses primarily on protecting children from harm (which is what former government MPs used to sell the bill to the public) but has no substantive content on how to realise this. A better bill or future amendments should promote children’s access to beneficial online resources, and services designed with their needs in mind.
  8. Respect for human rights and equality: Central to the thesis of this article, and reflected in the UK parliament’s report, the OSA should more explicitly incorporate human rights standards. While it affords some protections almost as afterthoughts, it should better reference international human rights frameworks and include specific provisions for protecting vulnerable groups and ensuring accessibility. The amendments improve this somewhat by refining enforcement mechanisms but more explicit human rights safeguards must be added.
  9. Education and awareness raising: The OSA as it stands lacks strong provisions on digital literacy and awareness raising. While it allows for some public education, it should mandate more comprehensive programmes and require service providers as well as intermediaries to contribute to country’s embryonic digital literacy efforts. The greater stress on a Code of Practice in the amendments introduced in August recommend education requirements that should be fleshed out and more explicitly noted in future revisions.
  10. Democratic accountability, proportionality and evidence-based approach: The amendments to the OSA improve this aspect by requiring public consultation on Codes of Practice and parliamentary oversight. But given the significant issues with the law as it stands, these are inadequate measures overridden by other clauses in the OSA. Amendments or a new bill should specifically stress the need for grounded, evidence-based decision making, and regular review of regulatory measures. It could also mandate impact assessments for major regulatory decisions.

Application of WEF’s typology of online harms

  1. Categorisation of harms: The OSA could benefit from adopting the WEF’s clear categorisation of online harms into distinct types: threats to personal and community safety, harm to health and well-being, hate and discrimination, violation of dignity, invasion of privacy, and deception and manipulation. While the Act as it stands addresses some of these areas, reorganising its structure around these categories would make it more comprehensive, cohesive and coherent.
  2. Content, contact and conduct framework: The law should be strengthened by explicitly adopting the WEF’s three dimensional framework of content, contact and conduct risks. Currently, the OSA focuses heavily on content-based harms but could better address contact risks (like grooming) and conduct risks (like technology facilitated abuse). This would provide a more holistic approach to online safety regulation.
  3. Harm production, distribution and consumption: Regulations should incorporate the WEF’s understanding of harm occurring at three stages: production, distribution and consumption of content. While the OSA as it stands addresses distribution and some aspects of consumption, it could better articulate how different types of harm manifest at each stage. This would help create more targeted interventions and enforcement mechanisms.
  4. Technology facilitated abuse: The law should benefit from including more specific provisions around Technology Facilitated Abuse (TFA) and Technology Facilitated Gender Based Violence (TFGBV) as defined in the WEF typology. A recent report, It’s Everyone’s Problem: Mainstreaming Responses to Technology-Facilitated Gender-Based Violence, is essential reading in this regard. While the OSA addresses harassment and privacy violations, it could more explicitly recognise how technology can enable, assist or amplify various forms of abuse, especially gender violence. Like harms against children, the “protection of women” was a point repeatedly noted by former government MPs but has no substantive expression whatsoever in the law itself as it stands.
  5. Developmental considerations: Like the stress in the House of Lords report from the UK, WEF’s emphasis on developmentally inappropriate content and recognition of childhood could help strengthen the OSA’s provisions regarding child protection. While the Act has some child protection measures, it could better address the spectrum of age inappropriate content and its potential impacts on development.
  6. Algorithmic harms: The OSA could incorporate provisions addressing algorithmic discrimination and automated decision making, which are highlighted in the WEF typology but not explicitly covered in the current Act (despite some vague, and confusing references to coordinated inauthentic harms). This is particularly important as artificial intelligence and automated systems become more prevalent given national frameworks already in play, and place.
  7. Health and wellbeing focus: Future amendments and a new bill will benefit from adopting the WEF’s detailed approach to health, and wellbeing harms, particularly regarding content promoting suicide, self-harm and disordered eating. While the OSA addresses some aspects of harmful content, it should be far more specific about these health related harms.
  8. Deception and manipulation: The OSA’s extremely problematic treatment of “false” statements can be addressed by incorporating the WEF’s nuanced approach to deception and manipulation, stressing clear distinctions between disinformation, misinformation and deceptive synthetic media. This would help create more targeted responses to different types of false or misleading content.
  9. Rights-based framework: The WEF’s explicit grounding in international human rights frameworks must guide the drafting of amendments or a new bill, balancing safety with fundamental rights. This will help ensure that safety measures don’t unduly restrict legitimate expression and participation online.
  10. Future technology considerations: While the WEF typology acknowledges that it doesn’t specifically address emerging technologies like the metaverse or Web3, its framework for understanding harms could help the OSA remain relevant as new technologies emerge, and existing technologies evolve at a rapid pace. Amendments or a new bill will benefit from incorporating similar flexibility to address future technological developments.

Concluding thoughts

If what Vijitha Herath noted is to be believed, there’s a limited but meaningful window of opportunity to support the new government in the drafting of amendments to the OSA and ideally a new regulatory framework for online harms based on human rights principles.

In fact, this is a discussion that’s never happened before in Sri Lanka and is fundamental to digital rights more broadly. No country has got this perfectly right but contemporary legislation and policymaking on online harms provide ample examples and templates for what should and should not be done in Sri Lanka. I hope the likes of President Anura Kumara Dissanayake, Prime Minister Harini Amarasuriya, Vijitha Herath and others in government grasp this and help more fully realise our democratic potential.

Exit mobile version