Powers of Online Safety Act tested in Federal Court case

eSafety Commissioner v X Corp [2024] FCA 499

Summary 

The high-profile dispute between the Office of the eSafety (‘eSafety’) Commissioner and X Corp (formerly known as Twitter) has tested key powers of Australia’s Online Safety Act and stimulated spirited debate on the interplay between online safety laws and rights to freedom of expression. eSafety sought enforcement of a removal notice pertaining to a bundle of content showing the high-profile stabbing in Sydney of Bishop Mar Mari Emmanuel. The Federal Court refused to extend an ex parte interim injunction against X Corp, and held that geo-blocking is a reasonable step for removing content pursuant to a removal notice under section 109 of the Online Safety Act. The judgment suggests Parliament should clarify the meaning of ‘all reasonable steps’ in the context of the Online Safety Act. 

Facts

The Office of the eSafety Commissioner (eSafety) issued X Corp, an American corporation, with a removal notice under s 109 of the Online Safety Act 2021 (Cth) (Act). The removal notice concerned 65 links to user-generated posts on the X platform. Section 109 of the Act outlines the process for removal notices. There are three key tests in s 109. First, the Commissioner must be satisfied the material is class 1 material. Second, the material can be accessed by end-users in Australia. Third, once issued a removal notice the service provider must take ‘all reasonable steps’ to ensure removal of the material.     

The 65 links all contained video footage of the stabbing of Bishop Mar Mari Emmanuel on 15 April 2024. eSafety determined this content was ‘class 1 material’. Class 1 material is defined by section 106 of the Act as either carrying an “RC” rating from the Classification Board, or if not classified, ‘likely to be classified as RC’ (s 106(1)(b)(iii)). eSafety had designated the materials as ‘likely to be classified as RC’, noting the depiction of ‘crime, cruelty and real violence in such a way that it offends against the standards of morality, decency and propriety generally accepted by reasonable adults’ ([10]).  

eSafety issued X Corp with a removal notice on 16 April 2024, requiring X Corp to take ‘all reasonable steps’ to ensure removal of the material. X Corp ‘geo-blocked’ the material, meaning that users in Australia could not access it from an Australian IP address. X Corp did not remove the material. eSafety then applied for an injunction under s 121(2) of the Regulatory Powers (Standard Provisions) Act 2014 (Cth). The injunction was granted ex parte on 22 April and extended on 24 April. At an interlocutory hearing on 10 May, the Court was asked to provide a declaration that X Corp had failed to comply with the removal notice and requested one of four remedies (emphasis added): 

a) Removing the material;

b) Restricting discoverability of the material, so that only the author and no end-user could view it; 

c) Hiding the material, by applying a ‘notice’ (a digital label) that obscured the material and could not be shifted;

d) Restricting discoverability of the material from search results or any feed on X. 

Decision

His Honour rejected eSafety’s claim that geo-blocking was not a reasonable step in the context of the Online Safety Act. His Honour rejected X’s submissions on validity and reasonableness, though provided comments in obiter that are relevant and may have bearing on future online safety disputes. These tensions are discussed below. 

Powers of the eSafety Commissioner to designate class 1 material 

X Corp submitted that the removal notice itself was invalid. His Honour concluded that eSafety did not need to provide an explicit and detailed reasoning process for the class 1 designation ([37]). However, His Honour did note that should there be an onus of proof, the Commissioner may be obliged to provide this reasoning process in more detail, even if it involves ‘the courageous step of calling evidence from the decision-maker’ ([32]). For the avoidance of delay in future legal challenges to removal notices, eSafety and other relevant online regulators may need to prepare a full accounting of their reasoning process for the designation of class 1 material. 

Reasonableness of class 1 removal notices 

X Corp also submitted that eSafety’s decision to issue a class 1 removal notice was unreasonable and therefore vitiated by jurisdictional error ([35]). The Court noted this assessment would need to enter the merits review process at the Administrative Appeals Tribunal, ‘where those merits can be carefully weighed with the benefit of detailed submissions and potentially expert evidence’([35]). Merits reviews are vital for ensuring accountability of government decision-makers in Australia, but X’s approach reveals a risk they could be weaponised by recalcitrant litigants seeking delay tactics. Notably, as these processes run their course, the harmful content stays online.

Interpretation of ‘all reasonable steps’ 

At the core of the dispute was what activity constituted the taking of “all reasonable steps to ensure the removal of the material from the service”. X Corp argued that geo-blocking, such that end-users physically located in Australia could not view the material, was sufficient. eSafety argued it was not, and sought further remedies (summarised above and ranging from ‘total removal’ to ‘hiding’ or ‘restricting discoverability’). 

His Honour reasoned that a notice made under s 109 compelling material from a platform’s services to be inaccessible to all users everywhere in the world was both unreasonable ([48]) and fundamentally clashed with the “comity of nations” ([49]). His Honour additionally expressed that were he to reach a different view on the comity of nations point, the balance of convenience would equally not favour extending the injunction ([56]). 

On reasonableness, His Honour made a key distinction between what would ordinarily be a reasonable step for X Corp to take in the application of their own policies – namely, that it would be altogether reasonable for X Corp to decide to remove the offending content ([46]). However, the test for s 109 was set out differently, given how it ‘imposes its requirements regardless of the wishes of providers and of individual users’ ([46]). His Honour concluded that the Commissioner, in the exercise of powers under s 109, could not decide what users of social media services throughout the world are allowed to see on those services ([50]).

Commentary

The preliminary judgment and the documents filed throughout the litigation process to date provide a rich source of information for the current vulnerabilities in the Online Safety Act. These should be studied closely, especially given the statutory review of the legislation taking place at the time of writing. 

The limitations of the ‘reasonable steps’ test 

The judgment offers a logic about online content that may challenge the future enforcement of the Online Safety Act. His Honour accepted eSafety’s decision that the content was class 1, meaning that it falls into the most severe category contemplated in the Act. Other content in this category includes material that shows or encourages child sexual abuse, for example (see Office of the eSafety Commissioner, Online Content Scheme: Regulatory Guidance (2021)). There are clear and established policy reasons for why class 1 content should be removed from online services rather than merely geo-blocked, and the voluntary participation of other digital platforms to remove it either of their own accord or when asked informally by eSafety indicates the uncontroversial nature of class 1 material removal practices across the industry. 

His Honour held that it would be reasonable for X Corp to remove the content but unreasonable for eSafety to compel removal through s 109 ([46]). Additionally, His Honour interpreted ‘reasonable steps’ in s 109 as demonstrably more modest than eSafety’s attempted action. The potential effect of this precedent is eSafety’s future attempts at enforcing removal notices under s 109 may be similarly rejected. One reform pathway is to clarify the meaning of ‘reasonable steps’ in the Act to avoid future ambiguity and litigation risk. An additional, recommended option would be for the Act to be amended to include a clear and overarching duty for platforms to manage a variety of defined risks to Australians across their underlying systems.

The long tail of s 230  

While not a particularly explicit thread in the dispute, the spectre of s 230 of the infamous Communications Decency Act lurked in several key documents, including powerful expert affidavits. Consideration of the “comity of nations” principle required an analysis of enforceability of the injunction in the U.S. Experts for X Corp and the eSafety Commissioner unanimously agreed an injunction would offend U.S. public policy by way of the First Amendment. However, there was a noteworthy divergence between the experts on the point of s 230. Kumar for X Corp argued the injunction would offend s 230, whereas Matz for eSafety took a different view. Matz cited the Solicitor-General in Gonzalez to argue that s 230 could not ‘block’ the eSafety Commissioner enforcing against X Corp (Affidavit of Mr Joshua Matz, 5 May 2024, [35]). 

Irrespective of the conclusions on s 230, the public policy impact is evident. X Corp’s status as a U.S. corporation encourages legal analysis on enforceability to defer to the legal relationship between X Corp and the U.S. government, which has traditionally immunised digital platforms from liability over their systems, including content moderation systems (The Matz affidavit does challenge the orthodox or established view on the First Amendment and s 230, arguing for instance ‘the First amendment protects rather than prohibits the authority of X Corp to engage in content moderation decisions’ [42]). One solution for tackling this circuitous route could be to compel U.S headquartered digital platforms to establish and operate locally as Australian companies, as has become routine practice in other multinational sectors. 

Secondary points of note 

His Honour noted two areas that may warrant further consideration by policymakers, particularly in the context of the Online Safety Act’s review. These were: 

a) How s 111’s requirements to comply with a removal notice ‘to the extent that the person is capable of doing so’ relates to the ‘all reasonable steps’ in s 190(e). While not the subject of the argument in the matter, His Honour noted how these qualifications interact with each other ‘may need to be explored at some stage’ ([14]).

b) Clarification of parliament’s intention to empower the Commissioner to issue removal notices with the effect contended, i.e., an explicit statement of what counts as ‘reasonable steps’ in the context of the Online Safety Act. As noted above, this judgment reveals that the cause of online safety in Australia would benefit from clearer expressions of parliamentary intent and more prescribed industry obligations.  

Prepared by Alice Dawkins of, Reset.Tech Australia