The alarming unintended consequences of the Online Safety Bill

The Online Safety Bill promises to improve and promote the online safety of Australians. But not for all Australians. The broad-brush approach of the bill goes far beyond what is necessary to achieve its objective, and the Government has ignored community concerns over its many shortcomings.

Alongside pathways of redress for those experiencing online bullying, abuse and non-consensual sharing of intimate images, the bill contains a handful of provisions that are likely to lead to unintended negative consequences – repressing freedom of expression, undermining encryption, and exacerbating harm for vulnerable groups it is supposed to protect.

Unfortunately, emotive rhetoric about children’s safety online can too easily be used to dismiss valid criticism of the powers contained in the bill, as well as the lack of safeguards which would ensure accountability. Children’s safety is important but it should not be politicised to pass overbroad legislation without checks and balances.

The bill is made up of several schemes, some of which are appropriate ways to handle online harm. The cyber-bullying, cyber-abuse, and image-based abuse schemes provide powers to assist the removal of material online that is harmful to children, seriously harmful to adults, as well as intimate images shared without consent. Notably, these powers are reactive: they respond to complaints made by those harmed.

However the bill also introduces “basic online safety expectations” allowing the eSafety Commissioner to introduce industry standards and technical requirements. The bill also includes an online content scheme and an abhorrent violent material blocking scheme, providing take-down and website blocking powers for content deemed offensive or violent.

These latter powers are generally proactive, meaning the eSafety Commissioner will have the mandate to search the internet at its discretion for content that is covered by the bill. This opens the opportunity for the eSafety Commissioner to essentially survey the Internet and arbitrate what content Australians are able to access.

While the powers are broad, so too is the scope of the content and services to which they apply. The Commissioner can order the removal of content categorised as Class 1 or Class 2 material, which at its lowest, corresponds to anything deemed R18+ or above in the National Classification Code. This captures any sexual content, violent or not, or content that is “unsuitable for a minor to see”.

These provisions apply to social media platforms as well as “relevant electronic” and “designated internet” services, including email, sms, instant messaging, online gaming, or any service that allows users to access material using the internet.

Taken together, the bill covers sexual content that many Australians consensually engage with, including in their personal correspondence. While this is clearly an overstep, much of the debate has focused on the harm caused to children when they access material considered to be “offensive,” but little consideration appears to have been given to the potential harm of not allowing access. For instance, many LGBTQ+ youth rely on the Internet and pornography to counter the lack of inclusive sex education.

The bill would also likely cover content that could be used for political accountability, such as footage of human rights abuses or misuse of violence by police. The current public interest exemption does not go far enough to ensure that Australians will not be prevented from seeing material holding those in power accountable.

Civil society groups have expressed concerns that the powers of the eSafety Commissioner could be used to repress freedom of expression and censor sex workers. The current Commissioner, Julie Inman-Grant, dismissed them as “ill-founded”, assuring that the sex industry is “not her concern.” Yet, the explanatory memorandum contradicts this, describing the intention to develop a “comprehensive roadmap” for the regulation of pornography. Looking internationally, we have already seen swathes of sex workers, support, safety and advocacy groups de-platformed as a result of comparable US legislation.

Over-compliance and blanket censorship is one of the likely consequences of the poorly drafted bill. The measures may incentivise platforms such as Facebook and others to pre-emptively remove content rather than risk penalty.

Another issue is how the powers may extend to encrypted services. The bill as currently drafted leaves the door open to giving the Commission access to communications, including some that are currently end-to-end encrypted.  Inman-Grant has even argued against end-to-end encryption in general, claiming that it facilitates online child sexual abuse.

Such claims advocate a regressive surveillance agenda at the expense of our digital security. Robust encryption is essential for the digital security of individuals and governments alike. If we’re genuinely concerned about child safety online, consider how encryption protects them from predatory users tapping into their webcam or their communications with friends and family. Without amendment, the bill could be used to compel providers to weaken encryption – a sure-fire way to undermine online safety for all of us. Children included.

The Bill’s consultation process has been rushed, suggesting the government is not meaningfully engaging with public concerns. A total of 376 submissions were made to the public consultation, yet only 10 days later the bill was tabled in parliament with no meaningful amendments. Then, despite only being provided three working days notice, another 135 submissions were made to the senate inquiry. Again, resulting in no meaningful recommendations or amendments.

The proposed law gives an alarming amount of discretionary power to an administrative official to determine what adult Australians can access online. The lack of meaningful consultation suggests that the government is not interested in having nuanced debate about complex issues of personal autonomy, individual responsibility and online harm reduction.

If the Morrison government continues to ignore public concerns, including the call for transparency and accountability, then perhaps these consequences are not so unintended after all.

Sam is a campaign office at Digital Rights Watch working at the intersection of feminism, human rights and technology. As former Program Director for Code Like a Girl, Sam is dedicated to ethics of technology in all its forms- from gender equity in the tech industry to upholding privacy in an increasingly surveillance-obsessed world.

Comments

3 responses to “The alarming unintended consequences of the Online Safety Bill”

  1. Patrick M P Donnelly Avatar
    Patrick M P Donnelly

    The usual fear mongering can get through many changes in laws, all chipping away at what rights are left.

    Sheep following a goat.

  2. Andrew Avatar
    Andrew

    If it’s true that “Inman-Grant has even argued against end-to-end encryption in general” (which I have no reason to doubt), then it shows an appalling lack of understanding of encryption. End-to-end encryption is enormously widely used. It is precisely what keeps everyones’ banking passwords private when using internet-banking, and access credentials private when logging onto any website. Such ignorance would be just sad, if the consequences of acting in that ignorance weren’t so dire.

  3. David Havyatt Avatar
    David Havyatt

    I made a submission to the Senate Inquiry noting the other issue which is that the ALRC recommended a thorough review of the classification standards. Classificatiuon standards exist both to restrict access to content and, at lower levels, provide guidance to potential exposure to the content of what it contains.

    But it isn’t the content owner that should be responsible for restricting access. What they should be responsible for is classifying (under a self-regulatory scheme like TV) or arranging for the classification (like is the case with movies) – or something with a bit of both. The issue is the need to mark the content in a standard way so that the content control systems can do their work. Ideally this could be embedded in the DNS as a global solution. But as a local solution just mandate that restricted content pages must first land on an ‘age verification’ page that would work manually (click here) but also contain standard code to interface with control software operated by the ISP (if the lessee of the service is under 18 or is over 18 and requests it) or the security software on the access device.

    Finally, pornography has been blamed for the increase in sexual violence. This is something that can be better addressed through better discretion in classification rather than blanket bans. After all there is plenty of pornography that depicts normal sexual encounters involving enthusiastic consent. That is, sex doesn’t have to be learned only through fumbles in the back seat of a car, or skills only acquired through having multiple sexual partners.