Big tech regulation needed to stop disinformation spreading like wildfire in Australia’s democracy
The Human Rights Law Centre is calling on the Albanese Government to appoint an independent regulator to hold social media platforms accountable, after a new report by Reset.Tech Australia found that Australia's voluntary regulatory framework for big tech companies is failing.
Australia’s approach of letting big tech companies regulate themselves has allowed misinformation and disinformation to spread like wildfire on our democracy, causing increased discrimination, societal polarisation, and distortion of public debate.
Big tech’s disregard for community standards has been epitomised by X's dispute with the eSafety Commissioner in relation to violent content stemming from recent incidents in Sydney, citing adherence to its own rules while sidestepping Australian law.
The Human Rights Law Centre warned that, too often, powerful people weaponised a distorted interpretation of the right to free speech to avoid accountability for the harm they cause by their speech.
Any independent regulator should be empowered to establish and enforce standards that protect the freedom of expression while also ensuring that platforms are held accountable for any serious harm their products cause, like those in place in the EU.
David Mejia-Canales, Senior Lawyer at the Human Rights Law Centre, said:
“Technology should bring our communities together, not divide us for profit. But the lax, self-regulation of big tech platforms is allowing them to wreak havoc on our democracy. Self-regulation and co-regulation have failed in other industries. Big tech is no different.”
“We urge the Albanese Government to appoint an independent, well-resourced regulator of big tech platforms so they operate transparently and are held accountable for any damaging impacts on people, society, or our democratic processes that their products may cause.”
“Any regulation of the big tech platforms must itself be human rights compliant and protect the freedoms of expression and speech. Instead of concentrating on content moderation or removals, regulation should prioritise transparency and accountability from big tech platforms. Currently, these platforms are profiting from spreading misinformation and disinformation, which only harms our society and our democracy.”
Media contact:
Thomas Feng
Acting Engagement Director
Human Rights Law Centre
0431 285 275
thomas.feng@hrlc.org.au