National action against 'deepfake'… | Alannah & Madeline Foundation Skip to main content

The Alannah & Madeline Foundation welcomes the news that the Senate has passed the Australian Government's bill to combat non-consensual ‘deepfake pornography’. This is a powerful recognition of the severity of the problem, the harm it causes, and the need to provide victimised individuals with legal avenues for redress.

The recent evolution of generative artificial intelligence – including the proliferation of ‘undressing apps’ – has enabled a new form of abuse to spread with shocking ease and tragic consequences.

Even before 2022, the Australian community was experiencing deep concern about ‘image-based abuse’ – a phrase that refers to someone sharing or threatening to share an intimate picture or video without the consent of the person depicted. In 2022-23, the eSafety Commissioner received over 9,000 reports about image-based abuse, with the problem having grown rapidly since 2021. A spike in ‘sextortion’ scams targeting young people exacerbated the problem last year, with blackmailers using intimate pictures and videos to threaten their victims.

One can only imagine how much more common and extreme such practices could become now that AI technologies have enabled the creation of synthetic sexual imagery of any individual on demand. For example, the social network analysis company Graphika identified over 24 million unique global visitors to 34 ‘undressing’ websites in September 2023 alone, plus a huge rise in spam promotion of the apps on social media platforms. The technology had moved ‘from niche pornography discussion forums to a scaled and monetised online business’.

This raises huge concerns for the wellbeing of children and young people. The eSafety Commissioner has already begun to receive reports of ‘deepfake’ abuse to her complaints schemes.

Distressingly, some serious cases have occurred in Australian schools, where students used generative AI to create image-based abuse involving their peers or teachers. The impacts on the individuals affected, and on the wider school communities, have been severe.

The Alannah & Madeline Foundation welcomes the government’s decisive action. We also applaud the advocacy of the eSafety Commissioner, in particular, on this serious matter.

At the same time, we would caution that criminal legislation alone will not fix the problem. It is important to retain age-appropriate approaches to sentencing and to provide expert support for victims to recover and heal. And it is vital to keep investing in preventative measures. These should include regulation of what digital platforms may do with children’s personal information, including their imagery; high-quality digital citizenship education; and appropriate, privacy-preserving measures to keep pornography away from children.

The Australian Government has committed to create a Children’s Online Privacy Code and has funded an age assurance pilot to prevent children’s access to pornography. These two positive steps have great potential to prevent further harm.

If you have been affected by any of the issues discussed here, please consider seeking help. Image-based abuse and cyber bullying, including ‘deepfakes’, can be reported to the eSafety Commissioner, who has powers to get the material taken down. Support is also available from Dolly’s Dream Support Line, Kids Helpline, and Lifeline. In an emergency, always call triple zero.

Learn more about our Digital Rights Advocacy work here.