Raising expectations for online safety | Alannah & Madeline Foundation Skip to main content

As a community, we talk often about the responsibilities of children, parents and teachers to know the risks of the digital world and make safer choices online. But what about the responsibility of the digital industry to make their own services safer for children?   

Recently, the Alannah & Madeline Foundation took part in a consultation by the Australian Government on how to improve the Basic Online Safety Expectations (BOSE) Determination. The BOSE Determination is an important document. It sets out the Government’s expectations about the minimum steps the digital industry will take to help keep Australians safe online.     

Now the Government is testing some new proposals to make their safety expectations for industry more relevant to a rapidly changing digital world.   

One change has been proposed which we find especially exciting: to introduce a new expectation that industry will take steps to make ‘the best interests of the child’ as the primary consideration in the design and operation of any service that is accessible to children.  

 What does this mean?

If this proposal were accepted, it would mean that social media platforms and other online services would have to stop and reflect on how their products impact on children’s rights. Children have the right to be protected from damaging things like sexual abuse and invasion of their privacy, to take part in positive things like recreation and cultural life, and to access things they need for their development, like education. The digital industry would have to show they have made decisions with these issues front of mind. 

 The Foundation would welcome this change. However, we believe the Government must take further steps to make it work, including giving more directions to industry about how to uphold the best interests of the child and requiring industry to tell the public what they are doing in that space. 

This includes AI (Artificial Intelligence)

The Government has also proposed to update the BOSE Determination to address individuals’ safety in relation to generative AI. Given how quickly and radically AI has evolved, this is vital. But we would like to see more attention paid to what the growth of AI means for children. One serious example is the connection between AI and child sexual exploitation and abuse. AI can be developed and used in ways which enable and escalate the horrific abuse of children – or it can be developed and used in ways which help to prevent, detect, and remove abusive material. Industry should have a duty to help safeguard against these potential harms and strengthen their monitoring and removal tools. 

Addressing ‘contact’ risks to children

In addition, the Government has proposed to introduce safety expectations for content recommender systems. These are the systems that suggest new content (e.g. videos, posts), often based on the individual’s previous activity or the activity of similar accounts. Recommender systems have been found to ‘nudge’ children towards concerning content – for example, a child who ‘likes’ posts about physical fitness may be offered extreme weight-loss content. This needs to be addressed. But recommender systems don’t just influence the content we see, they also connect us with new online friends and followers – but current systems don’t always include important safety provisions like ensuring the age of these ‘friends’ are appropriate. Safety measures should address these ‘contact’ risks to children, as well as risky content.  

Making digital service providers more accountable

Finally, we welcome the Government’s push to use the BOSE Determination to encourage digital service providers to direct more money into systems, staff, tools and processes to support online safety. We only wish more pressure could be brought to bear to make this happen. Laws and regulations are only as good as the resources in place to bring them to life.  

 Gaining access to children’s personal information has helped digital service providers to build extraordinary wealth. It is time they made a more proportionate investment in building a digital environment where children can thrive. 

The Alannah & Madeline Foundation will continue to advocate for the right of all children and young people to be safe in all places where they live, learn and play – including in online space.