It’s not just on parents: Foundation… | Alannah & Madeline Foundation Skip to main content

The Alannah & Madeline Foundation backs the eSafety Commissioner’s recent request to the eight most popular social media and messaging platforms to release data on the number of Australian children using their platforms. This includes what protections they have in place to safeguard their youngest users from potential harms.

The request comes at a time of increased community discourse and concern around the potential negative impacts of social media and online harms on Australian children.

The Commissioner’s request includes data relating to Australian children’s use of popular platforms such as Instagram, Snapchat and TikTok and what – if any – age assurance measures these tech platforms have in place to enforce their own age limits such as needing to be at least 13 years to access and use Instagram.  

Having access to this data will enable the eSafety Commission to keep tech platforms accountable to community expectations and outline the steps they take to keep their users safe online. 

CEO of the Alannah & Madeline Foundation, Sarah Davies AM said that children and young people continue to be at risk of online harms because the tech platforms they are using are not designed to keep them safe.

“Part of the problem is that social media platforms operate on a commercial model that seeks to maximise the handling of individuals' personal data. These platforms are designed to grab and hold their users' attention and to encourage as much interaction as possible. In practice, this leads to platforms that put children at risk because they are low privacy, distracting and highly compelling, feeding users content that is potentially unsafe and making them vulnerable to unsolicited contact.”

Recent research shows that two-thirds of 14-17 year-olds have viewed potentially harmful content online in the past year including drug use, self-harm and violent images (1).

Even when parental supervision is high, children still face risks (2). Of Australian teens whose parents take steps to restrict their tech use, 28% have still seen ‘gore’ content online, 27% have seen drug content, 22% have seen eating disorder content, and 14% have seen suicide content.

We know that parents care deeply about their children’s safety online, but they often feel overwhelmed and struggle to manage the risks that arise.  Only half of Australian parents feel in control of their children’s data privacy online (3), with most parents of teens saying their children understand tech better than they do (4).

“When it comes to keeping children safe online, we need a multi-pronged approach and a broader safety net – it cannot be the sole responsibility of parents and carers to protect their children from online harms. Tech platforms must take responsibility, and we are calling on them to take immediate steps to address the underlying causal factors that make their platforms inherently unsafe for children,” explained Ms Davies.

“And it must go beyond simple age assurance because research (5) shows that half of children aged 3-12 use at least one social media app or site – this tells us that children are finding ways around current measures to determine age,” she added.

Significant changes are needed to make the digital environment a safe place for children. That’s why the Foundation is calling for a Children’s Online Privacy Code to restrict what digital platforms are allowed to do with children’s personal information. It is digital platforms’ hunger for people’s data and attention that drives many of the safety risks.  

“What we need is age-appropriate safety by design that prioritises the best interests of children over commercial interests. It should be standard practice that all tech, games and apps default to the highest privacy and safety settings for children under 18 years, backed by enforceable regulation that ensures their data privacy and protection from the outset,” Ms Davies said.

This preventative approach would minimise potential harms, giving parents and carers peace of mind and gift children an internet where they are free to live, learn and play safely. 

Multi-pronged approach to keep children safe online

Keeping children and young people safe online takes collective effort – from government, tech companies, and everyone in the community.

  • We believe it’s government’s responsibility to ensure children’s digital rights are upheld and realised – setting minimum standards based on community expectations and holding tech companies to account for not meeting these standards.
  • It’s tech companies’ responsibility to prevent their services and products from being used in ways that violate children’s rights and which expose children to harms while using their services and platforms.
  • And it’s up to the rest of us to take responsibility to upskill and educate ourselves and our children on how to navigate tech and the online world safely and confidently; to participate with them in their digital worlds, not just police them.


The most urgent issues that need to be addressed

To address the underlying causal factors, we would mandate:

  1. Default Privacy & Safety Settings: all digital products and services must automatically provide the highest level of privacy and safety settings for users under 18 years, by default.
  2. Data:  data is the currency for many tech services and products, but we must ban the commercial harvesting / collection / scraping / sale / exchange of children’s data.
  3. Privacy:  prohibit behavioural, demographic and bio tracking and profiling of children, and therefore no profiled, commercial advertising to children under 18.
  4. Recommender Systems (the algorithm):  ban the use of recommender systems – software that suggests products, services, content or contacts to users based on their preferences, behaviour, and similar patterns seen in other users.
  5. Reporting Mechanisms:  require child-friendly, age-appropriate reporting mechanisms and immediate access to expert help on all platforms.
  6. Independent Regulation:  remove self and co-regulation by the tech industry and establish a fully independent regulator with the power and resources to enforce compliance.
  7. Safety by Design:  implement age-appropriate and safety-by-design requirements for all digital services accessed by children.
  8. Public Data Access:  ensure public access to data and analytics for regulatory, academic, and research purposes.

It is also vital that we continue developing and delivering digital literacy and data privacy education in schools and the broader community, and we must always consult with children and young people.

[1] https://www.esafety.gov.au/research/mind-gap

[2] https://www.esafety.gov.au/research/mind-gap

[3] https://www.oaic.gov.au/__data/assets/pdf_file/0025/74482/OAIC-Australian-Community-Attitudes-to-Privacy-Survey-2023.pdf

[4] https://www.accce.gov.au/sites/default/files/2021-02/ACCCE_Research-Report_OCE.pdf

[5] https://www.ofcom.org.uk/sitea...

 _______________________________________________________________________________

For further information or interview requests, please contact:  
Simone Redman-Jones - Media & Communications Manager, Alannah & Madeline Foundation 
0499 202 001 or [email protected]