Be smart online with eSmart. Australia's trusted provider of online-safety education.
As an organisation dedicated to keeping children and young people free from violence and trauma wherever they live, learn and play, we welcome the recent discourse in the media and wider community around the issue of raising the minimum age for social media use from 13 to 16 years of age.
However, if our many years supporting and educating parents, carers and teachers about children’s online safety tells us anything, it's that the question around social media and age of access is the wrong question to be asking!
It sounds counterintuitive, but stay with us as our CEO, Sarah Davies, unpacks the issues and the questions we should be asking – and what you can do to help empower the children in your life to be safer online.
Raising the age alone is a red herring – and diverts attention away from the real issues at hand.
The real issue is the underlying design elements of social media and its algorithms, recommender systems, and data harvesting, which can expose children and young people to inappropriate and harmful content, misinformation, predatory behaviour and other damaging harms such as extortion.
Raising the age of social media access won’t address the root causes of the issues and potential harms associated with its use. To quote social media darling and Empress of today’s zeitgeist, Taylor Swift, “band-Aids won’t fix bullet holes”.
Even if the age requirement were raised, it would not fundamentally change the experiences or the design features that drive the harms. The platforms would remain unchanged, still posing risks to all users regardless of their age.
Let’s be real for a minute, we know that raising the age limit would not prevent children under 16 years from accessing social media. We need to remind ourselves that this generation are digital natives, and that the online world is their world. Young users will find ways to circumvent age restrictions, potentially leading to even less oversight and support from trusted adults.
While raising the age limit might initially appease worried parents and grandparents, this relief would likely be short-lived. It’s not simply a case of ‘out of sight, out of mind’. It could be compared to a quick sugar fix – providing temporary satisfaction without addressing the underlying hunger.
And then there are the dangers associated with driving social media behaviour "under the radar." Young users who lie about their age to gain access may be less likely to report problems, seek help, or even acknowledge their use of social media – for fear of getting into trouble.
This clandestine behaviour could exacerbate existing issues and make it harder to address them effectively. Worse, young users will be the unwitting recipients of recommender systems pushing content and contacts aimed at 16 years plus: the age the system ‘thinks’ the user is.
Regardless of the age of entry for social media, recommender systems and algorithms will continue data scraping and behaviour profiling – collecting information about your child every time they tap, type or click on the profiles and pages they visit – and then introduce suggested contacts and targeted content based on this information – including pushing products, as well as potentially destructive ‘health information’, violent, sexualised, and misogynistic images, and many other forms of inappropriate content.
In a nutshell, instead of focusing on age limits, we should concentrate on requiring age-appropriate, safety by design which prioritises the best interests of children over commercial interests.
It should be standard practice that all tech, games and apps come loaded with default settings for children under 18 years at the highest privacy and safety settings, backed by enforceable regulation that ensures their data privacy and protection from the outset.
This preventative approach would minimise potential harms and gift children an internet where they are free to live, learn and play safely online.
Keeping children and young people safe online takes collective effort – from government, tech companies, and everyone in the community.
We believe it’s government’s responsibility to ensure children’s digital rights are upheld and realised – setting minimum standards based on community expectations and holding tech companies to account for not meeting these standards.
It’s tech companies’ responsibility to prevent their services and products from being used in ways that violate children’s rights and which expose children to harms while using their services and platforms.
And it’s up to the rest of us to take responsibility to upskill and educate ourselves and our children on how to navigate tech and the online world safely and confidently; to participate with them in their digital worlds, not just police them.
So what are the right questions and what change is needed to empower children and young people to be safe online?
If we had a magic wand, this is what we would mandate:
It is also vital that we continue developing and delivering digital literacy and data privacy education in schools and the broader community, and we must always consult with children and young people.
So, if raising the age of access for social media isn’t the answer, how can parents and carers support children to be safe online?
By creating a balanced approach that combines education, communication, and practical safeguards, parents and carers can help children navigate the digital world safely and responsibly.
We understand this is easier said than done. There is so much varying information, it can be overwhelming and hard to keep up.
Parents and carers can start by focusing on a few key areas:
And the most important tool in parents' digital toolkit – open communication. Conversations are our superpower. It is critical that children and young people are encouraged to share any concerns or suspicious experiences they encounter without fear of judgement.
Everything we do at the Foundation is centred around advocating for the rights of children and young people to be safe in all places where they live, learn and play – including online.
We are guided by the UN Convention on the Rights of the Child, and specifically the 2021 General Comment on children’s rights in relation to the digital environment.
Bringing everything we experience and learn from working directly with children, families, schools and teachers in our programs, informs our digital rights advocacy work for optimal safety standards to be built into all digital spaces and devices.
Over the last 12 months, we have advocated for children's rights online in relation to issues including:
With many years of practical experience delivering programs and providing resources to support children and young people, we have strong recommendations for ways to keep children and young people safe at home, at school, and at play.
We regularly make submissions to government and other decision-makers, based on our experience, expertise and research to ensure children’s rights are upheld. You can read our latest policy submissions here.
Whilst the concerns around social media use and its potential negative impacts are warranted, the jury is still out. The effects of social media on young people's mental health and development are unclear.
Some studies indicate negative impacts on psychological wellbeing, while others suggest positive effects on peer relationships. For example, a 2023 UK study found little evidence of negative outcomes, whereas a 2021 study indicated moderate social media use might benefit peer relationships. This illustrates the complexity and coexistence of both positive and negative effects.
What we need is more research – globally and locally, and in particular insights that represent and reflect the experiences of Australian children and families.
For more information, resources and tips check out these excellent resources:
You can also join the conversation on Facebook, Instagram and LinkedIn.