By | November 14, 2023
Print, PDF and email friendly

Does regulating the internet to protect children mean checking the age of all users? Sonia Livingstone discusses the importance of adopting a child rights-focused approach and highlights the political and policy challenge of recognizing who is a child online when trying to protect children.

One in three children use the internet and every third internet user is a child. Yet technology companies claim they cannot determine who is a child online. This is a practical but also a political challenge – does society want to mandate companies to age assess all their users? Or is it too great a risk for the privacy and freedom of expression of both adults and children?

Why children need protection online

Evidence of problems abounds – with a plethora of studies making headlines in media reports and cited in government inquiries and third sector advocacy. Although some sources are contested and their implications too often overstated, the case for a robust and evidence-based approach to protecting children online is widely accepted – hence the spate of new legislation and policy being introduced in the UK, US and internationally.

The research network EU Kids Online conducted a pan-European survey on children’s online access, skills, opportunities, risks and safety mediation. It found, for example, that only a quarter of 9-16-year-olds always feel safe online and 10 percent never feel safe. Depending on the country, up to half of all children said something upset them last year, double the previous survey. What upsets them? There is evidence that children face all 4 Cs of online risk: content, contact, conduct and contract. The most common risk is hatred; and the biggest increase is exposure to self-harm content. It is also crucial that the risks of online engagement fall unevenly: the European ySKILLS project finds that more vulnerable young people are at greater risk of harm online – this includes those who are discriminated against or in poorer health.

A child rights-based approach

An authoritative statement by the UN Committee on the Rights of the Child is its General Comment 25, which sets out the implementation of the UN Convention on the Rights of the Child in relation to the digital environment. A child rights framework requires a holistic approach that balances the rights to protection, provision and participation, and that centers children’s age and maturity (or ‘developing capacity’) and best interests (a complex assessment that puts children’s rights before profit, and which requires consultation children and make decisions transparent).

While General Comment 25 is directed at states, the technology sector also has clear responsibilities for online child protection. So, alongside new legislation, “by design” methods, including design-safety, are increasingly being demanded by companies whose digital products and services affect children in one way or another. And there is much they can do – for example, EU Kids Online research shows that children do not trust the platforms or cannot find out how to get help: after a negative experience, only 14 percent changed their privacy settings, and only 12 percent reported the problem online. Meanwhile, less than a third of parents use parental controls – because they don’t know how parental controls work or even if they are effective, and they fear negative effects on children’s privacy, independence and online opportunities.

Getting the policy framework right means adopting a children’s rights strategy, which the UK and Europe have long committed to but have not sufficiently adopted.

But society cannot seek to protect children at the cost of curtailing their civil rights and liberties, nor through policies that, however unintentionally, encourage companies to increasingly exclude children from useful digital services. Getting the policy framework right means adopting a children’s rights strategy, which the UK and Europe have long committed to but have not sufficiently adopted. While we wait, children – once the intrepid explorers of the digital age – are becoming overly cautious, worrying about the risks and, evidence shows, missing out on many online opportunities as a result.


Sonia Livingstone discusses children’s rights in a digital world. Click here to watch the LSE Player.

Identify who is a child

But if companies don’t know which users are children, how can they be tasked with providing and protecting age-appropriate? In the EU Commission-funded euCONSENT project, my role was to explore the consequences of age declaration and age verification for children’s rights. The Information Commissioner’s Office is actively investigating these issues in the UK.

One alternative is that we expect tech companies to design their services as a publicly accessible, child-friendly, broadly civil space – like in a public park – with exceptions that require users to prove they’re adults (like when buying alcohol in a store) . ). Another option is that we expect tech companies to treat all users in an age-appropriate way (ie find out everyone’s age and provide tailored services accordingly – although there is much debate about what age-appropriate means and how to implement it, given that children vary enormously not only by age but depending on many other factors). While there are challenges with both approaches, policy innovation is essential if we are to move beyond the status quo to treat children online as adults.

To realize children’s rights in a digital age, should society demand that companies redesign their service as needed, or redesign the process of accessing their service? In both the UK’s Online Safety Proposal and Europe’s Digital Services Act, success will depend on the effective and responsible implementation of risk assessments.

Currently, many old age insurance systems do not respect the full range of children’s rights.

In the euCONSENT project, we argued that a child rights approach to age verification must not only protect children’s right to be protected from digital content and services that may harm them, but also their right to privacy and freedom of expression (including exploring their identity or seeking confidential help without parental consent consent), their right to prompt and effective child-friendly measures and their right to non-discrimination. This means that they must be able to access digital services along with everyone else, even if they lack official identification or live in alternative care or have a disability, and regardless of their complexion. Currently, many old age insurance systems do not respect the full range of children’s rights.

Let’s be practical

Despite the important arguments going on about the potential costs to privacy, expression and inclusion associated with age assurance technologies to date, in practice users are already age verified. Google says it has age-estimated all users logged into its service, based on a wealth of data collected over time, including what their friends look like, the websites they visit and anything else it knows about them. But Google’s strategy here is not very transparent. Meanwhile, Instagram is one of a growing number of platforms that use age-estimation technology for all of its users, as is Roblox.

In the Digital Futures Commission, with the 5Rights Foundation, we have proposed a model of Child Rights by Design. It provides a toolkit for designers and developers of digital products and has been developed with them – and with children. It is based on the UNCRC and General Comment 25. It focuses on 11 principles where age-appropriate service is one, privacy is another, also security, of course. The other eight are equally important for a comprehensive approach – justice and diversity; best interests; consultation with children; business responsibility; children’s participation; well-being; fullest development; and agency in a commercial world.

If big technology embedded Child Rights by Design, the task of policy makers, teachers and parents would be greatly simplified.


This post is based on a speech given by the author as part of the European Commission’s stakeholder event on the Digital Safety Act and the European Parliament’s Committee on the Internal Market and Consumer Protection (IMCO) public hearing on the safety of minors online.

All articles published on this blog represent the views of the author(s), and not the position of LSE British Politics and Policy, or the London School of Economics and Political Science.

Image credit: Shutterstock and Michael Jeffery via Unsplash


#internet #safe #children #practice

Leave a Reply

Your email address will not be published. Required fields are marked *