The report, Digital Education and Media Literacy in Schools, examined how young people across the UK are navigating a digital environment that is expanding in “speed, scale and complexity”.
Speaking on the report’s publication at the House of Lords, The Lord Clement-Jones CBE said, “media literacy has never been more urgent,” calling for “radical coordinated action” to improve safe digital inclusion.
As the DPA celebrates its 25th anniversary next year, its CEO Elizabeth Anderson told Capacity that more work needs to be done.
“Ending digital poverty is not only about access or connectivity. It is about ensuring everyone can engage with the digital world critically, confidently and safely,” Anderson said in the report.
“Media literacy is central to that mission and fundamental to the informed, equitable society we aim to build.”
Education, equality, access
The DPA report findings are impactful. Despite 2025 having seen plenty of milestones for digital inclusion, with the Digital Inclusion Action Plan setting the tone, one in five children in the UK are still unable to access digital classrooms to complete schoolwork.
Only 23% of parents reported that their child had received any media literacy education in school, even as 63% identified understanding online harms as the most effective way to support children’s safety.
Notably, 98% of young people access health information online – versus a 2% accuracy of this type of information on TikTok. The report charts how a shift towards social media to access information has brought about a rise in disinformation online and negatively impacted young people’s inability to trust what they are seeing.
Anderson calls for technology and social media companies to work with government to enact positive change.
Speaking exclusively with Capacity at the House of Lords, she said: “There is a moral imperative that it is the right thing to do. We have nearly half of parents saying that their child has encountered misleading information online, or children are self-reporting encountering upsetting material. I don’t think anyone in a technology company wants that to be happening.”
She added: “Technology companies have the opportunity to really take control of this, and I think that stretches across from making sure that children have the correct kind of device that they need, that they have the correct knowledge and understanding to be able to use it safely, and also very specifically with social media, that they are doing everything they can to put genuine protections in place.”
A significant aspect to this safety is the algorithms companies use to recommend content online. In addition to 90% of those surveyed by the DPA not knowing what a large language model (LLM) is, 43% didn’t know that algorithms dictate what we see online.
“We need to be in a position where parents are better able to curate that and where people can ‘correct’ the algorithm and take that control back and look after children and vulnerable adults,” Anderson told Capacity.
Is the onus on big tech?
The DPA report makes a list of recommendations for technology companies, including to disclose how recommendation algorithm’s function and how they work to amplify content.
Other recommendations include mandatory labelling or watermarking for AI-generated content to help users distinguish between real and synthetic media, in addition to strengthening systems to detect and remove AI-generated misinformation.
Anderson told us the DPA is ready and waiting to facilitate these conversations between government and enterprise.
“We know there’s a lot of people in the technology sector who have the will to see some change, and we want them to play a leading role in that, rather than falling behind,” she said.
With the world becoming increasingly digital, the UK government is looking at making more central services more AI-driven – both in terms of how people access them, but also how decisions are made.
“If young people don’t understand AI, they are going to be disadvantaged,” Anderson explained. “This is because they can’t necessarily recognise AI-generated content. That is worrying on every level.”
She added: “So many jobs are going to rely on AI that, if you don’t have the skills to be able to navigate that, you’re going to just be left behind and lost – it’s those people who will be digitally excluded and most at risk of being left behind in this AI-driven future.”
In the wake of the 2025 Autumn Budget, there have been conversations over how government and business can work together to facilitate innovation and digital inclusion. The DPA wants to help bridge this gap.
“It’s about bringing people together,” Anderson tells us. “The absolute raison d’être for the DPA is to bring together business, government and the rest of civil society. If we look at this in the context of the budget, there’s lots of big shiny conversations about what AI can do to change the world and there’s very little in the way of digital inclusion and how education and social mobility play a part in that.
“These will be our focuses as we head into our 25th anniversary next year.”
Related stories
Digital Poverty Alliance CEO: ‘Digital inequality remains a serious societal challenge’
Autumn Budget: ‘If you build here, Britain will back you,’ UK Chancellor says
UK security minster unveils ‘business-first’ plan to boost cyber resilience
Image credit: Digital Poverty Alliance

techoraco: Our responsibility to young people
Capacity is a techoraco brand. tecoracho has its Talent in Digital Infrastructure programme exists to give young talent an educational and inspiring introduction to digital infrastructure and introduce new potential hires to the industry.
Learn about the Capacity Europe 2025 programme HERE.

Capacity Europe 2026
The 24th anniversary edition of Capacity Europe 2025 will bring together 3,500+ decision-makers from the global connectivity and digital infrastructure community.





