The Data Is In: AI in Schools Is Outpacing Protection for Our Kids
A new national study reveals that as schools rush to embrace AI, the risks to students — deepfakes, data breaches, harmful relationships with chatbots — are rising right alongside it. Here’s what every parent and educator needs to know.
I’ve spent 14 years inside the digital systems that are now sitting in our children’s classrooms — building campaigns for Disney, Netflix, Amazon, P&G. I know how these tools are designed, what they’re optimized for, and what happens when they’re deployed at scale without guardrails. So when a rigorous national study lands in my inbox, I read it carefully.
The Center for Democracy and Technology (CDT) just released its 2025 Hand in Hand: Schools’ Embrace of AI Connected to Increased Risks to Students report, and the findings are a wake-up call that goes beyond headlines. This wasn’t a think-piece — it was a nationally representative survey of over 2,800 students, teachers, and parents conducted during the 2024–2025 school year. Let me break down what the data actually says, and what it means for our kids.
AI Is Now Standard in Schools — Ready or Not
We are past the tipping point. This isn’t a pilot program or an opt-in experiment. AI tools are embedded into daily school life at a level that would have seemed extraordinary just three years ago.
The CDT report calls this “the highest levels of AI usage among students and teachers” since the organization began tracking this data. These are not cautious numbers. These are the numbers of a system that moved fast — and the risks are compounding just as quickly.
More AI = More Risk. That’s Not an Opinion — It’s What the Data Shows
Here is the part that every school board member, principal, and parent needs to sit with: the report found a direct, measurable correlation between how much AI a school uses and how frequently harmful incidents occur. This is not coincidence. This is a pattern.
Data Breaches
28% of teachers in high-AI-use schools reported a large-scale data breach — compared to 18% in low-use schools. Your child’s information is more vulnerable the more AI their school uses.
Deepfakes in Schools
61% of students in high-AI-use environments had heard of deepfakes at their school. In low-use schools? Just 16%. AI is becoming a tool for sexual harassment and bullying.
AI System Failures
23% of all teachers reported an AI system failure in the last school year. 1 in 10 reported AI failing to treat students fairly, or damaging community trust.
Emotional Dependency
42% of students have used AI for mental health support. Nearly 1 in 5 used AI to form what they described as a romantic relationship. These interactions are going largely unmonitored.
“As many hype up the possibilities for AI to transform education, we cannot let the negative impact on students get lost in the shuffle.”
— Elizabeth Laird, Director of Equity in Civic Technology, CDTThe Guidance Gap Is the Real Crisis
Here’s what stands out to me as a digital literacy educator: the problem isn’t that AI exists in schools. The problem is that our kids are navigating it without the tools, language, or support to do so safely.
Consider what the data reveals about the guidance our students are actually receiving:
What Students Aren’t Being Taught
- Only 40% of students received any guidance on deepfake non-consensual intimate imagery (NCII)
- Just 11% were told where to report it if it happened to them
- Only 21% of teachers say their school shared policies on how to address deepfake NCII
- Just 14% of teachers and students received guidance on what to do when an AI tool fails or behaves problematically
- Only 38% of students received guidance on how to tell if content was generated by AI or a real person — yet 71% said that guidance would be helpful
- Only 4 in 10 teachers report their school provided training on data privacy policies and procedures
That gap between what students need and what they’re receiving — that’s exactly where BAM Digital Media operates. That’s what drives every workshop, every curriculum module, every parent session we design.
The Paradox That Should Concern Every Parent
One of the most important findings in this report is what I call the familiarity paradox: students who use AI the most are also the most anxious about it. Not less anxious. More.
Among students in high-AI-use schools: 56% worry an AI tool will treat them unfairly. 51% believe AI exposes them to extreme or radical content. 48% feel less connected to their teacher because of AI use — yet 52% of all students said they’d prefer to work with AI over a person.
That tension? That’s not apathy. That’s a generation trying to make sense of something powerful that no one has taught them to navigate. They are figuring it out alone — and the stakes are too high for that to continue.
What This Means for Schools Right Now
Schools did not create the AI industry, but they are now responsible for managing its impact on children. And the CDT data makes clear that responsiveness has not kept pace with adoption.
This is why digital literacy education cannot remain an elective, an assembly topic, or a once-a-year internet safety lesson. Our students are engaging with AI daily — using it for homework, emotional support, creative projects, and social connection. They need structured, developmentally appropriate frameworks for how to think about what these tools are doing and why.
They need to understand how algorithms shape what they see. They need language for recognizing manipulation — from deepfakes to emotional dependency tactics in AI companions. They need adults in their lives who aren’t overwhelmed by the technology but aren’t dismissing it either.
That’s what conscious engagement looks like. That’s what BAM Digital Media builds.
Ready to Bring Real AI Literacy to Your School?
BAM Digital Media offers workshops, curriculum modules, and professional development for educators — built by someone who spent 14 years inside the systems your students are navigating. NYC DOE-approved vendor. Serving schools across the tri-state area and beyond.
Let’s Talk → contactus@bamdigitalmedia.infoSource: Center for Democracy and Technology, Hand in Hand: Schools’ Embrace of AI Connected to Increased Risks to Students (2025). Based on nationally representative surveys of 1,030 high school students, 806 middle and high school teachers, and 1,018 parents conducted June–August 2025. Full report available at cdt.org. Additional coverage via GovTech, NPR, and the Benton Institute for Broadband & Society.
