Regulation
What Every EdTech Company Needs to Know About COPPA 2.0 in 2026
If you build software that children use, whether it's a classroom learning platform, an AI tutoring tool, or a reading app parents download on a whim, the regulatory ground beneath your feet shifted dramatically in the last twelve months. The FTC's modernized COPPA rule, widely called "COPPA 2.0," isn't a minor update. It's a structural overhaul that redefines what compliance looks like for EdTech in 2026 and beyond.
I've spent the past several years working at the intersection of children's media, ethical AI, and healthcare. In that time, I've watched dozens of well-intentioned EdTech companies get blindsided by regulatory changes they should have seen coming. Here's what you need to understand, and what to do about it.
What Actually Changed
The original COPPA, enacted in 1998, was built for a world of desktop websites and simple registration forms. COPPA 2.0 recognizes that children's digital lives now include AI-powered recommendation engines, biometric data collection, persistent behavioral tracking, and immersive environments that didn't exist when the original law was written.
The key changes that matter for EdTech:
- Expanded definition of personal information. Biometric identifiers, geolocation data, and AI-inferred behavioral profiles now explicitly fall under COPPA's umbrella. If your AI tutoring system builds a learning profile based on a child's interaction patterns, that's personal information.
- Stricter consent mechanisms. The "school consent" exception that many EdTech companies relied on has been significantly narrowed. Schools can still consent on behalf of parents, but only for data strictly necessary for the educational service, not for product improvement, analytics, or AI training.
- Data minimization requirements. You can no longer collect data "just in case." Every data point must be justified by a specific, documented educational purpose. Retention limits are now explicit.
- Mandatory data security standards. The FTC now requires specific technical safeguards, not just "reasonable" security. Encryption, access controls, and breach notification timelines are spelled out.
- Third-party SDK accountability. If an analytics or advertising SDK in your app collects data from children, you're liable, even if you didn't know it was happening.
Why EdTech Is Uniquely Exposed
Most consumer apps can, in theory, age-gate their way out of COPPA compliance. EdTech can't. Your users are children by design. That means every product decision is a compliance decision.
The uncomfortable truth is that many EdTech products were built on the implicit assumption that educational purpose justified expansive data collection. COPPA 2.0 rejects that assumption. The fact that a child is learning multiplication doesn't give you the right to track their eye movements, build behavioral profiles, or feed their interaction data into your machine learning pipeline.
This hits AI-powered EdTech especially hard. If your product uses machine learning to personalize the learning experience through adaptive difficulty, content recommendations, and engagement optimization, you need to audit every data flow. The model training pipeline is now a compliance surface.
The "School Consent" Trap
For years, many EdTech companies operated under a comfortable arrangement: schools signed agreements consenting to data collection on behalf of parents, and companies used that consent as a blanket authorization. COPPA 2.0 closes this loophole.
Schools can still provide consent, but only for data collection that is "directly related to the provision of the educational service as described in the school's contract." That phrase, "as described in the school's contract," is doing enormous work. If your contract with the school says "math tutoring platform" and you're collecting data to improve your recommendation algorithm, you're outside the scope of consent.
This means EdTech companies need to rebuild their consent architecture from the ground up. Direct parental consent mechanisms that actually work, that aren't just a checkbox buried in Terms of Service, are no longer optional for most use cases.
What to Do Now: A Practical Roadmap
1. Audit Your Data Flows
Map every piece of data your product collects, from every source: direct input, behavioral tracking, device sensors, third-party SDKs. For each data point, document the specific educational purpose it serves. If you can't articulate one, you probably shouldn't be collecting it.
2. Rebuild Consent
If you rely on school consent, review every school contract against the new standard. Build direct parental consent flows for any data collection that falls outside the narrow educational-service exception. Make consent granular. Parents should understand exactly what they're agreeing to.
3. Audit Your AI Pipeline
If you use machine learning, trace the training data back to its source. Were children's interaction data used to train models? Under what consent? Can you demonstrate that model training serves a specific educational purpose? These are the questions the FTC will ask.
4. Purge and Minimize
Implement automated data retention limits. Delete data you don't need. The new rules don't just require you to stop collecting unnecessary data. They require you to get rid of data you've already collected without proper justification.
5. Get Expert Guidance
This is not a "have your lawyer review the privacy policy" situation. COPPA 2.0 compliance requires understanding both the technical architecture of your product and the regulatory framework. You need people who speak both languages. That's exactly the kind of cross-disciplinary advisory work I do through Mindful Media.
The Bigger Picture
COPPA 2.0 isn't happening in isolation. It's part of a massive wave of regulatory and legal action targeting how technology companies interact with children. State-level children's privacy laws in California, Connecticut, Texas, and elsewhere add additional layers. The EU's Digital Services Act imposes its own requirements. And the litigation wave against social media companies is establishing case law that will inevitably shape how courts interpret EdTech obligations.
The companies that treat COPPA 2.0 as a one-time compliance checkbox will find themselves playing catch-up for years. The companies that use this moment to build genuinely ethical design practices into their product DNA will have a durable competitive advantage.
Because here's what I've learned working with hospitals, school districts, and children's media companies: when you design products that truly prioritize children's wellbeing, compliance isn't a burden; it's a byproduct.
The future of EdTech isn't about collecting more data. It's about building products so well-designed that they don't need to.
Need Help Navigating COPPA 2.0?
I help EdTech companies, law firms, and organizations build children's digital products that are compliant, ethical, and effective.
Work With Mindful Media →Stay in the Loop
Get my weekly take on children's media, ethical AI, and what's coming next.