When Defaults Decide
What “Adolescence” Teaches Us About Self-Regulation, Hope, and Harm
Too many people believe this series is just about murder and misogyny. It is about so much more than that.
First, there is the art of the series. The behind-the-scenes reel for Adolescence is ostensibly about craft—four one-hour, single-shot episodes; a camera that moves like a train you can’t stop. But its subtext is governance. A boy is arrested. A 13-year-old girl weaponizes symbols on social media. A father needs his son to decode what adults missed. Grieving parents ask, “Could this be us?” The show refuses easy villains and easy answers—and that’s exactly why it’s so useful. It reveals the structural truth we keep dodging: technology companies cannot, and will not, self-regulate in the public’s best interest—especially where children are concerned. Well-meaning adults routinely substitute hope for strategy, outsourcing judgment to platform defaults that were never designed for child safety.
But I’m here to take a different spin on things. When I viewed this series, I saw it through the knowing lens of those who built the platforms and governance policies, who deeply understand the business incentives and have been in the room when executives were informed of research to make decisions aligned with societal health and well-being. Incentives drive decisions, not moral ethics. Policies existed that required parental consent for accounts aged 13 and under. Yet, the business was incentivized for overall engagement. History shows they will wait for a lawsuit before more formally adopting corporate pillars. Therefore, it becomes our choice where these defaults should lie.
The Doorless Train: Why Self-Regulation Fails by Design
The production notes describe a single camera moving through space, with no cuts or resets. That’s the business model: a continuous shot of engagement. In this system, safety is an interrupt, and interrupts are expensive. Self-regulation predictably fails because:
Incentives misalign. Engagement is revenue. Safety is costly. Fewer sessions, fewer impressions, fewer data points—platforms don’t “win” when kids sign off.
Information asymmetry is extreme. Parents and schools can’t see what the product manager sees. Risk is hidden in UX seams (autoplay, DMs by default, “people you may know,” infinite scroll).
Externalities are off-balance sheet. Anxiety, bullying, sleeplessness, self-harm contagion—the costs land on families, educators, and public health systems, not on cap tables.
Safety is framed as a feature, not a boundary condition. Companies ship “parental controls” and “family centers” like accessories, then blame misuse when harms occur.
A line from the reel lands like a thesis: “Don’t put this in the extraordinary. Make this feel like it could happen to you.” That’s governance. If it can happen and the incentives reward the conditions for it, then relying on corporate virtue is malpractice.
Hope as a Strategy (and How Defaults Exploit It)
Why do careful, loving adults still hand a loaded feed to a 13-year-old? Because the system is designed to convert uncertainty into trust through defaults:
Default visibility (“public unless changed”) signals normalcy.
Default connectivity (DMs open, recommendations on) signals social proof.
Default continuity (autoplay, streaks, push alerts) signals urgency.
Parents read these signals as safety vetting (“if it were harmful, they wouldn’t ship it like this”). Schools do the same, mistaking platform toolkits for policy. We drift into governance by optimism, the soft belief that good kids, good intentions, and a Big Brand equal “probably fine.”
Adolescence punctures that delusion. A father must learn a new alphabet to understand his child’s world. Another family learns, too late, that quiet defaults can compound into catastrophic outcomes. This isn’t a story about monstrous people; it’s a story about ambient harm in systems that reward speed, reach, and secrecy.
Governance Means Changing the System, Not Scolding the Users
If we accept that self-regulation can’t carry the load, we need structures that make safety the floor, not an afterthought. Think of this as moving from values-inspired statements to values-embedded design.
Minimum Viable Guardrails for Platforms
Safety by Default. For all minors: private by default, DMs off by default, geolocation off, recommendations limited to vetted contexts, no late-night push alerts.
Hard Friction on Risky Flows. Rate-limit resharing, disable forwarding for flagged topics, add cool-downs on conflict spikes, throttle virality for youth cohorts.
Age Assurance with Dignity. Privacy-preserving verification that doesn’t convert kids into data exhaust—paired with strict penalties for evasion tooling.
Independent Safety Cases. Annual, public “safety case” reports (like aviation) with third-party audits: incident rates, response times, design changes, and regression testing on harms.
Data Minimization for Minors. No ad targeting or behavioral profiling; delete sensitive signals by default; sunset youth data on graduation from minor status.
School Mode at the Switch. An institution-level control that can lock accounts to curriculum-safe settings (or suspend them) during school hours.
Policy That Bites (Without Breaking the Internet)
Duty of Care for Youth Products. Hold companies legally accountable for foreseeable youth harms resulting from modifiable design choices.
Dark-Pattern Prohibitions. Ban engagement-maximizing mechanics in products accessible to minors (streaks, infinite scroll, autoplay).
Platform Safety Audits as a License to Operate. Tie market access to regular independent audits and transparent remediation plans.
Real Penalties, Not PR. Fines indexed to revenue; executive liability for repeated failures; mandated product recalls for egregiously unsafe defaults.
Practical Governance for Families & Schools (That Actually Works)
Co-Use Over Surveillance. Sit with kids in their digital spaces weekly; decode together; practice reporting/blocking in real time.
Written Media Agreements. Times, places, and purposes; revisited each term. Devices out of bedrooms. Overnight charging in a common space.
Curfewed Defaults. Night mode that actually shuts down feeds and DMs for minors.
Courage Breaks. Teach a 10-second pause before posting or resharing—an intentional interrupt to counter design-induced urgency.
Neighborhood Watch, Not Moral Panic. A small, trusted adult circle that shares patterns (new slang, new scams) without shaming kids into silence.
What the Show Gets Right About Responsibility
One line in the behind-the-scenes reel that stood out for me was, “You can’t blame the parents.” Governance agrees and goes further. You also can’t delegate safety to parents in a system architected against them. Responsibility must be tiered:
Platforms are responsible for safe defaults and proving they work under adversarial conditions.
Policymakers are responsible for aligning consequences with incentives.
Schools are responsible for establishing culture and setting boundaries within their jurisdiction.
Families are responsible for presence and practice—not expertise in every new slang or exploit.
None of these absolves the others. That’s the point. Harm is networked; accountability must be, too.
From “Extraordinary” to Everyday
The show’s creator, Jack Thorne, and directors, Stephen Graham and Philip Barantini, insist they “didn’t want to make any particular character extraordinary.” Governance should do the same. We don’t design cars assuming perfect drivers; we build seatbelts, crumple zones, and speed limits, and we test them to ensure their effectiveness. Children’s online lives deserve at least that level of engineering seriousness.
If you work in tech, stop treating youth safety as a settings menu or a marketing campaign. Put your best designers and infra engineers on harm reduction, and publish your safety case like your uptime. If you’re a policymaker, align penalties with scale and move audits out of the lobbying shadow. If you’re a parent or educator, replace hope with presence plus boundaries—the two things defaults can’t do for you.
The camera is moving. The train is already in motion. Governance is the only way to build a door.


