Big TechBreaking NewsChildrenCultureonline safetypornSupreme CourtUS

Age blockers can’t save kids from porn

As a parent, I was radicalized on the topic of pornography in 2023 by a long X post by the novelist (and UnHerd contributor) Jordan Castro. It began: “I started watching porn when I was 11.” Castro wrote that he stumbled on a porn video by accident at that age, and though he was too young to experience real sexual pleasure, he found the porn “thrilling and addictive,” all the same. He believes that “the intense and hardcore nature of it fried my brain almost instantly.”

Jordan watched porn “almost daily” as a teen and in his 20s, and overlaid porn scenarios onto all his sexual experiences. When he finally quit, he noticed that he “had more energy and ambition, significantly less social anxiety; I felt more tenderly toward my now-wife; I wasn’t as tired in the morning; my thinking was clearer.” Castro noticed during the quitting process that his impulses to view porn occurred not when he was “horny,” but when he was “bored or stressed.”

This, and the statistics on teenage porn use — the average American boy is exposed at age 13 — are enough to horrify any parent and to produce the strong conviction that something should be done. “It’s becoming a bigger topic,” says Gail Dines, the CEO of Culture Reframed, an organization of scholars addressing the harms of pornography to youth. “This has become a public-health issue.”

Today, the something being tried is age-verification laws, enacted by 20 states, though most have been blocked by legal challenges until recently. In June, the US Supreme Court upheld a Texas law, one of the nation’s most aggressive, meaning these laws will likely eventually take effect. Utah Sen. Mike Lee has also proposed national legislation along the same lines. Our strong convictions, however, might be our downfall in terms of preventing kids from watching porn, and the parent who really wants something done would do well to pause and think it through.

Legally, the debate on age verification has centered around the First Amendment. The latest decision, Free Speech Coalition Inc. v. Paxton, held that the government’s interest in protecting children from porn outweighs adults’ rights for unfettered access to this form of speech. The decision lowered previous standards of scrutiny because, as time has shown, alternatives to age-verification, such as content-filtering software, haven’t worked. None of this is wrong, per se, but it’s a narrow approach by definition. Judges respond to existing law; lawmakers respond to constituent demand; neither group is necessarily thinking globally about the whole picture.

Since being galvanized by Castro’s post, I have been exploring the available parental-control options to protect my teenage son from pornography, and I’ve been struck by the disconnect between the solutions and basic realism regarding human behavior.

There are two flavors of child-protection: one involves greater surveillance than the other. In the first model, the parent can enable “family sharing,” which allows them to personally monitor their child’s devices, or they can use an app, which monitors the devices for them using AI. (All of my research has been done on Apple devices.) This requires a parent to be comfortable with total child surveillance — I am not — and to have a proactive, tech-savvy personality.

The less-intensive approach is to set up blacklisting on adult content, either on your home router or on your child’s devices. The router option is fairly secure, but it only catches devices on the home WiFi, and requires a high degree of confidence with technology (starting with knowing which box is the router). The option on the device itself is an easier setup but is inconvenient to manage and not very secure. I, for example, locked my son’s laptop using an admin account, and then had to unlock it every time he needed to do daily things like downloading a math app, which always occurred when I was cooking dinner. Then I lost the password and my enraged son went online and figured out how to delete the admin account.

None of these methods is designed for a wide range of people to easily use them. To be clear: all children should be protected from pornography, not just those who live in states with age-verification laws, or those who have tech-savvy parents. One problem is that most of the solutions, in the fine tradition of vapid safety-ism, are designed to do much more than combat porn. Years ago, my brief experiment with setting up a child account on Apple for my daughter ended when it prevented her from using Instagram or watching YouTube. These may be commendable goals, but they bundle a necessity that almost everyone agrees on — no porn for kids — with more ambiguous cases. Each additional restriction is a new hassle, and a new incentive for the parent to just give up.

The numbers speak to the successfulness of this approach — or the lack thereof. Apple doesn’t release information on how many users opt for family sharing, but one suspects because it’s low. A May 2025 study from the Family Online Safety Institute found that parental control adoption “remains low,” at less than 50 percent, and is inconsistent across devices, with higher adoption on the tablets. And online-safety apps cover a negligible share of children. Bark, one of the more popular, has only 7.5 million users, mostly in the US.

“We’ve set ourselves back another generation of children addicted to porn while we figure that out.”

Given all this, government-mandated age-verification might seem like the solution to concerned parents. Louisiana state Rep. Laurie Schlegel, the champion of the first such law in the country, told me that, like me, she first became interested in the topic after hearing personal testimony from a young person negatively impacted by porn — in her case a 2021 appearance by Billie Eilish on the Howard Stern Show. Eilish told Stern that she started watching porn at age eleven, which gave her nightmares and had damaging effects on her later sex life.

Schlegel had been observing her state’s usage of LA Wallet, a digital-ID app, and suspected that Louisiana had the capacity to anonymously implement age verification. “The technology is very much available where you can protect someone’s privacy,” Schlegel says. “All [LA Wallet] does is send a signal to the website. It’s not your ID. It just says Hey, this user is over 18.” Louisiana’s law, like the Lone Star State’s, relies for enforcement on the government’s ability to sue companies not in compliance.

But this, too, fails to take realistic human usage into account. The rub, as opponents have pointed out, is that most people just after searching “anal,” “gang bang,” or whatever, really don’t want to provide their identification, even if it’s supposedly anonymous and secure. Comparisons such as those to age verification for online gambling fall flat. Paid porn users already provide their credit-card information, but paid users are an unknown and smaller percentage of total users. Some paid content is also à la carte, meaning the users have been drawn in by free content and then convinced to upgrade while their defenses are down.

PornHub has argued that age requirements will result in users navigating to other sites that the law hasn’t yet caught up with. This is self-interested but also plausible: between ravenous user demand and porn-supplier self-interest, the laws could easily provide powerful incentives for the already shady porn market to move away from porn supplied by large, centralized, easy-to-sue companies into less policeable structures and jurisdictions. It would be nice to think these laws will work, but we’re guessing.

The reply of the frightened parent is: who cares, let’s try it. But if the laws break up the big, easily accessible aggregators, we’ve set ourselves back another generation of children addicted to porn while we figure that out. And we’ve also lost the potentially potent blacklisting tool, which works as well as it does because the companies are well-known and stable. As I look at my family’s situation, I see porn not as a shadowy globe-spanning operation that should be conquered lawsuit by lawsuit, but as a problem that enters our home on specific devices, each one from a massive company with enormous technological capabilities and no realistic ability to disband and become fly-by-night. Our lawmakers should make them responsible.

I’ve programmed my home router, which is rented from Spectrum, to use OpenDNS, an IP solution that blocks traffic to blacklisted adult sites. OpenDNS is one of many services that can block a high percentage of porn sites (estimates run up to 98%), which is already a huge improvement, especially since it gets the big ones kids are more likely to know. One imagines that with more resources thrown behind it, the black list could be even more comprehensive. (OpenDNS is owned by Cisco, which did not reply to my requests for comment.)

So why can’t the option of a porn-locked router be offered at the point of sale? Many parents might choose this if they had the easy option. And it’s much less onerous for the adult porn-user to just choose an adult-content router in the first place, rather than supplying an ID every time they want to watch adult content. Routers can also be programmed to porn-block some devices in the home and not others, so the adults can flourish, while the children are safe. Custom-programming your own router is too much for the average person, but it could easily be done by the ISP’s tech support, either on the phone or at installation — if companies were required to offer this service by law.

Our computers, phones and game consoles could also, I suspect, come with adult-content blocking, to be enabled or not at the point of purchase. An AI-fueled app like Bark costs $14 a month, which to some extent reflects the computing-cost of incessantly scanning for adult content but could probably cost less if it were being powered by Apple. (Bark does much more than this, which is fatal mission creep to my purposes, but this is the fine print.) Apple and similar firms could create a widget like Bark, opted into or out of at point of sale. Or Apple could invent something better if it were legally incentivized to do so, and if its engineers were reigned in from stupid bells and whistles. In addition, all internet browsers could be legally required to come locked (via the existing and powerful blacklist technology), with ID verification required to unlock them.

It’s unconscionable that the tech companies themselves haven’t already taken the lead on this. Our wealthiest and most powerful companies pump unrestricted access to hardcore porn to our children, scot-free. A conspiracy-minded person might suspect that the tech companies want it this way, because the rumors about technology uptake being mostly fueled by porn consumption are true, and they want to addict us young. A less apocalyptic theory is simply that the bros designing this stuff don’t really have their hearts in it and have no sense of the average, tech-phobic American family experience.

Blacklisting isn’t perfect, and a clever child could probably get around many of these features. But that’s why limiting them to pornography is so essential, giving clever children no reason to do so unless they’re truly determined to navigate to actual porn sites. Don’t block popular video games. Leave Instagram and TikTok alone. Don’t pointlessly require parental approval for the math app. Don’t make life so inconvenient for the porn users that they find new and unregulated delivery methods. And save little Jordan Castro.


Source link

Related Posts

1 of 57