Parenting has always been marked by worry and guilt, but in the age of social media, parents are increasingly faced with a distinctly acute kind of helplessness. Their children are the unwitting subjects of a remarkable experiment in human social forms, building habits and relationships in an unruly environment designed primarily to maximize intense engagement in the service of advertisers.
It’s not that social media has no redeeming value, but overall it’s not a place for kids. If Instagram and TikTok were physical spaces in your neighborhood, you’d probably never even let your teenager go there alone. Parents should have the same say over their children’s presence in these virtual spaces.
You may have the vague impression that it would be impossible, but it is not. There is a plausible, legitimate and effective tool available to our society to hold parents accountable against the risks of social media: we should raise the age requirement for social media use and give it some real teeth.
It might surprise most Americans that there is an age requirement. But the Children’s Online Privacy Protection Act, enacted in 1998, prohibits US companies from collecting personal information from children under 13 without parental consent, or from collecting more personal information than they have. necessary to operate a service for children under 13. , this means that children under the age of 13 cannot have social media accounts, as the platforms’ business models all depend on the collection of personal data. Technically, major social media companies require users to be over 12 years old.
But this rule is systematically ignored. Nearly 40% of American children between the ages of 8 and 12 use social networks, according to a recent survey by Common Sense Media. Platforms usually ask users to certify themselves that they are old enough, and they have no incentive to make it hard to lie. On the contrary, as a 2020 internal Facebook memo leaked to The Wall Street Journal made clear, the social media giant is particularly keen to appeal to “tweens,” whom it sees as “a valuable but untapped audience.”
Quantifying the dangers involved has been a challenge for researchers, and there are certainly those who say the risks are overstated. But there is evidence that social media exposure also causes serious harm to tweens and older children. The platform companies’ own research suggests so. Internal documents from Facebook – now known as Meta – regarding teenage use of its Instagram platform point to real concerns. “We’re making body image issues worse for one in three teenage girls,” the researchers noted in a leaked slide. Documents have also highlighted potential links between regular social media use and depression, self-harm and, to some extent, even suicide.
TikTok, which is also hugely popular with tweens and teens, has – alongside other social media platforms – also been linked to body image issues and issues ranging from muscle dysmorphia to Tourette-like syndrome. , sexual exploitation and various deadly stunts. Older issues like bullying, harassment and conspiracy are also often amplified and exacerbated through the mediation of children’s social life platforms.
Social networks also have advantages for young people. They can find connections and support, discover things, and whet their curiosity. In response to reports critical of its own research, Facebook noted that it had found that, by some metrics, Instagram “helps many teens struggling with some of the toughest issues they face.”
Platform access restrictions would entail real costs. But, as Jonathan Haidt of New York University put it, “the preponderance of evidence currently available is disturbing enough to warrant action.” Some teenage social media users also see the problem. As one of the leaked slides from Meta puts it, “Young people are acutely aware that Instagram can be bad for their mental health, but are forced to spend time on the app for fear of missing out on cultural trends and social.”
This balance of pressures must change. And as journalist and historian Christine Rosen has noted, preaching “media literacy” and monitoring screen time won’t be enough.
Policy makers can help. Raising the minimum age of the Children’s Online Privacy Protection Act from 13 to 18 (with the ability for parents to verifiably approve an exemption for their children, as the law already allows ), and by providing effective age verification and meaningful penalties for platforms, Congress could give parents a powerful tool to fend off pressure to use social media.
Reliable age verification is possible. For example, as political analyst Chris Griswold has proposed, the Social Security Administration (which knows exactly how old you are) “could offer a service whereby an American could type in their Social Security number on a secure federal website and receive an anonymized code via email or SMS”, like the two-factor authentication methods commonly used by banks and retailers. With this code, platforms could confirm your age without obtaining any other personal information about you.
Some teenagers would find ways to cheat, and the age requirement would be porous at the margin. But the attractiveness of platforms is a function of network effects – everyone wants to be connected because everyone is connected. The age requirement only has to be passably effective to be transformative – as the age requirement takes hold, it would also be less true for everyone to be on.
Real age verification would also make it possible to more effectively restrict access to online pornography, a vast, dehumanizing scourge that our society has inexplicably decided to pretend it can’t do anything about. Again, free speech concerns, whatever their merits, surely do not apply to children.
It may seem strange to address the challenge of children’s use of social media through online privacy protections, but this route actually offers distinct advantages. The Children’s Online Privacy Protection Act already exists as a legal mechanism. Its framework also allows parents to opt for their children if they wish. It can be a painstaking process, but parents who strongly believe their kids should be on social media might afford it.
This approach would also solve a fundamental problem with social media platforms. Their business model – in which personal information and user attention are the essence of the product that companies sell to advertisers – is key to explaining why the platforms are designed in ways that encourage addiction, aggression , bullying, conspiracies and other anti-social behavior. If companies want to create a kid-friendly version of social media, they’ll have to design platforms that don’t monetize user data and engagement in this way – and therefore don’t imply these incentives – and then let the parents see what they think.
Empowering parents is really key to this approach. It was a mistake to leave children and teenagers on the platforms in the first place. But we are not powerless to correct this error.
Yuval Levin, opinion columnist, is editor of National Affairs and director of social, cultural and constitutional studies at the American Enterprise Institute. He is the author of “A Time to Build: From Family and Community to Congress and the Campus, How Recommitting to Our Institutions Can Revive the American Dream”.