Anti-porn bills in 8 states could force device makers to censor sexual material

Eight states are weighing anti-pornography bills that would require phone and tablet makers like Apple and Samsung to automatically enable filters that censor nude and sexually explicit content.
The only way to disable filters, according to bills introduced this year, would be to use passcodes. Providing such a password to a child would be prohibited unless done by a parent.
Specifically, the bills state that phone filters must prevent children from downloading sexually explicit content through mobile data networks, manufacturer-owned and controlled apps, and wired or wireless Internet networks.
Many device manufacturers already offer adult content filters, although it’s not common to enable them by default. Many phone manufacturers, for example, make it easy for parents to enable filters on web browsers that prevent children from browsing to websites known to host pornography.
In recent years, some phone makers have added sophisticated filters that use artificial intelligence to censor individual images on certain apps.
One of these anti-pornography bills was passed in 2021 in Utah, but can only go into effect if five more states pass similar laws – a provision included to prevent Big Tech companies from isolating the status after the adoption of the law. This year, lawmakers in Florida, South Carolina, Maryland, Tennessee, Iowa, Idaho, Texas and Montana are all considering versions of the bill, the Montana versions and of Idaho being the most advanced in the process.
In interviews with NBC News, the authors of the original bill — representatives from the National Center on Sexual Exploitation and Protect Young Eyes, two advocacy organizations focused on child safety — said the original intent The model bill was to require device manufacturers to automatically enable adult filters for web browsers and not for other apps. These filters were already on phones, but not enabled by default, in 2019 when the bill was first created.
But Chris McKenna, founder and CEO of Protect Young Eyes, acknowledged that the legislation could also end up applying to other device-level filters created in recent years that some might consider more invasive.
In 2021, Apple introduced filters on devices that can scan messages for nudity, blurring any images of suspected nudity for people with filters enabled. The filter, which can be enabled for children by an adult administrator, also offers to connect users to parents or help resources.
Most of the state bills under consideration would make device makers subject to criminal and civil penalties if they failed to automatically activate “industry standard” filters. The bills do not define what this standard is or whether email filters are included.
McKenna said that “intent is meant to point to browsers and the [search] engines with filters already in place. But, he noted, “you wouldn’t find me upset if they chose to enable this for iMessage.”
Benjamin Bull, the general counsel for the National Center on Sexual Exploitation, said that when he drafted the language of the original model bill, it was designed to narrowly address the issue of child access to Internet pornography in a way that avoids potential legal challenges.
After it was written, Bull says NCOSE provided it to various interested parties across the country and eventually found a home in Utah.
“We gave it to some voters in Utah who took it to their lawmakers, and the lawmakers liked it,” Bull said.
Since the bill passed in Utah, Bull and McKenna said, various interested parties have contacted them with the goal of getting the bill passed in their own states.
“I mean, almost daily, voters, legislators. ‘What can we do? We are desperate. Do you have an invoice template? Can you help us?’ And we said, “actually, we do,” Bull said.
Erin Walker, director of public policy for the Montana Project STAND child safety organization, said she learned about the bill through a presentation McKenna gave to a child safety coalition. She said she contacted McKenna, who helped her introduce him to Montana lawmakers.
The bill, she noted, was one of a series of Montana laws targeting pornography.
“In 2017, we passed HB 247, which establishes that showing sexually explicit material to a child constitutes sexual abuse. And then in 2019, we passed a resolution declaring pornography a public health danger in the state of Montana,” she said.
The bill is also part of a wave of laws across the country aimed at regulating Big Tech.
“I think it’s just that Big Tech doesn’t want to be regulated,” Walker said. “We need to convince lawmakers that there is an appropriate amount of regulation in every industry.”
Proponents of the bills say signing them into law would be an extra step for tech companies to take, saying new filters wouldn’t be needed and other onerous procedures like age verification wouldn’t be necessary. not necessary due to the way most invoices are written.
But state-to-state language differences have created questions for watchdogs about what hurdles manufacturers might have to jump to meet each state’s requirements.
The wording of the Montana bill, for example, seems to suggest that age verification would be required for manufacturers to avoid potential lawsuits or lawsuits. According to the bill, a manufacturer is liable if “the manufacturer knowingly or recklessly provides the access code to a minor”.
Samir Jain, vice president of policy at the Center for Democracy and Technology, said the inclusion of such language raises issues regarding user privacy and data protection. In theory, manufacturers could be forced to collect age data from customers, via government ID and other forms of identification.
“There’s no restriction as such on how providers can then use that data for other purposes. So even the kind of age verification aspect of that I think creates both burdens and raises privacy issues,” Jain said.
Jain also noted that filter bills create concerns about free speech.
“I think we have to recognize that filters like these, with current technology, are far from perfect. They can’t distinguish, you know, for example, nudity that’s lustful or sexual in nature from nudity for artistic or other purposes, which the bills seek to at least exempt from regulation,” Jain said. “So any requirements placed on filters will necessarily mean that those filters will pick up many images and other even the authors of the bill would say they shouldn’t be restricted.”
Jain said he believes that given the nuances of what is acceptable for children of different ages, filters should be deliberately set and applied by parents themselves.
“What’s appropriate or useful for a teenager versus a 6-year-old is quite different,” he said. “That’s why I think providing different kinds of tools and capabilities that can then be adapted, depending on the circumstances, makes a lot more sense than some kind of crude mandatory filtering.”
nbcnews