The biggest internet companies frequently tout their commitment to child safety. Google talks about “giving kids and teens a safer experience online,” Amazon says it takes its “responsibility for the protection of children extremely seriously,” and TikTok promises “age-appropriate privacy settings and controls” for young people.
But the companies’ leading trade groups are sending a different message: They’re trying to kill legislation in multiple states designed to make kids safer online.
One strategy is litigation, with the tech-funded group NetChoice suing to block a landmark California child safety law before it takes effect in 2024. Another is lobbying, with tech companies and their trade groups scrambling to stop a slew of other states from advancing child safety bills, many of them based on the California model and another law passed in Utah. These bills could force online companies to make costly changes to their platforms.
A Tech Transparency Project (TTP) review sheds light on the tech industry’s campaign, which includes providing support to NetChoice’s California suit, fanning fears of litigation in other states, and arguing the child safety legislation is bad for families and small businesses.
The tactics give some insight into Big Tech’s evolving game plan when it comes to fighting regulatory threats, as courts and state legislatures take the lead in setting tech policy in the absence (so far) of any federal action on the issue.
Legislative State of Play
The California Age-Appropriate Design Code Act —signed by California Gov. Gavin Newsom (D) and set to take effect in July 2024—will require companies that provide online products or services that are “likely to be accessed” by kids under 18 to adopt a variety of safety measures. It was modeled on a U.K. law that came into force in 2020.
The California law would, among other things, require tech platforms to turn on the highest privacy settings by default for kids; proactively consider how the design of products could endanger minors; alert minors when their location is being monitored; and prohibit the use of “dark patterns” that trick minors into giving up personal information.
NetChoice sued over the law in December 2022, saying it violates the First Amendment by "by telling sites how to manage constitutionally protected speech." As the legal challenge plays out, lawmakers in at least nine other states have pushed bills that replicate or contain elements similar to the California measure.
Here’s a rundown of some of the key legislative activity:
- A group of Minnesota House Democrats introduced their own Age-Appropriate Design Code Act in Feb. 2023, which has drawn opposition from the tech-backed Computer & Communications Industry Association (CCIA) and the Chamber of Progress. The bill was later added to an omnibus commerce bill approved by the Minnesota House. Facebook whistleblower Frances Haugen is among those advocating for the measure.
- Opponents of the Minnesota bill include the Star Tribune newspaper, whose CEO Steve Grove previously served as director of Google News Lab. (The lab is part of Google’s efforts to fund and train local news organizations, which TTP has previously reported tracks with regulatory threats to the company.) The Star-Tribune has argued that that the Minnesota bill could “reduce the civic discourse in our state” and “easily bankrupt” the newspaper, and Grove has reportedly been personally lobbying lawmakers on the issue.
- Another California-style bill in Maryland passed the House of Delegates but failed to advance further before the state’s legislative session adjourned. A March hearing featured opposing testimony from NetChoice and three other tech-funded groups, the Chamber of Progress, TechNet, and the State Privacy and Security Coalition.
- A bipartisan trio of New Mexico state senators introduced a design code bill, based on the California measure, on Feb. 2. The bill passed the state Senate Tax, Business and Transportation Committee on an 8-0 vote on Feb. 24. The day before the committee vote, CCIA weighed in with a letter opposing the bill.
- In New York, lawmakers brought back elements of a California-style measure that died during the last session, reintroducing the language in a pair of new bills. Amazon and TikTok hired firms to lobby on the Senate version of the bill.1 TikTok, TechNet, and Tech:NYC—another organization funded by Big Tech—lobbied directly and via outside firms on the bill that died last year.2
- In January 2023, Connecticut state Rep. Gregory Haddad (D) introduced an age-appropriate design code bill, which CCIA and TechNet quickly opposed. A Chamber of Progress representative testified against the measure at a Feb. 28 hearing of the Connecticut General Assembly’s Joint Committee on General Law.
- New Jersey is also considering a pair of bills modeled on California’s design code law. Google and Meta are among the entities that have registered to lobby on the State Assembly version of the bill, which was introduced in December 2022. (An Assembly committee passed a separate bill that would prohibit social media companies from using practices or features that cause children to become addicted to their platforms.)
- Lawmakers in Nevada, Oregon, and Texas have also introduced bills with elements of California’s law, though a status update on the Nevada measure states that “no further action allowed.”
- Back in California, lawmakers have floated new measures that would bar social media from targeting minors with content that is harmful to their physical or mental health, and prohibit platforms from using a design or algorithm that facilitates drug use, suicide, or eating disorders in children.
In Utah, meanwhile, Gov. Spencer Cox (R) signed legislation that goes much further than California, requiring social media companies to verify the age of users; requiring parental consent for users under 18 to open a social media account; giving parents access to minor accounts; and blocking minors from accessing accounts from 10:30pm to 6:30am unless authorized by a parent.
Another bill signed by Cox would prohibit social media companies from using a design or feature that causes a minor to become addicted to the platform, and creates a so-called private right of action allowing users to sue a social media company over “any addiction, financial, physical, or emotional harm” they suffered as a result of using the service.
A number of states are considering or advancing Utah-like legislation that gives parents control over their children’s social media:
- In Arkansas, Gov. Sarah Huckabee Sanders (R) signed a bill that that includes age verification and parental consent requirements for social media. A proposal included in Ohio Gov. Mike DeWine’s (R) 2023-24 executive budget includes similar provisions.
- The Texas House recently passed a bill that would require social media platforms to obtain parental permission for a minor to open a social media account, allow parents to request data from their kids’ social media accounts, and prohibit platforms from promoting content that could cause physical or emotional harm to children. A Meta representative testified against the measure.
- Louisiana state Sen. Patrick McMath (R) also introduced a measure that requires social media to do age verification and get parental consent before providing an account to a minor, defined as kids under 16. In addition, the bill would give parents access to minor accounts, prohibit minors from direct messaging on social media accounts, block minors from accessing social media between 10:30pm and 6:30am, and establish a “private right of action” for users to sue social media companies. A Wisconsin legislator, state Rep. David Steffen (R), is proposing a similar idea.
- Others in this category: an Iowa House measure that would require social media platforms to obtain parental consent before collecting, using, or disclosing personal information about kids under 18, and a California bill that would allow parents to monitor their children’s social media accounts via third-party software.
In letters and testimony to state lawmakers, Big Tech-funded groups deploy many of the same talking points. They say the bills’ age verification or estimation requirements would force websites to collect more data on children and other users, violating their privacy. They also warn the bills would impede the ability of children and teens who are racial minorities, LGBT, or part of other marginalized groups to find supportive communities and resources online.
But the tech groups frequently insert another theme into their message to lawmakers: suggesting that NetChoice’s litigation in California creates legal uncertainty around the issue of age-appropriate tech regulations—and that states should hold off for that reason.
Testifying at a Feb. 28 hearing on Connecticut children’s bill, Jess Miers, legal advocacy counsel for the Chamber of Progress, urged lawmakers to “steer away from adopting similar legislation like California’s Age-Appropriate Design Code, which is currently facing a constitutional challenge.” CCIA, in letters to Minnesota and New Mexico lawmakers, used similar language, warning lawmakers that the bills they are considering raise “constitutional concerns.”
In a March 2 letter to Utah Gov. Cox urging him to veto the state’s children online safety bills, NetChoice Vice President and General Counsel Carl Szabo noted his group’s lawsuit in California, adding, “To avoid unnecessary First Amendment litigation, the legislature should at least wait until this lawsuit is resolved to advance HB 311.” Implicit in his message was a warning that NetChoice—or another tech group—could sue over the measure. NetChoice recently launched a “litigation center” for the tech industry to coordinate lawsuits and amicus briefs.
A representative of TechNet, testifying in opposition to Maryland children’s safety legislation on March 8, noted that the “the California Age-Appropriate Design Code is currently under litigation for various violations,” adding that “the best way to keep young people safe online is to promote education of safe internet practices.” She did not mention that her fellow witness on the panel—NetChoice's Szabo—represents the group that is suing California.
Miers, of the Chamber of Progress, issued another litigation warning to Minnesota over that state’s child safety bill. In an April 28 blog post, she wrote that “it seems likely that [Minnesota] AG Keith Ellison will face an expensive legal challenge on his home turf.”
Lobbying in Disguise
Szabo’s testimony at a Maryland legislative hearing illustrates another tactic that Big Tech is using with the children’s safety bills: hiding its agenda behind more sympathetic figures.
During a March 8 appearance before the Maryland Senate Finance Committee, Szabo failed to identify himself as the vice president and general counsel of NetChoice, instead describing himself as a “lifelong Maryland resident” and “parent” whose wife is a child therapist. He told lawmakers he was “nervous” about the bill because it will “really harm my family, this will really harm my kids’ ability to be online, really harm my neighbors.”
When Maryland state Sen. Benjamin Kramer (D), a committee member and one of the sponsors of the measure, called out Szabo for not identifying himself as a representative of Big Tech, Szabo insisted, “I don’t work for Big Tech” and described NetChoice as a “small business” with 11 employees and revenue that is “not that big.” That’s a dubious characterization, given that NetChoice—whose members include Amazon, Google, Meta, and TikTok—had annual revenue of more than $14 million in 2021, according to ProPublica’s Nonprofit Explorer.
Szabo’s reference to small business is part of a pattern in tech industry lobbying. Groups linked to the biggest internet companies often hold up small businesses as the real victims of tech regulation, even rolling out hand-picked small business owners to parrot Big Tech’s talking points to lawmakers and regulators. This creates a more sympathetic set of characters to push the industry’s agenda while making it seem as though Big Tech’s positions—aimed at protecting the industry’s bottom line—have grassroots support.
CCIA has adopted this tactic in multiple letters sent to lawmakers in states considering child safety bills:
Ambiguous and inconsistent regulation at the state level would undermine this business certainty and deter new entrants, harming competition and consumers. This particularly applies to new small businesses that tend to operate with more limited resources and could be constrained by costs associated with compliance. While larger companies may be able to more easily absorb such costs, it could disproportionately prevent new smaller start-ups from entering the market.
Many of the trade groups active in the lobbying against state privacy bills are also rushing to support NetChoice in its litigation to block the California law.
The groups include the Chamber of Progress, which is backed by the major tech companies and headed by a former Google policy executive; the Computer and Communications Industry Association (CCIA), which also has a Big Tech-heavy member roster; and the Chamber of Commerce, which has received funding from Amazon, Google, and Facebook. All submitted amicus briefs supporting NetChoice’s arguments in the case.
NetChoice also got support from Eric Goldman, a law professor at Santa Clara University. Goldman is a faculty member at the university’s Markkula Center for Applied Ethics, which received funding from Google as part of a cy pres legal settlement involving Google Buzz. He’s taken positions friendly to Big Tech on matters ranging from anti-sex trafficking legislation to the recent Supreme Court case that threatened liability protections for online platforms.
On its website, NetChoice boasts other declarations of support from employees of IMDB and Goodreads, which are both owned by Amazon, as well as Mike Masnick, editor of the Techdirt blog and founder of the Copia Institute, a think tank that counts Google as a corporate supporter. The IMDB and Goodreads employee declarations don’t mention the services are owned by Amazon.
1 https://reports.ethics.ny.gov/publicquery/ Search for the bill number “S3281” and Year “2023”
2 https://reports.ethics.ny.gov/publicquery/ Search “S9563,” Year “2022”