Key findings:
- Meta uses an array of influence tactics to try to shape the public narrative around kids and social media. The goal is to counter growing concerns that sites like Instagram can be harmful to teens’ health and safety.
- Meta funds a collection of parent and child safety groups, including the National PTA, that go to bat for its initiatives involving kids. This gives a sheen of expert approval to the company’s efforts to keep young users engaged on its platforms.
- Meta has also created something called the Trust, Transparency & Control Labs that publishes reports in support of its kid-focused products. Meta has at times framed these “labs” as a separate organization to regulators and others.
- Meanwhile Meta has funded an array of academic research projects that foster a more benign view of Instagram, helping to support the company’s contention that academic research is inconclusive on the topic of social media’s impact.
When Meta introduced a package of new and existing safeguards for teen accounts on Instagram last September, critics in Congress and the child advocacy world slammed it as an attempt to sap momentum from kids online safety legislation. But Meta’s announcement of the “Instagram Teen Accounts” included a glowing quote from Yvonne Johnson, president of the National PTA, the parent-teacher association with more than 20,000 local branches nationwide.
“[O]ur association applauds Meta for launching Instagram Teen Accounts,” said Johnson, who also appeared at Meta’s launch event for the safeguards in New York. Meta, she said, is “taking steps to empower parents and deliver safer, more age-appropriate experiences on the platform.”
Not mentioned in the announcement: Meta is a “national sponsor” of the National PTA, making annual investments of undisclosed size in the organization.
Meta’s support for child advocacy groups that go to bat for the company in public relations campaigns is just one part of a sophisticated operation the company uses to try to shape the narrative around kids and social media. Once a low-profile topic, the issue of how social media affects children and their mental health has taken center stage in Washington and in state houses across the country, posing an existential threat to Meta’s business and its effort to retain young users.
A new investigation by the Tech Transparency Project sheds light on the extensive and often opaque apparatus that Meta uses to counter growing concerns that sites like Instagram can be harmful to kids’ health and safety. In addition to funding a network of child safety groups, the company created a "Trust, Transparency & Control Labs" unit that publishes reports in support of its kid-focused products. Meta has also funded academic papers that find positive use cases for Instagram, helping to support the company’s contention that research is inconclusive about whether social media is harmful.
These tactics all appear to have one goal: Stopping or slowing efforts to regulate social media in ways that could hurt Meta’s bottom line.
Meta did not respond to questions about how much funding it provides to the National PTA and other parent and child safety groups. In a statement, Meta spokeswoman Liza Crenshaw said the company is "proud to work with entities that support parents in navigating online safety."
She added: "All the entities clearly disclose on their websites that they receive support from Meta, and Meta discloses its partnership with these entities on its Family Center and Help Center, for example. Meta's logo is prominently displayed across TTC Labs website, and in reports it produces."
Paid allies
Meta and its CEO Mark Zuckerberg have been preoccupied for years with holding the attention of teens, who represent a highly valuable demographic for advertisers. But that push has run headlong into growing revelations that Facebook has known about—and ignored—the harms its products can cause young people.
Some of the most damaging information has come from inside Meta itself, from whistleblowers who have shared internal company documents with the media and Congress and spoken about what they encountered behind the scenes. Chief among these is Frances Haugen, the former Facebook product manager whose trove of internal company documents revealed, among other things, that researchers at Instagram found the app worsened body image issues for one in three teen girls. Haugen was followed by whistleblowers Arturo Bejar and Sarah Wynn-Williams, who made further allegations that Meta failed to protect teens.
A flood of litigation has also produced insights into how Zuckerberg and other Meta executives allegedly ignored pleas by employees for greater child safety measures. Last year, the U.S. surgeon general called for a warning label on social media platforms stating that they may be harmful to adolescents, and lawmakers in Congress and multiple states have moved to pass laws regulating use of social media by children. At a Senate hearing in January 2024, Zuckerberg apologized to parents who say social media contributed to their children’s suicides or drug overdoses. Meanwhile, shows like The Social Dilemma and Adolescence have shined a spotlight on the potential dangers of social media use to people and society at large.
Amid this cascading series of public relations hits, Meta has leaned heavily on a collection of parent and child safety groups that it directly supports. Meta often features these groups in high-profile campaigns related to children and in communications with government regulators, giving a sheen of expert approval to its initiatives.
The National PTA has been a key player in Meta’s efforts to paint a rosy picture of its approach to kids—an approach that relies heavily on parents to police their children’s social media use. In addition to taking part in the rollout of Instagram Teen Accounts, the PTA collaborated with Meta on a parent’s guide to Instagram that touts the company's teen safeguards and held a series of Meta-sponsored community events across the country to promote them.
The PTA continues to push the Instagram Teen Accounts despite reporting by The Washington Post that the safeguards failed to shield teens from content related to sex, alcohol, and drugs. More broadly, the PTA’s close relationship with Meta appears to be out of step with hundreds of school districts around the country, which are pursuing lawsuits against Meta and other tech platforms over design features they say addict students and harm their mental health. Meta denies wrongdoing and is seeking to have the cases dismissed.
National PTA President Yvonne Johnson supplied a positive quote for Meta's rollout of its "Instagram Teen Accounts." The announcement did not mention that Meta is a corporate sponsor of the group.
National PTA President Yvonne Johnson supplied a positive quote for Meta's rollout of its "Instagram Teen Accounts." The announcement did not mention that Meta is a corporate sponsor of the group.
In 2017, Meta invoked the National PTA to provide air cover during the rollout of “Messenger Kids,” its messaging app for kids under the age of 13. Meta said it saw a need for such a product after consulting with the PTA and other groups. Messenger Kids soon came under intense criticism from public health advocates, who warned the app promoted excessive screen use and threatened to “undermine children’s healthy development.” The Federal Trade Commission later accused Meta of misleading parents about their ability to control who their children communicate with on the Messenger Kids app. The agency said the company’s actions violated a 2020 privacy order and proposed to bar Meta from profiting from data on any users under 18. Meta has contested the proposed change and is also challenging the constitutionality of the FTC’s overall structure and process.
The National PTA has listed Facebook as a corporate sponsor since 2018, archived webpages show. That year, the company sponsored a PTA initiative to educate families about “digital safety tools and resources.” But the relationship extends back at least 15 years. In 2010, Facebook and the National PTA announced a partnership that involved “an in-kind Facebook commitment equivalent to $1 million.”
National PTA did not respond to questions about the total amount of funding and payments it has received from Meta, but the group provided a statement that read, in part, "Our approach is to seek a seat at the table anywhere decisions are being made about kids, and to be a strong, clear voice for parents and children."
The statement continued, "Our association does not endorse any social media platform, and we are clear about this in our communications. We are not seeking to promote the increased use of any particular app or platform, but we do believe that if families are going to allow their teens to be on a particular app or platform, then it is crucial for everyone in the household to know how to safely navigate that platform."
Another paid Meta ally is ConnectSafely, a Silicon Valley nonprofit devoted to online safety, privacy, and “digital wellness.” In an essay that accompanied the launch of Messenger Kids, ConnectSafely CEO Larry Magid wrote, “I think of Messenger Kids as training wheels for social media and messaging. It’s also like a sandbox. Kids can injure themselves in sandboxes, but they’re a lot safer than playing on the street.” Later, after the messaging app encountered blowback from public health advocates, he defended Meta’s effort in an op-ed, saying he sees products like Messenger Kids “more as part of the solution rather than as part of the problem.”
Magid stated that ConnectSafely advises and receives support from Facebook, though it is not clear how much funding the group has received from the company. Meta disclosures show it has supported ConnectSafely since at least 2017. During the rollout of Messenger Kids, Meta said ConnectSafely receives an honorarium as a member of the company’s Safety Advisory Council.
ConnectSafely has been a prominent partner on other child-related moves by Meta. In 2023, when Meta officially welcomed teens aged 13 to 17 into Horizon Worlds, its virtual reality-based social network, the company touted safety features developed with input from ConnectSafely and other groups. (Meta later lowered the age requirement for Horizon World to 10 years.) As part of this rollout, ConnectSafely produced a parent’s guide to the V.R. product that is featured prominently in Meta’s “family center.”
While Meta offers parental control features for Horizon Worlds and says it keeps minors out of adult spaces in the so-called metaverse, the company’s efforts to pitch its virtual reality product to younger and younger users has drawn concern from child safety advocates who fear the psychological impact on kids. Media reports suggest children have been active for years in Horizon Worlds—well before Meta officially opened the metaverse to under-18 users—where they risk exposure to harassment and sexual content. A Meta whistleblower, Kelly Stonelake, alleged that it was widely known at the company that underage children were accessing Horizon Worlds by misrepresenting their ages.
ConnectSafely also created a parent’s guide for Meta’s AI chatbot, which rolled out across the company’s platforms in April 2024. The parent’s guide states that Meta AI is “available to everyone in the US” and has guidelines that “tell a generative AI model what it can and cannot produce,” suggesting the existence of safeguards for young users. But The Wall Street Journal, during tests of Meta AI starting in January 2025, found that the chatbot engaged in adult sexual role play with Instagram accounts registered to teens as young as 13. According to the Journal, Meta, under pressure from Zuckerberg to make the chatbot less boring, created a carveout for romantic role play in the AI tool. Meta called the newspaper’s testing manipulative but later restricted sexual role play activity for accounts registered to minors, the Journal reported.
ConnectSafely CEO Larry Magid defended Meta's Messenger Kids app after it encountered blowback from more than 100 public health advocates.
ConnectSafely CEO Larry Magid defended Meta's Messenger Kids app after it encountered blowback from more than 100 public health advocates.
Meta has pointed to its work with ConnectSafely in arguing against regulation of social networks. In a 2023 declaration in support of NetChoice, a Big Tech trade group suing the state of Utah over its landmark Social Media Regulation Act, Meta’s global safety chief Antigone Davis talked up the ConnectSafely parent’s guide for Instagram as evidence of Meta’s commitment to teen safety. The declaration said Meta “collaborates” with ConnectSafely but did not mention it provided financial support to the group. A federal judge last fall issued a preliminary injunction blocking the Utah law from taking effect.
ConnectSafely did not respond to questions and a request for comment.
In Australia, where lawmakers have been active in regulating social media, Meta has frequently pointed to its work with an anti-cyberbullying group called PROJECT ROCKIT, which is funded by Meta, to portray itself as committed to child safety.
In feedback submitted to the Australian government on legislation known as the Online Safety Act, Meta talked up its voluntary efforts to protect young users, including donating $1 million to a PROJECT ROCKIT anti-bullying initiative. Shortly after the Online Safety Act came into force in 2022, giving the government new powers to force tech platforms to take down abusive content, Meta said it would provide new funding to PROJECT ROCKIT to consult with young Australians on its plans for the metaverse. PROJECT ROCKIT later produced a report, based on paid interviews with hand-picked consultants aged 18 to 25, that gave a series of broad principles for the metaverse and praised Meta’s “commitment to co-designing the metaverse responsibly.”
In 2023, as Meta officially welcomed teens into its metaverse product, Horizon Worlds, PROJECT ROCKIT published a Metaverse Youth Safety Guide. The guide, which indicates it was “supported by Meta,” gives a high-level briefing on virtual reality and Horizon World’s control settings, framing the metaverse in an upbeat tone. (“It’s not just a place to consume info, but a place to connect with others, build relationships, and create content in real-time.”) Months later, the Australian government’s eSafety commissioner, citing the results of a public survey it conducted, said “a significant proportion of metaverse users may encounter harmful experiences,” including receiving unwanted or inappropriate messages. These experiences, the report said, “can have negative impacts” on people’s health and well-being.
Like the National PTA president, PROJECT ROCKIT CEO Lucy Thomas supplied a positive quote on Meta's announcement about its Instagram Teen Accounts in September 2024, saying it "ensures that young people can engage meaningfully and safely, fostering positive connections while still providing the protection they need.” Meta's announcement did not disclose that it financially supports PROJECT ROCKIT.
It is not clear how much total funding PROJECT ROCKIT has received from Meta. The company included PROJECT ROCKIT’s metaverse youth roundtables in a list of global research projects that received a combined $50 million. In September 2024, Meta told Australian lawmakers that it had supported PROJECT ROCKIT’s “Digital Ambassadors” anti-bullying initiative for more than a decade and had worked with the group to produce a series of educational videos about the sharing of intimate images.
In November 2024, after Australia passed the world’s first social media ban for children under 16—in a legislative process that Meta called “rushed”—Thomas criticized the law in a television interview, saying “many of us are concerned that it may actually make online life unsafe for young people.” She did not disclose in the interview that her group is funded by Meta.
In a statement, Thomas declined to disclose the amount of funding PROJECT ROCKIT has received from Meta, but said her group "will only enter into partnerships that include our mandatory independence clause, which safeguards our neutrality and guarantees we can openly evaluate and critique platforms without constraint."
"By engaging directly with platforms, we continue to challenge the tech industry to step up and better protect young people," she said. Thomas added that her group's position on Australia's social media ban for under-16s was shared by numerous academics and experts and is "entirely independent of any partnership or advisory role."
PROJECT ROCKIT held Meta-funded consultations with young people on the metaverse and produced a youth safety guide to Horizon Worlds.
PROJECT ROCKIT held Meta-funded consultations with young people on the metaverse and produced a youth safety guide to Horizon Worlds.
While these Meta-funded groups are often aligned with the company on policy matters, some of them diverged from Meta on its January 2025 decision to end third-party fact-checking and reduce content moderation. PROJECT ROCKIT and ConnectSafely joined with other members of Meta’s Safety Advisory Council in raising concerns about the move, saying it had “profound implications that warrant careful scrutiny.” In a separate statement, PROJECT ROCKIT said it was not consulted on Meta’s actions and said the “lack of input” from experts sets a “troubling precedent for how safety decisions are weighed in major policy decisions.”
Reports and studies
To support its efforts involving kids, Meta has repeatedly pointed to something called the Trust, Transparency & Control (TTC) Labs. Meta launched this initiative in 2017, describing it as a collaboration on privacy design features with children, parents, industry, government, civil society, and academia. It has produced reports detailing the workshops and consultations that informed Meta’s development of child-focused products. While TTC Labs is clearly a Meta creation, Meta has at times left the impression that it is a separate entity.
During Meta’s introduction of Instagram Teen Accounts, TTC Labs published a report touting the “research and consultation” that went into the development of the safeguards. According to the report, TTC Labs “has consulted with more than 600 stakeholders, 300 teens and 270 parents from more than 35 countries to inform a number of the safety and privacy features of Meta technologies.” It added, “These consultations help us develop age-appropriate experiences for teens that preserve their access to online connection and community.”
After Meta opened Horizon Worlds to teens, TTC Labs published a report about how child and parent workshops shaped Meta’s approach to kids in the metaverse. The report described how Meta held workshops in the U.S., UK, Ireland, and Australia with 36 children aged 10 to 13 and 36 parents, observed in some cases by third-party organizations. One of the key “learnings” from these workshops, the report said, was that parents want “tools and education to understand their children’s digital experiences”—an idea consistent with Meta’s strategy of pushing responsibility for policing kids’ social media onto parents.
TTC Labs also published a report about the research that went into development of the Messenger Kids app. The report at times reads like marketing material, with lines like:
The overarching principle for Messenger Kids is that parents are informed and in control of their children’s online social experiences, but the app is flexible enough to allow parents to create more autonomous and private experiences when they feel their children are ready for it.
TTC Labs’ website and reports are branded with Meta’s logo, and its publications are clearly designed to support Meta’s products. One of its leaders is a Meta executive. But it also positions itself as a neutral source of information, stating that it is a “non-commercial initiative that is not intended for or directed towards commercial advantage or monetary compensation.”
Meta often points to the work of its Trust, Transparency & Control Labs when rolling out new products aimed at kids.
Meta often points to the work of its Trust, Transparency & Control Labs when rolling out new products aimed at kids.
At times, Meta has framed TTC Labs as a separate organization in communications with government officials and others, leaving the impression that an independent entity has somehow endorsed its efforts.
For example, in September 2021 feedback submitted to Australia’s eSafety commissioner, Facebook said it had “co-developed” a youth design guide with TTC Labs, without disclosing that TTC Labs is a creation of Meta. That same year, in a response to the Irish Data Protection Commission, Facebook referred to TTC Labs as an "external collaboration." In a November 2022 presentation to the Advertising Research Foundation, a Meta executive referred to TTC Labs studies as being “commissioned” by Meta. In a 2023 human rights report, Meta said it conducted “co-design sessions” through TTC Labs without explaining that it owns the unit.
Meta has also funded an array of academic research projects that focus on positive use cases for Instagram, one of the most popular social networks for young people. This helps the company to support its argument that research is inconclusive on whether social media is harmful to people’s well-being.
TTP identified three rounds of Instagram research grants announced between 2018 and 2021. In 2021, the winning projects explored topics like technology-driven mental health interventions, social media as a safe space for transgender people, and the role of Instagram communities in promoting daily fitness activity. One of the projects, by researchers at the University of New South Wales in Australia, looked at how Instagram can deliver safety information to people taking selfies in dangerous locations like a cliff’s edge. The winners, who offered upbeat quotes in Meta’s press release announcing the awards, received grants of up to $50,000.
For the 2019 round, Meta did not describe the six winning projects, but TTP identified two Instagram-funded studies from the researchers who were named as grant recipients. One of them, a dissertation from a PhD student at Cornell University, looked at how Instagram can reduce stigma for family members of incarcerated people. The other was an academic talk from a professor at the University of Michigan exploring how alternative theories of criminal justice could help social media sites support victims of online harassment.
The three winners in 2018 included studies looking how social media platforms can enable women entrepreneurs and how the isolation of space affects astronauts’ social media use. The other winner was a professor at Northumbria University in the UK, who examined issues involving Instagram and well-being. TTP could not find a study with the same title as that listed on Meta’s announcement. But the same professor coauthored a 2024 study, funded by Facebook Research, that found “users versus non-users of Instagram did not significantly differ in their levels of anxiety, depression or loneliness.”
Most of the academics who conducted these research projects either declined to comment or did not respond to a request for comment. One of the academics, Celeste Campos-Castillo, said she is "unaware of the motives or strategies of Meta regarding its use of research funding," and said that Meta had no role in the design of her study or "communication of its findings."
TTP did not analyze the methodologies of these studies and makes no judgment on their conclusions. But taken together, these projects provide Meta with a useful examples support its argument that research on the impact of social media is not conclusive—an argument it has made repeatedly over the years to push back on criticism that its platforms can have a harmful impact on young users.
In comments to a UK Parliament inquiry about social media and children in 2018, Meta said “the evidence is not conclusive, and the claim that social media is detrimental for young people's health is not universally substantiated by existing research.” Asked at a March 2021 House committee hearing about whether too much time in front of screens is harmful to kids’ mental health, Zuckerberg responded, “I don't think that the research is conclusive on that.” He added that “overall, the research that we have seen is that using social apps to connect with other people can have positive mental health benefits and well-being benefits by helping people feel more connected and less lonely.”
As recently as April 2025, Zuckerberg told podcaster Theo Von, “My understanding of the current state of the research is that there isn’t kind of a conclusive finding that this is negative for people’s well-being."
In a 2021 appearance before Congress, Meta CEO Mark Zuckerberg suggested the research on kids and screen time is not conclusive, adding, "the research that we have seen is that using social apps to connect with other people can have positive mental health benefits."
In a 2021 appearance before Congress, Meta CEO Mark Zuckerberg suggested the research on kids and screen time is not conclusive, adding, "the research that we have seen is that using social apps to connect with other people can have positive mental health benefits."
Not mentioned in those statements: the fact that Meta’s own in-house research found that Instagram has harmful psychological effects on some of its users, including teen girls, as revealed in documents leaked by Facebook whistleblower Frances Haugen. Meta and Zuckerberg have challenged media interpretation of those documents, which have been widely aired in public and in Congress.
Conclusion
As Meta has come under growing pressure over its impact on kids and their well-being, the company has responded with a range of tactics to influence the public debate. These include cultivating a network of paid child advocacy groups, using a social research “lab” to publish reports in support of its products, and funding academic research that highlights positive use cases for Instagram. Such strategies help the company build a counter-narrative to the idea that its platforms are harming young users—and fend off attempts to regulate its platforms.