For the first time in over two decades, Congress may approve major legislation focused on protecting children online. This comes with increasing state action on children’s safety online, including at least 50 bills enacted about children and social media that tend to be more proscriptive on the rules rather than respecting the rights of minors.
After Pride celebrations throughout the month of June, any such policy environment could threaten to curb access to resources and online communities for LGBTQ+ youth. One prominent legislative proposal, the Kids Online Safety Act (KOSA), could harm the very population of LGBTQ+ youth that supporters argue that the bill protects. The Senate passed the bill last year as part of a broader package that included an update to the Children’s Online Privacy Protection Act (COPPA 2.0), and with then-President Joe Biden’s support, it seemed likely to pass until House Speaker Mike Johnson (R-La.) stalled progress in the House.
The bill was reintroduced in May, with renegotiated text responding to critiques from civil rights and LGBTQ+ advocacy groups that clarifies state attorneys general and the Federal Trade Commission (FTC) do not have the power to bring lawsuits over content or speech. X, Microsoft, and Apple have endorsed the bill alongside various advocacy groups.
Despite these revisions, one of the bill’s sponsors, Sen. Marsha Blackburn (R-Tenn.), told an interviewer that KOSA could “[protect] minor children from the transgender in this culture” and give parents the right to sue social media platforms that exposed children to “transgender content.” Her legislative director later said that, “KOSA will not — nor was it designed to — target or censor any individual or community.” This shift in focus—from protecting minors from addictive content to shielding them from areas steeped in ideological differences—is a more recent injection into the broader debate.
In response to these concerns, lawmakers amended the original text to clarify that it does not preempt Section 230 of the Communications Decency Act, which affords liability protections to platforms that host user-generated content. Additionally, the potential harm in KOSA used to include any item in the Diagnostic and Statistical Manual of Mental Disorders (DSM), which includes gender dysphoria. Among other changes, the bill authors also took out the word “grooming” in the bill, which has historically been used in anti-human trafficking bills but also has been used to attack LGBTQ+ spaces online.
However, LGBTQ+ advocates still oppose the bill, worrying that it opens the door for the FTC to censor posts on LGBTQ+ identity or expression or sue platforms that surface these posts in search results. At least one LGBTQ+ advocacy organization, GLAAD, which previously was “neutral” on the bill, has changed its position to oppose the text considering the new FTC leadership. The broad categories of harm—which include “compulsive usage,” “gambling,” and “sexual exploitation”—could be misinterpreted by enforcement agencies to include LGBTQ+ content. Additionally, while the bill does not explicitly regulate content, it defines the broadly worded “design features” that encourage kids to stay on the platform, including a “personalized design feature,” which refers to any partially or fully automated system based on users’ personal data. While search bars are exempted under the bill to ensure teens can access accurate search results, this provision is contradicted by other language in the bill, including oversight of algorithmic ranking systems. By censoring this potentially sensitive information from children, it shields them from seeking help and information in situations where a child may need anti-harm advice on eating disorders, mental health struggles, drug abuse, or other struggles predominantly affecting LGBTQ+ youth. Such exclusion of the positive and prideful aspects of LGBTQ+ content comes close to infringing on free speech and expression.
Unintentional consequences from online safety laws
This is not the first time Congress has legislated on children’s safety online. In 1998, Congress passed the Children’s Online Privacy Protection Act (COPPA). While COPPA is a tool for the FTC to protect children online, it remains unclear whether the law has achieved its objectives. Research from the University of California, Berkeley, found that about 57% of child-directed apps studied were potentially violating COPPA. In a case brought by the FTC, the agency found that Amazon violated COPPA by retaining children’s data on Alexa. These violations are common due to broad, ambiguous language in COPPA and financial penalties that are largely negligible for large companies. New proposals, like KOSA, do not seem to solve these problems.
Additionally, many children can circumvent protections by easily lying about their age or using virtual private networks (VPNs) online. In fact, searches for VPNs doubled in the days after Utah required age-verification measures on pornography sites. Research found that millions of underage users still signed up for Facebook, despite COPPA requirements against young users joining the site. New rules do not circumvent these longstanding issues without introducing further privacy concerns through age verification requirements.
Two other laws that moderated online content—the Allow States and Victims to Fight Online Sex Trafficking Act and the Stop Enabling Sex Traffickers Act, commonly known as FOSTA-SESTA—could offer lessons for upcoming social media legislation. While the bills sought to combat a similarly egregious harm to children online, sex trafficking, their restrictions on speech disproportionately impacted marginalized communities, especially LGBTQ+ sex workers.
After the bills were enacted, companies cracked down on online spaces that included discussions of sexuality, whether sex trafficking was common or not. Reddit closed any subreddits where sex workers discussed safety measures. Craigslist removed its personal ads, a space that provided LGBTQ+ users an opportunity to connect and meet others in the community. Tumblr banned “adult content” that ended up censoring art, LGBTQ+ educational content, and a whole host of photos mistakenly caught in the filter, including boxes, tires, and socks.
One of the most pernicious impacts of these laws is not necessarily the government censorship itself, but the “self-censorship” that users and platforms engage in to avoid any scrutiny. Sex workers have tried to avoid moderation policies by intentionally misspelling words or limiting the information they share with the community. Platforms have also “shadow banned” users that use terms like “queer” or “trans.” In addition to Meta’s restrictions of hashtags, TikTok representatives acknowledged the company restricts LGBTQ+-related hashtags in certain countries to comply with local laws. This conflation of gender and sexual identity discussed in informational contexts under a broad umbrella of “sexual content” demonstrates how platforms may interpret laws like KOSA. Platforms that provided safety measures for sex workers closed down discussion boards and companies changed terms of services to limit adult content creators’ reach.
This offers a preview of the potential adverse impacts of future child safety legislation, such as the elimination of safe spaces for children who will likely continue to access the internet regardless.
States are going further to restrict LGBTQ+ information
Meanwhile. states are going further than the proposed federal bills. Florida, Arkansas, Texas, Nebraska, Louisiana, and others all have passed bills aimed at protecting children online that are more extreme than the proposals presented at the federal level. Even socially liberal legislative bodies like California have passed bills like the Age-Appropriate Design Code that LBGTQ+ advocates worry could harm LGBTQ+ children by banning discussions of LGBTQ+ public events or books that address gender and sexuality at early ages.
Many state bills aim to give parents greater control over their children’s internet access, but they often assume that parents both know what is best and will set appropriate limits. They also presume that children are “out” to their families, despite over half of LGBTQ+ youth experiencing some form of parental rejection at home. LGBTQ+ youth are 120% more likely to experience homelessness, and 20% do experience homelessness before the age of 18.
Some of the more conservative states like Texas, Oklahoma, and Alabama are already looking at ways to restrict LGBTQ+ youth access to informational resources online. But introducing bills at the federal level in this current anti-LGBTQ+ climate could embolden local lawmakers to go even further, or weaponize enforcement mechanisms.
The LGBTQ+ community faces unprecedented threats
These legislative efforts come at a time when the LGBTQ+ community is facing unprecedented legislative attacks. President Donald Trump and the Republican-majority Congress have made critical remarks against LGBTQ+ Americans, and executive branch agencies are removing references to transgender people from several government websites. There are over 470 anti-LGBTQ state bills still under consideration in 2025 that aim to curb LGBTQ+ rights. As legislative and enforcement bodies have targeted LGBTQ+ populations, new proposals aimed at protecting children and teens online may negatively impact the LGBTQ+ youth they’re intended to protect.
Online resources can significantly improve the mental health and well-being of LGBTQ+ youth, especially those who lack access to targeted support or safety in their offline environments. While 44% of LGBTQ+ youth feel “very safe” online, only 9% report the same feeling for in-person spaces. On the internet, LGBTQ+ youth can find community with peers, support for difficult situations, and access to accurate health information. LGBTQ+ youth are five times more likely than their non-LGBTQ+ peers to search for information about sexuality online, and they are also more likely to seek out health or medical information—rates that are even higher among transgender youth. Additionally, LGBTQ+ youth spend an average of 45 minutes more online per day than non-LGBTQ+ youth. The internet serves as a valuable resource, with nearly three-fourths of LGBTQ+ youth reporting they are more honest online, and over half of closeted youth using it to safely connect with their peers.
In the past year, 39% of LGBTQ+ youth seriously contemplated suicide, according to The Trevor Project; that figure jumps to 46% for transgender and nonbinary youth. Online access to mental health and supportive resources is not just important for youth well–being but can also be lifesaving for children’s physical safety.
Despite this, policymakers and social media platforms have threatened to curb access to these same resources that are protecting LGBTQ+ youth health and well-being. A leaked internal budget document from the Department of Health and Human Services (HHS) reveals plans to cut funding for LGBTQ+ youth resources on the national suicide hotline by October. The service for LGBTQ+ youth has responded to nearly 1.3 million calls and messages since its launch in 2022.
For months in late 2024, Meta blocked teen accounts from accessing search results that included LGBTQ+-related hashtags, including #gay, #lesbian, #trans, and #queer, flagging the results as “sexually explicit content.” While these restrictions were reversed after journalists reached out to the company, the algorithmic suppression of LGBTQ+ content online can make it more difficult for LGBTQ+ youth to find these resources and community online. These restrictions can seem especially targeted as algorithms block LGBTQ+ content while still allowing graphic and violent content to appear on teen accounts without similar restrictions.
What could an inclusive children’s online safety law look like?
As we reflect on the conclusion of Pride Month across the U.S., the difficult question remains: How can lawmakers protect children online without endangering identity-affirming spaces for LGBTQ+ youth? Solutions such as creating opt-in digital environments designed with youth safety in mind—rather than relying on blunt content filters—and enacting stronger protections against addictive game mechanics could help give children and teens greater agency over their online safety.
Many concerns around these bills hinge on the potential for censorship through content-related restrictions (even around promotion or demotion of certain content in algorithms and not the content itself). Lawmakers should take caution to narrow definitions of harm to avoid unintended consequences for online content.
Ultimately, any policy that regulates content must anticipate how it might be interpreted by courts, platforms, and enforcement agencies. The conservative application of ambiguous language can lead to de facto censorship and withdrawal of services, as seen when Planned Parenthood reduced offerings in states where abortion remained legal, citing legal ambiguity and litigation costs.
Instead of writing around LGBTQ+ rights, lawmakers should proactively include explicit protections for communities that may be harmed by new legislation. They should clearly state the aim of these bills—such as being “content neutral—to prevent courts from misinterpreting their original intent. The challenges LGBTQ+ youth face are often shared by other vulnerable young people seeking support, not condemnation, online.
If the goal is to create a safer internet for all young people, including LGBTQ+ youth, then policies must protect—not erase—the spaces where they learn, connect, and express themselves.
————————————————