top of page

Youth Online Safety: What The House's 19-Bill Hearing Means for the Road Ahead


This week’s House Energy and Commerce Subcommittee hearing marked a crucial moment in federal youth online safety policymaking. Nineteen bills, spanning algorithmic design, content moderation, age verification, parental monitoring, data privacy, AI governance, messaging tools, and more - were all taken up within a single session. 


While the legislative slate was expansive, the hearing was also defined by disagreement over the best path forward. Members agreed that not acting on these issues was no longer an option, but beyond a shared desire to improve safety , there was little concurrance on any specific approach. This institutional uncertainty matters deeply for marginalized communities, especially LGBTQ+ youth, who have historically depended on these online spaces for identity development, safety, and access to affirming content. 


A House markup is expected in the near future, as are aligned and conflicting Senate efforts around youth safety online. Below is a brief summary of the discussed bills with a focus on their impact on marginalized communities, including the LGBTQ+ community.


---


Familiar Faces (KOSA, COPPA 2.0, Sammy’s Law)


Much of the hearing focused on three bills that advocates know well: KOSA, COPPA 2.0, and Sammy’s Law. 


The biggest change to this revised House discussion draft of the Kids Online Safety Act, or KOSA (H.R. TBD), is the removal of  its core “duty of care” standard. Instead of imposing a legal obligation on platforms to prevent and mitigate specified harms, the draft leans more heavily on transparency reports, safety audits, and “reasonable” policies to address harmful content and design features. Sponsors argued that these changes are meant to make the bill more resistant to First Amendment challenges. However many members, including previous supporters of KOSA, held that this change to the draft makes the bill  “weak” and “ineffectual.” Senator Marsha Blackburn, sponsor of the Senate iteration, stated plainly that the proposed changes “would do nothing to protect children online.”


As the Senate and House wrestle with KOSA, the questions that framed earlier debates have not disappeared: how broadly will harm be defined, who will define it, and critically, will  broad language and vague definitions  incentivize platforms to suppress content in a political environment where LGBTQ+ expression is already being framed as inherently risky?



The updated Children and Teens’ Online Privacy Protection Act, or COPPA 2.0 (H.R. 6291), also came under scrutiny. The House version reinstates a tiered knowledge standard, applying an  “actual knowledge” standard to most companies while using this same standard or “willful disregard” for large social media platforms. Most notably, this draft adopts a sweeping preemption clause preventing state-level action on provisions relating to the Act. This preemption represented a point of contention between Republicans and Democrats, with the latter concerned about blocking and overriding the many existing state efforts, especially those more protective than federal law.   


The Let Parents Choose Protection Act, or Sammy’s Law (H.R. 2657), was another focal point of the day. Broadly, Sammy’s Law would require social media platforms to build and maintain real-time APIs allowing third-party “safety software” to plug directly into a teen’s account. Through that connection, a parent or guardian could delegate access to said monitoring tool - reading messages, viewing posts, tracking interactions, and flagging a wide range of “harm indicators,” from drugs and suicide to bullying, fraud, and sexual content.


Supporters framed the bill as a targeted response to very real dangers like drug and self-harm content, arguing that it gives parents a tool they urgently need. But members and advocates raised serious concerns about surveillance, data security, and the absence of safeguards for youth in unsafe households. LGBTQ+ youth are disproportionately represented among young people who face rejection, punishment, or abuse at home. For them, private or semi-private online spaces can be lifelines. Legislation that normalizes continuous third-party monitoring, especially without any meaningful protections for minors who may be endangered by such visibility, risks chilling help-seeking behavior and exposing vulnerable youth to real-world harm.



---


App Stores as Gatekeepers (App Store Accountability Act, Parents Over Platforms Act)


Several bills up for discussion sought to shift the onus for preventing youth harm to app stores or operating systems. The App Store Accountability Act (H.R. 3149) would require large app stores to determine each user’s age category, link everyone under 18 to a verified parent or guardian, and secure parental consent before minors can download apps or make in-app purchases. App stores must then pass an “age signal” and consent status down to app developers, who must tailor their services accordingly.


The Parents Over Platforms Act (H.R. TBD) follows a similar logic, but through the usage of a “commercially reasonable” age-assurance standard. It would require app distribution platforms to generate and share age signals and give parents tools to block minors from acquiring or using specific apps or features. Developers of covered applications would have to honor those signals, differentiate experiences for adults and minors, and avoid personalized advertising to minors. App stores and operating system providers would be covered by a liability shield under this legislation, providing they have made good-faith efforts to comply. Both bills rely on FTC enforcement and federal preemption to set a single national baseline. 


For LGBTQ+ youth, moving the choke point to the very top of the ecosystem remains a risk. Requiring app stores to verify age and collect parental consent can introduce profound privacy risks, necessitating sensitive data collection or exposing a young person’s LGBTQ+ status to unsupportive family members and malicious actors.



---


Addressing Messaging Features (Safe Messaging for Kids Act and Safer GAMING Act)


Two of Tuesday’s bills focused specifically on how minors communicate online. The Safe Messaging for Kids Act of 2025 (H.R. 6257) would ban ephemeral messaging, defined as messages that disappear after viewing or within a short period, for users under 17. Platforms would additionally be required to offer extensive parental controls for direct messages, including blocking direct message features from unknown contacts by default, requiring parental notification and approval, and allowing parents to disable messaging entirely. App stores would also need to display warnings when a minor downloads an app with any messaging functionality.


The Safer Guarding of Adolescents from Malicious Interactions on Network Games Act, or the SAFER Gaming Act (H.R. TBD), extends a similar approach to online games. It would require game providers to offer safeguards that limit minors’ communications with other users and to set the most restrictive privacy options as the default for anyone under 18, with only parents able to relax those settings.


These proposals rest on the assumption that less private communication and more parental oversight automatically make young people safer. For many LGBTQ+ teens, the reality is more nuanced. Messaging tools, whether in social apps or game chats, are often where queer and transgender youth find their first affirming friends, mentors, or support networks, particularly when offline life is hostile. Systems that require parental sign-off for every new contact, combined with bans on ephemeral messages, risk cutting off those lifelines and exposing deeply personal conversations to family scrutiny. The lack of distinction between the tools appropriate for a 12-year-old versus a 17-year-old also creates risks and vulnerabilities for LGBTQ+ teens, who will experience different privacy and access needs as they grow.


---


Youth Data Protections (SPY Kids Act, Don’t Sell Kids’ Data Act)


Another pair of bills looks to target the larger data ecosystem behind widely used platforms. The Stop Profiling Youth and Kids (SPY) Act, or SPY Kids Act (H.R. 6273), would bar engagement-driven platforms from conducting “market or product-focused research” on children under 13 and require verifiable parental consent before conducting such research on teenagers (13 - 16). The bill leans heavily on FTC enforcement, paired with extensive federal preemption of overlapping state laws.


Meanwhile, the Don’t Sell Kids’ Data Act (H.R. 6292) focuses on data brokers. The legislation would prohibit brokers from collecting, using, or selling personal data about any youth under 17 when they know or reasonably should know that the individual is a minor. The bill includes strong deletion rights, dual enforcement by the FTC and state attorneys general, and a private right of action that would allow individuals to sue over violations.


During the hearing, Ranking Member Frank Pallone (sponsor of the second Act above) highlighted an important broader structural gap: even if Congress secures strong protections for minors’ data, the United States still lacks comprehensive privacy protections for all. This omission matters, and we’re grateful it’s a part of the conversation. Youth-specific rules do little for LGBTQ+ adults, for young people as they age out of coverage, or for closeted users whose identities are inferred by data profiles rather than self-disclosed. Still, it remains true that minimizing data brokerage involving minors could meaningfully reduce risks like the weaponization of sensitive identity information against LGBTQ+ youth.


---


AI and Algorithmic Design (SAFE BOTs Act, Algorithmic Choice and Transparency Act)


Two proposals address AI systems and recommendation engines. The Safeguarding Adolescents From Exploitative Bots Act, or the SAFE BOTs Act (H.R. TBD), focuses on AI chatbots that interact with minors. Broadly, it would require providers to clearly disclose when users are interacting with an AI, prohibit bots from claiming to be licensed professionals, display crisis hotlines when users discuss self-harm, and adopt policies for how bots handle sexual content, gambling, and illegal drugs in conversations with minors. 


Meanwhile, the Algorithmic Choice and Transparency Act (H.R. 6253) purports to provide young users with safer recommendation systems. Platforms would need to explain how and when they use personalized algorithms, provide a non-personalized feed as the default for minors, and offer options to limit or adjust recommendation categories. States would be largely preempted from adopting different algorithm-choice or disclosure standards.

For LGBTQ+ youth, these proposals can be double-edged. Personalized recommendations are often how queer and trans content reaches those who need it most. At the same time, guardrails on chatbot behavior and honest disclosures about AI’s limitations are essential to avoid situations where young people treat a chatbot as a substitute for human mental health care or community support. 


As with the rest of the legislative package, the challenge here lies in how to curb genuinely harmful design practices without starving vulnerable users of trustworthy, affirming information.


---


Extreme Approaches (RESET Act, SCREEN Act)


Two bills stand out as the most far-reaching attempts to re-engineer minors’ access to the internet. The Reducing Exploitative Social Media Exposure for Teens Act, or RESET Act (H.R. TBD) would broadly prohibit anyone under 16 from maintaining an account on a covered platform. Companies would be required to identify known minor accounts, notify those users, terminate their accounts, delete their data, and prevent new accounts from being re-created by users under 16. Enforcement would flow through the FTC, and the bill includes federal preemption that would block states from crafting more tailored approaches.


The Shielding Children’s Retinas from Egregious Exposure on the Net Act, or SCREEN Act (H.R. 1623), focuses on content access. It would require commercial websites that host material deemed harmful to minors (under obscenity-style criteria that references “patently offensive” sexual content and material lacking “serious value”) to implement age-verification technology and block minors entirely. While framed as an anti-pornography bill, standards like these have historically been used to suppress, among other topics,  LGBTQ+ sexual health information, coming-out resources, and queer literature. Because compliance depends on determining who is a minor, the bill also effectively pushes platforms toward invasive age-verification systems.


Both bills have drawn concern in the hearing as examples of aggressive approaches that could face serious constitutional challenges. They also reveal a conceptual shift in approach: in some corners of Congress, “protecting kids” is now being equated with excluding them from large portions of the internet altogether. 


---


Study and Public Education Bills


Finally, a significant cluster of bills under discussion would not directly regulate platform behavior, but instead expand research, reporting, or public education around youth online experiences. The No Fentanyl on Social Media Act (H.R. 6259) directs the FTC, HHS, FDA, and DEA to study how minors access fentanyl through social media and propose policy responses. The Safe Social Media Act (H.R. 6290) calls for a broad study of how minors use social media, how platforms collect and use their data, and how those practices intersect with mental-health outcomes. 


The Promoting a Safe Internet for Minors Act (H.R. 6289) would position the FTC as the lead agency for a nationwide online-safety education campaign on the issue of social media usage. The AI Warnings and Resources for Education Act, or the AWARE Act (H.R. 5360), would task the FTC with creating educational materials about the use of AI chatbots; while the Assessing Safety Tools for Parents and Minors Act (H.R. TBD) would require the FTC to study and report on how well current platform settings and tools (parental controls, age labels, and safety settings) are performing. The Kids Internet Safety Partnership Act, or KISPA (H.R. TBD), would create a new entity within the Department of Commerce to identify online risks and benefits for minors and publish a best-practices playbook on the issues at hand.


 If these efforts move forward, it will be critical that they are grounded in evidence, shaped with meaningful input from affected communities, and, above all else, focused on empowering youth rather than fueling moral panic.



Where This Leaves LGBTQ+ Youth


These 19 bills run the gambit: from research and education, to design mandates, to app store gatekeeping, to extreme bans on youth accounts or access to content. They also demonstrate how the same shared concerns about mental health, exploitation, data abuse, and manipulation, are being approached through very different and sometimes conflicting legal strategies.


For LGBTQ+ youth, the stakes could not be higher. Research and lived experience consistently show that queer and transgender individuals use digital platforms to explore their identities, find community, navigate crises, and access resources that are often unavailable offline. As these bills move forward, it will be essential for Congress to center the experiences of marginalized communities in order to build frameworks that advance safety without sacrificing privacy, autonomy, or dignity. 


Civil society organizations, including LGBT Tech, stand ready to help lawmakers navigate this complex landscape.

bottom of page