top of page

Youth Online Safety: App Stores As Gatekeepers?


As Congress looks for ways to reduce online harms for children and teens, discussion often centers on access to social media sites.  However, some lawmakers now want to start that conversation not with social media platforms themselves, but with the app stores and operating systems that house them. 


This debate came into focus at last week’s House Energy & Commerce Subcommittee hearing, where members examined two very different app-store proposals: the App Store Accountability Act (H.R. 3149) and the Parents Over Platforms Act (H.R. TBD). Both would push app stores and operating systems to play a central role in policing how minors access apps, and, in the process, fundamentally affect youth and adult privacy, the role of parents in their children’s digital access, and potentially restrict access to sensitive resources. However, as explained below, these two pieces of legislation diverge on how that gatekeeping should and would play out.


---


The App Store Accountability Act (Centralized Control & Hard Age Gates)

The App Store Accountability Act (H.R. 3149) would place app stores in a direct custodial role over minors’ digital lives. Under the bill, large app stores would be required to determine each user’s age category, link anyone under 18 to a verified parent or guardian, and require parental consent before a minor could download an app or make in-app purchases. App stores would also have to transmit an “age signal” and consent status to every app developer signaling whether they are dealing with a minor or adult user, who would then be expected to tailor their apps accordingly. This model largely shifts responsibility for age verification from individual apps to app stores, and has attracted explicit support from some major platforms, including Meta and Grindr.

For LGBTQ+ youth, the trade-offs are significant. Any system that requires app stores to verify age and subject minors’ accounts to a parental veto risks forcing young people to choose between privacy and access. A teenager in an unsupportive household could see their ability to download health resources, crisis-support tools, or LGBTQ+ community apps effectively gated or blocked by a parent who disapproves of their identity. In fact, just the concept that all apps must be parent-approved would chill engagement from questioning youth or youth in unsupportive households. Depending on how age assurance is implemented, these requirements may also demand the collection of sensitive identity data, creating new risks of outing or data misuse.


---

Parents Over Platforms Act (Shared Age Signals & Commercially Reasonable Assurance)

The Parents Over Platforms Act (H.R. TBD)  takes a related but distinct approach. Like the App Store Accountability Act, it starts with the premise that app stores and operating systems are best-positioned to improve youth protections. But rather than requiring a hefty consent process for every app, this Act relies on “commercially reasonable” age-assurance methods and shared signals.

Under this legislation, app-distribution platforms would be responsible for generating age signals and, with user consent, sharing those signals with app developers. Developers of covered applications would be required to honor these signals, differentiate experiences for adults and minors, and avoid personalized advertising for minors. In exchange, app stores and operating system providers would receive a liability shield so long as they make good-faith efforts to comply.

The Parents Over Platforms Act looks to treat app stores as infrastructure that can standardize age signals and parental tools, without micromanaging every download in the process. A more flexible, commercially reasonable standard could, in theory, allow for age-assurance methods that protect anonymity and minimize data collection, however current commercial age-assurance methods leave open a lot of questions about whether these methods can actually protect  privacy and anonymity to a satisfactory level. The Act also leaves more room for developers and advocates to push for youth-centric designs within the constraints of a shared signaling system. That said, many of the same questions apply. Which age-assurance methods will be considered commercially reasonable? How easily can a minor in an unsafe home access critical resources if a parent chooses to block categories of apps? And how will marginalized youth be represented in this governance?

---

Where Congress lands on this issue will have real consequences for LGBTQ+ youth and users. As these bills potentially move forward, it will be essential to ensure that any app-store obligations are grounded in strong privacy protections, minimally invasive age-assurance methods, and meaningful consultation with the communities whose lives are most affected by what gets approved or blocked at the app-store level.



bottom of page