America Wants to ID You Before You Download an App

America Wants to ID You Before You Download an App

Redacto
15 min read

Categories: Age Verification, Digital Rights, Privacy and Security, Tech Policy

Quick Story Summary
  • The App Store Accountability Act (H.R. 3149) would require app stores like Apple and Google to verify the age of every user before they can download or access apps.
  • App stores would have to collect age data, categorize users into age brackets, and share that age signal with app developers so apps can restrict access for minors.
  • Supporters say centralized age verification at the app store level would give parents more control and make it harder for children to access harmful apps.
  • Privacy and digital rights groups warn the system would require collecting sensitive personal data—such as government IDs or biometrics—from millions of users.
  • Critics argue the bill could create security risks, expand data sharing with developers, and restrict access to digital spaces for people without acceptable ID.
  • The proposal is advancing in Congress alongside a Senate companion bill (S.1586) and would likely face immediate constitutional challenges if enacted.
H.R. 3149 Federal app store bill
All users Age verification required
S.1586 Senate companion bill

The App Store Accountability Act (ASAA) is a piece of federal legislation currently advancing through the U.S. House of Representatives. It would require app store operators to verify the age of every user before they can download apps, regardless of what those apps contain. Supporters say it is a necessary step to protect children online. Critics argue it creates serious privacy, security, and free speech risks for everyone.

What Is the App Store Accountability Act?

The App Store Accountability Act is a piece of federal legislation (H.R. 3149) that would require app store operators, primarily Apple and Google, to verify the age of every person who downloads or accesses apps through their platforms. The goal is to give parents more control over what apps their children can access, and to prevent minors from downloading apps that may expose them to harmful content.

Under the bill, app stores would be required to collect age information from all users, categorise them into age brackets, and share that information with app developers. Apps that are deemed unsuitable for minors would be restricted accordingly. Parents would need to provide consent before their children can download certain categories of apps.

The bill is part of a broader wave of youth online safety legislation being pushed through Congress and state legislatures across the United States. On March 5, 2026, the House Energy and Commerce Committee took up a wide slate of youth safety bills for markup, including the ASAA alongside other measures targeting social media, gaming, and AI chatbot companions.

Who Is Behind It and Why?

The ASAA has strong support among lawmakers who argue that app stores have for too long operated as unregulated gateways to content that can harm young people. Supporters point to the explosion of social media addiction, exposure to violent or sexual content, and the ease with which children can access apps designed for adult audiences.

Proponents argue that placing the gatekeeping responsibility on app store operators, rather than individual apps or websites, creates a single point of control that is easier to enforce and harder to circumvent. The logic is that if Apple and Google both require age verification, the system becomes harder to route around.

The bill has also been embraced by some parent advocacy groups who feel that existing tools for parental controls are too fragmented and difficult to use. From their perspective, a mandatory, standardised age signal from the app store level is a straightforward and long-overdue reform.

Who Is Opposing It and Why?

Opposition to the ASAA is broad and cuts across the political spectrum. Privacy advocates, digital rights organisations, legal scholars, and even some tech companies have raised serious concerns.

The Open Technology Institute at New America, one of the most prominent voices against the bill, has described the ASAA’s approach as comparable to requiring every person shopping at a grocery store to provide ID upon entering, regardless of whether they intend to buy chips, fruit, or alcohol. The analogy gets at the core problem: the bill does not target age-restricted content specifically. It targets everyone.

The privacy concerns are not abstract. Age verification at scale means collecting sensitive personal data, potentially including government IDs, credit card numbers, or biometric information, from hundreds of millions of people. That data then has to be stored, transmitted, and shared with app developers, creating multiple points of potential exposure.

Critics also raise free speech concerns. Because the ASAA requires age verification for access to all apps, not just those offering adult or legally restricted content, it could effectively prevent people without acceptable forms of ID from accessing digital spaces they have every right to use. That includes undocumented immigrants, young people in households without typical ID documents, and others on the margins.

The Privacy and Security Risks Are Real

One of the most important arguments against the ASAA is that age verification systems have a documented track record of failure. In July 2025, hackers exposed thousands of selfies and government ID photos from a dating app (Tea) that had used them for identity verification. A few months later, Discord disclosed that tens of thousands of users may have had their government ID photos compromised after submitting them as part of an age-gating process.

These are not edge cases. They are predictable consequences of collecting sensitive identity data at scale. The more widely that data is collected, and the more entities it is shared with, the more likely it is to be breached.

The ASAA’s architecture makes this problem worse, not better. Under the bill, app stores would be required to collect granular age category information and share it with every app developer, regardless of whether a particular app actually needs it. There is no meaningful data minimisation built into the core of the design.

Advocates have long argued that any age verification system, if it must exist at all, should use privacy-preserving techniques that allow users to prove their age without revealing their identity. Think of it as the digital equivalent of showing your age on your face rather than handing over a passport. The technology to do this is advancing, but experts say it is not yet mature enough to be deployed at the scale the ASAA would require.

The Constitutional Questions

The legal challenges to the ASAA are significant and already being played out in courtrooms across the country.

A federal judge in Texas has already blocked that state’s version of the App Store Accountability Act on First Amendment grounds. Judge Pitman’s ruling found that requiring users to verify their age in order to download general-interest apps was akin to a law requiring every bookstore to verify the age of every customer at the door. The categories of speech being restricted, the court found, were excessively broad.

Supporters of the federal ASAA have argued that the Supreme Court’s ruling in FSC v Paxton, which permitted age verification for websites hosting explicit content, provides constitutional cover for the bill. Critics strongly dispute this reading. The Supreme Court’s ruling was narrow and explicitly tied to sexually explicit material. Applying the same logic to all app downloads across every category of content is a significant and untested legal leap. (Harvard Law Review)

Similar litigation is advancing against Utah’s version of the law, suggesting that the constitutional battle over app store age verification is far from settled.

What Is Happening in the UK?

The United States is not the only country grappling with these questions. In the United Kingdom, Apple has already begun implementing its own age verification measures, driven in part by the Online Safety Act, which received royal assent in 2023 and has been rolling out in stages since. Apple has updated its App Store age verification tools globally to comply with the growing web of child safety laws, including those in the UK. You can read our prior coverage of UK Online Safety Laws here.

Under the Online Safety Act, platforms are required to ensure that children cannot access legal but age-inappropriate content. In practice, this has prompted Apple to introduce stricter age rating systems in the UK App Store, as well as parental control tools that operate at the account level. Family Sharing in the UK now comes with more prominent prompts for parents to set age restrictions, and Apple has made it harder for minors to bypass those restrictions.

However, the rollout of age verification requirements under the UK’s Online Safety Act has not been without controversy. Critics have pointed to examples of news websites, journalistic content, and general interest pages being swept up in age-gating measures that were originally designed for explicit content. The concern is that once you build the infrastructure for age verification, the temptation to expand its scope becomes difficult to resist.

Apple’s approach in the UK has generally been praised as more privacy-conscious than some alternatives, because it operates primarily at the account level rather than requiring individual users to submit government IDs directly to the App Store. But the UK experience is still being watched closely as a data point for what mandatory age verification looks like in practice, and what kinds of collateral restrictions it tends to produce.

Is There a Better Way?

Critics of the ASAA are not universally opposed to protecting children online. Most argue that the goal is right but the mechanism is wrong.

The Parents Over Platform Act (POPA) has been held up as a less restrictive alternative. Rather than requiring strict age verification from everyone, POPA would require app stores to generate an age signal based on a user’s self-declared age. Critically, that signal would only be shared with apps that actually offer different experiences to young people or that are labelled for adults. And sharing would only happen when an account holder or their parent has opted in.

This approach has real limitations and would need improvement to be truly effective. But it points in a more proportionate direction: targeted interventions at content that genuinely warrants age restrictions, rather than a blanket checkpoint for all digital activity.

The core principle most experts agree on is that age verification requirements should be applied in the least restrictive manner possible, using privacy-preserving techniques, only to apps and content that actually require age gating, and only for users who are actively seeking to access that content.

What Happens Next?

The ASAA is moving through Congress at a time when there is genuine bipartisan appetite for doing something about children’s safety online. That political momentum is real, and it makes the bill’s passage more likely than it might otherwise be. The Senate companion bill (S.1586), sponsored by Senator Mike Lee, is advancing in parallel, increasing the chances of a federal version becoming law.

At the same time, the legal challenges facing state-level versions of the law suggest that even if a federal version passes, it will face immediate court battles. The constitutional questions around overbroad age verification are live and contested, and the courts have not yet provided a definitive answer.

For users, the practical stakes are high. If the ASAA becomes law in its current form, downloading an app in the United States could soon require submitting personal identifying information to Apple or Google, which would then share an age category with every developer whose app you download. Whether that trade-off is worth it for the child safety benefits the bill promises to deliver is a question that lawmakers, courts, and the public are only just beginning to seriously debate.

Why This Matters for Your Data

Most people do not think carefully about how much personal data they share simply by using an app store. The ASAA would dramatically increase the amount of sensitive data flowing through that channel, and the risks that come with it.

Understanding what data is held about you, where it lives, and who has access to it has never been more important. If you want to take control of your personal information, Redact makes it easier to manage and remove your digital footprint across platforms. In a world where age verification at scale means more of your sensitive data is being collected, stored, and shared than ever before, knowing what you can do to protect yourself is not optional. It is essential.

Download Redact.dev and take back control of your online presence.

Redact also supports a massive range of major social media and productivity platforms – like InstagramTwitterFacebookDiscordReddit and more.