Testimony from:
Josh Withrow, Fellow, Tech & Innovation Policy, R Street Institute

In OPPOSITION to House Bill 3405, the “App Store Accountability Act”

April 29, 2025

House Judiciary Committee, Artificial Intelligence,

Cybersecurity, and Special Laws Subcommittee

Chairman Moore and members of the committee,

My name is Josh Withrow, and I am a resident fellow with the Technology and Innovation Policy team at the R Street Institute, which is a nonprofit, nonpartisan, public policy organization. Our mission at RSI is to engage in research and outreach to promote free markets and limited, effective government in many areas, including the technology and innovation sector. It’s for this reason we have an interest in H 3405.

We support empowering parents to help guide their children’s use of technology, and believe they are the best and most positive way to ensure the safety of our youth with respect to the online environment. The intent of H 3405 may be to improve parent’s ability to control the online content accessible to their children, but we believe it creates a superfluous government mandate that mirrors protections that are already easily available to them. In doing so, it unwittingly erects barriers to accessing speech and content for all mobile device users and creates novel data privacy and security problems. While we share the goal of protecting kids and teens from harmful content and other dangers online, we believe that mandatory age verification at the app store level poses several practical and constitutional concerns.[1] These issues outweigh whatever limited good this proposal may achieve.

Increasingly, the major online platform owners are investing heavily to make their parental control tools at the device, browser, and platform levels more accessible and effective.[2] In addition, there has long been a robust market for third-party software that grants parents even more granular control over their children’s mobile device screen time and online access.[3] Given the widespread availability of better private solutions to online safety, we believe that educational efforts aimed at providing both kids and parents with the knowledge of how to more safely navigate the digital world would be a better approach. States such as Tennessee have passed legislation that directs public schools to include education about online safety in their curriculum.[4]

H 3405 requires that any app store on a smartphone or tablet device must verify a user’s age when creating an account. The app stores are held responsible for verifying whether a user falls into any of the categories of under 13 years of age, between 13 and 16, 17 years old, and over 18 years of age.[5] Even the most privacy-protecting readily available commercial age estimation technologies have error rates that are high enough that even just estimating whether someone is over 18 is an imperfect science. [6] To “verify” – as opposed to estimate or self-certify – which one of these narrow age ranges someone fits into would most likely require documentary identification like a state ID.

What might count as a “commercially reasonable method of verification” is left to be defined by the Department of Consumer Affairs, but in general requiring the app stores to “verify” as opposed to “estimate” their user’s age ranges will incentivize the use of more intrusive forms of age verification. This would entail collecting a good deal more personally identifying information from mobile device users than the companies currently collect.

Obtaining verifiable parental consent without requiring intrusive identity verification and the collection of further sensitive personal data is even harder.[7] Platforms are somehow left with the obligation to figure out whether someone really is a valid parent or legal guardian of a minor account holder, which at best compromises the online anonymity or pseudonymity of the parent. It is even more difficult for platforms to discern parental status for children in non-traditional family situations, such as having divorced parents or legal guardians who are not relatives. And for children in dysfunctional family situations, a parental consent requirement may deny them the potentially crucial outlet of social media apps altogether, all the way to the age of 18.

The additional data that app stores will have to collect from consumers in order to comply with H 3405 creates a tempting new trove of information for hackers. Even with a data minimization requirement, companies are put in the odd spot of both having to delete information they collect to verify ages and also to be able to prove that they did comply with the verification requirements if brought to trial.[8] Just as problematic, if they use a third-party service to conduct the age estimation or verification, those services are not immune to hackers either. As if to emphasize this point, one of the biggest services used by some large social media platforms to verify user age and identity recently suffered a major data breach.[9] Similar data security concerns were one component of why California’s Age Appropriate Design Code has been enjoined by the courts, with the district court finding that the law was “actually likely to exacerbate the problem by inducing covered businesses to require consumers, including children, to divulge additional personal information.”[10]

The fact that every mobile device user would have to go through the age verification process to use the app stores also almost certainly dooms this bill on constitutional grounds. Previous attempts to enact broad age-gating restrictions for online services have been found to violate the First Amendment in the past. In the 1990s, the majority of the Communications Decency Act was struck down, with the U.S. Supreme Court finding unanimously that the law’s “burden on adult speech is unacceptable if less restrictive alternatives would be at least as effective in achieving the Act’s legitimate purposes.”[11] The ubiquity of parental tools and guidance on how to use them certainly means that this bill’s mandates would fail this least-restrictive-means test.[12] 

Similarly, requiring parental consent for minors to access lawful, non-obscene content was found to be unconstitutional, in Brown v. Entertainment Merchants’ Association. Justice Antonin Scalia wrote in the majority, that “we note our doubts that punishing third parties for conveying protected speech to children just in case their parents disapprove of that speech is a proper governmental means of aiding parental authority.”[13] Essentially, even minors have limited First Amendment rights to accessing non-obscene speech and online content, which may be curtailed by their parents but not by the government.[14] But H 3405 would require parental consent for a minor to download and use any app – from social media to a calculator app – regardless of the level of risk they pose to minors.

The constitutional issues inherent in age-verification and parental consent requirements such as those in H 3405 have caused similar proposals in several states to be enjoined by courts, likely on the path to being struck down altogether.[15] Lawmakers would be better off focusing on ways to improve online literacy, both for parents and their children, and encouraging parents to exercise the substantial power they already have to control what content and interactions their kids can access online.  

Thank you for your time,

Josh Withrow
Fellow, Technology & Innovation Policy
R Street Institute
jwithrow@rstreet.org 


[1] See Shoshana Weissmann and Josh Withrow. “No, conscripting the app stores doesn’t solve the problems with age verification,” R Street Institute, Jan. 29, 2025. https://www.rstreet.org/commentary/no-conscripting-the-app-stores-doesnt-solve-the-problems-with-age-verification/.

[2] See, “Helping Protect Kids Online,” Apple.com, Feb. 2025. https://developer.apple.com/support/downloads/Helping-Protect-Kids-Online-2025.pdf, “Leading Technology Companies and Foundations Back New Initiative to Provide Free, Open-Source Tools for a Safer Internet in the AI Era,” PR Newswire, Feb. 10, 2025. https://www.prnewswire.com/news-releases/leading-technology-companies-and-foundations-back-new-initiative-to-provide-free-open-source-tools-for-a-safer-internet-in-the-ai-era-302371243.html.

[3] “Children Online Safety Tools,” Competitive Enterprise Institute, Last accessed Feb. 16, 2025. https://cei.org/children-online-safety-tools/.

[4] HB 0285, Tennessee General Assembly, 2025 Legislative Session. https://wapp.capitol.tn.gov/apps/BillInfo/default.aspx?BillNumber=HB0825&GA=114.

[5] H 3405, South Carolina State House, 2025 Legislative Session, last accessed Apr. 28, 2025. https://www.scstatehouse.gov/billsearch.php?billnumbers=3405&session126&summary=B 

[6] On error rates for the best age estimation technologies, see: Kayee Hanaoka, et al., “Face Analysis Technology Evaluation: Age Estimation and Verification,” NIST Internal Report 8525, May 2024. https://nvlpubs.nist.gov/nistpubs/ir/2024/NIST.IR.8525.pdf.

[7]  “The State of Play: Is Verifiable Parental Consent Fit for Purpose?” Future of Privacy Forum, June 2023. https://fpf.org/verifiable-parental-consent-the-state-of-play/

[8] Shoshana Weissmann, “Age verification legislation discourages data minimization even when legislators don’t intend that,” R Street Institute, May 24, 2023. https://www.rstreet.org/commentary/age-verification-legislation-discourages-data-minimization-even-when-legislators-dont-intend-that/ 

[9] Jason Kelley, “Hack of Age Verification Company Shows Privacy Danger of Social Media Laws,” Electronic Frontier Foundation, June 26, 2024. https://www.eff.org/deeplinks/2024/06/hack-age-verification-company-shows-privacy-danger-social-media-laws 

[10] Adrian Moore and Eric Goldman, “California’s Online Age-Verification Law is Unconstitutional,” Reason, Nov. 28, 2023. https://reason.org/commentary/californias-online-age-verification-law-is-unconstitutional/.

[11] Reno v. ACLU, 521 U.S. 844 (1997), U.S. Supreme Court, June 26, 1997. https://supreme.justia.com/cases/federal/us/521/844.

[12] For example, a quick step-by-step walkthrough for how to enable parental controls on any commonly-owned mobile device: “Parental Controls,” Internet Matters, https://www.internetmatters.org/parental-controls/

[13] Brown et al. v. Entertainment Merchants Assn. et al., 564 U.S. 786 (2011). U.S. Supreme Court, June 27, 2011. https://supreme.justia.com/cases/federal/us/564/786.

[14] Jennifer Huddleston, “Courts Should Affirm First Amendment Rights of Youths in the Digital Age: A Case for 21st-Century Tinker,” Cato Institute, Mar. 28, 2024. https://www.cato.org/briefing-paper/courts-should-affirm-first-amendment-rights-youths-digital-age-case-21st-century#.

[15] See, e.g.: Netchoice LLC v. David Yost, U.S. District Court for the Southern District of Ohio, Eastern Division, 2:24-cv-00047. https://netchoice.org/wp-content/uploads/2024/01/2024.01.09-ECF-27-ORDER-Granting-TRO.pdf, NetChoice LLC  v. Lynn Fitch, U.S. District Court for the Southern District of Mississippi, Southern Division, 1:24-cv-170-HSO-BWR https://netchoice.org/wp-content/uploads/2024/07/NetChoice-v-Fitch-District-Court-Preliminary-Injuction-Ruling-July-1-2024.pdf , and NetChoice v. Sean Reyes, U.S. District Court for the District of Utah, 2:23-cv-00911-RJS-CMR and 2:24-cv-00031-RJS-CMR https://netchoice.org/wp-content/uploads/2024/09/NetChoice-v-Reyes-2024.09.10-ECF-86-ORDER-Granting-PI.pdf