Creating and managing a positive digital environment for children has become a priority for parents, lawmakers, and technology companies. However, as proposals progress to develop solutions and implement protections, we must ensure that our approaches address parents’ concerns without creating additional issues from the extensive collection of minors’ data.

Several legislative proposals currently seek to shift the responsibility of verifying a minor’s age from app creators and content platforms to the manufacturers of hardware devices like smartphones. However, this approach misplaces accountability within the technology stack, targeting the wrong layer of the ecosystem. To understand why this distinction matters, it’s essential to break down the key components of the technology stack:
• Hardware refers to physical devices such as smartphones, laptops, and gaming consoles manufactured by companies such as Apple, Samsung, and Dell.
• Software comprises the programs and operating systems that operate on these devices, such as iOS on an iPhone or Windows on a PC.
• Content providers, such as app developers and media platforms, create and distribute the digital experiences we engage with, such as social media apps, video streaming services, and online games.
The fundamental issue with these legislative proposals is that they conflate device manufacturers—those who build the hardware- with app creators and content providers—who control user interactions, content, and data policies. Holding hardware manufacturers responsible for app-based data management misaligns regulation with technological reality, potentially leading to ineffective and impractical solutions.
The key difference is who controls what. Hardware companies (like Apple, Samsung, and Sony) build the devices and the essential operating system software. They provide the tool but not the specific content you see on it. Content providers (like YouTube, TikTok, Instagram, or news websites) control the information flow of what appears on the screen. The smartphone’s manufacturer builds the device, but the apps consumers choose (YouTube, games, etc.) provide the content you see on it.
Creating regulations that require hardware companies to manage age restrictions (age gating) on content places responsibility at the wrong layer of the technology stack. The hardware serves as a delivery mechanism that displays the content sent by an app or website the consumer chooses to engage with. A phone or console cannot distinguish whether a video or website is “for adults” or “for kids” without guidance from the content source. Attempts to enforce filtering at the device level encounter serious practical problems because the device isn’t aware of the context of every piece of content – that knowledge resides with the content provider.
Hardware companies also lack a reliable method for determining a user’s age or identity. Your device doesn’t automatically know if you are 12, 18, or 40. At best, when initially purchased, it might contain a user profile with a birth date (which a child could easily misrepresent) or rely on guesswork like “facial recognition”— but that raises privacy concerns. Accurate age verification typically requires checking official IDs or similar sensitive information, which a phone or laptop cannot do on its own. Conducting age checks at the device or operating system level could also invade user privacy and would likely fail if a device is shared among multiple users, for instance, a family iPad. A device cannot easily identify who is holding it at any given moment.
Legislation like Utah’s App Store Accountability Act raises serious concerns about privacy and data security. It requires app stores to share users’ age information with every developer without parental consent or clear guidelines on how this sensitive data can be used or who collects it. This approach compromises user privacy and introduces significant risks, including the potential for malicious actors to exploit or sell children’s personal information.
A key issue with this model is that age-gating should be the responsibility of individual apps, not device manufacturers or app stores. Many apps do not require age verification because they are not designed for specific age groups, meaning they should not be compelled to collect unnecessary personal data.
As we develop frameworks for online child safety, solutions must genuinely protect children without introducing new privacy risks. Effective safeguards require age-gating at the content level, where apps and platforms determine appropriate access based on the content itself rather than at the device level, where broad data sharing can create new vulnerabilities. In conjunction with content-based age verification, parental controls provide a more secure and responsible approach—empowering families while ensuring that children’s personal data remains protected.
The post Children’s Online Safety Should Rely on Content Providers, Not Device Manufacturers appeared first on American Enterprise Institute – AEI.