The European Union to Sharply Tighten Regulation on the Use of Children's Data

  • 2025-08-09
  • Krete Paal

The European Commission’s latest guidelines introduce additional restrictions for developers of online platforms and services that are used by minors under the age of 18. Under these new rules, minors are treated as a distinct risk group, requiring an ethical approach and enhanced protection from the dangers of the digital world.
Krete Paal, CEO of the data protection startup GDPR Register, explained that the European Commission, the United Kingdom, and the United States are introducing new rules that recognize minors as a separate risk group. Their personal data is now considered not just sensitive but “ethically charged.” For developers of digital products, this shift means they must ensure greater transparency when processing minors’ data, offer fair design choices, and implement heightened security standards.
According to Paal, if an online platform or digital product such as a game or educational environment can be used by children, developers are required to conduct a child rights-focused risk assessment. This includes evaluating the impact of the content on minors and, if needed, adjusting their business models and privacy settings. Service providers must also be prepared to demonstrate that they have made a genuine effort to protect children’s rights.
In light of the new rules, developers must ensure that a child’s account is private by default. Location data must not be accessible by default, and cameras, microphones, and content downloads must all be turned off or restricted unless the user actively opts in. These privacy settings must not be preselected by the developer but should be based on the user’s informed decision.
Paal emphasized that the additional restrictions are entirely justified, as children often do not understand the implications of data collection. The younger the child, the more likely it is that they lack the capacity to assess the extent of personalization or recognize manipulative content. The distinction between personalization and exploitation will become increasingly important for digital product developers.
Many modern digital platforms use recommendation systems to increase user engagement and screen time. While adults may to some extent recognize manipulative tactics, children usually do not perceive algorithmic influence in the same way. The new regulations aim to reduce excessive digital consumption among minors, remove gambling-like features, and make recommendation systems more transparent.
Developers will no longer be allowed to enable features such as automatic twenty-four-seven notifications, infinite scrolling, autoplay in video games, or body-altering filters by default. These must become opt-in features based on informed choice, not automatic functionality.
Age verification will also become significantly stricter. Simply asking for a user’s age will no longer be sufficient. The new rules require age verification to be reliable, difficult to bypass, and to offer multiple verification options.
The European Union plans to launch the EU Digital Identity Wallet in 2026, which could be used for verifying age. Other methods currently being tested include facial recognition, mobile network verification, and linking to bank accounts. Paal noted that developers will need to solve the challenge of protecting children without collecting excessive data, as finding the right balance will be central to future regulation.
The data protection expert also emphasized that trust in the digital world can no longer be taken for granted. When data collection concerns children, the issue becomes more than just a legal risk - it becomes a question of broader public trust. If a platform neglects children’s rights, its business model, reputation, and brand may be at serious risk. Conversely, developers who can deliver a transparent, ethical, and safe digital experience for young users are already at an advantage, as the awareness and vigilance of parents and educators continues to grow.
The new rules will take effect in 2025 across multiple jurisdictions. In the European Union they will be enforced through the Digital Services Act, in the United Kingdom through the Online Safety Act, and in the United States under the Children’s Online Privacy Protection Act. Compliance will be mandatory for all online product developers operating in these markets.
The GDPR Register<https://www.gdprregister.eu/>, developed by an Estonian startup in collaboration with IT experts, simplifies the process of complying with the GDPR. It helps organizations efficiently manage the requirements, processes, and documentation associated with data protection regulations.