WSGR logoWSGR logo
WSGR logo
  • Experience
  • People
  • Insights
  • About Us
  • Careers

  • Practice Areas
  • Industries

  • Corporate
  • Intellectual Property
  • Litigation
  • Patents and Innovations
  • Regulatory
  • Technology Transactions

  • Capital Markets
  • Corporate Governance
  • Corporate Life Sciences
  • Derivatives
  • Emerging Companies and Venture Capital
  • Employee Benefits and Compensation
  • Energy and Climate Solutions
  • Executive Advisory Program
  • Finance and Structured Finance
  • Fund Formation
  • Greater China
  • Mergers & Acquisitions
  • Private Equity
  • Public Company Representation
  • Real Estate
  • Restructuring
  • Shareholder Engagement and Activism
  • Tax
  • U.S. Expansion
  • Wealthtech

  • Special Purpose Acquisition Companies (SPACs)

  • Environmental, Social, and Governance

  • AI and Data Center Infrastructure
  • Energy Regulation and Competition
  • Project Development and M&A
  • Project Finance and Tax Credit Transactions
  • Sustainability and Decarbonization
  • Transportation Electrification

  • U.S. Expansion Library and Resources

  • Post-Grant Review
  • Trademark and Advertising

  • Antitrust Litigation
  • Arbitration
  • Board and Internal Investigations
  • Class Action Litigation
  • Commercial Litigation
  • Consumer Litigation
  • Corporate Governance Litigation
  • Employment Litigation
  • Executive Branch Updates
  • Government Investigations
  • Internet Strategy and Litigation
  • Patent Litigation
  • Securities Litigation
  • State Attorneys General
  • Supreme Court and Appellate Practice
  • Trade Secret Litigation
  • Trademark and Copyright Litigation
  • Trial
  • White Collar Crime

  • Advertising, Promotions, and Marketing
  • Antitrust and Competition
  • Committee on Foreign Investment in the U.S. (CFIUS)
  • Communications
  • Data, Privacy, and Cybersecurity
  • Export Control and Sanctions
  • FCPA and Anti-Corruption
  • FDA Regulatory, Healthcare, and Consumer Products
  • Federal Trade Commission
  • Fintech and Financial Services
  • Government Contracts
  • National Security and Trade
  • Payments
  • State Attorneys General
  • Strategic Risk and Crisis Management
  • Tariffs, Customs, and Import Compliance

  • Antitrust and Intellectual Property
  • Antitrust Civil Enforcement
  • Antitrust Compliance and Business Strategy
  • Antitrust Criminal Enforcement
  • Antitrust Litigation
  • Antitrust Merger Clearance
  • European Competition Law
  • Third-Party Merger and Non-Merger Antitrust Representation

  • Anti-Money Laundering
  • Foreign Ownership, Control, or Influence (FOCI)
  • Team Telecom

  • AI in Healthcare
  • Animal Health
  • Artificial Intelligence and Machine Learning
  • Aviation
  • Biotech
  • Blockchain and Cryptocurrency
  • Clean Energy
  • Climate and Clean Technologies
  • Communications and Networking
  • Consumer Products and Services
  • Data Storage and Cloud
  • Defense Tech
  • Diagnostics, Life Science Tools, and Deep Tech
  • Digital Health
  • Digital Media and Entertainment
  • Electronic Gaming
  • Fintech and Financial Services
  • FoodTech and AgTech
  • Global Generics
  • Internet
  • Life Sciences
  • Medical Devices
  • Mobile Devices
  • Mobility
  • NewSpace
  • Quantum Computing
  • Semiconductors
  • Software

  • Offices
  • Country Desks
  • Events
  • Pro Bono
  • Community
  • Our Diversity
  • Sustainability
  • Our Values
  • Board of Directors
  • Management Team

  • Austin
  • Boston
  • Boulder
  • Brussels
  • Century City
  • Hong Kong
  • London
  • Los Angeles
  • New York
  • Palo Alto
  • Salt Lake City
  • San Diego
  • San Francisco
  • Seattle
  • Shanghai
  • Washington, D.C.
  • Wilmington, DE

  • Law Students
  • Judicial Clerks
  • Experienced Attorneys
  • Patent Agents
  • Business Professionals
  • Alternative Legal Careers
  • Contact Recruiting
What’s Next for Age Assurance Laws in Europe?
Alerts
May 5, 2026

Developments in law, regulatory guidance, and enforcement practice across Europe are leading to meaningful changes in how online services are offered to minors. A steady stream of announcements in recent months makes clear that this area will continue to develop at pace, requiring providers of online services to keep their approach to age assurance under regular review.

Maturing Regulatory Regimes

At a high level, age assurance requirements in Europe are driven by a combination of privacy and online safety laws, supplemented by sector-specific rules in certain countries. Although the core legal frameworks underpinning these regimes have been in force for some time, it is increasingly regulatory guidance, supervisory engagement, and enforcement activity that are driving how these requirements are interpreted in practice:

  • Privacy and consent requirements. Where a company relies on consent as its legal basis for processing minors’ data under the General Data Protection Regulation (GDPR), it must take reasonable steps to ensure the consent is provided by a parent or person with parental authority. In practice, this may require age verification. Regulatory guidance, such as that issued by the Irish Data Protection Commission (IDPC) and the UK Information Commissioner’s Office (ICO), stresses that age assurance methods must be effective for their intended purpose and proportionate to the risks present in a service. So far in 2026, the ICO has issued two fines related to parental consent, and the regulator has also issued an open letter calling on companies to strengthen their age-assurance practices.
  • Online safety and platform regulation. Providers of “online platforms” are required by the Digital Services Act (DSA) to take steps to protect the privacy, safety and security of minors, and providers subject to the UK’s Online Safety Act (OSA) can be required to implement age assurance if certain types of harmful content can be accessed on their services. Guidance issued by the European Commission and Ofcom under both the DSA and OSA largely echoes the approach of privacy regulators, stressing the need for a proportionate, risk-based approach, with providers of high-risk services required to meet more prescriptive standards. Both regulators have active ongoing investigations into services’ compliance with their age assurance duties.
  • Sector and product-specific restrictions. Most countries have national legislation in place, which prohibits access by minors to certain types of high-risk content, including, for example, pornography and online gambling.

The Emergence of the “Social Media Ban”

Inspired by Australia’s “social media ban”, which came into force at the end of 2025, EU lawmakers are now calling for the introduction of a framework that would require a digital minimum age of 16 for access to social media, video sharing, and AI companions, but with an exception for 13-16 year olds where parental consent is obtained. This comes alongside a number of other developments for online platforms, including the proposed introduction of a Digital Fairness Act, which would tackle online consumer protection issues, including by requiring mandatory age assurance for digital products accessible to minors that contain certain commercial practices.

At a national level, a number of European governments are considering implementing restrictions that may have a similar effect. For instance:

  • In France, the government is pursuing an accelerated legislative procedure to enact a new bill before September 2026, that would require a minimum age of 15 for access to certain social media, with exceptions for encyclopedias, educational and scientific content, as well as open source software development and sharing platforms.
  • In the UK, the government is currently consulting on a range of proposals to strengthen the company’s online safety regime, including a minimum age for social media services, and a potential change to the age of digital consent (currently set at 13 in the UK).
  • The Spanish Government has recently approved the draft Organic Law on the protection of minors in digital environments, which is now being examined by Parliament. The draft law raises the minimum age for consent in digital services (including social media) from 14 to 16. In parallel, the Government has also announced a separate review of the law governing the right to honor and self-image, which is still in an early drafting stage. This law focuses on protecting personal rights, such as image and voice. The draft law specifies that teens over 16 are able to provide consent for the use of their image, and teens below 16 need to obtain parental consent unless they have a sufficient degree of maturity. Consent required under this law is not the same as consent under data protection law, but rather a commercial consent based on an individual’s right to image.
  • In Norway, following a public consultation, the government recently announced that it will introduce a new law that would prohibit social media platforms from offering their services to minors under the age of 16, and will make platform providers responsible for age verification. Services such as video games, platforms used for communication related to school or extracurricular activities, and services used for buying and selling goods and services, or to provide housing or job advertisements would not be part of the prohibition.
  • In Austria, the government has committed to publishing a draft law by June 2026 that would establish a minimum age of 14 for access to social media services, along with mandatory age-verification requirements for platforms.

Meeting Age Assurance Requirements

Both the DSA and OSA are technology neutral, and while Ofcom and the European Commission have each issued guidance addressing some common methodologies, most companies have discretion over the most appropriate methodology to implement, in view of the nature of their service and regulatory exposure. For many companies, this may involve adopting a range of measures, including age estimation tools, alongside more robust age verification mechanisms such as document checks or digital identity solutions. The effectiveness of these measures should be assessed on an ongoing basis, taking into account evolving user behavior, known risks, and technological developments.

Most current approaches to age assurance rely on a combination of commercial solutions and bespoke in-house tools. This will likely evolve over time, as a push across jurisdictions—in particular, the EU—toward digital identity frameworks creates opportunities for more standardized and interoperable approaches to age verification. As an early example, in April 2026, the European Commission announced the launch of a new age verification app, which has been designed to work on any device and is fully open source.

Considerations When Implementing Age Assurance

Companies should consider the following when implementing an age assurance strategy:

  1. Identify applicable regulatory frameworks. Expectations to implement age assurance can arise under the GDPR (including through instruments such as the ICO’s Age Appropriate Design Code or IDPC’s Fundamentals), as well as under the OSA (for user-to-user services) and the DSA (for online platforms). Mapping which regimes apply to a service is a key first step, as the applicable standard for age assurance will vary.
  2. Select and design an appropriate methodology. In practice, this will often involve taking a risk-based approach to determine the level of age assurance required and implementing a combination of measures. Companies should also consider how age assurance interacts with the user journey (e.g., whether it should be required only when accessing certain features or elements of a service).
  3. Carry out a DPIA. Data Protection Impact Assessment (DPIA) requirements continue to apply even where age assurance is mandated by law. If you are relying on a third-party solution, or you are building one in house, carrying out a DPIA is likely to be necessary to identify any risks associated with the processing.

Wilson Sonsini has extensive experience with both data protection and platform regulation. We help clients design practical strategies that meet DSA and GDPR requirements, reduce compliance risks, and support business goals in the EU. If you have any questions, please contact Cédric Burton, Laura De Boel, Yann Padova, Nikolaos Theodorakis, Tom Evans, Marie Catherine Ducharme, or any member of the Data, Privacy, and Cybersecurity practice.

Contributors

  • Cédric Burton
  • Yann Padova
  • Tom Evans
  • Marie Catherine Ducharme
  • Michaela Novakova
  • people
  • insights
  • about us
  • careers
  • Binder
  • Alumni
  • Mailing List Signup
  • Client FTP Portal
  • Privacy Policy
  • Terms of Use
  • Accessibility
WSGR logo
Twitter
LinkedIn
Facebook
Instagram
Youtube
Copyright © 2026 Wilson Sonsini Goodrich & Rosati. All Rights Reserved.