Online platforms hosting pornography face an increasing compliance burden as regulators demand they safeguard children from age-inappropriate content. Lexology PRO breaks down what in-scope businesses should know.

Shutterstock.com/tsingha25
Regulators around the world are increasingly scrutinising how adult platforms mitigate the risks their services pose to children.
This includes requiring websites hosting pornographic content to prevent children from accessing their services and taking robust action against platforms that fail to comply.
Adult websites Pornhub, Stripchat, XNXX, and XVideos are being investigated for suspected breaches of the EU Digital Services Act 2022 (DSA 2022), it was announced on 27 May 2025. The European Commission’s preliminary investigation found these platforms had not taken appropriate measures to ensure a “high level of privacy, safety and security for minors.”
Earlier this month the UK Office of Communications (Ofcom) also launched investigations into two adult websites as part of its “age assurance enforcement programme” under the UK Online Safety Act 2023 (OSA 2023).
Companies breaching child safeguarding duties face some of the highest penalties available under DSA 2022 and OSA 2023 – up to 6% and 10% of global annual turnover, respectively.
Other jurisdictions, including Australia and some US states, have also imposed enhanced age verification requirements on platforms hosting pornography. Lexology PRO explores the nuances of these regulatory approaches.
Platforms face stricter safeguarding duties
UK
OSA 2023 imposes specific child safeguarding duties on both platforms that publish their own pornographic content, including certain Generative AI tools (Part 5 services), and user-to-user services where pornographic content may be shared (Part 3 services).
All services that allow pornography will need to have “highly effective age assurance” measures in place. For Part 5 services, the deadline to implement these measures was January 2025, while Part 3 services have until 1 July 2025.
Ofcom has issued in-depth guidance on its expectations for adult platforms’ age assurance measures. The chosen method should be “technically accurate, robust, reliable and fair.”
EU
DSA 2022 requires designated platforms with more than 45 million EU users to take steps to protect the rights and wellbeing of children. This includes preventing minors from accessing pornographic content on their services using age verification tools.
Pornhub, XNXX, Stripchat and XVideos are the porn hosting websites currently designated under DSA 2022.
The Commission published its draft “guidance on protection of minors online” under DSA 2022 on 13 May 2025. It recommends that platforms accessible to children use age verification to protect minors from various forms of age-inappropriate content, including gambling and pornography.
The Commission is developing an age-verification app as an interim solution until the EU Digital Identity Wallet becomes available by the end of 2026. Both the app and the wallet will enable online service providers to confirm that a user is an adult, while protecting their privacy.
Italy
The Italian regulator AGCOM approved new age verification rules on 12 May 2025. They apply to all video-sharing platforms and websites hosting pornographic content available in Italy.
In-scope platforms have 6 months to implement necessary age verification measures to prevent children from accessing online pornography. AGCOM has the power to block access to any platform that fails to comply.
The regulator has set out a series of mandatory principles for age verification systems, including proportionality, to balance the need for age verification with users’ privacy, as well as cybersecurity, ease of use, accuracy and accessibility.
Platforms must enlist an independent third-party provider to carry out verification, using a "double anonymity" method.
France
All platforms that host pornographic content in France must comply with Arcom’s standard for age verification systems to access pornographic sites (French language only), effective April 2025. The standard requires all platforms that host pornography to introduce age checks that adhere to strict technical standards.
Platforms’ landing pages must not display any pornographic content until the user's age has been verified and verification must be carried out every time a user accesses the website.
The French regulator places a strong emphasis on ensuring age verification methods protect users’ privacy. Platforms must enlist an independent third-party to provide verification services, enabling double anonymity and the platform must not have access to age verification data.
Germany
In Germany, the Interstate Treaty on the protection of minors and Youth Protection Act require age assurance for online content that poses a “real potential for danger or impairment” to children, which includes pornographic content.
According to regulatory guidance, age verification systems should follow a two-step procedure. First, a one-time identification of the user by comparing their likeness with an official identity document, followed by a second step to ensure only the genuine owner of the documentation is granted access to the age-restricted content.
US
To date, 24 US states, including Texas, Missouri, Florida, Utah and Georgia have introduced some sort of age verification requirement for adult platforms.
Some state laws stipulate a specific age verification method companies must use, such as requiring users to submit government-issued ID or using a method based on transaction data.
At federal level, President Donald Trump signed the Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act (Take It Down Act) on 19 May 2025. The Act requires online platforms to remove content within 48 hours of receiving requests from victims of image-based sexual abuse. Platforms must develop a notice and takedown process for non-consensual images and deepfakes by 19 May 2026.
Australia
The Australian online safety regulator eSafety is developing industry codes of practice for in-scope online platforms under Online Safety Act 2021. This includes guidance on protecting children from "class 1C" and "class 2" material, which includes online pornography and other “high-impact” adult material, such as gambling.
The draft codes addressing this material make certain platform types, such as social media, responsible for implementing age assurance and access control measures to prevent children accessing class 1C and class 2 material, “where technically feasible and reasonably practicable”.
Australia has already passed a law requiring social media platforms to verify that users are over 16 before allowing them to create accounts. It takes effect in December 2025.
What this means for businesses
The growing number of age-verification requirements across different jurisdictions significantly increases the compliance burden and risk of enforcement for companies that host online pornographic content.
Some platforms are resisting their new obligations. Pornhub has blocked access to its services in various US states that have passed laws requiring it to verify the age of users.
Nevertheless, as regulators double down on children’s online safety, it seems likely that adult content providers will need to introduce some form of age verification to continue operating legally.
Platforms that host pornography should be aware that “click away” pop-ups asking users if they are over 18 are unlikely to be sufficient to ensure compliance with new age verification rules. Platforms will need to develop methods of age assurance that are robust, accurate, not easily circumvented and that comply with privacy requirements. In some jurisdictions, they will need to enlist the services of an independent third party.
What’s more, while some of the laws target platforms that primarily host pornography, in many jurisdictions – including Australia, the UK and EU – social media platforms that allow users to upload their own content may also fall within scope. Unless such platforms explicitly prevent users from uploading pornography or other adult content, they should have access restrictions in place to ensure children are not exposed to it.