While most MENA countries lack binding online child protection requirements, businesses should start implementing age verification and content moderation now to stay ahead of tightening regulations.

Shutterstock/Family Stock
Regulators in the MENA region are gradually introducing laws and codes of practice aimed at improving online safety for minors, as internet usage among children rises.
Minimum age requirements, user verification, and mechanisms to remove harmful content are emerging regulatory expectations for online platforms across MENA, but legal protections remain uneven, and enforcement mechanisms are often limited, leaving platforms walking a regulatory tightrope.
“Some governments in the Middle East and North Africa have made limited progress in digital child protection,” Hanaa Al-Ramli, a consultant on internet safety and culture in MENA, told Lexology PRO. “Many Arab countries still lack comprehensive and specialised legislative frameworks that focus on shielding children from all forms of online exploitation and abuse.”
Proactive compliance with international standards like the UK's Online Safety Act can therefore reduce liability and prepare platforms for the region's evolving legal landscape.
Lexology PRO explores the region’s approaches towards child online safety and key tips for businesses.
How are lawmakers and regulators in MENA approaching online safety?
Morocco is drafting a new law to regulate digital content creation, with specific provisions to tackle the exploitation of children online, Minister of Youth, Culture, and Communication Mohamed Mehdi Bensaid announced in parliament earlier this summer. Currently, only the penal code governs such issues, leaving a legal gap in online protection.
In the UAE, Federal Decree-Law No. 34 of 2021 criminalises cyber abuse, including online harassment of minors. The law applies to local and foreign service providers but does not mandate specific content moderation or age verification protocols. In addition, the Dubai Data Law aims for data protection and privacy of all individuals including that of children.
Back in 2016, the UAE established Federal Law No. 3 of 2016 on Child Rights, known as “Wadeema’s Law”, that protects children from all forms of exploitation, including those that may occur online. Under the law, internet service providers must alert relevant authorities if children are put at harm. The UAE’s Child Digital Safety platform can be found here.
Saudi Arabia established a National Framework for Child Online Safety in 2023, which introduced guidelines for platforms to protect minors and promote awareness campaigns. The framework is a five-year national plan for child online safety in the Kingdom, aligning with international standards. Enforcement mechanisms remain nascent, with voluntary compliance emphasised.
Tunisia launched a new reporting portal to allow individuals to safely and anonymously report instances of child sexual abuse material found online. This initiative aims to enhance online safety for children.
Egypt’s Data Protection Act (2020) explicitly classifies children’s personal data as sensitive and mandates parental or guardian consent for processing such data. Cybercrime Law No. 175 of 2018 criminalises online behaviors such as child pornography, cyberbullying, and harassment, establishing legal consequences for these offenses
Jordan’s updated Cybercrime Law, effective since September 2023, criminalises various online abuses, but doesn’t have specific measures for children. Jordan’s Penal Code was amended in 2020 to address online child sexual exploitation.
On Safer Internet Day 2025, Save the Children Jordan called for legislative updates to better tackle emerging digital threats.
How do MENA rules compare to elsewhere?
Globally, many jurisdictions have developed comprehensive protections for minors online. The 2023 UK’s Online Safety Act mandates age verification measures, content moderation, and robust reporting mechanisms for harmful material. The EU Digital Services Act 2022 (DSA) rules that any online platform accessible to minors must implement appropriate and proportionate measures to ensure a high level of privacy, safety, and security for child users.
By contrast, across most of the MENA region, there is no mandatory age verification, clear platform responsibility for content moderation, and an absence of national hotlines or cross-border investigative cooperation.
“There is an urgent need for laws obligating platforms to periodically review and improve their services, and to clearly assume responsibility for protecting minors,” Al-Rammli said.
“In many countries, there are no easy-to-use reporting portals, national hotlines, or cross-border cooperation to investigate and prosecute digital crimes against children, nor sufficient support and protection for victims.”
“Laws often lack binding requirements for digital platforms to assess, address, and document all risks facing children, and to set clear limits on stranger contact, bullying, or dangerous challenges.”
What are the key tips for businesses?
Countries in MENA have adopted a mix of general cybercrime, data protection, and emerging child-specific regulations. As compliance with guidelines is largely voluntary at this stage, businesses could look ahead and proactively implement measures in line with international standards.
Review services against child protection risks
Platforms should assess whether their services are accessible to children and identify potential exposure to harmful content. Benchmarking against international laws, such as the UK’s Online Safety Act or the EU’s Digital Services Act, can guide voluntary compliance and risk mitigation strategies.
Implement age verification and parental consent measures
Even in jurisdictions without explicit legal mandates, companies should adopt robust age assurance systems to protect children and reduce liability.
Establish reporting and content moderation mechanisms
Platforms should provide accessible channels for users to report abuse or harmful content and implement rapid response systems for content removal. Cooperation with local authorities can enhance protection and compliance.
Embed continuous risk assessments
As Al-Ramli notes, platforms should periodically review and improve their services to address emerging risks. This includes auditing content, monitoring interactions between users, and updating safety features.
Educate and engage users and parents
Platforms can supplement regulatory compliance with educational initiatives for children and parents, creating a safer digital environment and building trust.