The European Commission has asked multiple online platforms and app stores for information under the Digital Services Act on how they are protecting children from online harm.

Shutterstock.com/paparazzza
The Commission said today that it has sent the requests to platforms Snapchat and YouTube and app stores Google Play and Apple Store, seeking information on age verification systems and safeguards for children. All four companies are designated as very large online platforms (VLOPs) under the Digital Services Act (DSA).
The investigative actions are the first to be launched since the commission adopted its guidelines on the protection of minors on 14 July. The guidelines had set out how platforms can comply with article 28 of the DSA, under which providers must implement measures to ensure “a high level of privacy, safety and security of minors on their service”.
Henna Virkkunen, executive vice-president for tech sovereignty, security and democracy at the commission said “platforms have the obligation to ensure minors are safe on their services – be it through measures included in the guidelines on protection of minors, or equally efficient measures of their own choosing.”
“Today, alongside national authorities in the member states, we are assessing whether the measures taken so far by the platforms are indeed protecting children,” Virkkunen added.
With regards to Snapchat, the commission said it is seeking information on how the platform complies with its own policy to prohibit children under 13 from accessing its services. The regulator has also asked about the steps Snapchat takes to prevent the sale of illegal goods such as vapes or drugs to children on its platform.
The commission asked Google subsidiary YouTube about its age assurance and content recommender systems “following reports of harmful content being disseminated to minors”. Meanwhile it requested that Apple’s App Store and Google Play explain how they apply age ratings, and how they manage the risk of users downloading illegal or harmful apps such as those related to gambling.
A Snapchat spokesperson told Lexology PRO that the company has received the requests and will “collaborate to provide the necessary information”.
Snapchat is “deeply committed” to ensuring the safety of its community and aims to provide users with an environment that “reduces online risks and potential harms”, the spokesperson added. “It is why we have built privacy and safety features into our service and have provided the commission with detailed risk assessments from the start.”
A spokesperson for Google said the company will “continue to engage with the commission on this critical area”.
"For years, we’ve worked with child development experts to build age appropriate experiences across Google, including on YouTube, with robust controls for parents, and industry-leading security and protections for younger users,” the spokesperson said.
Commission president Ursula von der Leyen emphasised the importance of prioritising kids’ safety during her State of the Union address last month, where she revealed a new panel tasked with advising on the best approach to social media regulation in the bloc. Von der Leyen noted that Australia is currently pioneering a social media minimum age restriction: “I am watching the implementation of their policy closely to see what next steps we can take here in Europe," she added.
Apple did not respond to a request for comment.