Australia’s online safety regulator has ordered four AI companion chatbot providers to share how they are protecting children or face enforcement action.
0446.jpg)
Shutterstock.com/Engel Ching
Australia’s eSafety Commissioner today issued legal notices to four AI companion chatbot providers seeking information about their measures to protect children from harmful content such as self-harm or sexually explicit material. If the companies don’t comply, they may face enforcement actions including court proceedings and fines of up to A$825,000 (€462,000) per day.
The commissioner said the providers of Character.Ai, Chub.ai, Glimpse.AI and Chai must answer questions detailing how they are complying with the Basic Online Safety Expectations Determination (BOSE) under the Online Safety Act 2021. The expectations, which set out the government’s minimum standards for digital platforms, include reasonable steps providers must take to minimise harmful content and ensure user safety, especially for children.
eSafety Commissioner Julie Inman Grant said in a statement that although these services are often marketed as providing emotional support and companionship, there is a “darker side”. She noted that many chatbots are capable of engaging in sexually explicit conversations and “may also encourage suicide, self-harm and disordered eating.”
“These companies must demonstrate how they are designing their services to prevent harm, not just respond to it,” Inman Grant said. “If you fail to protect children or comply with Australian law, we will act.”
The notices follow the registration of updated codes under the act in June 2023, aimed at protecting children from age-inappropriate online content. In September 2025, the codes were expanded to also cover AI companion chatbots. The codes and standards are legally enforceable and breaching these may result in civil penalties of up to A$49.5 million (€28 million) .
“I do not want Australian children and young people serving as casualties of powerful technologies thrust onto the market without guardrails and without regard for their safety and wellbeing,” Inman Grant added.
Australia is tightening online child safety priorities across digital platforms beyond AI services, including social media. From December, under the Online Safety Amendment, social media companies will be required to block users under 16 from creating accounts.
Character.Ai, Chub.ai, Chai and Glimpse.AI did not respond to requests for comments.