Age Verification Australia: Sites Block Access as Deadline Looms
age verification australia is triggering immediate site blocks as new codes take effect, with several adult websites restricting Australian registrations on Friday (ET). The Australian online safety regulator is poised to require platforms to verify ages for access to pornography, extremely violent and self-harm content from Monday (ET). The move aims to keep children away from potentially harmful material while sparking warnings about privacy and over-blocking.
Age Verification Australia: Platforms begin blocking
In the run-up to enforcement, a number of adult websites displayed notices to Australian users stating they were not accepting new account registrations when accessed from an Australian IP address on Friday (ET). From Monday (ET), platforms hosting pornography, extremely violent content or self-harm material must implement new checks under the codes, the regulator has set out.
The national online safety regulator has warned that platforms failing to comply could face fines of up to $49. 5m per breach, a penalty designed to push rapid uptake of the new verification requirements. Some companies have indicated they will restrict access ahead of the deadline, with one company saying its video-sharing platforms will limit access before the deadline on March 9th (ET).
Immediate reactions
Julie Inman Grant, eSafety commissioner, welcomed the introduction of the codes and framed them as bringing familiar real-world safeguards into online spaces where children spend time. She said, “We don’t allow children to walk into bars or bottle shops, adult stores or casinos, but when it comes to online spaces where they are spending a lot of their time, there are no such safeguards, ” and added that the codes change that for Australian kids (Julie Inman Grant, eSafety commissioner).
John Livingstone, head of digital policy at Unicef Australia, supported the move on child-protection grounds, warning that early accidental exposure to pornography, violence, self-harm and eating-disorder content can have lasting impacts on a child’s healthy development (John Livingstone, Head of Digital Policy, Unicef Australia).
At the same time, the sex worker advocacy group Scarlet Alliance cautioned the measures could have a chilling effect on platforms willing to host advertising for their services and warned of potential over-filtering of lawful sexual health information.
Quick context and regulatory mechanics
The new codes require age verification measures across a range of online offerings, not just adult websites, encompassing app stores and AI companion chatbots as well as platforms carrying extremely violent or self-harm material. The regulator has positioned the rules as bringing commonsense protections to online environments for children.
What’s next
Watch for further access restrictions and compliance moves through the coming days, with additional platforms likely to announce changes before the March 9th (ET) enforcement marker. The regulator’s fine framework makes enforcement a central next step; platforms that do not implement the new checks risk financial penalties while advocacy groups and child-safety organisations continue to press competing concerns over privacy, access to health information and the scope of automated filtering. Expect fast follow-up statements from companies and further public comment from the eSafety commissioner and child protection advocates as implementation unfolds.
Timestamp: initial checks documented on Friday (ET); rules take effect from Monday (ET); deadline cited as March 9th (ET).