Australia has fined X Australia over child sex abuse material concerns. How severe is the issue – and what happens now?
Australia’s eSafety Commissioner, Julie Grant, has found X (formerly Twitter) guilty of serious non-compliance to a transparency notice on child sex abuse material.
- Australia’s eSafety Commissioner, Julie Grant, has found X (formerly Twitter) guilty of serious non-compliance to a transparency notice on child sex abuse material.
- The commissioner has issued X with an infringement notice for A$610,500.
- The commissioner first issued transparency notices to Google, X (then Twitter), Twitch, TikTok and Discord in February under the Online Safety Act 2021.
How severe is the issue?
- It was the first quantitative analysis of child sex abuse material on the public sites of the most popular social media platforms.
- The researchers’ findings highlighted Instagram and X (then Twitter) are particularly prolific platforms for advertising the sale of self-generated child sex abuse material.
- As for X, they found the platform even allowed the public posting of known, automatically identifiable child sex abuse material.
Why does X have this content?
- All major platforms - including X - have policies that ban child sex abuse material from their public services.
- Most sites also explicitly prohibit related activities such as posting this content in private chats, and the sexualisation or grooming of children.
- They should scrutinise content shared voluntarily by minors, and ideally should also weed out any AI-generated child sex abuse material.
Does the fine go far enough?
- For instance, last year US federal regulators imposed a US$150 million (A$236.3 million) fine on X to settle claims it had misleadingly used email addresses and phone numbers for targeting advertising.
- This year, Ireland’s privacy regulator slapped Meta, Facebook’s parent company, with a €1.2 billion (almost A$2 billion) fine for mishandling user information.
- The latest fine of A$610,500, though small in comparison, is a blow to X’s reputation given its declining revenue and dwindling advertiser trust due to poor content moderation and the reinstating of banned accounts.
What happens now?
- If it doesn’t, eSafety can initiate civil penalty proceedings and bring it to court.
- Depending on the court’s decision, the cumulative fine could escalate to A$780,000 per day, retroactive to the initial non-compliance in March.
- To get out, it’ll need to make a 180-degree turn on its approach to moderating content – especially that which harms and exploits minors.