Shuichi Shigeno

Elon Musk is mad he’s been ordered to remove Sydney church stabbing videos from X. He’d be more furious if he saw our other laws

Retrieved on: 
Dienstag, April 23, 2024

Australia’s eSafety Commissioner has ordered social media platform “X” (formerly known as Twitter) to remove graphic videos of the stabbing of Bishop Mar Mari Emmanuel in Sydney last week from the site.

Key Points: 
  • Australia’s eSafety Commissioner has ordered social media platform “X” (formerly known as Twitter) to remove graphic videos of the stabbing of Bishop Mar Mari Emmanuel in Sydney last week from the site.
  • In response to this order, X’s owner, Elon Musk, has branded the commissioner the “Australian censorship commissar”.
  • Read more:
    Why is the Sydney church stabbing an act of terrorism, but the Bondi tragedy isn't?

Prompt political fallout

  • Labor minister Tanya Plibersek referred to Musk as an “egotistical billionaire”.
  • Of course such damning remarks directed towards a much-maligned website and its equally controversial owner are to be expected.

What do federal laws say?

  • The power she exercised under part nine of that act was to issue a “removal notice”.
  • The removal notice requires a social media platform to take down material that would be refused classification under the Classification Act.
  • While it’s these laws being applied in the case against X, there are other laws that can come into play.
  • It is a variation of this bill, reflecting the substantial range of views on the draft, that now has bipartisan support.

What else could be done?


Perhaps the gruesome images in the Wakeley videos might remind some of the Christchurch massacre. In that attack, Telstra, Optus, and Vodafone (now part of TPG), cut access to sites such as 4Chan, which were disseminating video of the attack. This was without any prompting from either the eSafety Commissioner or from law enforcement agencies.

  • She would need to be satisfied the material depicts abhorrent violent conduct and be satisfied the availability of the material online is likely to cause significant harm to the Australian community.
  • This means the commissioner could give a blocking notice to telcos which would have to block X for as long as the abhorrent material is available on the X platform.
  • This would be a breach of the terrorism prohibitions under the federal Criminal Code.


Rob Nicholls receives funding from the Australian Research Council for the International Digital Policy Observatory.

Elon Musk vs Australia: global content take-down orders can harm the internet if adopted widely

Retrieved on: 
Dienstag, April 23, 2024

Do Australian courts have the right to decide what foreign citizens, located overseas, view online on a foreign-owned platform?

Key Points: 
  • Do Australian courts have the right to decide what foreign citizens, located overseas, view online on a foreign-owned platform?
  • Read more:
    Elon Musk is mad he's been ordered to remove Sydney church stabbing videos from X.

Do global take-down orders work?

  • There can be no doubt that a global take-down order can be justified in some instances.
  • For example, child abuse materials and so-called revenge porn are clear examples of content that should be removed with global effect.
  • After all, international law imposes limitations on what demands Australian law can place on foreigners acting outside Australia.

An unusually poor ‘test case’ for free speech

  • But for the broader Australian public, this must appear like an odd occasion to fight for free speech.
  • There can sometimes be real tension between free speech and the suppression of violent imagery.
  • After all, not even the staunchest free speech advocates would be able to credibly object to all censorship.

The path forward

  • Global take-down orders are justifiable in some situations, but cannot be the default position for all content that violates some law somewhere in the world.
  • If we had to comply with all content laws worldwide, the internet would no longer be as valuable as it is today.
  • Read more:
    Regulating content won't make the internet safer - we have to change the business models


Dan Jerker B. Svantesson does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

Grattan on Friday: Ethnic tensions will complicate the Albanese government’s multicultural policy reform

Retrieved on: 
Donnerstag, April 18, 2024

“In 2024, threats to our way of life have surpassed terrorism as Australia’s principal security concern,” he said.

Key Points: 
  • “In 2024, threats to our way of life have surpassed terrorism as Australia’s principal security concern,” he said.
  • Tensions, especially in western Sydney, are much elevated because of the Middle East conflict.
  • And the Wakeley attack came just two days after the Bondi Junction shopping centre stabbings, which killed six people.
  • While that atrocity did not fall under the definition of “terrorism”, inevitably the two incidents were conflated by an alarmed public.
  • The challenge for political leaders is not just dealing with the immediate increasing threats to cohesion, but with longer term policy.
  • Andrew Jakubowicz, emeritus professor of sociology at the University of Technology Sydney, highlights the three separate elements of multiculturalism.


“Settlement policy, which deals with arrival, survival and orientation, and the emergence of bonding within the group and finding employment, housing and education
"Multicultural policy, which ensures that institutions in society identify and respond to needs over the life course and in changing life circumstances, and
"Community Relations policy, which includes building skills in intercultural relations, engagement with the power hierarchies of society and the inclusion of diversity into the fabric of decision-making in society - from politics to education to health to the arts.”

  • The Albanese government last year commissioned an independent review of the present multicultural framework.
  • Although the review is not due for release until mid-year, the May budget is likely to see some initiatives.


Michelle Grattan does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

FPF Publishes New Report: A Conversation on Privacy, Safety, and Security in Australia: Themes and Takeaways

Retrieved on: 
Dienstag, Januar 2, 2024

Today, FPF publishes a report summarizing broad themes and takeaways gleaned from this discussion, “A Conversation on Privacy, Safety, and Security in Australia: Themes and Takeaways.”

Key Points: 
  • Today, FPF publishes a report summarizing broad themes and takeaways gleaned from this discussion, “A Conversation on Privacy, Safety, and Security in Australia: Themes and Takeaways.”
    Australia’s Online Safety Act of 2021 (“Online Safety Act”) mandates the development of industry codes or standards to provide appropriate community safeguards with respect to certain online content, including child sexual exploitation material, pro-terror material, crime and violence material, and drug-related material.
  • Through September 2023, the eSafety has registered six industry codes that cover: Social Media Services, App Distribution Services, Hosting Services, Internet Carriage Services, Equipment, and Internet Search Engine Services.
  • A draft of the industry standards was published on November 20, 2023, and is open for public comment until December 21, 2023.
  • For purposes of the FPF and meeting, participants were asked to assume the existence of industry standards that satisfies the Online Safety Act’s statutory requirements.

Digital platform regulators release working papers on algorithms and AI

Retrieved on: 
Dienstag, Januar 2, 2024

23 November 2023

Key Points: 
  • 23 November 2023
    The Digital Platform Regulators Forum (DP-REG) has published working papers on algorithms and the large language models (LLMs) used in generative artificial intelligence (AI) to mark the launch of its website.
  • Each member contributed to the working papers, reflecting DP‑REG’s purpose to promote a streamlined and cohesive approach to the regulation of digital platform technologies in Australia.
  • The papers support DP-REG’s 2023–24 strategic priorities, which include a focus on understanding the impact of algorithms and evaluating the benefits, risks and harms of generative AI.
  • - Working Paper 1: Literature summary – Harms and risks of algorithms considers the harms and risks posed by some commonly used types of algorithms to end-users and society.

Australia has fined X Australia over child sex abuse material concerns. How severe is the issue – and what happens now?

Retrieved on: 
Dienstag, Oktober 17, 2023

Australia’s eSafety Commissioner, Julie Grant, has found X (formerly Twitter) guilty of serious non-compliance to a transparency notice on child sex abuse material.

Key Points: 
  • Australia’s eSafety Commissioner, Julie Grant, has found X (formerly Twitter) guilty of serious non-compliance to a transparency notice on child sex abuse material.
  • The commissioner has issued X with an infringement notice for A$610,500.
  • The commissioner first issued transparency notices to Google, X (then Twitter), Twitch, TikTok and Discord in February under the Online Safety Act 2021.

How severe is the issue?

    • It was the first quantitative analysis of child sex abuse material on the public sites of the most popular social media platforms.
    • The researchers’ findings highlighted Instagram and X (then Twitter) are particularly prolific platforms for advertising the sale of self-generated child sex abuse material.
    • As for X, they found the platform even allowed the public posting of known, automatically identifiable child sex abuse material.

Why does X have this content?

    • All major platforms - including X - have policies that ban child sex abuse material from their public services.
    • Most sites also explicitly prohibit related activities such as posting this content in private chats, and the sexualisation or grooming of children.
    • They should scrutinise content shared voluntarily by minors, and ideally should also weed out any AI-generated child sex abuse material.

Does the fine go far enough?

    • For instance, last year US federal regulators imposed a US$150 million (A$236.3 million) fine on X to settle claims it had misleadingly used email addresses and phone numbers for targeting advertising.
    • This year, Ireland’s privacy regulator slapped Meta, Facebook’s parent company, with a €1.2 billion (almost A$2 billion) fine for mishandling user information.
    • The latest fine of A$610,500, though small in comparison, is a blow to X’s reputation given its declining revenue and dwindling advertiser trust due to poor content moderation and the reinstating of banned accounts.

What happens now?

    • If it doesn’t, eSafety can initiate civil penalty proceedings and bring it to court.
    • Depending on the court’s decision, the cumulative fine could escalate to A$780,000 per day, retroactive to the initial non-compliance in March.
    • To get out, it’ll need to make a 180-degree turn on its approach to moderating content – especially that which harms and exploits minors.

Digital platform regulators make joint submission on AI

Retrieved on: 
Dienstag, September 26, 2023

11 September 2023

Key Points: 
  • 11 September 2023
    In a joint submission to the Department of Industry, Science and Resources consultation on the Safe and responsible AI in Australia discussion paper, members of the Digital Platform Regulators Forum (DP-REG) have outlined the opportunities and challenges presented by rapid advances in artificial intelligence (AI).
  • In its submission, DP-REG highlighted the potential impacts of AI in relation to each member’s existing regulatory framework.
  • The submission also flags that coordination between DP-REG members and other arms of government to leverage complementary strengths and expertise will remain crucial to Australia’s response to AI.
  • Through DP-REG, members engage in ongoing collaboration, information sharing and coordination on digital platform regulation.

Digital Platform Regulators Forum communique

Retrieved on: 
Mittwoch, Juli 5, 2023

4 July 2023

Key Points: 
  • 4 July 2023
    Digital Platform Regulators Forum puts generative AI on agenda
    The heads of the four members of the Digital Platform Regulators Forum (DP-REG) met on 20 June 2023 to review the forum’s progress over 2022–23 and to discuss strategic priorities for the year ahead.
  • Through the forum, all members continue to share information and work together to tackle issues across their traditional lines of responsibility.
  • The forum remains committed to working together to promote proportionate, cohesive, well-designed, and efficiently implemented digital platform regulation.
  • This communique is jointly released by the ACCC, ACMA, eSafety and OAIC.

Banks put family violence perpetrators on notice. Stop using accounts to commit abuse or risk being 'debanked'

Retrieved on: 
Dienstag, Juli 4, 2023

It happened when she was shopping for groceries with her kids, or refuelling the car.

Key Points: 
  • It happened when she was shopping for groceries with her kids, or refuelling the car.
  • That’s when she would discover her partner had cancelled the card or lowered the limit so she couldn’t buy essentials.
  • Ella* (not her real name) is one of about 1.6 million Australian women and 745,000 men who have experienced economic or financial abuse.

The highly disruptive tactics used by abusers

    • Perpetrators use a range of tactics, some of which are inadvertently enabled by bank products and services.
    • However, it may be possible to eliminate or reduce the need for these interventions with improved product design to prevent and disrupt abusers.

Taking action against perpetrators

    • It outlines steps banks can take to prevent their products being used as a weapon in domestic and family violence.
    • In banking, this means spelling out the bank’s rules and its expectations of customer behaviour in its terms and conditions.
    • These rules are the foundation of the contractual relationship with the customer and are relied on where there is a dispute.

Banks taking the lead

    • They will be the first Australian banks to signal to millions of bank customers they have a choice: abuse other customers and potentially lose access to their bank account, or behave with respect.
    • Implementation will be complex and the banks will need to proceed with caution.

Consequences for abusers who fail to stop

    • In this instance, there is the option of “de-banking” the customer which is not only a major inconvenience but also denies them access to an essential service.
    • It is instructive to examine the collective approach the banks have already taken to disrupt technology-facilitated abuse through payment descriptions.
    • It could also be informed by the Council of Financial Regulators’ de-banking policy recommendations on transparency and fairness measures.

Getting the public on board

    • Airlines make it clear jokes about terrorism are not okay, and patrons are ejected from sporting events for violence.
    • The widespread adoption of financial abuse terms and conditions and broad public communication will send a strong message to everyone with a bank account that financial abuse is unacceptable and has consequences.

School phone bans seem obvious but could make it harder for kids to use tech in healthy ways

Retrieved on: 
Mittwoch, April 26, 2023

School phone bans may seem like the answer to reeling in young people’s technology use. But if we ban phones and bury this issue under the sand, when and how do our kids learn to have a healthy relationship with technology in a world becoming more tech-focused by the day? Existing bans in Australian schoolsVictoria has banned mobile phones in both primary and secondary schools since term 1 of 2020.

Key Points: 


School phone bans may seem like the answer to reeling in young people’s technology use. But if we ban phones and bury this issue under the sand, when and how do our kids learn to have a healthy relationship with technology in a world becoming more tech-focused by the day?

Existing bans in Australian schools

    • Victoria has banned mobile phones in both primary and secondary schools since term 1 of 2020.
    • South Australia is transitioning to a ban in all public high schools by term 3 of 2023.
    • New South Wales will ban them in public high school schools in October, as part of a flagship election policy from the incoming Minns government.

Talk of a national approach

    • Last week federal Education Minister Jason Clare called for a national approach, saying he will meet with state and territory counterparts in the middle of 2023 to discuss and encourage this.
    • I think the time has come for a national approach to the banning or the restriction on the use of mobile phones by students in schools.
    • I think the time has come for a national approach to the banning or the restriction on the use of mobile phones by students in schools.

Students need to be included in this

    • not make the decision on our own; talk to parents, talk to principals, talk to teachers about what’s the best approach to take.
    • Scenes of school phone bans gone wrong are all over TikTok, with footage of Australian students breaking open pouches often purchased by schools to lock phones away.

It is easy to see why bans are popular

    • Banning mobile phones is popular with some parents, as it seems like the obvious answer to young people’s problematic technology use.
    • But this popularity in part is underpinned uncertainty on how to control children’s technology use.
    • Parents often resort to confiscating phones at home when they don’t know how to control children’s use of technology.

What are adults doing?

    • However rather than pointing the finger at the kids, let’s consider what’s happening with the adult population and mobile phones.
    • Adults use their phones all the time, especially in places they should not be.
    • As adults we find it very difficult to cope with mobile phone bans.

Where’s the evidence this will work?

    • One 2022 Spanish study did attempt to say bans had led to better academic results.
    • But in careful reading of the study, students were permitted to use phones in schools as a learning tool for educational purposes.
    • Policies need to be made using evidence, and right now we don’t really have any.