May 19 - 25, 2024: Issue 626

 

Australian Government’s Joint Select Committee into social media  Now established: Opinions - proposals welcomed

Even blanked themselves for a while during 'news spat': screenshot from Thursday February 18, 2021

The Hon Michelle Rowland MP, Minister for Communications, announced on May 10 2024 the Albanese Government will move to establish a Joint Parliamentary Select Committee into the influence and impacts of social media on Australian society.

Social media is part of everyday life in Australia, and social media companies play a role in determining what content Australians are exposed to online.

Their decisions in recent months – particularly Meta’s decision to withdraw from paying for news in Australia – demonstrates the negative impacts these companies can have on our society, the Government stated in a release

'Social media has a civic responsibility to its Australian users – and our society more broadly.'

The Government is committed to making social media companies more transparent and accountable to the Australian public, and the Joint Committee will enable Parliament to undertake this task.

Among other matters, the Government expects the Committee will examine and report on:

  1. The decision of Meta to abandon deals under the News Media Bargaining Code;
  2. The important role of Australian journalism, news and public interest media in countering mis and disinformation on digital platforms;
  3. The algorithms, recommender systems and corporate decision making of digital platforms in influencing what Australians see, and the impacts of this on mental health; and
  4. Other issues in relation to harmful or illegal content disseminated over social media, including scams, age-restricted content, child sexual abuse and violent extremist material.

The Government will consult across the Parliament on the final Terms of Reference ahead of an expected Parliamentary referral next week.

 Minister for Communications, the Hon Michelle Rowland MP stated then:

“Social media is how millions of Australians connect, access news and run small businesses. 

“These social media companies have enormous reach and control over what Australians see with little to no scrutiny.

“In our democracy, it is imperative that Australians have access to quality public interest journalism, including on social media. Unliteral decisions to undermine news hurts us all. 

“Social media companies have social responsibilities. They need to be more accountable and transparent.

“Parliament needs to understand how social media companies dial up and down the content that supports healthy democracies, as well as the anti-social content that undermines public safety. 

“Establishing this inquiry will provide opportunity and resources for parliamentarians to closely scrutinise these companies and make recommendations on how we can make these platforms for accountable for their decisions”.

Assistant Treasurer and Minister for Financial Services, the Hon Stephen Jones MP said:

“Social media is a great way for people to connect, it’s become a part of everyday life for many people.

“But users are also exposed to harm in an environment where it can be difficult to distinguish fact from fiction.

“The social media giants seem more determined to wipe trusted news sources from their platforms than scammers and other criminals. This will open the floodgates for misinformation and disinformation.

“We have a clear message for the platforms. Be better. Do better.

“The committee will put big tech under the microscope to help create a safer online environment”.

On Thursday May 16, during a Speech to the House of Representatives, The Hon Michelle Rowland MP said:

''Social media has a civic responsibility to its Australian users and our society more broadly. Social media is part of everyday life in Australia, and social media companies play a role in determining what content Australian adults and children are exposed to online. Meta's recent decision to withdraw from paying for news in Australia demonstrates the negative impact these companies can have on Australian news businesses and our democracy. The Albanese government is committed to making social media companies more transparent and accountable to the Australian public, and the joint committee we seek to establish today will enable the entire parliament to undertake this task.

These companies have enormous control over what Australian citizens and consumers see and hear online. Their corporate decision-making impacts the sustainability of Australian public interest journalism and news media. Their business models incentivise maximising attention and screen time to drive profits, often at the expense of public interest objectives such as quality and accurate information and the best interests of children. We know algorithms and recommendation systems dial certain content up and down, often putting on repeat dangerous material, like misogynistic material that reinforces stereotypes counter to the interests of society. The spread of harmful or illegal content, like scams, age-restricted content, and child sexual abuse and violent extremist material, causes harm. The lack of action on misinformation and disinformation sows division, undermines trust and tears at the fabric of society.

The committee will also examine age assurance on social media, something the government has confirmed will be part of the trial funded in this week's budget. This is an important opportunity to scrutinise developments given its widespread interest to Australians. We want Australians to participate openly and safely in society with the same standards and expectations reflected online as well as offline. Social media platforms have immense power and influence, and parliament has a stake in ensuring this is deployed in accordance with our public interest objectives as a nation. Elevating issues in this way is when the parliament is at its best, working together with a common purpose. With the right incentives, social media can contribute more to the welfare of society, enhancing access to trusted sources of news and information and expanding participation in civic life. I call on the House to support this referral.''

Although Meta's decision to withdraw from paying for news in Australia really only impacts those larger news organisations that were benefiting from this to the tune of millions to add to their other millions, as set up under the previous Coalition government, when this was originally mooted it also impacted small 'digital community news noticeboards' such as this one. Social media platforms are used by PON staff between Issues when the local community needs to know about closed roads due to flooding, evacuations due to the same, (the BOM was blanked out too during this spat) or to share other items of interest. Despite a wide readership, this is still only a little local digital noticeboard that also runs reports that impact or would be of interest to the local community -  although it is generating thousands of $ for local volunteer organisations each year, it is mostly ignored by anyone other than the hundreds of thousands that visit each week - even if not all of those are locals.

However, the inappropriate materials that will appear on a social media feed unprompted need to be addressed. They certainly are not by those companies when pointed out to them, despite all these platforms hosting literature stating that they do remove such content.

Residents of our area are increasingly reporting social media platforms run by big tech companies are failing to follow their own policies in prohibiting or taking down reported sexually explicit materials (that apparently have popped up in people's feeds or run straight after videos they were watching of surfing or other completely unrelated subjects), hate speech, credible threats or direct attacks on an individual or group, and content that contains self-harm or excessive violence.

Facebook (Meta) and Twitter are stated to be the highest offenders. 

Their current stated rules and policies may be found at the following links:

Reporting these breaches is not acted on time and again with a 'does not breach our standards' reply according to those who have made a complaint. The instances have become so frequent people now state these platforms have become unsafe for most users, including their children. Although social media platforms require people to be 13 years or older to have an account, a 13-year-old is still a child, and many children below this age do access social media platforms.

These platforms have been targeted by predators to access our children and young adults in a way they could not do previously. The recent addresses by Reece P Kershaw APM, Commissioner of the Australian Federal Police and Mike Burgess, ASIO Director-General, to the National Press Club Of Australia should be compulsory reading or viewing for parents who are not aware that thousands of Australian children are being abused every year by strangers. 

Atop this, what began as a great way to connect with others has now become renowned for causing division and putting people at risk of harm - every other week PON runs a report in the Youth page pointing out some latest TikTok craze/trend will most definitely cause them physical, emotional or psychological harm - that it is 100% false.

Critics state this is intentional, despite the deliberate breaches of multiple laws in multiple countries that would confirm.

A 2021 study, 'Facebook’s ethical failures are not accidental; they are part of the business model' by David Lauer published by the National Library of Medicine, opens with:

Facebook’s stated mission is “to give people the power to build community and bring the world closer together.” But a deeper look at their business model suggests that it is far more profitable to drive us apart. By creating “filter bubbles”—social media algorithms designed to increase engagement and, consequently, create echo chambers where the most inflammatory content achieves the greatest visibility—Facebook profits from the proliferation of extremism, bullying, hate speech, disinformation, conspiracy theory, and rhetorical violence. Facebook’s problem is not a technology problem. It is a business model problem. This is why solutions based in technology have failed to stem the tide of problematic content. If Facebook employed a business model focused on efficiently providing accurate information and diverse views, rather than addicting users to highly engaging content within an echo chamber, the algorithmic outcomes would be very different.

What followed was a response which the study stated demonstrated that Facebook:

  • Elevates disinformation campaigns and conspiracy theories from the extremist fringes into the mainstream, fostering, among other effects, the resurgent anti-vaccination movement, broad-based questioning of basic public health measures in response to COVID-19, and the proliferation of the Big Lie of 2020—that the presidential election was stolen through voter fraud;
  • Empowers bullies of every size, from cyber-bullying in schools, to dictators who use the platform to spread disinformation, censor their critics, perpetuate violence, and instigate genocide;
  • Defrauds both advertisers and newsrooms, systematically and globally, with falsified video engagement and user activity statistics;
  • Reflects an apparent political agenda espoused by a small core of corporate leaders, who actively impede or overrule the adoption of good governance;
  • Brandishes its monopolistic power to preserve a social media landscape absent [of] meaningful regulatory oversight, privacy protections, safety measures, or corporate citizenship; and
  • Disrupts intellectual and civil discourse, at scale and by design.

The calls for more regulation and for these platforms to work within Australian laws prompted Julie Inman Grant, Australia’s eSafety Commissioner, to issue legal notices to Google, Meta, Twitter/X, WhatsApp, Telegram and Reddit requiring each company to report on steps they are taking to protect Australians from terrorist and violent extremist material and activity.  

The spread of this material and its role in online radicalisation remains a concern both in Australia and internationally, with 2019 terrorist attacks in Christchurch NZ and Halle Germany, and more recently Buffalo NY, underscoring how social media and other online services can be exploited by violent extremists, leading to radicalisation and threats to public safety.  

The Joint Select Committee on Social Media and Australian Society for the 47th Parliament was appointed by resolution of the Senate on 15 May 2024 and resolution of the House of Representatives on 16 May 2024, now established, will also examine a way overdue delve into the impact of social media on our children.

The committee invites individuals and organisations to send in their opinions and proposals in writing (submissions).

Find out more about 'how to' on their webpage at: www.aph.gov.au/Parliamentary_Business/Committees/Joint/Social_Media/SocialMedia

The Terms of Reference are:

Terms of Reference

The Joint Select Committee on Social Media and Australian Society for the 47th Parliament was appointed by resolution of the Senate on 15 May 2024 and resolution of the House of Representatives on 16 May 2024, to inquire into and report on the influence and impacts of social media on Australian society, with particular reference to:

(a) the use of age verification to protect Australian children from social media;

(b) the decision of Meta to abandon deals under the News Media Bargaining Code;

(c) the important role of Australian journalism, news and public interest media in countering mis and disinformation on digital platforms;

(d) the algorithms, recommender systems and corporate decision making of digital platforms in influencing what Australians see, and the impacts of this on mental health;

(e) other issues in relation to harmful or illegal content disseminated over social media, including scams, age-restricted content, child sexual abuse and violent extremist material; and

(f) any related matters.

That the committee present an interim report on or before 15 August 2024, and its final report on or before 18 November 2024.

The resolution establishing the Committee is available in the Journals of the Senate No. 110 – 15 May 2024.

Resources

If your life is in danger or a person is highly distressed, feeling unsafe and you think they are a risk to you or themselves, dial 000 for immediate assistance.

If you or someone you know needs support, there are resources available on the ACCCE website at accce.gov.au

thinkuknow

ThinkUKnow Australia has been run by the Australian Federal Police for more than 10 years. Resources and information are available at: www.thinkuknow.org.au

The AFP is urging parents to lock down their privacy settings on social media accounts and limit the information they share about their children online, including posting pictures of students in their school uniforms. 

The eSafety Guide
Find out how to protect your personal information and report harmful content on common social media, games, apps and sites. Entries are for information only and are not reviews or endorsements by eSafety.
At: www.esafety.gov.au/key-topics/esafety-guide

Support services in New South Wales

Child Protection Helpline
Is a telephone service offering assistance if you have concerns for a child in NSW.
13 21 11 (Operates 24/7)
facs.nsw.gov.au/families/Protecting-kids/reporting-child-at-risk

NSW State Government Mental Health Line
Is a free telephone service that offers professional help and advice and referrals to local mental health services.
1800 011 511 (Operates 24/7)
health.nsw.gov.au/mentalhealth/Pages/default.aspx

Parentline NSW
Is a free telephone counselling and support service for parents and carers with children aged 0 to 18.
1300 1300 52 (Operates 9am-9pm, M-F and 4pm-9pm, S-S)
parentline.org.au 

Support for children and young people

Kids Helpline
Offers a free, confidential telephone and online counselling service for young people aged 5-25 years.
1800 55 1800 (Operates 24/7)
kidshelpline.com.au

Top tips for parents and carers

  • Keep your child's personal information including full name and age private
  • Ensure the background of photos or videos doesn't give away your address or location, (and don't post your location or 'check in')
  • Avoid posting photos in school uniform
  • Only share images of your children with people you know and trust - lock your social media accounts so others cannot access them
  • For community accounts, consider having a closed group with approved members and ensure you have strong privacy settings in place.


Investigating social media harm is a good idea, but parliament is about to see how complicated it is to fix

Rob Nicholls, University of Sydney and Terry Flew, University of Sydney

Barely a day has gone by this month without politicians or commentators talking about online harms.

There have been multiple high-profile examples spurring on the conversation. There was the circulation of videos of Bishop Mar Mari Emmanuel being stabbed in the Sydney church attack. The normalisation of violent content online has also been central to the discussion of the domestic violence crisis.

Then, of course, there’s the expressions of disdain for the Australian legal system by X (formerly Twitter) owner Elon Musk.

Inevitably, there are calls to “do something” and broad public appetite for changes in regulations. A new parliamentary committee will explore what that change should look like, but will have to contend with a range of legal, practical and ethical obstacles along the way.

Ten busy days

On May 1 and May 10, the government made two major announcements.

The first was a Commonwealth response to some of the online harms identified by National Cabinet. At the May 1 meeting, the Commonwealth promised to deliver new measures to address violent online pornography and misogynistic content targeting children and young people. This included promised new legislation to ban deepfake pornography and to fund a pilot project on age-assurance technologies.

The second was an announcement establishing a Joint Parliamentary Select Committee to look into the influence and impacts of social media on Australian society. The government wants the committee to examine and report on four major issues:

  1. The decision of Meta to abandon deals under the News Media and Digital Platforms Bargaining Code

  2. the important role of Australian journalism, news and public-interest media in countering misinformation and disinformation on digital platforms

  3. the algorithms, systems and corporate decision-making of digital platforms in influencing what Australians see, and the impacts of this on mental health

  4. other issues in relation to harmful or illegal content disseminated over social media, including scams, age-restricted content, child sexual abuse and violent extremist material.

However, the final terms of reference will be drafted after consultation with both the Senate crossbench and the opposition, so they may change a bit.

Why would they do this?

Asking the committee to review the Meta decision is an odd move.

In practice, Financial Services Minister Stephen Jones can “designate” Meta without a referral to the parliament. That is, the minister can decide all of the obligations of the News Media Bargaining Code apply to Meta.

However, a sounding by the committee may help to ensure Meta keeps concentrating on the issue. It also provides the opportunity to restate the underlying principles behind the code and the parlous state of much of the Australian news media.

In relation to harmful or illegal content disseminated over social media, there is already a review of the Online Safety Act underway. The terms of reference seem to ask the committee to provide input into the review.

The issue of misinformation and disinformation has also been the subject of review. The government released a draft of a proposed bill to combat misinformation and disinformation in June 2023. It would give the Australian Communications and Media Authority (ACMA) power to enforce an industry code, or to make one if the industry cannot.

That draft was criticised by the opposition at the time. However, there have been shifts since then and the committee might be a vehicle for the introduction of an amended version of the bill.

An age-old issue

Online age verification is a simple idea that is hard to implement unless there are significant consequences for non-compliance on a service provider.

Work in this area by the UK’s communications regulator, Ofcom, and the UK Information Commissioner’s Office are often cited as leading practice. However, the commissioner’s website notes “age assurance is a complex area with technology developing rapidly”.

A group of children in a classroom using smartphones
Measures to limit children’s access to social media will be investigated by the committee. Shutterstock

One approach is for the minor to identify themselves to a platform by uploading a video or to send a photograph of their ID. This is entirely contrary to the eSafety Commissioner’s messaging on online safety. The Commissioner advises parents to make sure children do not share images or videos of themselves and to never share their ID.

In practice, the most effective age identification for minors requires parents to intervene. This can be done by using software to limit access or by supervising screentime. If children and teenagers can get around the rules simply by borrowing a device from a school friend, age verification might not do much.

As the International Association of Privacy Professionals has found, age verification and data protection are far harder than they look. It is particularly difficult if the age barrier is not one already in place – such as the adult rights that those over the age of 18 possess – but rather a seemingly arbitrary point in the mid-teens. Other than online, the most important age to verify is 18 for things such as alcohol sales and credit. It is also the age at which contracts can be enforced.

Countries vs companies

One issue that is often raised about social media platforms is how Australia can deal with a global business.

Here, the European approach in the Digital Markets Act provides some ideas. The act defines companies with a strong market position as “gatekeepers” and sets out rules they must follow. Under the act, important data must be shared as directed by the user to make the internet fairer and to ensure different sites and software can communicate with each other. It also calls for algorithms to be made more transparent, though these rules are a bit more limited.

In doing so, it limits the power of gatekeeper companies, including Alphabet (Google), Amazon, Apple, ByteDance (TikTok), Meta and Microsoft.

Obviously, Australia can’t harness the collective power of a group of nations in the same way the European Union does, but that doesn’t preclude some of the measures from being useful here.

There is considerable public support for governments to “do something” about online content and social media access, but there are both legal and practical obstacles to imposing new laws.

There is also the difficulty of getting political consensus on such measures, as seen with the debate surrounding the misinformation bill.

But it’s clear in Australia, both citizens and governments have been losing patience with letting tech companies regulate themselves and shifting responsibility to parents.The Conversation

Rob Nicholls, Senior research associate, University of Sydney and Terry Flew, Professor of Digital Communications and Culture, The University of Sydney, University of Sydney

This article is republished from The Conversation under a Creative Commons license. Read the original article.