The Hon Michelle Rowland MP, Minister for Communications, announced on May 10 2024 the Albanese Government will move to establish a Joint Parliamentary Select Committee into the influence and impacts of social media on Australian society.
Social media is part of everyday life in Australia, and social media companies play a role in determining what content Australians are exposed to online.
Their decisions in recent months – particularly Meta’s decision to withdraw from paying for news in Australia – demonstrates the negative impacts these companies can have on our society, the Government stated in a release
'Social media has a civic responsibility to its Australian users – and our society more broadly.'
The Government is committed to making social media companies more transparent and accountable to the Australian public, and the Joint Committee will enable Parliament to undertake this task.
Among other matters, the Government expects the Committee will examine and report on:
- The decision of Meta to abandon deals under the News Media Bargaining Code;
- The important role of Australian journalism, news and public interest media in countering mis and disinformation on digital platforms;
- The algorithms, recommender systems and corporate decision making of digital platforms in influencing what Australians see, and the impacts of this on mental health; and
- Other issues in relation to harmful or illegal content disseminated over social media, including scams, age-restricted content, child sexual abuse and violent extremist material.
The Government will consult across the Parliament on the final Terms of Reference ahead of an expected Parliamentary referral next week.
Minister for Communications, the Hon Michelle Rowland MP stated then:
“Social media is how millions of Australians connect, access news and run small businesses.
“These social media companies have enormous reach and control over what Australians see with little to no scrutiny.
“In our democracy, it is imperative that Australians have access to quality public interest journalism, including on social media. Unliteral decisions to undermine news hurts us all.
“Social media companies have social responsibilities. They need to be more accountable and transparent.
“Parliament needs to understand how social media companies dial up and down the content that supports healthy democracies, as well as the anti-social content that undermines public safety.
“Establishing this inquiry will provide opportunity and resources for parliamentarians to closely scrutinise these companies and make recommendations on how we can make these platforms for accountable for their decisions”.
Assistant Treasurer and Minister for Financial Services, the Hon Stephen Jones MP said:
“Social media is a great way for people to connect, it’s become a part of everyday life for many people.
“But users are also exposed to harm in an environment where it can be difficult to distinguish fact from fiction.
“The social media giants seem more determined to wipe trusted news sources from their platforms than scammers and other criminals. This will open the floodgates for misinformation and disinformation.
“We have a clear message for the platforms. Be better. Do better.
“The committee will put big tech under the microscope to help create a safer online environment”.
On Thursday May 16, during a Speech to the House of Representatives, The Hon Michelle Rowland MP said:
''Social media has a civic responsibility to its Australian users and our society more broadly. Social media is part of everyday life in Australia, and social media companies play a role in determining what content Australian adults and children are exposed to online. Meta's recent decision to withdraw from paying for news in Australia demonstrates the negative impact these companies can have on Australian news businesses and our democracy. The Albanese government is committed to making social media companies more transparent and accountable to the Australian public, and the joint committee we seek to establish today will enable the entire parliament to undertake this task.
These companies have enormous control over what Australian citizens and consumers see and hear online. Their corporate decision-making impacts the sustainability of Australian public interest journalism and news media. Their business models incentivise maximising attention and screen time to drive profits, often at the expense of public interest objectives such as quality and accurate information and the best interests of children. We know algorithms and recommendation systems dial certain content up and down, often putting on repeat dangerous material, like misogynistic material that reinforces stereotypes counter to the interests of society. The spread of harmful or illegal content, like scams, age-restricted content, and child sexual abuse and violent extremist material, causes harm. The lack of action on misinformation and disinformation sows division, undermines trust and tears at the fabric of society.
The committee will also examine age assurance on social media, something the government has confirmed will be part of the trial funded in this week's budget. This is an important opportunity to scrutinise developments given its widespread interest to Australians. We want Australians to participate openly and safely in society with the same standards and expectations reflected online as well as offline. Social media platforms have immense power and influence, and parliament has a stake in ensuring this is deployed in accordance with our public interest objectives as a nation. Elevating issues in this way is when the parliament is at its best, working together with a common purpose. With the right incentives, social media can contribute more to the welfare of society, enhancing access to trusted sources of news and information and expanding participation in civic life. I call on the House to support this referral.''
Although Meta's decision to withdraw from paying for news in Australia really only impacts those larger news organisations that were benefiting from this to the tune of millions to add to their other millions, as set up under the previous Coalition government, when this was originally mooted it also impacted small 'digital community news noticeboards' such as this one. Social media platforms are used by PON staff between Issues when the local community needs to know about closed roads due to flooding, evacuations due to the same, (the BOM was blanked out too during this spat) or to share other items of interest. Despite a wide readership, this is still only a little local digital noticeboard that also runs reports that impact or would be of interest to the local community - although it is generating thousands of $ for local volunteer organisations each year, it is mostly ignored by anyone other than the hundreds of thousands that visit each week - even if not all of those are locals.
However, the inappropriate materials that will appear on a social media feed unprompted need to be addressed. They certainly are not by those companies when pointed out to them, despite all these platforms hosting literature stating that they do remove such content.
Residents of our area are increasingly reporting social media platforms run by big tech companies are failing to follow their own policies in prohibiting or taking down reported sexually explicit materials (that apparently have popped up in people's feeds or run straight after videos they were watching of surfing or other completely unrelated subjects), hate speech, credible threats or direct attacks on an individual or group, and content that contains self-harm or excessive violence.
Facebook (Meta) and Twitter are stated to be the highest offenders.
Their current stated rules and policies may be found at the following links:
Reporting these breaches is not acted on time and again with a 'does not breach our standards' reply according to those who have made a complaint. The instances have become so frequent people now state these platforms have become unsafe for most users, including their children. Although social media platforms require people to be 13 years or older to have an account, a 13-year-old is still a child, and many children below this age do access social media platforms.
These platforms have been targeted by predators to access our children and young adults in a way they could not do previously. The recent addresses by Reece P Kershaw APM, Commissioner of the Australian Federal Police and Mike Burgess, ASIO Director-General, to the National Press Club Of Australia should be compulsory reading or viewing for parents who are not aware that thousands of Australian children are being abused every year by strangers.
Atop this, what began as a great way to connect with others has now become renowned for causing division and putting people at risk of harm - every other week PON runs a report in the Youth page pointing out some latest TikTok craze/trend will most definitely cause them physical, emotional or psychological harm - that it is 100% false.
Critics state this is intentional, despite the deliberate breaches of multiple laws in multiple countries that would confirm.
A 2021 study, 'Facebook’s ethical failures are not accidental; they are part of the business model' by David Lauer published by the National Library of Medicine, opens with:
Facebook’s stated mission is “to give people the power to build community and bring the world closer together.” But a deeper look at their business model suggests that it is far more profitable to drive us apart. By creating “filter bubbles”—social media algorithms designed to increase engagement and, consequently, create echo chambers where the most inflammatory content achieves the greatest visibility—Facebook profits from the proliferation of extremism, bullying, hate speech, disinformation, conspiracy theory, and rhetorical violence. Facebook’s problem is not a technology problem. It is a business model problem. This is why solutions based in technology have failed to stem the tide of problematic content. If Facebook employed a business model focused on efficiently providing accurate information and diverse views, rather than addicting users to highly engaging content within an echo chamber, the algorithmic outcomes would be very different.
What followed was a response which the study stated demonstrated that Facebook:
- Elevates disinformation campaigns and conspiracy theories from the extremist fringes into the mainstream, fostering, among other effects, the resurgent anti-vaccination movement, broad-based questioning of basic public health measures in response to COVID-19, and the proliferation of the Big Lie of 2020—that the presidential election was stolen through voter fraud;
- Empowers bullies of every size, from cyber-bullying in schools, to dictators who use the platform to spread disinformation, censor their critics, perpetuate violence, and instigate genocide;
- Defrauds both advertisers and newsrooms, systematically and globally, with falsified video engagement and user activity statistics;
- Reflects an apparent political agenda espoused by a small core of corporate leaders, who actively impede or overrule the adoption of good governance;
- Brandishes its monopolistic power to preserve a social media landscape absent [of] meaningful regulatory oversight, privacy protections, safety measures, or corporate citizenship; and
- Disrupts intellectual and civil discourse, at scale and by design.
The calls for more regulation and for these platforms to work within Australian laws prompted Julie Inman Grant, Australia’s eSafety Commissioner, to issue legal notices to Google, Meta, Twitter/X, WhatsApp, Telegram and Reddit requiring each company to report on steps they are taking to protect Australians from terrorist and violent extremist material and activity.
The spread of this material and its role in online radicalisation remains a concern both in Australia and internationally, with 2019 terrorist attacks in Christchurch NZ and Halle Germany, and more recently Buffalo NY, underscoring how social media and other online services can be exploited by violent extremists, leading to radicalisation and threats to public safety.
The Joint Select Committee on Social Media and Australian Society for the 47th Parliament was appointed by resolution of the Senate on 15 May 2024 and resolution of the House of Representatives on 16 May 2024, now established, will also examine a way overdue delve into the impact of social media on our children.
The committee invites individuals and organisations to send in their opinions and proposals in writing (submissions).
Find out more about 'how to' on their webpage at: www.aph.gov.au/Parliamentary_Business/Committees/Joint/Social_Media/SocialMedia
The Terms of Reference are:
Terms of Reference
The Joint Select Committee on Social Media and Australian Society for the 47th Parliament was appointed by resolution of the Senate on 15 May 2024 and resolution of the House of Representatives on 16 May 2024, to inquire into and report on the influence and impacts of social media on Australian society, with particular reference to:
(a) the use of age verification to protect Australian children from social media;
(b) the decision of Meta to abandon deals under the News Media Bargaining Code;
(c) the important role of Australian journalism, news and public interest media in countering mis and disinformation on digital platforms;
(d) the algorithms, recommender systems and corporate decision making of digital platforms in influencing what Australians see, and the impacts of this on mental health;
(e) other issues in relation to harmful or illegal content disseminated over social media, including scams, age-restricted content, child sexual abuse and violent extremist material; and
(f) any related matters.
That the committee present an interim report on or before 15 August 2024, and its final report on or before 18 November 2024.
The resolution establishing the Committee is available in the Journals of the Senate No. 110 – 15 May 2024.
Resources
If your life is in danger or a person is highly distressed, feeling unsafe and you think they are a risk to you or themselves, dial 000 for immediate assistance.
If you or someone you know needs support, there are resources available on the ACCCE website at accce.gov.au
thinkuknow
ThinkUKnow Australia has been run by the Australian Federal Police for more than 10 years. Resources and information are available at: www.thinkuknow.org.au
The AFP is urging parents to lock down their privacy settings on social media accounts and limit the information they share about their children online, including posting pictures of students in their school uniforms.
The eSafety Guide
Find out how to protect your personal information and report harmful content on common social media, games, apps and sites. Entries are for information only and are not reviews or endorsements by eSafety.
At: www.esafety.gov.au/key-topics/esafety-guide
Support services in New South Wales
Child Protection Helpline
Is a telephone service offering assistance if you have concerns for a child in NSW.
13 21 11 (Operates 24/7)
facs.nsw.gov.au/families/Protecting-kids/reporting-child-at-risk
NSW State Government Mental Health Line
Is a free telephone service that offers professional help and advice and referrals to local mental health services.
1800 011 511 (Operates 24/7)
health.nsw.gov.au/mentalhealth/Pages/default.aspx
Parentline NSW
Is a free telephone counselling and support service for parents and carers with children aged 0 to 18.
1300 1300 52 (Operates 9am-9pm, M-F and 4pm-9pm, S-S)
parentline.org.au
Support for children and young people
Kids Helpline
Offers a free, confidential telephone and online counselling service for young people aged 5-25 years.
1800 55 1800 (Operates 24/7)
kidshelpline.com.au
Top tips for parents and carers
- Keep your child's personal information including full name and age private
- Ensure the background of photos or videos doesn't give away your address or location, (and don't post your location or 'check in')
- Avoid posting photos in school uniform
- Only share images of your children with people you know and trust - lock your social media accounts so others cannot access them
- For community accounts, consider having a closed group with approved members and ensure you have strong privacy settings in place.