September 29 - October 27, 2024: Issue 635

 

Social Media's adverse impacts on communities - young people: October 2024 Post-Summit Update

The NSW and SA Governments partnered to deliver a first of its kind two-day, two-state summit focussed on exploring and addressing the impacts of social media.

The NSW Government held sessions at ICC Sydney on Thursday, October 10 and the Government of South Australia hosted sessions at the Adelaide Convention Centre on Friday, October 11.

The summit brought together experts, policymakers, academics, young people, and community voices to discuss the positive and negative impacts of social media on people’s lives and how government can best support digital wellbeing. 

The summit will inform the design and delivery of a range of policies, programs and resources to address the challenges posed by social media.

As reported earlier this month, the released the findings of a major statewide survey revealed widespread community concerns about the impact of social media on children, young people, and broader society.

Visit: Social Media Summit Update: Survey highlights growing concerns 

While there is widespread public support for age restrictions on social media, some experts argue there are positives for young people using social media, and that a ban won't make it safer. A survey by youth mental health service ReachOut, released on Monday, found 73% of young people were accessing mental health support through social media.

ReachOut’s research report: Mind Over Media: Supporting youth mental health in the digital age aims to provide insights into how digital platforms can facilitate accurate, safe and relevant information and features to support mental health and wellbeing. 

''This research report underscores the importance of collaborative efforts in policy-making to ensure that digital spaces contribute positively to youth mental health and wellbeing, making a compelling case for the integration of safety-focused designs on social media platforms.'' ReachOut stated

The report states:

  • 73% of young people regularly use social media to search for mental health information, or have done so in the past.
  • 35% of young people with a probable serious mental illness search for mental health information on social media once a week or more.
  • More than 50% of young people facing mental health challenges use social media as a substitute for professional support.

Some of the key research findings included:

  • Young people want to be able to easily discern the credibility of mental health information they come across on digital platforms, including social media.
  • Young people see online platforms as instrumental in raising awareness of mental health challenges, through quality information and shared accounts of lived experience.
  • Young people want more safe and supportive spaces to share their mental health journey.
  • Young people want better tools for combatting misinformation on digital platforms, and excessive use of social media.

An open letter signed by 100 academics, released Tuesday and published by the Australian Child Rights Taskforce, expresses concerns in regards to discussions to ban children from social media platforms until aged 16.

''We understand the risks that social media has for children and young people, and these are well documented.'' the document states

''Addressing those risks requires a careful and evidence-based response that acknowledges the role that the digital world plays in contemporary childhood. The online world is a place where children and young people access information, build social and technical skills, connect with family and friends, learn about the world around them and relax and play. These opportunities are important for children, advancing children’s rights and strengthening development and the transition to adulthood.

Any restrictions in the digital world must therefore be designed with care and we are concerned that a ‘ban’ is too blunt an instrument to address risks effectively.''

Key focus areas of the Social Media Summit were:

  • Impacts of social media on children and young people's wellbeing
  • Online safety
  • Social media's role in disinformation and misinformation
  • Addressing online hate and extremism
  • How social media is changing the way government delivers services

Below run some of the Addresses given at the 2024 Social Media Summit. The full live stream of Day one, held in Sydney, is embedded below. Day two of he summit may be accessed on the South Australian Government's website

To stay up-to-date with developments, visit the NSW Government's Social Media Summit webpage.

____________________________________________

Premier of NSW, Chris Minns
Opening Address to the Social Media Summit 

International Convention Centre, Q Sydney
Thursday, October 10  2024

Thank you so much Matty and thank you to Yvonne Weldon for that Acknowledgement of Country.

What a beautiful and generous gesture for our Indigenous leaders to welcome us to this land.

It's such an important part of the opening of summits like this, and that practice will continue in New South Wales under our government. We think it's fundamentally important and intrinsic to who we are as a people in this state and this country. Thank you so much, Yvonne.

I'd like to acknowledge the Gadigal people on whose land we meet today, and can we just get a round of applause for all of the young people in the room today.

Many of them are at university in the closing stages of their degrees, but a lot of them are in the midst of school holidays, so it's a massive effort to put aside the school holidays and be here with all of us, and we genuinely appreciate it.

I want to thank you all for coming today, and for the many people that came across the state, and even across the country, to take part in what is a unique, cross state summit between New South Wales and South Australia.

We're not used to having policy discussions between two states, that don't involve a border dispute, or us demanding more money from the Commonwealth government.

Or swapping jokes about Victoria. Which, we're happy to do a little bit later.

But this is an important issue that is fundamental to our two states, and I'd like to see more of it in the Commonwealth the states working together on issues that are affecting the people of New South Wales, and in this case, South Australia, but genuinely all of us.

I'd also like to acknowledge the work that's been done by South Australia and the Premier Peter Malinauskas, who is going to speak in a moment, in firstly instituting what was a brave and nation leading mobile phone ban across his state.

Now that was the first in the country to do so, and then secondly, for commissioning the independent legal examination by Justice Robert French into restricting access to social media for children, and for really getting the ball rolling when it comes to reform for the social media space.

It's been nation leading, an important conversation, and really been driven by the Premier of South Australia and it's fantastic that he's here with us today.

Delegates, a year ago this state followed the decision of the South Australian government and restricted the use of mobile phones in public schools across the state.

But of all the announcements we made at the last election and I think, the Deputy Premier would agree, hundreds of commitments and billions of dollars worth of election commitments and projects that were at stake, not many other issues generated more controversy than what was colloquially called, 'the phone ban.'

Now, we had our supporters, lots of parents and some teachers.

But there was also a backlash, mostly from students themselves, who genuinely didn't know, as they expressed to me and Prue, how they were going to get through the day to day without their phones being in their laps.

Now, I think it would be easy speaking today, to ridicule those kids, bear in mind that adults have exactly the same addictions in their life.

And if we put ourselves in the shoes of children, young people, students, if we think about the life they're currently living, that fear is completely understandable.

And it's not just kids. It's all of us.

We've all got these devices in our pockets, all day, whispering in our ears that we need to log on, we need to check our notifications, we need to watch this video, we can't stop scrolling because we don't know what the next image is that we'll see.

So I can understand where that reluctance came from.

The fortunate thing about the policy that was instituted by the government is that it was, in effect, a natural experiment.

We had the same group of people, in the same setting, before and after the phone restrictions came in, so we could test the impact it was having in each individual school.

And delegates, the impact has been massive.

Since the ban, teachers have reported a reduction in behavioural issues, fewer suspensions, higher attendance rates, less bullying in the school room and the school yard.

We've seen more concentration in our classrooms and stronger connections outside of it.

In fact, I recently met with the Deputy Premier and Minister for Education, students at East Hills, who told me that they'd grown much closer as a result of the change, and that their friendships had strengthened, and that phones had been left off the playground.

We've also seen a welcome return to active play at lunchtime, with more sport, more exercise, more physical activity.

In just one example, teachers at a school in Blacktown have reported an extra fifty minutes of learning time a day, with fewer confrontations between students and teachers and students and students, and a reduction in online bullying.

So, we believe the evidence is overwhelming. The changes to phones in school has been a huge success.

And from that success I think we can learn a few lessons as we roll into this summit and confront some of the changes that we want to see in the country.

Firstly, that we do have some control over the incursion of technology into our lives.

That it's not all a one way street. That is, of more technology, forever and ever, with no restrictions or no regulations on the other side.

Secondly, that we can in a democracy, take the best of technology, while preserving space for other things - important things - like concentrating on a task at hand, the mental health of people in our community and fundamentally, human connection. Physical, human connection between people.

We cannot, and we should not, outsource these questions to what are effectively, unelected billionaires living in Silicon Valley.

And thirdly, we've got the right to be suspicious of the impact social media is having on our society and our young people in particular. Our first responsibility should be to do no harm.

A healthy scepticism doesn't make you backward or nostalgic or some kind of modern luddite.

If you're concerned about your kids and the impact these sites are having on their mental health, or their body image, or their sense of personal confidence in the world, well we think that you're right to be concerned.

And we want to get the information on the table to make the best decisions, not just from governments, but for communities and families.

Studies have found that kids who spent more than three hours a day on social media are twice as likely to experience poor mental health away from social media and that includes depression and anxiety.

And many people are worried about the ability to concentrate - on your ability to concentrate, for that matter. That's adults as well.

To finish your book like we used to or being concerned and concentrated on the work that's at hand.

Now, these are reasons to take action, because, by any measure, our ability to focus as a society has fallen as social media use and its ubiquitous nature has risen.

Research bears this out. A worldwide OECD Programme for International Student Assessment (PISA) found in 2022, that the use of phones had an impact on classroom learning.

Across the OECD, 65% of students reported being distracted by using digital devices in at least some of the maths classes they were taking.

The report found that digital distraction has a strong association with learning outcomes, as students who reported being distracted by other students using these devices scored 15 points lower in some of those annual standardised tests.

To illustrate how much of a drain that 15 points is, it's the equivalent of three- quarters of a year of learning, in maths or any subject in education. A massive deficiency.

And if you're worried about our civil society and our democracy, or the state of our public debate. If you're watching a relative fall down what can be an extreme rabbit hole in one of the dark corners of the internet, then think many people in this room share these concerns and are up for practical ways of making a difference.

We know these algorithms are encouraging extremist views, because that's how they keep you clicking on the site for longer periods of time.

Delegates, over the next two days, you will hear from some of the world's most credentialed experts on these questions, from Australia and around the world.

Soon you will hear from Dr Jean Twenge, who's come here from the United States, Professor of Psychology at San Diego State University. The author of more than 190 scientific articles and publications on the link between adolescent mental health and social media.

A bit later on, we'll have a second keynote address from Frances Haugen, who's a courageous whistleblower from Meta, who revealed how these sites are knowingly damaging young minds, while allowing misinformation to run rampant through the corridors of the internet.

And throughout the day and in Adelaide tomorrow, we'll hear from people on the frontline. From parents, local experts and most importantly from young people themselves.

Who are negotiating these questions in their own lives without any historical frame of reference.

This has been repeatedly pointed out, it's almost a global, unregulated experiment on young people.

Delegates throughout these discussions, it's important I think, that we maintain a practical focus.

We need to understand our problem but perhaps more importantly, we need the tools that can effectively respond to the problems.

As we've shown with changes to phones in schools that it is possible, as long as we're targeted and determined as well as being realistic.

This summit is not about turning back the clock, or recreating the world as it was before smart phones and social media.

Even if we wanted to, we couldn't.

But the truth is, we don't want to.

Technology and the internet has driven progress. It has unleashed new forms of creativity. It has connected us in new and wonderful ways and it has helped people grow and start new businesses across the country and the world, linking up communities that previously would never have run into one other.

What we're here to do is make sure this technology is working for us, rather than us working for the technology.

It's a mighty task, one of the great questions I think of the twenty first century.

But the only way we can begin to answer it to start right here.

To clearly describe the kind of digital lives we want children to have.

And to develop the kind of interventions we can make, for a healthier, happier, and ultimately better and productive society.

This is something we've all been working out in our own way, as parents, as citizens, as voters, as individuals on the internet.

But as individuals, there's only so much we can do.

As a parent, you're facing up against some of the biggest companies that have ever existed in the global economy.

Which is why we believe we should confront these challenges as a community.

We're not just consumers of technology.

We're citizens in a democracy.

It's up to us to decide the kind of country that we want to live in and the lives we want for our children and the next generation.

And that's what this summit is about.

It's a huge responsibility but we firmly believe, both Peter and myself, that if we assemble the best information, if we can formulate and articulate the right policies, then we can set a new course and take back some control.

And most importantly, give our kids the best chance for a full and happy life.

________________________________________________

Raise the age to 16
Mark Speakman
Leader of the NSW Opposition

Friday October 11, 2024

With the rise of social media fuelling increasing levels of cyberbullying, anxiety, depression and even suicide among teenagers, the NSW Opposition remains committed to supporting raising the minimum age for social media use from 13 to 16. 

In August, the NSW Opposition endorsed the 36 Months campaign, which spotlighted the critical dangers social media poses to children during the most vulnerable stage of their psychological development, between the ages of 13 and 16. 

The Legislative Assembly carried the motion of the Leader of the NSW Opposition Mark Speakman to support raising the legal age of access to social media from 13 to 16 and to call on the NSW and Federal Governments to work together to implement this much-needed change by July 2025. 

Mr Speakman said our children are facing a mental health crisis driven by the unchecked influence of social media. 

“Parents feel powerless, watching their kids struggle under the pressure of a digital world they’re not ready for. By raising the social media age to 16, we’re giving families back control and protecting our young people when they need it most,” Mr Speakman said. 

New data from the eSafety Commissioner revealed that 84% of children aged 8 to 12 have accessed social media platforms. Alarmingly, 1.3 million Australian children under 13 are currently using social media, exposing them to risks such as cyberbullying, online predators and harmful content that severely impacts their mental health. 

Peter Dutton, Leader of the Federal Opposition, has also committed to raising the national minimum age for social media use to 16 within his first 100 days in office, ensuring that this critical change is implemented nationwide. 

Parents, community groups and mental health experts have repeatedly called for stronger safeguards to protect young people. 

We’re not just talking about numbers—these are real lives, real families, and real tragedies. It’s time to stand up to the tech giants and put our children’s futures first. 

The NSW Opposition is committed to standing alongside parents and families across the state, ensuring that governments act now to safeguard the mental health and well-being of young Australians.

________________________________________________

Australian Security Intelligence Organisation (ASIO) Director-General's Social Media Summit address
Director-General of Security, Mike Burgess AM

Friday October 11, 2024

I’d like to start by acknowledging the traditional custodians of the land on which we gather here, the Kaurna people, and pay my respects to elders past, present and emerging. 

I would also like to extend my respects to First Nations people here today and streaming online. 

Premiers, ministers, judges, distinguished guests. 

If I might also say at this time, for Chloe, Brendan, Denzel and India, thank you for sharing your voices this morning, you were incredibly powerful. 

I would have to admit that at the age of 20 or even younger, I would not have had the courage to stand on a stage like this, so it is a real credit to you. 

But most importantly, it is important right now that we hear from younger Australians, so thank you for sharing your voice to this important debate. 

And last but not least, I would like to acknowledge our law enforcement partners here today, thank you for your service.

Social media is both a goldmine – and a cess pit, simultaneously social and antisocial.

It creates communities – and divides them.

A rich source of information – and disinformation.

An effective way to engage with family and friends – and an effective way for scammers, spies and extremists to engage with you.

Social media does all these things, at speed and scale. 

This makes it so attractive – and is why it is so difficult for agencies like mine.

Based on what I see, the internet is the world’s most potent incubator of extremism.

And social media is the world’s most potent accelerator of that extremism.

The digital platforms are not generating new security threats, but almost certainly they are amplifying, accelerating and aggravating existing ones.

I am particularly concerned about the implications for young people, the most enthusiastic participants in the digital ecosystem.

This conference is addressing a range of social media challenges. I’ll restrict my comments to ASIO’s lane – the national security implications. 

And I’ll leave potential solutions to the experts – legislation to the legislators, regulation to the regulators and policy to the policy-makers. 

However, I will make one general point upfront: no form of technology, no corner of the internet, should be above the rule of law. Social media cannot be without a social licence.

Australia’s security environment is complex, challenging and changing.

Since 2022 ASIO has assessed that espionage and foreign interference are this country’s principal security concerns.

This year we added politically motivated violence to that list, when we raised the national terrorism threat level to PROBABLE.

We assess that more Australians are being radicalised, and being radicalised more quickly.

More Australians are embracing a more diverse range of extreme ideologies, and more Australians are willing to use violence to advance their cause.

Political differences, political debates and political protests are an essential part of a healthy democracy. 

Unfortunately, here and overseas, we are seeing spikes in political polarisation, intolerance, uncivil debate and unpeaceful protests.

Anti-authority beliefs are growing; trust in institutions is eroding; provocative and inflammatory behaviours are being normalised.

There are connections between inflamed language and inflamed community tensions, just as there are connections between extreme content online and extreme behaviours in the physical world. 

The riots in the United Kingdom are a case in point – social media inflamed a toxic combination of grievance, anger and misinformation, with bloody real-world consequences.

These trends increased during COVID, gained further momentum after the terrorist attacks in Israel, and accelerated during Israel’s military response.

Individuals are embracing anti-authority ideologies, conspiracy theories and diverse grievances. Some are combining multiple beliefs to create new hybrid ideologies. 

Many of these individuals will not necessarily espouse violent views but may still see violence as a legitimate way to effect political or societal change.

All this raises the temperature of the threat environment, creating a security climate where violence is more permissible.

Social media is not the sole driver of these trends, but ASIO considers it a significant driver.

Social media allows extremist ideologies, conspiracies, dis- and misinformation to be shared at an unprecedented scale and speed. 

Social media helps extremists find and build relationships with other extremists more quickly, easily and securely.

Social media exposes vulnerable individuals to extremist content and extremist communities.

Social media creates echo chambers of grievance, polarisation and reinforcement, paving a pathway to potential radicalisation.

At the height of ISIS and al-Qa‘ida, individuals would often be radicalised in person, over an extended period of time.

Now, individuals can be self-radicalised, and the process can take days and weeks rather than months and years.

At the height of ISIS and al-Qa‘ida, individuals would often be influenced by family members or associates who held extremist views.

Today, the most likely perpetrator of a terrorist attack is a lone-actor, from a family previously unconnected to extremism.

At the height of ISIS and al-Qa‘ida, extremism tended to be concentrated in major cities.

Now, extremism is much more diffuse – and much more diverse.

Again, the internet and social media are not solely responsible for these shifts, but I believe they are significant contributors.

It is important to note that there is no single pathway to extremism, just as there is no single profile of a person likely to take that path. 

Some people are radicalised in the real world. Others are radicalised online. For many individuals, it is a combination of both.

What is clear, though, is that social media and the internet more broadly are making extremist content more accessible, more digestible and more impactful than ever before.

The Christchurch massacre is just one example. The perpetrator used the internet to research and refine his ideology, and social media to livestream his rampage.

Almost all of ASIO’s counter-terrorism investigations involve an online element.

In some cases, the internet or social media can act as the initial avenue of exposure to violent extremism. 

In others, they allow engagement with other people holding extremist views. In some, they accelerate existing extremist ideologies, particularly where online narratives and content play into existing grievances. 

We’ve seen one case where the alleged perpetrator explicitly acknowledged that the availability of online extremist content drove them – and I quote – “over the edge”.

The internet and social media are obviously closely connected and often inter-dependent.

Extremist groups use social media as both radicalisation and recruitment platforms. 

They share toxic ideologies, swap tips and tactics in videos and post propaganda. 

They share content from the internet on social media, and use social media as a gateway to dark parts of the internet – places like a Telegram chat room known as Terrorgram.

Nationalist and racist violent extremists – including Australians – are using Terrorgram to communicate with offshore extremists and each other, discussing how to provoke a race war in this country.

Terrorgram and platforms like it give users a sense of community, normalise violence, create echo chambers that reinforce extreme beliefs, and bestow anonymity that encourages further extremism. 

Critically, they also offer protection through encryption. End to end encryption is a feature of multiple social media platforms, and it makes ASIO’s and law enforcement’s ability to investigate much more difficult, expensive and time-consuming. 

I recognise privacy is important, but it is not absolute. As I said earlier, technology should not be above the rule of law.

While the internet and social media are closely connected, there are key distinctions.

The internet requires you to search for and find extremist content. Social media is designed to make it easier for the content to find you.

The algorithms that drive social media also drive extremism and radicalisation.

The social media business model depends on engagement – the deeper and longer the better – and as a result, it rewards extreme content.

The more outrageous, controversial and polarising the content is, the more engagement it will get – more hits, more views, more shares. 

To the algorithm, clicks and shares are critical, accuracy and efficacy are not.

We need more research to conclusively connect this dynamic with the more angry, alienated, divided and intolerant security environment I described, but it cannot be a coincidence.

In the violent extremist context, if someone engages with extremist material, the algorithm will recommend more extremist material – and potentially more extreme extremist material. Studies suggest users can go from mainstream content to extreme content in just a few clicks. 

It is like travelling around the world with no borders.

Let me give you a real world example of the online journey. Users of a very popular social media platform can visit a page administered by individuals who subscribe to a radically, hard-line religious interpretation. 

The page espouses strong views but does not advocate violence, hate speech or terrorism. 

Once you visit the page, the algorithm recommends similar pages it thinks you will like. If you go to one of the recommended pages, you’re encouraged to join an English language Telegram network. 

The Telegram network is explicitly pro-ISIL, containing propaganda videos, messages of support for Australian terrorists, statements justifying, and I quote, “spilling the blood” and letters purportedly written by Australians convicted of terrorism offences. Two clicks is all it takes to go from the radical to the violently extreme,  guided by an algorithm that thinks it is being helpful.

The algorithm’s ‘engagement imperative’ is so powerful, it can screen out contrary content or opinions that could provide balance and moderate radicalisation. 

This can be even more pronounced when people gain views entirely online, without any offline influences from their family, other networks or community to provide counter-balance. 

Researchers talk about individuals going down a rabbit hole. Based on the violent content I’ve seen, it’s more like a hell hole, especially when the user is a child.

One of the studies I referred to a moment ago found that users as young as 13 can easily access incel material on a social media app. Incel is short for ‘involuntary celibate’ and refers to men who espouse a violent misogynist ideology. Incels have committed terrorist acts overseas.

According to the research, if a user spends just ten minutes looking at incel material, the algorithm will start recommending more and more violent misogynist propaganda, including posts glorifying incel terrorists. 

Not everyone viewing extremist material or engaging with extremists will be radicalised, but a combination of factors makes young people particularly vulnerable:

  • Minors spend significant amounts of time on the internet and social media, increasing the risk of exposure to harmful material.
  • By definition, adolescents are going through a tough time of physical and emotional change. Often, they are seeking an identity and seeking a community, especially if they are socially isolated.
  • Online environments provide an avenue for first approaches by extremists, including through seemingly innocuous platforms such as Discord.
  • The anonymity of the online ecosystem allows young people to engage with adults much more easily and equally than they do in the physical word.
  • A growing prevalence of mental health issues in young people is a further complication.

All these factors make young people particularly susceptible to radicalisation if they are exposed to extreme content and extreme communities.

During COVID, when young people were socially isolated and spending even more time online, teenagers represented around fifty per cent of our priority counter-terrorism caseload. It’s why I talked about it at the time. Those numbers fell when COVID restrictions were lifted, but we are now seeing a disturbing resurgence. Around twenty per cent of our priority cases involve minors.

All of Australia’s most recent cases of alleged terrorism, or events that are still being investigated as potential acts of terrorism, were allegedly perpetrated by young people.

The oldest, 21; the youngest, 14.

The internet was a factor in every single one of these incidents, albeit to different degrees and in different ways. 

As a nation, we need to reflect on why some young teenagers are hanging Nazi flags and portraits of the Christchurch killer on their bedroom walls, why others are sharing beheading videos in the schoolyard and, more concerningly, why there are young Australians willing to kill in the name of their beliefs.

When ASIO and law enforcement are dealing with this problem, it is usually too late.

The community can play a pivotal role in identifying signs a teenager isn’t just going through adolescence, but is heading towards radicalisation. 

Radicalisation can occur with limited or no external indicators when it occurs entirely online. 

Now this is the hard part, these changes that indicate radicalisation require more research.

As parents, we can’t assume a child’s online hours are only spent on safe apps or age-appropriate apps.

I’m certain no responsible adult would allow their child out after dark, aimlessly exploring alleyways or hanging out with adult strangers.

In one generation, we have allowed our children full access to the alleyways, content and people that they would not be able to access in the physical world.

The internet and social media are great for homework and hobbies, but from where I stand, our vulnerable children are at risk.

Families, carers and communities should notice and ask questions if young people spend an inordinate or inappropriate amount of time online or is accessing and sharing inappropriate material. 

Children often start with moderately objectionable material, which then becomes worse and worse—identifying it early can be critical.

I’m often asked about the impact artificial intelligence will have on these trends.

The simple answer is this: 

if the internet is the world’s greatest incubator of extremism and social media is the world’s biggest accelerator, AI will augment the incubation and accelerate the acceleration.

Of course, the full answer is a little bit more complicated.

Artificial Intelligence is HOT: equal parts Hype, Opportunity and Threat.

‘Hype’ because there’s a yawning chasm between current reality and what’s being claimed by tech-evangelists and marketing gurus. A lot of what they call AI isn’t.

‘Opportunity’ because there is no doubt, stripping away the hype, AI will deliver dividends to every part of society.

‘Threat’ because the productivity benefits of AI also extend to those who could use it to threaten Australia’s security.

ASIO assesses that artificial intelligence will allow a step change in the threat environment.

AI is likely to make radicalisation easier and faster. 

We are is already aware of extremists experimenting with AI, and it is likely they will try to use it to improve their recruitment campaigns, including through social media.

We also anticipate artificial intelligence will increase the volume of espionage.

It will empower much more bespoke, personalised social engineering.

It will also allow foreign intelligence services to conduct more prolific and more credible disinformation campaigns.

ASIO is tracking and monitoring all the likelihoods and their implications for Australia’s security.

Of course, the most obvious and possibly best defence against adversary use of AI is AI itself.

A company, these are rich, smart companies, capable of building such powerful social media algorithms should also be capable of harnessing AI to better identify, moderate and remove extreme material, especially when it is being fed to our young.

None of the challenges I’ve laid out today have simple solutions.

ASIO is certainly not the answer and nor do we want to be.

Any proposal to regulate social media must be balanced against free speech, free choice and the free market.

And we need more research to confirm what I’m describing involves causation not correlation. 

I would like to acknowledge the researchers on the panel today and the work here in South Australia, where Adelaide and Flinders Universities are researching the influence of violent extremist ideologies on young people in online communities.

To assist with this effort, ASIO is leading work by the Five Eyes security and police services to produce an unclassified paper on minors and terrorism. We hope our insights will be used by experts in sociology, psychology and other disciplines in the search for solutions.

We can only counter these threats with a whole of government, whole of society, whole of nation response. 

Even that won’t be enough, given the challenge, the technology and the tech companies are all beyond our borders.

A summit like this is a great start. 

It is a timely reminder that security and safety are shared responsibilities, and as adults, we all have a responsibility to protect our children.

Thank you.

________________________________________________

Social Media Summit address
The Hon Michelle Rowland MP, Minister for Communications

Friday, October 11 2024

Good afternoon, 

Thank you, Premier Peter Malinauskas for inviting me to speak on behalf of the Prime Minister, the Honourable Anthony Albanese. 

It is wonderful to be in Adelaide for this joint Summit focussed on a very important discussion taking place nationally, and around the world. 

I acknowledge the Traditional Owners – the Kaurna people – and pay respect to Elders past and present. I extend this to First Nations people attending. 

Thank you to New South Wales Premier Chris Minns for hosting Day One of the Social Media Summit in Sydney. 

And thanks to you – the experts, academics, policy makers and young people – who have come together to share your insights and experiences in this space. 

A space that has evolved exponentially over decades. 

Australia’s first Minister for Communications was known as the Postmaster General. 

Established at Federation, the Minister’s responsibilities were the provision of postal and telegraphic services throughout Australia. 

It wasn’t until 1975, when its Department’s name changed to reflect the rise in electronic media. 

Fast forward to today, and the internet continues to undergo significant change; as do the challenges faced by governments and regulators. 

We are now raising the second generation of digital natives. 

Social media is ubiquitous and a normal part of life for many young people. 

It can be a source of entertainment, education and connection with the world - and each other. 

But we are also seeing social harms affecting young people. 

And it is for this reason that we are here today.  

The Albanese Government understands parents and communities are concerned about the harmful impacts of social media and want action. 

Social media has a social responsibility. We know they can – and should – do better to address harms on their platforms. 

Governments around the world are grappling with this. 

No government, no regulator and no law can protect every child from every threat, every day. 

But we must work together to support our children to be happy, healthy and safe. 

The number one priority of the Albanese Government is the safety of Australians, including online. 

Australia is a world-leader when it comes to online safety, and I want to acknowledge the terrific work of our eSafety Commissioner, Julie Inman Grant. 

Online safety has traditionally been an area of bipartisanship in Australia, and that has served us well. 

Our Government is taking action on a number of fronts. 

Today, I will step out the Commonwealth’s approach to legislating a national minimum age for social media access – our latest effort to address online safety. 

This is significant reform. 

And we will work with State and Territory governments, regulators, experts, industry and the community. 

Today, I will cover three things:

  1. The pragmatic approach we are taking to social media age limits;
  2. The design principles that will underpin our reforms;
  3. And, finally, how this aligns to our whole-of-government approach to improving online safety.

As a mother of two young daughters, I understand that parents worry about the amount of time their children spend on social media. 

Research released by eSafety yesterday explored children’s use of online services, including social media, in 2024. 

The Social Media Pulse Survey found a significant number of children aged 8-12 are spending time on digital platforms. 

84 per cent reported using at least one online service, including social media or messaging services, since the start of this year. 

While the proportion of overall users increased with age, a significant majority – three quarters – have accessed an online service by 8 years old. 

More than two-thirds of children aged 12 have their own accounts. 

As parents, we also worry our children may unintentionally access harmful, distressing and age-inappropriate content on their feeds. 

We know that almost two-thirds of 14 to 17 year-olds have viewed extremely harmful content online including drug abuse, suicide or self-harm, as well as violent and gory material. 

A quarter have been exposed to content promoting unsafe eating habits. 

This is unacceptable and must be addressed. 

As Communications Minister, I have been engaging with a wide range of stakeholders in this space - and I have learned a lot. 

Young people tell me social media allows them to connect and feel socially included. 

It can be an entry point to health and mental health support, a creative outlet, or a platform for legitimate children’s programming. 

But young people also understand the need for protection. 

Survey data released by the Minns Government in the lead-up to the Summit highlighted widespread community concern. 87 per cent of survey respondents said they support age limits for social media. 

The national conversation has seen a range of ages proposed. We welcome this input. 

Let me also take the opportunity to acknowledge the extensive work of former High Court Chief Justice Robert French. 

Our age assurance trial is evaluating technologies that could be effective to age-limit access to social media platforms from 13 up to 16 years. 

And preventing people under 18 from accessing online pornography. 

The trial includes targeted stakeholder consultation and consumer-focussed research looking into attitudes towards different technologies, and issues of privacy, security and accessibility. 

The Albanese Government has also brought forward the independent review of Australia’s Online Safety Act by a year. 

This critical and comprehensive body of work is looking at how to ensure our regulatory settings keep pace with emerging online harms and are fit for purpose. 

I look forward to receiving the final report in coming weeks. 

The Albanese Government has asked the States and Territories for their views on what the age for social media access should be, including evidence from a youth development perspective. 

The Prime Minister wrote to the Premiers and Chief Ministers last week seeking views on this, and a range of related matters, including:

  • Community appetite on the role for parental consent as a factor for age limits and permissions;
  • On grandfathering arrangements for existing account holders;
  • The need for a safety net or exemption for support services like mental health and education;
  • And what state-based supports they have available for children – particularly those who are vulnerable or isolated - to connect and access services away from social media. 

No solution will be perfect, and consensus on the ‘right’ age is unlikely. 

Young people are digitally savvy and will find ways to circumvent controls. 

But we can’t let the ‘perfect’ be the enemy of good – we need to make progress to ensure our safeguards keep improving. 

This is about protecting young people, not punishing or isolating them or their parents. 

It is letting parents know that we are in their corner when it comes to supporting their children’s health and wellbeing. 

I am conscious of the pressure on parents in trying to oversee when and how their children use social media. 

Establishing an age limit for social media will help signal a set of normative values that support parents, teachers, and society more broadly.

For this reason, a key design principle of the Commonwealth’s legislative approach is to place the onus on platforms, not parents or young people. 

Penalties for users will not feature in our legislative design. 

Instead, it will be incumbent on the platforms to demonstrate they are taking reasonable steps to ensure fundamental protections are in place at the source. 

Our approach will ensure the eSafety regulator provides oversight and enforcement. 

We are also considering an exemption framework to accommodate access for social media services that demonstrate a low risk of harm to children. 

The aim of an exemption is to create positive incentives for digital platforms to develop age-appropriate versions of their apps, and embed safe and healthy experiences by design.   

We are conscious of the harmful features in the design of platforms that drive addictive behaviours. 

This is why we will set parameters to guide platforms in designing social media that allows connections, but not harms, to flourish. 

We will set a 12-month implementation timeframe to provide industry and the regulator time to implement systems and processes. 

And we will review these measures to ensure they are effective and delivering the outcomes Australians want.

Our strategic objective is clear: social media must exercise a social responsibility. 

This is the approach we are taking across government. 

As Communications Minister, I am working to curb seriously harmful misinformation and disinformation from being spread at speed and at scale on social media. An issue I know was raised by young people at the Summit yesterday. 

Efforts to improve online safety for all Australians are being taken across the Albanese Government.  

The Minister for Industry and Science is supporting businesses and organisations to safely and responsibly use and innovate with AI. 

The Attorney General has criminalised the non-consensual sharing of deep-fake material and he is seeking to criminalise ‘doxxing’ – that is when a victim’s identity, private information or personal details is shared without consent. 

Myself and the Minister for Social Services, Amanda Rishworth, are making dating apps safer through a world-leading voluntary code developed by industry to better protect their users. 

I am progressing Classification Scheme reforms to address violent and misogynistic adult content that reinforces unacceptable attitudes towards women. 

And, finally, I amended the Basic Online Safety Expectations determination to ensure the best interest of the child is a primary consideration in service design.

These changes also go to the systems that power content delivered by algorithms that influence what Australians see.  

The Albanese Labor Government is a reformist government. 

We are not afraid to tackle difficult reforms or hold big tech to account. 

Platforms are not above the laws of this land. 

In legislating a minimum age to access social media, we are laying the challenge at the front door of social media companies to do better. 

We will work with you: the experts, academics, industry, premiers, parents and young people to progress these important reforms. 

And support young Australians to be safe and to thrive, now and into the future. 

Thank you.

The government’s social media ban for kids will exempt ‘low-risk’ platforms. What does that mean?

BAZA Production/Shutterstock
Lisa M. Given, RMIT University

In a speech to the New South Wales and South Australian government social media summit today, Federal Minister for Communications Michelle Rowland announced more details of how the federal government’s proposed social media ban would actually work.

The government first announced the ban last month, shortly after SA said it will ban children under 14 from social media. But experts have heavily criticised the idea, and this week more than 120 experts from Australia and overseas wrote an open letter to Prime Minister Anthony Albanese and state and territory premiers urging a rethink.

Despite this, the government appears to be ploughing ahead with the proposed ban. The details Rowland announced today do not meaningfully address many of the criticisms made over the past few weeks.

In fact, they actually raise new problems.

What are the details of the social media ban?

In her speech, Rowland said the government will amend the Online Safety Act to “place the onus on platforms, not parents or young people” to enforce the proposed social media ban.

The changes will be implemented over 12 months to give industry and the regulator time to implement key processes.

The government says it “will set parameters to guide platforms in designing social media that allows connections, but not harms, to flourish”. These parameters could address some of the “addictive” features of these platforms, for instance by limiting potential harms by prioritising content feeds from accounts people follow, or making age-appropriate versions of their apps.

The government is also considering an:

exemption framework to accommodate access for social media services that demonstrate a low risk of harm to children.

The problem with “low risk”

But allowing young people to access social media platforms that have a demonstrated “low risk of harm” is fraught with issues.

Risk is difficult to define – especially when it comes to social media.

As I explained earlier this year around potential harms of artificial intelligence, risk “sits on a spectrum and is not absolute”. Risk cannot be determined simply by considering a social media platform itself, or by knowing the age of the person using it. What’s risky for one person may not be risky for someone else.

How, then, will the government determine which social media platforms have a “low risk of harm”?

Simply focusing on technical changes to social media platform design in determining what constitutes “low risk” will not address key areas of potential harm. This may give parents a false sense of security when it comes to the “low-risk” solutions technology companies offer.

Let’s assume for a moment that Meta’s new “teen-friendly” Instagram accounts qualify as having a “low risk of harm” and young people would still be allowed to use them.

The teen version of Instagram will be set to private by default and have stronger content restrictions in place than regular accounts. It will also allow parents to see the categories of content children are accessing, and the accounts they follow, but will still require parental oversight.

But this doesn’t solve the risk problem.

There will still be harmful content on social media. And young people will still be exposed to it when they are old enough to have an unrestricted account, potentially without the support and guidance they need to safely engage with it. If children don’t gain necessary skills for navigating social media at an early age, potential harms may be deferred, rather than addressed and safely negotiated with parental support.

A better approach

The harmful content on social media platforms doesn’t just pose a risk to young people. It poses a risk to everybody – adults included. For this reason, the government’s heavy focus on encouraging platforms to demonstrate a “low risk of harm” only to young people seems a little misguided.

A better approach would be to strive to ensure social media platforms are safe for all users, regardless of their age. Ensuring platforms have mechanisms for users to report potentially harmful content – and for platforms to remove inappropriate content – is crucial for keeping people safe.

Platforms should also ensure users can block accounts, such as when a person is being bullied or harassed, with consequences for account holders found to engage in such harmful behaviour.

It is important that government requirements for “low-risk” accounts include these and other mechanisms to identify and limit harmful content at source. Tough penalties for tech companies that fail to comply with legislation are also needed.

The federal government could also provide extra resources for parents and children, to help them to navigate social media content safely.

A report released this week by the New South Wales government showed 91% of parents with children aged 5–17 believe “more should be done to teach young people and their parents about the possible harms of social media”.

The SA government appears to be heeding this message. Today it also announced a plan for more social media education in schools.

Providing more proactive support like this, rather than pursuing social media bans, would go a long way to protecting young Australians while also ensuring they have access to helpful and supportive social media content.The Conversation

Lisa M. Given, Professor of Information Sciences & Director, Social Change Enabling Impact Platform, RMIT University

This article is republished from The Conversation under a Creative Commons license. Read the original article.