September 29 - October 27, 2024: Issue 635

 

Social Media Summit Update: Survey highlights growing concerns 

The Minns Labor Government released the findings of a major statewide survey on October 4 2024, revealing widespread community concerns about the impact of social media on children, young people, and broader society.

The survey, conducted between 11 August and 15 September this year, engaged more than 21,000 participants from across the state through the ‘Have Your Say’ platform.

See prior Pittwater Online News Notice: Community encouraged to have their say ahead of Social Media Summit - August 2024

This is the largest response to a ‘Have Your Say’ public consultation to date.

It found that 87 per cent of respondents support implementing age restrictions for social media use, with 16 being the most suggested minimum age.

This sentiment was particularly strong among parents, with 91 per cent of those with children aged 5-17 advocating for age limits.

The survey reveals a growing concern over the time young people spend on social media, with those aged 16-17 averaging over three hours per day. This trend begins as early as 10-12 years old, where 70 per cent of children are already using social media, and usage steadily increases with age.

Additionally, 35 per cent of parents of 13-15 year olds, especially in single-parent households, report that social media has a ‘negative’ or ‘very negative’ impact on their child’s life.

Parents cited concerns over excessive screen time leading to issues such as addiction, exposure to inappropriate content and the detrimental effect on their children's daily responsibilities.

Young people themselves echoed some of this sentiment, with 66 per cent of 16-17 year olds expressing that social media distracts them from essential tasks such as schoolwork and family obligations.

The survey highlights a clear correlation between time spent on social media and negative outcomes reported by parents. Parents whose children use social media for more than four hours a day were more than twice as likely to note negative impacts on their child's life, compared to parents whose children spend less than an hour a day online. 

The most frequently mentioned issues include ‘cyber security risks’, ‘exposure to harmful content’, and ‘concerns over how social media usage is affecting children's behaviour’. Notably, parents of younger children who lack rules or safety measures often cited uncertainty about what to implement, signalling a need for more guidance and solutions.

The insights gathered from the survey will play a crucial role in shaping discussions at the upcoming Social Media Summit.

Jointly hosted by the NSW Government and the Government of South Australia, the first of its kind two-day, two-state event will bring together experts, policymakers, young people, and community voices to discuss strategies to combat the negative impacts of social media and foster a more positive digital future.

The summit will start at Sydney’s International Convention Centre on Thursday, 10 October and continue the following day at Adelaide’s Convention Centre.

Further details about the summit, including full event schedules and online live streaming information, will be released soon.

To stay updated on the summit, visit www.nsw.gov.au/socialmediasummit.

To read the full report: Click here to view the survey findings report. (PDF 16.77MB)

Premier of NSW Chris Minns said:

“The community has spoken, and the message is clear. Parents are concerned about how social media is impacting the lives of young people.

"The huge response to this survey sends a powerful message about the extent of community concern.

“Parents and children are rightly concerned about this giant global unregulated experiment on young people.

“The feedback we’ve received will guide discussions at the Social Media Summit and help the government as we respond to the harmful aspects of social media.”


Snapshot results Pages from NSW Government's survey findings report

Related - Published October 4 2024 

Is big tech harming society? To find out, we need research – but it’s being manipulated by big tech itself

AlexandraPopova/Shutterstock
Timothy Graham, Queensland University of Technology

For almost a decade, researchers have been gathering evidence that the social media platform Facebook disproportionately amplifies low-quality content and misinformation.

So it was something of a surprise when in 2023 the journal Science published a study that found Facebook’s algorithms were not major drivers of misinformation during the 2020 United States election.

This study was funded by Facebook’s parent company, Meta. Several Meta employees were also part of the authorship team. It attracted extensive media coverage. It was also celebrated by Meta’s president of global affairs, Nick Clegg, who said it showed the company’s algorithms have “no detectable impact on polarisation, political attitudes or beliefs”.

But the findings have recently been thrown into doubt by a team of researchers led by Chhandak Bagch from the University of Massachusetts Amherst. In an eLetter also published in Science, they argue the results were likely due to Facebook tinkering with the algorithm while the study was being conducted.

In a response eLetter, the authors of the original study acknowledge their results “might have been different” if Facebook had changed its algorithm in a different way. But they insist their results still hold true.

The whole debacle highlights the problems caused by big tech funding and facilitating research into their own products. It also highlights the crucial need for greater independent oversight of social media platforms.

Merchants of doubt

Big tech has started investing heavily in academic research into its products. It has also been investing heavily in universities more generally. For example, Meta and its chief Mark Zuckerberg have collectively donated hundreds of millions of dollars to more than 100 colleges and universities across the United States.

This is similar to what big tobacco once did.

In the mid-1950s, cigarette companies launched a coordinated campaign to manufacture doubt about the growing body of evidence which linked smoking with a number of serious health issues, such as cancer. It was not about falsifying or manipulating research explicitly, but selectively funding studies and bringing to attention inconclusive results.

This helped foster a narrative that there was no definitive proof smoking causes cancer. In turn, this enabled tobacco companies to keep up a public image of responsibility and “goodwill” well into the 1990s.

Vintage magazines with tobacco advertising from the sixties.
Big tobacco ran a campaign to manufacture doubt about the health effects of smoking. Ralf Liebhold/Shutterstock

A positive spin

The Meta-funded study published in Science in 2023 claimed Facebook’s news feed algorithm reduced user exposure to untrustworthy news content. The authors said “Meta did not have the right to prepublication approval”, but acknowledged that The Facebook Open Research and Transparency team “provided substantial support in executing the overall project”.

The study used an experimental design where participants – Facebook users – were randomly allocated into a control group or treatment group.

The control group continued to use Facebook’s algorithmic news feed, while the treatment group was given a news feed with content presented in reverse chronological order. The study sought to compare the effects of these two types of news feeds on users’ exposure to potentially false and misleading information from untrustworthy news sources.

The experiment was robust and well designed. But during the short time it was conducted, Meta changed its news feed algorithm to boost more reliable news content. In doing so, it changed the control condition of the experiment.

The reduction in exposure to misinformation reported in the original study was likely due to the algorithmic changes. But these changes were temporary: a few months later in March 2021, Meta reverted the news feed algorithm back to the original.

In a statement to Science about the controversy, Meta said it made the changes clear to researchers at the time, and that it stands by Clegg’s statements about the findings in the paper.

Unprecedented power

In downplaying the role of algorithmic content curation for issues such as misinformation and political polarisation, the study became a beacon for sowing doubt and uncertainty about the harmful influence of social media algorithms.

To be clear, I am not suggesting the researchers who conducted the original 2023 study misled the public. The real problem is that social media companies not only control researchers’ access to data, but can also manipulate their systems in a way that affects the findings of the studies they fund.

What’s more, social media companies have the power to promote certain studies on the very platform the studies are about. In turn, this helps shape public opinion. It can create a scenario where scepticism and doubt about the impacts of algorithms can become normalised – or where people simply start to tune out.

This kind of power is unprecedented. Even big tobacco could not control the public’s perception of itself so directly.

All of this underscores why platforms should be mandated to provide both large-scale data access and real-time updates about changes to their algorithmic systems.

When platforms control access to the “product”, they also control the science around its impacts. Ultimately, these self-research funding models allow platforms to put profit before people – and divert attention away from the need for more transparency and independent oversight.The Conversation

Timothy Graham, Associate Professor in Digital Media, Queensland University of Technology

This article is republished from The Conversation under a Creative Commons license. Read the original article.