Kiwis are more concerned than ever about extreme, unregulated online content

  • 83% of Kiwis surveyed are concerned about harmful content on social media
  • There is concern that not enough is being done to regulate online platforms
  • Extreme content leads to more extremism
  • A new content regulation framework will be presented to the cabinet in 2023

New Zealanders are increasingly concerned about graphic and violent content they see online and feel more needs to be done to regulate the internet amid a rising wave of global misinformation and misinformation resulting in more frequent acts of terrorism and extremism.

This is according to a nationally representative sample of 1,201 New Zealanders, surveyed in a new report from the Classifications Office on screen and online viewing behavior released Wednesday.

According to respondents, better education, information and support is needed to protect vulnerable children from extreme content.

“We see” [the extreme content] physically manifest,” Acting Chief Censor Rupert Ablett-Hampson said in an interview, referring to the illegal 23-day occupation of Parliament earlier this year and a mass shooting in Buffalo, New York in May. The killer in Buffalo cited the Christchurch mosque terrorist as a source of inspiration for his own actions.

Kiwis feel that not enough is being done to protect children from extreme content online.  (File photo)

123rf

Kiwis feel that not enough is being done to protect children from extreme content online. (File photo)

READ MORE:
* The internet has not gotten any safer three years after the Christchurch terror attack
* Kiwis more vulnerable to online extremism in lockdown
* Jacinda Ardern’s ‘Christchurch Call’ has progressed, but is it worth much more than the paper it’s written on?

The report found that 83% of Kiwis surveyed were concerned about harmful or inappropriate content on social media and other websites, including content that encouraged racism, sexism, misogyny, hatred or extremism. It found that 33% had seen content that directly promotes or encourages violence against others based on race, culture, religion, sexuality or gender.

One in five of those surveyed had seen content that encouraged self-harm, suicide or eating disorders. Two in five said it was difficult to see harmful and offensive content online.

There has been widespread concern about young people engaging with harmful content online, and not enough is being done to protect them, despite age ratings, content warnings and parental controls.

The Kiwis surveyed overwhelmingly believed that more online content impacted children’s emotional well-being, mental health and attitudes toward sex, relationships, violence and suicide.

The new report found widespread concern over the regulation of online platforms.

delivered

The new report found widespread concern over the regulation of online platforms.

Most also believed there should be stricter regulation, technical solutions, education and accountability by tech/social media companies for damages hosted and consumed on their platforms.

Very few in the sample felt very confident in protecting themselves and their families online, and only a minority felt that the current regulatory system worked well to protect children.

“This is something New Zealanders are concerned about. And they’re concerned about their children,” Ablett-Hampson said.

The report comes after the government ordered a review of media and online content regulations following the Christchurch attacks and the parallel rise of changing online platforms.

The government will consider proposals for a new regulatory framework for harm-limiting content next year.

Acting Chief Censor Rupert Ablett-Hampson says action must be taken now.

delivered

Acting Chief Censor Rupert Ablett-Hampson says action must be taken now.

Regulating decentralized, ever-changing platforms in a global online world has been a pressing challenge for all countries, including Aotearoa, Ablett-Hampson said.

While the Classification Society could classify content as objectionable, that was a fairly high standard, and a lot of malicious content online didn’t necessarily meet the criteria.

“We rely on many social media platforms for self-regulation. Frankly, a lot of harmful material comes through. … What happens if you possibly end up with a metaverse?”

While some platforms have been incredibly difficult to regulate, become less responsive and increasingly claim they’re not responsible for the content they host, those weren’t good reasons not to make regulatory efforts, Ablett-Hampson said.

“The fact that we don’t have confidence in social media regulation right now suggests that we won’t have confidence in what the future looks like… We need to take action.”

Dame Melanie Dawes says the media and telecom watchdog Ofcom, which regulates Netflix, would be really valuable. (Video first aired in February 2022).