TikTok stars launch “Only Nans” to encourage young users to report anything that offends their grand mood

TikTok stars launch “Only Nans” to encourage young users to report anything that offends their grand mood

TikTok stars have launched an Ofcom-sponsored campaign called “Only Nans” to encourage young people to report harmful content that offends their grandmother.

Lewis Lee has gained millions of followers on a popular video-sharing site after posting his sweet footage dancing with his elderly grandmother Philis.

He has now launched a campaign to encourage people to report content that offends their grand, as “Nan is the best judge out there.”

Lewis became popular with the app during the blockade and wanted to launch a campaign on Ofcom after scrolling through suspicious content.

He said:’Nan always gives the best advice. So the next time you scroll your phone and come across something you don’t understand, ask yourself, “What do you think of my Nan?”

“If it’s a” no “from nan, probably consider reporting it. “

Ofcom, a communications regulator, found that 67% of teens and young adults aged 13 to 24 encountered at least one potentially harmful content online.

After gaining popularity on TikTok during a pandemic, Lewis launched a campaign with Ofcom and his grandmother Phyllis, as shown in the photo. He encourages other young people to report online content that they consider potentially harmful.

@lewisleighh

I love social media, but there could be some harmful content out there! We are all guilty of scrolling past it, but the only way to get rid of it is to report it! So I teamed up with @Ofcom and played Only Nans. And called a big gun to help me … my nanny Philis! #advertisement

♬ Original Sound-Lewis Lee

However, according to the survey, only 17% continued to report. This is because more than 20% said they don’t think reporting anything would make a difference.

They also found that 12% of the people involved in the survey didn’t know what to do when they saw harmful content, or who to let them know.

The most common content young people encountered was false information, fraud, and offensive language.

His campaign, which takes place while the online safety bill is passing Congress, will empower Ofcom on a superior social media platform if the duty of care fails.

Ofcom can fine up to £ 18m, or 10% of the revenue-earning companies.

Not only should social media companies remove illegal content such as images of child abuse, but they should also strive to eliminate “hate crime crimes”, even if they are allowed in the real world for freedom of expression. protection.

News publishers have been campaigning for a complete tax exemption for online safety bills since the white paper was published three years ago.

They are concerned that the latest version of the bill does not appear to correspond to the recommendations of parliamentarians on amendments to protect press freedom.

A joint parliamentary commission that scrutinized the bill said it should include a ban on tech companies that block news content unless it violates criminal law.

Social media bosses can be imprisoned under the latest law if they do not work with regulatory agencies to protect vulnerable online.

According to an Ofcom survey, only 17% of people who see harmful content continue to report it, and more than 20% think it makes no difference to report something.

According to an Ofcom survey, only 17% of people who see harmful content continue to report it, and more than 20% think it makes no difference to report something.

The campaign will take place after Molly Russell in the photo took her life in 2017 after seeing thousands of posts about suicide and self-harm on Instagram.  After her death, her 14-year-old father, Ian, has been campaigning for stricter online safety laws.

The campaign will take place after Molly Russell in the photo took her life in 2017 after seeing thousands of posts about suicide and self-harm on Instagram. After her death, her 14-year-old father, Ian, has been campaigning for stricter online safety laws.

Earlier versions of the online safety bill issued last year stated that tech companies could be fined hugely and could reach billions of pounds if they did not comply with their duty of care. increase.

The minister avoided holding his boss personally responsible for the company’s failure, but senior managers will now be charged with breaking the duty of care.

This law is called the Nick Clegg Act because the former Deputy Prime Minister is now Vice President of Global Issues and Communications on Facebook.

Children’s charities and worried families have long campaigned to prosecute social media companies for failing to crack down on self-harm.

It was after the father of a teenage girl killed her after seeing thousands of posts online about suicide and self-harm.

Molly Russell, 14, went through her life in 2017 after scrolling through Instagram graphic images. Her father, Ian Russell, told MP when she asked her company to remove the content, “she wasn’t frustratingly successful.”

Online safety activists said tech companies only seem to take action when “news articles are broken” or when the government changes regulations.

Russell said the platform’s “corporate culture” needs to be modified to respond “aggressively” to harmful content rather than “reactive”.

Ian Russell told MP that when he asked the company to remove harmful content, online safety campaigners had

Ian Russell told MP that when he asked the company to remove harmful content, online safety campaigners had “frustratingly limited success.”He wants social media giants to be more “aggressive” by removing posts from their site rather than being “reactive” in their approach.

Last year, he submitted evidence to the MP in a draft of the Joint Committee on Online Safety Bills, he said: Content and this are especially stressful for family and friends who have died from suicide.

“Platforms seem to respond only when news articles are broken in a particularly public way, or perhaps when regulations change … Therefore, the corporate culture of these platforms needs to change. Is our view, and it is becoming more and more so.

“They need to be proactive, not follow-up, and after all, they have the resources and skills to do this.

“But it’s often done as a retrofit, so you have to be true to their words that they take online safety seriously and want to make the platform more secure.”

TikTok’s latest Transparency Report reveals that 85.8 million pieces of content have been removed in the last three months of 2021.

5% of them were removed as a result of user reports, and Instagram reported 43.8 million content removals.

Behavioral psychologist Joe Hemmings told The Times that young people who haven’t reported potentially harmful content “at risk of challenging potentially serious problems.”

She added: ..