he Government has been urged to stop rogue algorithms that “push hate” online, including Holocaust denial and anti-Semitic conspiracy theories.
The Antisemitism Policy Trust has highlighted how “inappropriate” systems can spread malicious content and disinformation through search engines.
In an interview with PA news agency, the CEO of the Trust, Danny Stone, welcomed the online Safety Bill as the first step.
But he said: “I have always believed that the bill should focus on the systems behind social media and internet platforms, rather than the content.
I’m very concerned about search in general and what the safeguards are, what the risk assessments are, for the systems behind search engines
“The focus shouldn’t be on an individual item being posted, but on how that item is then shared with 50 million people. Are people being referred to it instead of just appearing online?”
Mr. Stone, who was made a MBE in 2017 for Services to Combating Hate Crime, cited several examples of simple words producing anti-Semitic content among the top hits on search engines.
He said: “The search companies are inciting or inciting people to perform malicious searches without anything in the bill that will affect this.
“Right now, search engines need to address and remove illegal content and pirated content, but they don’t have to do anything about that legal but harmful material.”
Popular home voice technologies such as Alexa, Siri and Hey Google should also be covered by the regulation, he suggested.
“I’m very concerned about search in general and what the safeguards are, what the risk assessments are, for the systems behind search engines.
“I think something needs to be done in the bill to address that,” he said.
He suggested that amendments could be made to the bill to ensure that technology companies should at least take the risk of reviewing the algorithms used to produce search results.
Currently, he said, the Online Safety Bill has a “triple shield” for platforms from user to user to ensure that illegal material is handled by companies and terms and conditions are applied consistently.
As part of the plans, individual users would be able to choose whether or not to filter “harmful but legal” content, such as abuse based on gender, race or religion.
Mr Stone insisted that protection against malicious content should be the default setting rather than an opt-in feature.
He said: “I feel very strongly that unless people are actively seeking it, anti-Semitism, Holocaust denial, possibly suicidal ideation and the encouragement of eating disorders should not be presented to people.
“That’s one of the battles to come — whether the so-called switch will be on or off.”
Mr Stone said much of the debate has focused on the big well-known social media companies such as Twitter and Facebook.
But he highlighted the growing proliferation of toxic culture on lesser-known networks often favored by extremists, such as the perpetrator of last year’s Buffalo massacre in the United States.
The Trust, which works to educate and empower MPs, policymakers and opinion leaders to tackle anti-Semitism, has already raised a number of issues with MPs and colleagues.
Mr Stone said he will look closely at the government’s proposals and the forthcoming Lords debate on the online safety law on Wednesday.