Now Google is accused of profiting from antisemitic remarks and hate speech as people PAY to have their extremist opinions shown prominently on YouTube in the latest scandal to rock the firm

Google is receiving significant backlash for potentially profiting from antisemitism on YouTube, which it owns. 
Users were using the site's 'Super Chat' feature which allows them to pay a small fee in exchange for their comment to be highly visible in the comment section of videos.
These comments are meant to abide by the company's terms of service, which prohibit hate speech.
An investigation into the feature saw a host of extremist and antisemitic remarks on content from right-wing vloggers and conspiracy theorists. 
Some called the horrific terrorist events in New Zealand, where 50 people lost their lives, an Israeli 'false flag' operation and others said Brexit required a violence-orientated solution. 
Comments can be promoted for a sum anywhere up to £500 in exchange for a certain period of time at the top of the 'comment ticker' which is displayed on the screens alongside the content.  
This money goes to the producer of the video, but Google takes a 30 per cent cut of the takings. 
According to The Times, one of the videos involved a chat about if the alleged perpetrator of the Christchurch attack was a Mossad agent.  
Another $5 post said: 'What's your take on the fact that the murderer did not once mention Zionist influence in America or Europe?' 
One paid $2 to promote his opinion and comment saying there is a 'Lubavitch Synagogue in Christchurch'. 
Other uses of the feature to spread extremist views include a post from a Briton who said:
A British viewer paid £10 to post: 'VIOLENCE IS THE ONLY SOLUTION. I AM WILLING TO DIE FOR MY NATION.' 
Amanda Bowman, vice-president of the Board of Deputies of British Jews, said: 'There is no excuse for accepting money from people who are promoting their own racism.' 
YouTube has since removed the video calling for violence and discussing 'super Jews' but the other comments are still in place. 
A YouTube spokesperson told MailOnline: 'We do not allow videos or comments that incite hatred on YouTube and work hard to remove content that violates our policies quickly, using a combination of human flagging and review and smart detection technology. 
'We're making progress in our fight to prevent the abuse of our services, including hiring more people and investing in advanced machine learning technology. 
'We know there's always more to do here and we're committed to getting better.'

WHAT'S THE CONTROVERSY OVER YOUTUBE'S CONTENT?

YouTube has been subject to various controversies since its creation in 2005. 
It has become one of Google's fastest-growing operations in terms of sales by simplifying the process of distributing video online but putting in place few limits on content.
However, parents, regulators, advertisers and law enforcement have become increasingly concerned about the open nature of the service. 
They have contended that Google must do more to banish and restrict access to inappropriate videos, whether it be propaganda from religious extremists and Russia or comedy skits that appear to show children being forcibly drowned. 
Child exploitation and inappropriate content
By the end of last year YouTube said it had removed more than 50 user channels and has stopped running ads on more than 3.5 million videos since June.
In March last year, a disturbing Peppa Pig fake, found by journalist Laura June, shows a dentist with a huge syringe pulling out the character's teeth as she screams in distress.
Mrs June only realised the violent nature of the video as her three-year-old daughter watched it beside her.
Hundreds of these disturbing videos were found on YouTube by BBC Trending back in March.
All of these videos are easily accessed by children through YouTube's search results or recommended videos. 
YouTube has been getting more stringent about deleting videos. One example is the wildly popular Toy Freaks YouTube channel featuring a single dad and his two daughters that was deleted last year.
Although it's unclear what exact policy the channel violated, the videos showed the girls in unusual situations that often involved gross-out food play and simulated vomiting.
The channel invented the 'bad baby' genre, and some videos showed the girls pretending to urinate on each other or fishing pacifiers out of the toilet.
Adverts being shown next to inappropriate videos
There has been widespread criticism that adverts are being shown on some clips depicting child exploitation.
YouTube has now tightened its rules on who qualifies for posting money-making ads.
Previously, channels with 10,000 total views qualified for the YouTube Partner Program which allows creators to collect some income from the adverts placed before their videos.
But YouTube's parent company Google has announced that from February 20, channels will need 1,000 subscribers and to have racked up 4,000 hours of watch time over the last 12 months regardless of total views, to qualify.
This is the biggest change to advertising rules on the site since its inception - and is another attempt to prevent the platform being 'co-opted by bad actors' after persistent complaints from advertisers over the past twelve months.
In November last year Lidl, Mars, Adidas, Cadbury maker Mondelez, Diageo and other big companies all pulled advertising from YouTube.
An investigation found the video sharing site was showing clips of scantily clad children alongside the ads of major brands.
One video of a pre-teenage girl in a nightie drew 6.5 million views.
Issues with system for flagging inappropriate videos
Another investigation in November found YouTube's system for reporting sexual comments had serious faults.
As a result, volunteer moderators have revealed there could be as many as 100,000 predatory accounts leaving inappropriate comments on videos.
Users use an online form to report accounts they find inappropriate.
Part of this process involves sending links to the specific videos or comments they are referring to.
Investigators identified 28 comments that obviously violated YouTube's guidelines.
According to the BBC, some include the phone numbers of adults, or requests for videos to satisfy sexual fetishes.
The children in the videos appeared to be younger than 13, the minimum age for registering an account on YouTube.
Powered by Blogger.