The U.S. IT giant, Google, will increase the number of employees engaged in the detection of extremist content on its video-sharing service YouTube, as well as other materials violating the service’s rules, YouTube CEO Susan Wojcicki said.
In June, Google announced additional measures to counter the spread of extremist data via YouTube.
The company voiced plans to widen the use of technologies to identify extremist- and terrorist-related videos, to attract more experts to its programme of identifying problematic videos, toughening the rules as for the content that did not clearly violate YouTube’s rules and to expand its role in struggle against radical movements.
“Since June, our trust and safety teams have manually reviewed nearly two million videos for violent extremist content, helping train our machine-learning technology to identify similar videos in the future.
“We will continue the significant growth of our teams into next year, with the goal of bringing the total number of people across Google working to address content that might violate our policies to over 10,000 in 2018,” Wojcicki said in an article published on the official blog of YouTube.
The official added that the company was also fighting against aggression in comments, as well as cooperating with a number of child safety groups, such as the National Center for Missing and Exploited Children, in order to ensure fight against predatory behavior.
According to Wojcicki, YouTube has also exerted efforts to increase the “network of academics, industry groups and subject matter experts” that teach the company’s specialists to better respond to the developments in the world and on the Internet.