YouTube: Google will hire more than 10,000 moderators in 2018 to identify abusive content
More human moderation and more framed monetization: Susan Wojcicki, Director general of YouTube, has promised measures to combat the ongoing drifts on the video platform. But the blur surrounding the goal pursued by 10 000 moderators through Google in 2018 asks about the actual magnitude of the effort announced.
"More people to check more content": this is the strategy chosen by Susan Wojcicki, YouTube CEO , to combat multiple derivatives in force on the Google platform, between videos of exploitation of children, the unhealthy parodies from cartoons, the pedophile comments or even the content of terrorist or hate propaganda.
In a blog post published on Monday 4 December, she promises a strengthening of the company's teams: "in 2018, we want to go further and raise to 10,000 the number of people involved at Google to fight against the likely contents of. violate our policies. » If the level of 10 000 moderators is symbolic, it remains difficult to assess the real YouTube regarding the effort because the platform does not communicate on its current strength. Not to mention that the announced deployment involves Google in its entirety, which examines the number of people truly concerned at YouTube. YOUTUBE promises regular reports Susan Wojcicki also promises to publish a regular report informing the public of the latest actions taken around the reported content and deleted videos, for more transparency. The director general of YouTube also ensures that the monetizable videos will be chosen more rigorously, hoping to regain the confidence of the advertisers who have suspended their campaign (s) to denounce the Association of their ads to Dubious content. The controversy over the drifts of the videos of minors has notably prompted volunteer moderators to denounce the lack of support from YouTube, who claim to have been working for more than a year with a defective reporting tool. YouTube has a habit of being evasive about the formation of such teams. This recruitment campaign would represent an increase of 25% compared to the current staffing, nevertheless report several sources of BuzzFeed., a YouTube representative says that these people will all be salaried by the company. It specifies that Alphabet, the holding of Google, has 78,000 employees worldwide. At the end of November, YouTube made the expense of a broad controversy over the poor control of its contents for and with children. 270 user accounts and 150,000 problematic videos have been removed from its pages. The service also removed the possibility of displaying advertisements on more than 2 million of content published by more than 50,000 channels "pretending to offer content for children", including many unhealthy hijackings of cartoons. At the end of November, several major brands such as Mars and Lidl announced the temporary suspension of their campaigns, after finding that their contents were associated with disturbing videos featuring the exploitation of children. Efficient but fallible algorithms Moderation is a long-haul fight for YouTube. Despite the measures taken in recent years to take over the contents violating its conditions of use, many videos degrading, polemic or violent still pass through the cracks. In particular, YouTube has developed algorithms that are responsible for locating such content more quickly. For the time being, the latter cannot do without a human glance, according to Susan Wojcicki. "It is essential for us to use human moderators to remove inappropriate content and train our automated learning systems," she stresses. In mid-August, as part of its fight against terrorist content, YouTube had automatically deleted several videos illustrating the war crimes committed in Syria, yet put online by investigative sites and researchers.a representative of the company had indicated that such analytical technologies were still "being improved". "Given the volume of videos on our site, we can sometimes make a mistake," he had then estimated. Used since June, this technology has helped greatly speed analysis of videos. Such a job would have required the collaboration of 180,000 people, up to 40 hours per week, notes Susan Wojcicki. In October, 83% of the videos 'extremist and violent' deleted had been spotted by algorithms. A month later, this same rate amounts to 98%. Did you like this article ? Then share it with your friends by clicking on the buttons below:
YouTube: Google will hire more than 10,000 moderators in 2018 to identify abusive content
Reviewed by Tech news
on
December 05, 2017
Rating: 5 YouTube: Google will hire more than 10,000 moderators in 2018 to identify abusive content More human moderation and more framed ...
No comments: