By AI Trends Staff

You Tube needs to employ AI to help process the 300 hours of video uploaded to the platform every minute by its users. This processing includes removing video deemed inappropriate by YouTube’s standards. 

Some 8.3 million videos were removed from YouTube in the first quarter, 76 percent of those identified and flagged by AI automatically, according to an account in Forbes. Of those, more than 70 percent were never viewed by users. While the AI system is able to review more content than humans, full-time human specialists work with the AI, which of course is not foolproof.

YouTube’s “number one priority” is to prevent harmful content from seeing the light of day via YouTube, said Cecile Frot-Coutaz, head of the EMEA region for YouTube, based in London. AI and machine learning has advanced the company’s ability to identify objectionable content, with performance improving from eight percent “pre-AI” of banned video being removed before 10 views, to more than 50 percent being removed.

An important metric used by YouTube’s algorithms is video “watch time,” valued by advertisers. However, the metric tends to amplify videos with outlandish content, and the more people watch it, the more highly it is recommended, notes Guillaume Chaslot, a former Google employee and founder of AlgoTransparency, a firm that encourages greater transparency in algorithms. Chaslot gave an address at the recent DisinfoLab Conference.

Chaslot is not a fan of the YouTube recommendation engine. “If the AI is well-tuned, it can help you get what you want. But the problem is that the AI isn’t built to help you get what you want — it’s built to get you addicted to YouTube. Recommendations were designed to waste your time,” he is quoted as saying an address at the recent DisinfoLab Conference, in an account from TheNextWeb.

AlgoTransparency built a program to analyze what videos YouTube is recommending each day. The website states, “The algorithm is responsible for more than 700,000,000 hours of watch time every day, and it is not fully understood, even by the people who built it.” From its base of 1,000 channels tracks, it shows how many recommended the top videos on YouTube each day.

In general, Chaslot has seen that the close a video gets to the edge of what is acceptable under YouTube’s policy, the moe engagement the video will get. “We’ve got to recognize that YouTube recommendations are toxic and pervert civic discussion,” he said. “Right now the incentive is to create this type of borderline content that’s very engaging, but not forbidden,” 

Google, which bought YouTube in 2006, and YouTube have challenged Chaslot’s methodology. He said his requests to probe the flaws in cooperation with YouTube have gone unanswered. Until Google becomes more transparent about how it recommends videos, the engine will not be better understood.

Going over the edge of propriety too many times on its homepage spurred YouTube to recently employ a “trashy video classifier” for its homepage, according to an account in Bloomberg. Early experience has shown the fix is helping keep many inappropriate clips off the home page. The “watch time” on YouTube’s homepage has grown 10x in the past three years, Google marketers said recently.

Read the source posts in ForbesTheNextWeb and  Bloomberg.

Source link