YouTube users watch about a billion hours of content per day. In the time it takes you to read this sentence, roughly 50 hours of content will have been uploaded.
According to Pex.com founder, Rastey Turek, it would take around 28,539 years to watch every video on YouTube, not including the thousands of hours of content that would have been uploaded subsequently. This begs the question, how does YouTube manage all of that content, and more importantly, how does it find the videos that should not be there?
Only recently has the question of content curation and review been asked by the public; it came in the wake of YouTube’s biggest creator-based scandal, the Logan Paul Suicide Forest video. But this was just the tip of a very large iceberg, the problem of content review in general, and it seems that YouTube has a delicate task on its hands.
Not only must YouTube somehow make sure that no videos that are uploaded are in violation of its Terms of Service and Community Guidelines, but they must also ensure that all monetised videos are matched with appropriate adverts to keep advertisers happy.
In December 2017, YouTube CEO, Susan Wojcicki, said in an online statement: “Since June, our trust and safety teams have manually reviewed nearly two million videos for violent extremist content, helping train our machine-learning technology to identify similar videos in the future.”
Wojcicki went on to say that there are plans to increase “the total number of people across Google working to address content that might violate our policies to over 10,000 in 2018.”
It is essential to appreciate the scale of the problem. Roughly 300,000 videos get uploaded to YouTube every day, so it’s pretty safe to assume that the company stopped individually reviewing everything uploaded to their site not long after the company launched, if in fact that ever was the case.
Since then, YouTube has been developing and relying upon algorithms to find any videos in violation of their terms of service and community guidelines, which are then reviewed manually. This, of course, has had its flaws, which allowed terror-related videos to not only be uploaded but also have adverts shown on them.
Had it not been for the nature of the Logan Paul video, it is difficult to tell whether we would even be having this debate. The video highlighted the fact that there were flaws within the company’s safeguards, which allowed a video containing footage of a dead body to be released in the first place, let alone by a high-level personality that featured in YouTube’s annual ’Rewind’ video.
These flaws in the content review process cannot just be resolved by increasing the number of staff reviewing content because there is way too much content to possibly review manually.So, why is YouTube taking content management so seriously all of the sudden?
Well, that’s partially down to the fact that some of YouTube’s biggest advertising clients were removing their add spend or boycotting YouTube entirely. According to the Financial Times a month before Wojcicki’s statement: “Diageo, Mars, Hewlett-Packard, Deutsche Bank and Mondelez were among brands to pull advertising from YouTube and its owner Google after campaigns appeared alongside videos featuring children and sexualised comments.”
YouTube has addressed these issues in a blog post which said; “Since we started using machine learning to flag violent and extremist content in June, the technology has reviewed and flagged content that would have taken 180,000 people working 40 hours a week to assess.”
Now, the reason that this was so important for YouTube was simple: advertisers are the backbone of their business model, and without them, the platform would shrivel and die.
In a bid to try and solve this, YouTube over-hauled their content review software and added a new ‘demonetization’ feature, that removes adverts on videos that the software deemed to be in violation of YouTube’s Terms of Service or Community Guidelines.
This has been heavily criticised by YouTube creators who claimed that the algorithm was repeatedly demonetising content that was not actually in violation of any of the rules.
The most notable example of this was Casey Neistat’s video trying to raise money for victims of the Las Vegas shooting tragedy, which was demonetised by YouTube for not being suitable for most advertisers, even though in the film he stated that he would be donating all AdSense money to charity.
YouTube said that the video was demonetised because it is their policy not to allow adverts on videos about tragedy, despite the fact that they still allowed adverts to be shown on other content related to the Las Vegas Shooting tragedy uploaded by endorsed channels that are backed by TV networks, such as Jimmy Kimmel Live, who have the ability to place their own partners’ advertisements on their content.
So, what does this mean for middle-sized YouTube channels and how are they faring in the wake of the biggest change to YouTube’s monetisation feature since it was introduced?
Jake from JDZ Media, a growing YouTube channel that has been championing urban music, told Artefact how their channel has been impacted as a result of these new rules: “Quite a lot of videos got demonetised, we were able to get a lot of them back although it was a long process appealing each one separately and some still remain demonetised which did cause a loss in revenue,” he said.“I think it is more difficult to earn money from YouTube in general now as advertisers seem to be paying less. I think this will affect all channels though, not just Grime music.”
Preferential treatment of large companies seems to be an increasing part of YouTube’s business practices, with schemes such as Google Preferred, ensuring that advertisers will “get access to among the top five per cent of content on YouTube” as well as “reach the highly coveted 10-34-year-old audience”.
Google describes this as aggregating YouTube’s “top content…into easy-to-buy packages for brand advertisers”. Adweek wrote in 2014 that YouTube had “an even more premium tier to the platform, letting brands pay extra to run ads against only the top one per cent of YouTube videos.”
This is all explained in videos by Phillip DeFranco and Casey Neistat respectively:
YouTube’s reactions to this would answer a question that many YouTubers have been speculating about: Who is more important to the brand: Creators or Advertisers? It seems the Advertisers won that battle, with YouTube bending over backwards to ensure that all their biggest advertisers remain and continue to advertise on the platform.
In the wake of the Logan Paul scandal, advertisers are very concerned about where their adverts are being shown on the platform, so Artefact asked Jake how he thought YouTube should manage content to ensure that channels like JDZ are not affected: “It is understandable that advertisers would not want to be associated with content like that,” he told us, adding that YouTube should change the rules so that “specific partners are allowed advertising, and remove partnership from anyone who’s content is unacceptable.”
Leading on from the topic of what content is deemed acceptable, we asked Jake if JDZ had any ‘in-house’ rules on what content they would consider inappropriate, and if they have ever had a video of theirs removed as a result of its content.
“There are no specific rules on what is inappropriate, but each track is judged first to see if it’s suitable for our audience. As far as I can remember no videos have ever been taken down due to the content. The main reason for videos being removed is by request from the artist when they decide they no longer want their video to be seen.”
Jake also mentioned that the changes to YouTube have led them to look into alternative platforms to host their channel, stating that: “we are planning to use multiple platforms and this is already in progress.”
Artefact also spoke to Enea Tanku, co-founder of another urban music platform called Link Up TV, who have amassed over a billion views on their channel since they started uploading content to YouTube in 2008. We put the same questions to Enea to see if they were facing the same complications.
“About 30 per cent of our videos get automatically demonetised,” Enea told Artefact, explaining that it depends on the title of the video and its content, but much of their content gets covered under YouTube’s “blanket rules”, which does not differentiate between any of the videos the algorithm has detected which may be in breach of YouTube’s community guidelines.Enea also suggested that YouTube might improve their demonetisation appeals process by allowing the review of multiple videos at once, as Link UP TV sometimes uploads “five to ten videos a day” and the current appeals process require a new appeal for every video a channel wishes to have reviewed.
In regard to advertising, Enea told Artefact that due to the nature of the channel, they are unable to get higher-tier advertising such as the Google Preferred scheme. Enea went on to say that it is those only just starting out on YouTube that will be hit the hardest by these changes.
On January 16, 2018, Google announced yet another revision to their monetisation feature on YouTube; The changes make it harder to become eligible to apply to the YouTube Partner Program, which is a less exclusive version of the Google Preferred program. You must be a partner before you can access certain features, such as “end screens and cards that link to associated websites, crowdfunding, or merchandise sites,” according to the Google support page.
The announcement also said that; “Previously, channels had to reach 10,000 total views to be eligible for the YouTube Partner Program (YPP).” and that “It’s been clear over the last few months that we need the right requirements and better signals to identify the channels that have earned the right to run ads.” Now they say that “new channels will need to have 1,000 subscribers and 4,000 hours of watch time within the past 12 months to be eligible for ads.”
So what that does mean for the original question of content management? For starters it will now become much harder to reach the top echelon of Google’s channel preference structure, but also harder for new YouTubers to make a living out of the platform.
This will reduce the amount of content that must be manually reviewed, based on the fact that fewer channels will now be able to host adverts. These changes will hopefully reduce the likelihood that any of YouTube’s most famous stars will be able to upload a grotesquely inappropriate video without YouTube noticing, as well as make it harder for inappropriate videos to host adverts in general.
In terms of the general reduction and prevention of extremist or violent content, YouTube must rely upon the improvement of what they call ‘Machine Learning’, which means that will learn from every mistake it makes, which reduces the chance of a mistake happening in the future and with so much content to learn from, and some predict that we will see that the algorithms and systems will become efficient enough to not require any human reviewing at all.
[pullquote]“The total number of people across Google working to address content that might violate our policies to over 10,000 in 2018.”[/pullquote]A huge advocate of the battle against violent and extremist content online is a marketing services executive called Eric Feinberg, owner of GIPEC (Global Intellectual Property Enforcement Centre) which has developed systems used for tracking down instances where adverts are being shown on violent and extremist content, particularly videos that are linked or promoting terror groups.
GIPEC say on their website that one of the services they provide is “Peace of mind that someone is looking out for the public’s best interest and our client’s specific interests.”
GIPEC owns a technology patent described as a ‘computerised system and method for detecting fraudulent or malicious enterprises’ which is something that GIPEC are very interested in licencing to Google.
This would further aid Google in their efforts to manage and reduce the distribution of harmful content. Of course, this would help the consumers of online content on the platform but arguably would benefit YouTube more with their efforts to keep advertisers happy.
It would seem that there are in fact two battles being fought; The first is the battle to protect the advertisers and their investments into YouTube. The second is developing systems advanced enough to sieve through an endless torrent of content to prevent the platform from playing host to countless unsavoury videos.
While it is a shame that these improvements have been born out of the protection of profitable advertising, rather than consumer protection, the results are still largely the same in the long run. One of the areas it seems YouTube must look into is the effect that these new rules are having on their core rule-abiding community.
Demonetisation has already begun affecting small and medium-sized channels, and the people who will be hit the worst are those who have built not only lives around YouTube, but also businesses.
This is why so many channels, Link Up TV and JDZ Media included, are looking for ways to branch out from YouTube and lower their reliance on the platforms advertising revenue.
YouTube must, however, remember that it was the community of content producers and consumers that saw the company rise to such dominant heights. It is that same community that has advertisers queuing round the block.
Featured image by Link Up TV