Social media platforms, and independent content creators in particular, have faced tough times since the conclusion of the 2016 US presidential election.
This is because, aside from being one of the most heated contests in recent memory, the 2016 election was also fraught with problems including spreading misinformation over social media and also through outlets like YouTube.
But the 2016 election isn’t the only reason content creators are under fire.
Chasing after scarce advertising revenue and hoping to establish itself as a healthy, viable unit within the Alphabet empire, YouTube is under a lot of pressure to make sure advertiser dollars do not go towards supporting violent, racist, or hateful content.
And, as you can imagine, anyone who occupies the grey area outside of overt “fake news” – or even a content creator who takes a line different from the traditional media – is now in the crosshairs of Google’s updated YouTube algorithm.
That is having a direct impact on the ad revenue channels are seeing on YouTube and many are reporting steep declines in both money coming in as well as views on their content.
Yet it isn’t hard to see why YouTube would want to marginalize this kind of content – at least from a monetization standpoint.
That’s because advertisers often don’t want their brand associated with conspiracy theories, racist content, or otherwise questionable material.
It’s really not hard to understand why and it is probably one of the most value neutral judgments out there: These companies want to make money and they do that by reaching as wide an audience as possible with their ads.
Not to mention that a lot of the “conspiracy theory” content on YouTube has actually led to more problems than it has solved.
Take, for example, the endless harassment of the parents of the children of Sandy Hook by conspiracist Alex Jones.
Jones, convinced that the US government staged the massacre in Sandy Hook, has labeled the dead children as fake and their parents as crisis actors, prompting some of his less-than-stable followers to attack and harass these people in public media and on the Internet.
They call themselves “truthers” but they are anything but that.
Now that YouTube and other platforms are taking a hardline stance against them these content creators are claiming that the company is limiting their freedom of speech.
But few are addressing YouTube’s freedom to make money and, beyond that, a company’s freedom to choose when and where to advertise their products.
Indeed, in what can only be called “entitlement” to advertiser dollars, a lot of content creators that make videos talking about everything from birther conspiracies to Ruth Bader Ginsburg being kept alive via illegal drugs think that they not only deserve prominence in search rankings but also that YouTube should direct users to their videos even if what they offer is of questionable value.
Looking for a documentary about the ancient Egyptians? Something from the BBC should be placed on par with the documentary by a random YouTuber that says ancient Egyptians were actually aliens. Oh and both should show the Clorox ad before them. If you think this is the way most people would want to search YouTube – or how most companies would want their ads shown – you’re mistaken. That’s just the way it is.
That doesn’t mean you can’t find questionable content on YouTube. Quite the contrary: Not only can you find a lot of questionable stuff on YouTube, but the service might even recommend it to you.
So, if you can find it on YouTube, what is hampering creators from getting out their messages, no matter how controversial?
It seems that money truly does make the world go around but it also helps deflate a lot of the conspiracy channels’ arguments that YouTube is silencing them. On the contrary, it’s simply not paying them.
In discussing the changes, YouTube said, “We think this change strikes a balance between maintaining a platform for free speech and living up to our responsibility to users.”
Further, YouTube has gone out of its way to make sure that not only are conspiracy theorist videos accommodated but also that they are labeled and challenged for what they are. An initiative with Wikipedia to label videos presenting non-mainstream information was launched just last year, in fact.
Mercer University professor Whitney Phillips had this to say to CNN of the company’s efforts: “Finding ways to counter conspiracy theories and media manipulation efforts is critical and I applaud YouTube’s acknowledgment of the problem…The push to address the issue is a good one. Whether or not it will work is a totally different question, with many more variables than we currently are able to assess.”
But even these attempts are tinged with a different kind of danger. Because they are crowd-sourced articles, many Wikipedia entries contain inaccurate or misleading information. Using Wikipedia as a source for verifying the truth thus becomes more complicated than it would appear to be at first glance.
Professor emeritus Jacob Cohen of Brandeis University had this to say: “Wikipedia articles are group sourced and in my experience many of them are either ignorant of the subject at hand or for ideological reasons, subtly supportive.”
Barnard psychology professor Rob Brotherton compared YouTube’s rather passive approach with Facebook’s much more active quest to identify conspiracy theories and misleading content.
“[Facebook’s] approach invites speculation about smear-campaigns against people who see themselves as merely questioning mainstream narratives, and the fact-checkers can always be accused of being biased…Taking a relatively light touch like [YouTube] might be a good way to reach people on the fence.”
One thing is certain: The problem isn’t going away any time soon nor is YouTube’s attention to it.
Increasingly trying to position itself as a streaming service for music and media as well as content creators, YouTube’s multifaceted approach to its business means it is in it for the long haul and probably has its ear close to the ground to make sure it has what advertisers want.