YouTube Will Determine What ‘Conspiracy’ Is and Stop Recommending Such Videos

While the evolution of Google’s YouTube from a free expression platform into something entirely different has been underway for a while, it just took another step in a very short-sighted and restrictive direction. NBC News reports:

YouTube has announced that it will no longer recommend videos that “come close to” violating its community guidelines, such as conspiracy or medically inaccurate videos…Chaslot said that YouTube’s fix to its recommendations AI will have to include getting people to videos with truthful information and overhauling the current system it uses to recommend videos.

There’s a lot to unpack here so let’s get started. First, it appears YouTube has announced the creation of a new bucket when it comes to content uploaded to the site. It’s no longer just videos consistent with company guidelines and those that aren’t, but there’s now a category for “conspiracy or medically inaccurate videos.” This is a massive responsibility, which neither YouTube or anyone else seems fit to be judge and jury. In other words, YouTube is saying it’s comfortable deciding what is “conspiracy” and what isn’t. Which brings up a really important question.

“Conspiracy” and covering up conspiracies is a fundamental part of the human experience, and always has been. It demonstrates extreme hubris for a tech giant to claim it can differentiate between a legitimate conspiracy to explore, versus an illegitimate one. One person’s righteous investigation is another’s conspiracy theory, with Russiagate serving as an obvious contemporary example.

Going back to the early 21st century, we witnessed a major conspiracy to start a war in Iraq based on lies; lies which were endlessly repeated uncritically throughout the mass media. Even worse, General Wesley Clark described an even larger conspiracy which consisted of starting multiple additional wars in the aftermath of 9/11. This conspiracy is ongoing and has continued to move forward in the years since, through both Republican and Democratic administrations.

It’s pretty clear what will end up happening as a result of this tweaking to YouTube’s recommended videos AI. The “conspiracies” of your average person will be pushed aside and demoted, while government and mass media lies will remain unaffected. Google will assume mass media and government are honest, so government and billionaire approved propaganda will be increasingly promoted, while the perspectives of regular citizens will be pushed further to the margins. YouTube is simply not a platform anymore, but rather a self-proclaimed arbiter of what is ridiculous conspiracy and what is truth.

While YouTube says videos it deems conspiracy will still be available via search, it’s not a stretch to imagine this is just the first step and before you know it certain categories will be banned from the site entirely. Either way, I think there’s a silver lining to all of this.

As I outlined in a recent post, U.S. tech giants, particularly Facebook, Google and Amazon, aren’t simply private companies. They appear more akin to quasi-government entities that increasingly view themselves as instrumental gatekeepers for a discredited status quo. Moreover, their primary business models consist of mass surveillance and violating our privacy.

Ultimately, I think the increasingly nefarious and desperate behavior of these tech giants will lead to their demise. More and more of us have looked under the hood and seen the seedy and privacy-destroying nature of these entities. We’ve also seen what it’s like to have genuine free expression on the internet and we don’t want to turn the web into another cable news where Facebook, Google and Amazon become the new CBS, NBC and ABC. If we do, then the entire promise of the internet will have turned out to be a giant waste.

But I don’t think that’s going to happen. I think most of us have had a taste of what’s possible, and agree that free speech and expression on the internet, the good, the bad and the ugly, is better than an internet censored by tech companies and their billionaire executives, who will always be biased toward the status quo point of view. It’s still not clear which platforms will emerge to replace the tech giants, but it seems fairly clear to me the best days are over for these companies, and it cannot come a moment too soon.

If you liked this article and enjoy my work, consider becoming a monthly Patron, or visit our Support Page to show your appreciation for independent content creators.

Like this post?
Donate bitcoins: 35DBUbbAQHTqbDaAc5mAaN6BqwA2AxuE7G


Follow me on Twitter.

10 thoughts on “YouTube Will Determine What ‘Conspiracy’ Is and Stop Recommending Such Videos”

  1. Yes Michael, we know how valuable it is for an author that wants exposure, to be “banned in Boston” or “publicly denounced by a Bishop”.
    Of course the other annoying consequence of censorship is the vast body of mindless “puritans” who fiercely defend the only narrative they ever hear or want to hear. – From Copernicus to today, change is always formulated by the discerning and eventually blindly followed by the mindless.

    Reply
  2. Absolutely agree that big tech has become quasi-government and hugely corrupt. Also hope that people keep waking up to this and look for alternatives.

    Unfortunately such an awakening does not spell the end of these companies, since we’re in an era where corporate profits are optional and don’t necessarily impact stock prices. They can also rake in money from .gov in their collaborations, so they’re not going away anytime soon. They’ll just make PR efforts to excuse away misdeeds.

    Reply
    • There will always be new alternatives that arise as a result of large behemoths forgetting their roots, and therefore missing the upstarts until it’s too late.

      YouTube is already having problems with musical Artists who are telling them to bugger off due to it’s crappy royalties.

      If that sounds familiar, it should.

      ..

  3. I’m going to go against the grain here, and say that “not recommending” is a far cry from de-platforming.

    I’m also going to risk being labelled a troll and say that it’s OK to “not recommend” obviously stupid/dangerous videos on conspiracy theories like 1) moon landing was fake, 2) earth is flat 3) climate change is a hoax 4) a vaccines are dangerous (brought to you by a “vlogger”).

    Bring back websites anyway – we should never have let FB, and YouTube become the web.

    In general, I’m all on board the liberty blitzkrieg blitz, but I can see a moderate view between our media thought overlords and LB.

    Reply
    • I’d like to think I can critically unpack all four of your above potentially “stupid/dangerous” videos and determine their merit on their own individual content. Actually the blanket assumption that all four of those should not be recommended sounds dangerously like censorship. I recognize that this is not the same as deplatforming, but I think at its extreme it serves to create echo chambers, where those who feel a certain way seek out information supporting their side, while those who’ve never looked into the subject matter critically are funneled into officially sanctioned information. It broadens the gap between the two sides..

    • Big tech pushing back against conspiracy theorists will only make them stronger. The underlying problem is that our society has done a remarkably poor job of providing economic opportunity to regular people (causing disengagement and disillusionment) as well as maintaining a trustworthy media and government. People claiming everything is a hoax are a symptom, not the greater problem.

      The constant “divide and conquer” narrative to drive wedges between us hasn’t helped matters either. There is a well deserved credibility gap between elites and everyone else. I honestly doubt much can be done to restore faith now, it seems too far gone.

      Besides all that, who gets to decide what is stupid and/or dangerous? For example, I think our (US) government has been lying about the Syrian conflict all along. What category does that fall under?

    • If the AI by analyzing my view history decides that I am likely to be interested in a flat-earth video, then fine – I want it in my recommended videos list. In spite of you considering it «obviously stupid/dangerous».

      Seems like you do not understand how the stuff works: It is not like youtube is «recommending» this videos. Its a fully automated process. Or it was, until youtube started to actively mess with it in order to promote their own view of the world.

    • Very valid point Gbell to distinguish between “not recommend” and “not allow”. Of course if someone is too afraid or too stupid to venture out of the common domain, then for them it amounts to the same thing.
      Just remember that Copernicus and Galileo were both “forbidden”, and yet without them your 2) could be regarded as accepted fact.

  4. I’m all for “going back to websites”. I hate that every time I want to sign up for something, its “sign up through Facebook, Google, Instagram etc.”. Let me just sign up without going through big tech.

    It sounds like YouTube has been tainted by the pervasive Puritanical judgmental genetic structure of America as a whole (it runs deep).

    Reply
  5. Other search aggregators like Duckduckgo.com will form so as long as Youtube makes those videos searchable, then we’ll get to see videos of James Corbett and the rest.

    I find myself using Duckduckgo for real reseaech and search results on topics I care about then use Google for marketing research to see what the “corporate types” are viewing.

    Reply

Leave a Reply