The great thing about YouTube is that it made it easy for absolutely anyone to upload and share video with the world. The terrible thing about YouTube is that it made it easy for absolutely anyone to upload and share video with the world. YouTube is crawling with sketchy content that promotes toxic ideas and conspiracy theories, and it has only started addressing that. A new report from Bloomberg claims that YouTube spent years pretending the problem didn't exist for one reason: clicks.

It doesn't take long to find videos on YouTube that claim mass shootings are hoaxes using "crisis actors" or that vaccines cause autism. Engineers have allegedly been fretting openly about the proliferation of such content on the platform and how it appears in recommendations for years, but CEO Susan Wojcicki and her team have brushed off those fears. One engineer pitched flagging incendiary videos so they wouldn't show up in recommendations. Others wanted to undertake projects to track the popularity of conspiracy theories and alt-right extremism. In all cases, YouTube executives allegedly declined to proceed in order to keep engagement numbers up.

YouTube's best behind-the-scenes attempts to fix things could have actually made YouTube even worse. For example, insiders recount an effort called Project Bean in 2016-2017 to stop paying creators based on ad views and instead evenly distribute money based on engagement. However, that would have just fed cash into the coffers of channels like Infowars, which YouTube booted several months ago. Google CEO Sundar Pichai shot down Project Bean before it could be implemented.

Even with the issue now getting some attention, YouTube has failed to articulate any coherent strategy to address it. Its attempts to link authoritative information on shady videos have been hit and miss, and the company won't explain any of the alleged changes to its recommendation algorithms. YouTube's platform may just be too large to manage efficiently.