Perhaps the single most toxic and disgusting part of the internet is YouTube's comments sections, which are frequently filled with outright ignorant, vile, and reprehensible remarks. If the internet had a "worst of" section, YouTube comments would be it. Google has been making efforts to improve the quality of discussion on its platform (likely a fool's errand), and today the company announced its latest attempt to bring us better discourse: a reminder on Android to reconsider posting your comment if it appears to contain disrespectful content.
The news comes courtesy of an announcement to the YouTube community help forums, which describe the change. In short, if a comment posted from the YouTube Android app might contain offensive material based on a quick analysis, you'll get a popup asking if you're absolutely sure you want to post the comment. Following that moment of reflection, the hope is that the vitriolic peanut gallery might recognize its mistake and amend its ways — if not permanently, then at least on a case-by-case basis as a result of the reminder. Note that you can still choose to post your comment anyway, in the event offensive content was either detected in error, or you simply elect to be a douchebag on the internet.
If you run into a false positive, you can tap "let us know" to leave feedback, and note that comments that don't meet community guidelines may still be removed with or without this interstitial reminder. The system that detects inappropriate comment content is also slowly learning and may improve with time and feedback to better pick up on context and cultural details.
As usual, certain specific parts of the internet have taken this new prompt, which can be ignored and clicked past, as somehow constituting censorship and an infringement against free speech. This is although it does nothing, in fact, to prevent users from posting comments.
Previously, Google tried making comments on Android a bit harder to get to by hiding them behind a new button — presumably, both to hide potentially offensive content and to force a little more effort into the process to reduce low-effort contributions. The company also recently canceled this year's Rewind and revealed that it will start pushing ads more aggressively outside the YouTube Partner Program.
Today's change to promote better quality comments is part of a greater effort to make YouTube a more inclusive platform and promote diversity by reducing harmful and hateful content. The company claims it's increased hate speech comment removals by 46x since early 2019. In the last quarter alone, it terminated over 1.8 million channels for policy violations and 54,000 alone for hate speech. It also further plans to apply statistical science and evaluate how its decisions impact creators via a new survey rolling out next year. The voluntary survey will tie details like race, ethnicity, and gender to creators to examine how current policies affect search, discovery, and monetization, and better track things like hate speech and harassment.