As someone who loves YouTube, watches a non-healthy amount of YouTube, and even willingly uses YouTube Music as her primary music service, I cannot stay silent any longer: YouTube needs to get off the couch and get things done.
YouTube has a laundry list of problems and broken features as long as my arm that needs fixing, and that’s on top of the FUBAR situation that is YouTube moderation and monetization policy — or more specifically, demonetization policy. 2020 was a huge year for misinformation on media platforms from Twitter to YouTube and beyond, and while Twitter and a few other media platforms have at least earnestly tried to fight misinformation on their platforms, YouTube’s approach has been a lot like Facebook’s: it says its taking action, but it’s barely scratching the surface of what’s needed.
That’s why YouTube’s 2021 resolution needs to be to get its shit together and stop resting on its laurels as the largest video platform on the planet.
Moderation, monetization, and the need for a more moral AI in YouTube
Source: Jason England / Android Central
2020 gave us a start-to-finish fiasco of a presidential election, a pandemic whose response has been stymied by misinformation and outright conspiracies, and continued attacks on marginalized communities by extremists emboldened and radicalized on websites like — you guessed it — YouTube. This is a major problem not just for YouTube but for society as a whole.
Too little, too late is YouTube’s modus operandi so far.
For example, let’s take the election: YouTube tried to steer users towards official coverage of the election around November 3, but for weeks and weeks after the election, YouTube refused to pull videos spreading falsehoods and outright conspiracies about the election. In fact, YouTube didn’t ban uploading this content until after Safe Harbor Day in mid-December, and it’s still not taking down any of this known misleading content from before Safe Harbor Day, just adding a fact-checking blurb to the top of the search results. By contrast, Twitter started marking misleading or contested tweets about the election even before the polls closed.
YouTube’s got a few reasons for not removing these videos, and no, “The First Amendment” is not one of them. These videos make money for YouTube, even if the uploader isn’t making money from them. The more videos YouTube has to serve up to people who search for this bullshit, the more hours those people will spend watching YouTube and generating ad money, making comments, and sharing these videos to draw in more engagement and money.
One could argue that this is tainted money since YouTube is making it off radicalizing its users and spreading foment, but this is the 21st century, and websites will make money wherever and however they can.
AI is the future, and YouTube’s needs a big injection of ethics.
The better argument to be made — both for YouTube and its users — is that YouTube’s AI needs to be more ethically trained to try and guide people away from misleading content back towards factual content and, y’know, reality. Ethical AI is something Google has been having issues with the last few months after ousting one of its best AI researchers. YouTube needs as much ethical AI attention as Google Search if not more because the algorithms at YouTube so far have very, very, very little regard for ethical content.
Source: Android Central
Outside the need for better AI, YouTube also needs a more robust and more human moderation system. Yes, there are too many videos uploaded every second to YouTube for a human to judge them all, and automatic flagging/removal/suspensions are going to be a thing. Still, YouTube needs to build up some moderation teams that handle specialized content so that generic rules stop hurting the marginalized communities they are supposed to protect.
Case in point: LGBTQ creators are more prone to having videos demonetized when discussing issues around their community, and while the videos are usually re-monetized after the review process, by then, they’ve missed practically all of the revenue that video would’ve made in the first days and weeks when YouTube’s AI flagged it. YouTube should create moderating teams specifically for content types that run into issues more frequently, such as LGBTQ content, BIPOC content, political content, and gaming content.
Source: Ara Wagoner / Android Central
Build teams that are well-versed in this content and can more quickly and more consistently apply YouTube’s policies. Not only could this make it easier to train YouTube’s algorithms over time, but hopefully, by having moderation teams that focus on specific content types rather than all types of content, YouTube could help cut down on the high rate of turnover among YouTube’s moderators. Having such teams could also make it easier for YouTube to react when major events happen within one content type, like this year’s election and the mass disinformation campaign that accompanied it.
Half-measures for accessibility and COPPA need to come full circle
Source: Android Central
It is now harder for those with hearing impairments to watch YouTube than it was a year ago. YouTube removed community captions, a method for people to add subtitle tracks to YouTube videos for creators that didn’t or couldn’t caption videos themselves. This means that unless a creator goes out of its way to make a subtitle track, you’re stuck with the built-in AI-generated captions.
Why break what works, YouTube? Why?!
While AI-based captioning like Live Captions and YouTube’s built-in version of Live Caption can help fill some of the gaps, it still doesn’t do well with non-standard words/acronyms, strong accents, and overlapping voices. YouTube took away community captions due to a “lack of use,” which is the same nonsense answer Google has used to kill dozens of perfectly functional services and features in the last three years, including Google Cloud Print — which dies New Year’s Eve — and Google Play Music, which was replaced by a still buggy YouTube Music that will be getting its own section in a moment.
YouTube needs to improve the accessibility of its videos, and it needs to offer a robust toolbox to both creators and viewers to help make an accessible YouTube a reality.
Another area that needs a more thought-out implementation is how YouTube handles kids’ content because what they rolled out a year ago is an absolute disgrace.
YouTube’s lax approach to kids’ content earned them a multimillion dollar fine from the FTC. In response, YouTube announced “improvements” to kids’ content and data protection. These changes included disabling comments on millions of videos, disabling the mini-player, and other features.
The problem: not only did this make it harder for creators to get engagement for their channels and content, but this straight-up broke YouTube Music for families since it prevented you from liking these songs and adding them to playlists.
There are no easy answers when it comes to protecting kids online and offering them a safe and COPPA-compatible experience. But I know that YouTube can do better than this, and they should. First and foremost, though, they can keep these restrictions from impacting YouTube Music. Disabling the mini-player, likes, and playlist additions for kid-friendly music in YouTube Music makes absolutely no sense, especially when you cannot see comments or content suggestions when you click on a kids song in YouTube Music the way you would the main YouTube app.
It just drives parents back to Spotify and Apple Music, where everything just works properly.
YouTube Music needs to fix the basic features that have been broken for years.
Source: Ara Wagoner / Android Central
YouTube Music has had a consistency problem since it first launched in 2014 — yes, YouTube Music existed long before that 2018 “launch” — and there are a variety of issues that have been needing to be fixed since Day One that have never been touched.
2021 is the year YouTube Music has to get its shit together, or it’s going to die a slow, painful death like Google Play Music did, and that shit starts with fixing casting.
Casting should’ve been fixed back in 2015.
Casting in YouTube Music is flat-out broken. If you cast a large playlist or some song-based mix, the player doesn’t keep pulling in new songs once it gets to the bottom of the 15-or-so songs loaded in the queue. If you cast, you cannot shuffle or repeat songs — YouTube support says this is a “Premium-only feature,” but no, it doesn’t appear for Premium users, either.
Google owns and oversees the Chromecast system, and YouTube was one of the launch-day services. That casting is still breathtakingly screwed in 2020 is an embarrassment for the entire company and begs the question as to why anyone would buy a Cast speaker like the Nest Audio when YouTube Music has a worse experience with Cast than it does with Bluetooth.
Source: Ara Wagoner / Android Central
Are you trying to make it harder to make playlists, YouTube??
We finally have the ability to sort our library three ways — A-Z, Z-A, Recently added — but you still can’t sort a playlist that way, nor do we have the ability to save the current queue as a playlist, set custom art for a playlist or even the ability to select multiple songs at once to add to a playlist. If you migrated from Google Play Music to YouTube Music this year, your library management just got about five times worse.
And let’s not forget that while YouTube Music got the music locker service this year, there is no way to edit the song info or the album art for anything you upload, so you better be damn sure you got everything right before you add it to your library.
Source: Chris Wedel / Android Central
YouTube Music showed great improvement this year and received plenty of good new features, like the Just For You Mixes and Year in Review. That said, none of them can make up for the fact that YouTube Music still absolutely refuses to get the basic needs of a music service taken care of and stable, even two years after its relaunch.
Downloads is another basic necessity that YouTube Music has continued to blunder, with songs needing to be re-downloaded at seemingly random times, a lack of a “cache while streaming” feature that its predecessor Google Play Music had, and Smart Downloads that often download playlists the user has never even seen before rather than playlists and albums in their libraries. If I can’t count on having my music when the network goes down, then why pay for YouTube Music at all?!
Again, I use YouTube Music as my primary music app and have for the last two years. It’s where my music and my heart are, but it enrages me that I still have to use Bluetooth rather than casting in order to repeat a song. It frustrates me that I can’t sort a master mix playlist to ensure I don’t have duplicate songs. It utterly infuriates me that Google decided to destroy a perfectly functional service for one that can’t even show me how many times I’ve listened to a song. This app cannot rise about being an “honorable mention” in Best Music Apps until it straightens up and flies right.
So for all our sakes, YouTube, get your shit together.