Online video streaming has been one of the most revolutionary additions to the Internet, and the one platform that best espouses this revolution – to the point of being synonymous with it – is YouTube. However, there is always undesirable content over there that you can do without, and the age-old issue of why you get recommended certain videos remains puzzling. Thankfully, the company has just introduced new features that will put both of those problems to rest.
YouTube recently announced the introduction of new features for the platform which will bring greater transparency and control for users. Of course, the aim is to enable the mitigation of harmful content by empowering users to control what they see.
In a profound blog post, YouTube manager Essam El-Dardiyr wrote, “One thing we’ve consistently heard from you is that you want more control over what videos appear on your homepage and in Up Next suggestions. So we’re doing more to put you in the driver’s seat.”
Among the most impactful features to be introduced comprise of an information box underneath recommended videos explaining why they are here on your homepage, and a shortcut that quickly hides specific channels to improve your overall viewing experience. The former has already been introduced on iOS and expected to rollout on Android and desktop in the upcoming days, while the latter has been introduced on both Android and iOS, with a desktop rollout yet to arrive.
The features show just how aware YouTube has become of its users’ desire to have control over content. We have always wondered, for instance, why certain videos inexplicably pop up in our homepage, and we will finally have an explanation that will help us to discover similar channels that we like or dislike.
“Our goal is to explain why these videos surface on your homepage in order to help you find videos from new channels you might like,” said El-Dardiry. “Although we try our best to suggest videos you’ll enjoy, we don’t always get it right, so we are giving you more controls for when we don’t.”
It is certainly a relief to see that YouTube is taking the management of content on its platform seriously. There have been way too many incidents in the past where it has been believed that YouTube’s recommendation algorithm is spreading terrorist propaganda, or completely inappropriate content for certain viewers. This is bound to pave the way for a safer and more ultimately more useful YouTube.