Over the past 10 months, together with 37,380 YouTube users, Mozilla conducted the largest-ever crowdsourced investigation into harmful recommendations on YouTube using a tool that we built called RegretsReporter.

What we learned was really troubling.

YouTube’s algorithm is suggesting people watch misinformation, scams, violent and graphic content — even disturbing and sexual “children’s” content was flagged in our investigation. We found evidence of their algorithm suggesting videos that violate the platform’s very own content policies. We also found that the problem is particularly bad in non-English speaking countries.

The good news: these problems are fixable — but YouTube’s leaders have to care enough to fix them. We’ve made clear recommendations of what YouTube should do to keep people safe on their platform, and we need your help to put pressure on them so they will act.

Will you share this video on social media to spread the word?

If enough people see this video, we can raise awareness of the dangers of YouTube’s algorithm and let them know that they’re not going to get away with inaction.

Thank you,
Brandi, on behalf of the Mozilla team