But of course it is linked to decreased viewership and the opaque algorithm punishing the videos. There's been enough data uncovered to confirm that. But the story, that YouTube doesn't recommend what it thinks are best, but what will make it look good and get them the most money, is made obvious by the preposterously rigged US Trending chart.

It's worth saying, as an engineer and someone who's made a high-volume API to rank and suggest things, that when your boss asks "Does X do Y?" the only way you have a quick and simple answer is when (for instance) the recommendation and filtering system doesn't even know about X. It can't do anything with it. It doesn't have that kind of information.

I strongly suspect this is what led to YouTube's response – their ad server has data their recommending system doesn't, which is certainly the way I'd design it. And that information includes demonetization data.

Of course, your video might be suddenly flagged as inappropriate for children. This causes it to be demonetized – the average advertiser isn't crazy about being alongside mature content – but also, obviously, causes the recommendation algorithm to not recommend it to kids. So, fewer views. That's the simplest case, and one mentioned in YouTube corporate statements as explaining the statistical link. It's just correlation, because some events cause both demonetization and lower views.

But I think that simple analysis (although good for explaining the system) understates the correlation. I am an adult, but I do not like videos with graphic depictions of violence. I'm sure their recommendation system would notice this. So getting marked as violent doesn't just filter out kids. It pushes it so far down in my recommended videos that I never see it.

These correlations get more and more subtle – any propensity or bias based on the information the video gets marked with will be taken into account. Even very tiny things – someone not watching a video after they watch a video containing mentions of eating disorders – would have a substantial effect on videos. And since a lot of the marking happens in the course of content review for advertisers, the correlation would be extremely strong.

Someone who makes their living off of YouTube is only seeing a strong correlation, and hearing the denial, and, well, it sucks. We need to take more time to understand the complexity of things, to take the pains to be empathetic with people who hold a lot of power in their hands. But because not everyone is an engineer, that means, there's a reverse duty, to use empathy in explanations of things. Just tell people, we do audience filters at the same time we demonetize, because they're so similar, so they're very similar things, yes. That's not the whole truth – the statistical inference stuff is both interesting and (I can predict with almost zero doubt) happening as well. But if you don't phrase the answer as confirming their fears, they might not hear it at all.