One of the dirty secrets of modern big tech - in this case those companies running the large social media sites - is how much low-paid human work goes into providing you the “experience” they offer. And how damaging some of this work is to the people who do it.

Today’s Guardian reports on the plight of over 140 Facebook moderators. These are the people - yes, it’s not all algorithmic - who are instructed to look at posts deemed to be potentially infringing Facebook’s policies on acceptable content to decide whether to keep them or remove them from display.

It’s making some of them very ill.

More than 140 Facebook content moderators have been diagnosed with severe post-traumatic stress disorder caused by exposure to graphic social media content including murders, suicides, child sexual abuse and terrorism.

Then there’s generalised anxiety disorder and major depressive disorder to add to the mix.

Funnily enough, spending hours a day looking at the worst content humans manage to produce isn’t doing them any good.

The images and videos including necrophilia, bestiality and self-harm caused some moderators to faint, vomit, scream and run away from their desks, the filings allege.

It’s in the news due to a lawsuit some of these moderators are bringing against Meta. After all:

In any other industry, if we discovered 100% of safety workers were being diagnosed with an illness caused by their work, the people responsible would be forced to resign and face the legal consequences for mass violations of people’s rights.

But instead we continue onwards, exposing our fellow humans to dangerous work so that we can organise insurrections and commit genocides in between the cat pictures et al.