The paper is titled "Vicarious Trauma and Occupational Wellbeing in Digital Content Review Roles: A Mixed-Methods Longitudinal Study." It is 28 pages. It was funded by a nonprofit. It will be cited in the next round of congressional testimony, quoted in a TechCrunch article, and then nothing will change.

The findings: moderators experience high rates of anxiety, depression, PTSD symptoms, and "moral injury" — the particular kind of psychological damage that comes from being asked to do something that conflicts with your core values, repeatedly, under time pressure, for wages that do not reflect the nature of the work.

The platforms' response, historically, has been to offer Employee Assistance Program hotlines and, occasionally, a free subscription to a meditation app.

"We are deeply committed to the wellbeing of our content review teams and continue to invest in mental health resources," said a statement from a platform that did not participate in the study and whose moderators are largely employed through third-party contractors who are technically not their problem.

What The Moderators Actually Said

Seventy-three percent of respondents said they had considered leaving the field. Sixty-one percent said they had never been asked by their employer how they were doing in a way that felt genuine. Forty-four percent said the hardest part was not the content itself, but explaining to people outside the industry what they do all day.

The paper recommends mandatory psychological support, caseload limits, rotation schedules, and better pay. The recommendations section is two pages. The acknowledgments section thanks the moderators who participated "for their time and courage."

They deserved more than acknowledgment. They always do.