Ben Ditto: CLOUDBURST Trauma Bank

Published
WordsBen Ditto
ArtworkAliina Kauranne

The future is weird, according to Ben Ditto’s column CLOUDBURST. In this latest instalment, he explores a fictional and futuristic world in which therapy gets an AI makeover. You’re already familiar with speaking to your therapist over Zoom, you’ve used the mental health apps that promise to massage your mind of all its knots and kinks, but this takes the digitization of trauma to the next level.

It’s a crisp December day, a light covering of snow on the ground in the suburbs of Berlin. I moved here a couple of years ago for the quality of life and social security; Northern Europe has been good at embracing AI-led social planning, there are no two ways about that. I’m in between clients, the last one was remote and the next one will be face-to-face. I take appointments in my front room, there is good natural light and the neighborhood is quiet. 

The nature of my work has changed so much…kind of a 180 degree turn. I trained as a clinical psychotherapist; my mother passed from addiction-related complications when I was young and I guess I was searching for answers. What made her the way she was? How did that shape me? What was my role in her life and her ongoing role in mine? The draw of therapy was meeting people and hearing my own thoughts, fears, hopes and neuroses reflected back at me. It is, of course, a great way to get to know oneself.

But then something happened: a seismic shift in my vocation. Large Language Models were piloted as an accessible alternative to human talking therapy. Then avatars and spatial computing really took off and next level LLMs incorporated neural-symbolic reasoning, which made them far more powerful. Instead of testing what the “client” told them, they would do their own research and build their own a-priori knowledge of the client. 

This, combined with facial recognition and motion tracking cross-referenced with an enormous database of monitored human behaviors, made them infinitely better at reading human emotions, wants and needs than any human therapist. We did NOT see that coming. I was one of those people who thought my job was safest of all because how could you possibly automate empathy or dynamic therapeutic relationships? But it turns out that you can. And it is far more cheap and effective than anything the health service could provide.

Join the club

Like this story? You’ll (probably) love our monthly newsletter.

The first pandemic of the 21st century got everyone used to remote working, Zoom, “intimacy” through a screen. The more astute recognized something subtle: You can give and receive therapy remarkably well through a screen. If the first pandemic was the era of flatscreen human connection, the second pandemic adrenalized that and added a third dimension. Spatial computing and 6G really come into their own. We therapists were caught in a perfect storm and found ourselves very much surplus to requirement. It’s not that AI-powered avatar therapy was a good proxy for the real thing; it was far, far better.

There was something they still needed from us though, which led to my career recalibration: you can’t have true empathy without a trauma history. And you can’t train an avatar without humans because it is outside of AI ethical permission sets to create trauma as it is a tangible harm. This ethical dilemma and others like it created space for humans in the equation. The AIs needed our trauma history because they don’t have permission to create their own. It would quickly lead to system collapse.

So that’s what I do: I am a trauma farmer. I get paid by tech companies to interview people about the nuances of their past, from dramatic headline events like sexual assault or physical beatings to the more subtle traumas of juvenile loneliness, not being heard, not being understood, fear of abandonment based on real or imagined factors. Petty jealousies, sibling rivalries, the body horrors of puberty. These have to come from real people to be worthwhile. 

You can’t train an army of service AIs on fiction or a small dataset because that’s not how people form meaningful relationships with therapists. It has to have the texture and fabric of real experience.

There is a large market for trauma history. Think about how Getty Images used to sell the rights to images. And just like images, some are of far higher quality than others. I like to think of myself as part of the Magnum Photography of trauma licensing: the stories I draw from my clients will be used to help a great many people. They have depth and weight and they will be used to train the most empathetic AIs. 

I no longer have a human supervisor, the company provides me with a digital one. I miss the real thing. When the day comes that they no longer need me, I will change career—until we are all looked after by fully AI-powered automatic luxury communism. But I will be lonely.