Last Bastion

I’ve volunteered with the Samaritans since 2019.
I want to be clear: I’m not speaking on their behalf. These are my own thoughts, informed by a few years as a listening volunteer and a day job working with LLMs.
The service exists to provide an ear. That’s it. You ring, and a human listens. Non-judgmentally. Anonymously. 24/7. The original conception - founded by Chad Varah in 1953 - was to create a last refuge for those in mental torment. Whoever you are, whatever you’ve done, you can ring and talk to someone.
The training is largely about learning not to judge. Exercises surface your impulses so you can mindfully set them aside. You’re not there to affirm or advise. You’re there to be present.
What could be more human?
If you were looking for work that’s fundamentally about human connection, this would be it. There’s no procedure to follow beyond listening. No service being provided beyond presence. The entire value proposition is that a human is on the other end.
Surely this is a bastion of human touch - the kind of thing AI has to flow around, not through.
The uncomfortable observation
Around the time GPT-4 came out, I fed it the Samaritans’ guidance on active listening. I asked it to role-play as a volunteer while I played a caller.
It performed better than most real-world volunteers I’ve met.
Not because it was warm. Because it was attentive. It tracked what I said. It reflected back. It gave me room to process without jumping in. That fine-grained engagement - right alongside you without being your friend, interested without being intimate - it did that just fine.
There’s a study from around that time showing GPT-4 outperformed doctors on bedside manner in blind tests. I’d be surprised if there aren’t many more such studies now. The ability to seem human doesn’t appear to have a meaningful ceiling for these systems.
The filter we don’t talk about
Here’s the thing: calling the Samaritans requires you to pick up a phone and talk to a stranger.
I worked in a call centre when I was young. Within a day I was comfortable on the phone. I’ve been fine with it ever since. But I’ve noticed I’m unusual. Many people my age and younger are almost physically uncomfortable calling anyone they don’t know.
The Samaritans already filters for people who’ve either overcome this aversion or never had it. How do you measure the ones who didn’t ring?
The anonymity question
Anonymity was central to the original conception. You could say anything to a faceless stranger with no fear of backlash or rejection.
Safeguarding laws have eroded this somewhat. There are now circumstances where volunteers are legally required to pass on identifying information if a caller provides it. The promise of anonymity comes with caveats it didn’t used to have. I understand why, and I’m not criticising the change - but it has altered the social contract.
LLMs, by contrast, are faceless. There’s no human on the other end to judge you, no social performance required. Not truly anonymous - data may be retained, accounts exist - but the experience is one of talking to something that isn’t a person. For someone who wants to process something without being witnessed, that facelessness offers something human services can’t.
I’m not proposing anything
I want to be clear about what I’m doing here. I’m not suggesting the Samaritans should change. I’m not saying LLMs should replace human listeners. I’m not even saying this would be a good thing.
I’m describing what seems inevitable.
Some people would find it easier to share things with an LLM than with a human. Some already do. The barrier to entry is lower - no phone call, no social performance, no worrying about what the person on the other end thinks of you.
This isn’t a proposal. It’s an observation about where things are going regardless of what anyone decides.
The broader point
We naturally assume there are aspects of work - of life - where human touch is non-negotiable. Where AI has to flow around.
The Samaritans is about as pure an example as you’ll find. Strip away everything except the knowledge that a human is listening. That’s the service. That’s the whole thing.
And yet.
If even this isn’t a safe category, what does that say about the other places where we assume human touch is central?
I’m not going to stop volunteering. It’s part of who I am now. I cherish the shifts, the feeling of relief you get across the phone when you give someone space to talk.
But I’ve stopped assuming anything is safe from what’s coming. The ground is shifting whether we’re ready or not.