Clock Town
I’ve had a lot of conversations about AI over the past few years. HR, journalism, project management, design, QA, copywriting. Friends outside work too: musicians, school teachers, military officers, marketers. My dad, a retired software developer.
Show someone what the tools can do. Walk through where they were two years ago, where they are now, the rate of investment, the prisoner’s dilemma between labs and between superpowers. Within five minutes, most people are nodding. Often they start making connections themselves, observations that tell you they’re genuinely engaged, not just being polite. They can see their job is a bundle of tasks, they can see the trendline only goes up, they can see the error bars on expert AGI predictions are 5-15 years, not 50.
Almost everyone gets it, almost immediately. That’s what surprised me.
And then the conversation ends, and they go back to normal. Maybe 1-2% actually uproot. The rest return to Clock Town.
Majora’s Mask

In Majora’s Mask, the moon is falling. You have three days. The game is structured around this countdown. Events happen at specific times, you can rewind and replay, the whole thing has an uncanny wind-up-toy quality.
What stays with me is the NPCs. They carry on chatting in the town square. Preparing for the festival. You can stop them and talk to them, the moon might come up, but then they go back to their routines. There’s a vague acknowledgement that something dramatic is about to happen. It doesn’t change their behaviour.
That’s what it feels like right now. Not denial exactly. People can see the moon when you point to it. They just can’t seem to carry the realisation beyond the conversation.
Psychoticism
Hans Eysenck identified psychoticism as a personality trait. (Distinct from psychosis, a clinical condition.) It measures how decoupled someone’s mind is from the society they’re in.
Most people are low on this scale. That’s probably adaptive. Following the patterns exhibited by those around you, rather than constantly questioning them, makes sense most of the time. We’re social animals playing a numbers game at scale.
But it means that seeing something clearly and acting on it are two different things. The reasoning part of the brain isn’t massively connected to the part that actually lives. You can understand the implications of machine intelligence, think through the consequences, and still file it alongside Game of Thrones in some kind of interesting-fiction box.
It takes a certain disposition to say “no, the moon really is there, I’m going to make a survival plan.” Willingness to be disagreeable against the social grain is rarer than the intelligence to see what’s coming. High-psychoticism types are rare by design.
Which day?
Meanwhile, xAI is building what they internally call “human emulators”. They don’t say agents, assistants, or copilots. They say humans. The framing is explicit.
Compare Jensen Huang’s framing from Roving Bridges: jobs have tasks and purpose, tasks get automated, purpose expands. That’s the optimistic version. xAI isn’t automating tasks. They’re emulating the human.
Everyone agrees there’s a moon. Expert consensus has error bars, but they’re narrow. The debate is which day we’re on.
Most people can see this when shown. They just can’t hold onto it. Maybe that’s fine. Someone has to keep the town running while the rest of us stare at the sky. But it’s strange to watch. You point at the moon, they say “oh dear,” you look away and look back, and they’re back to planning the festival.