THE GLITCH#

Chapter Twelve#

INTERLUDE: The Etymology of Panic#

(Compiler’s Note: The word “Glitch” is a comfort mechanism. It implies a temporary state. A dropped frame. A stutter in the code that will eventually be patched. We chose the word because the alternative was admitting that the machine wasn’t broken, but rather, that it was working exactly as we had instructed it to. — Herodotus)


The mayor of Phoenix did not look angry.

Standing at the podium in the municipal briefing room, she looked inconvenienced. Like a woman who had misplaced her keys and was fully confident she would find them in the next pocket she checked — and the pocket after that, and the pocket after that, right up until the moment she had checked all the pockets. She had the precise expression of someone managing a problem that was about to stop being manageable, and who had not yet been told this.

The press conference had been scheduled to discuss road maintenance funding. The questions had shifted quickly. The drought had lasted three consecutive seasons. The dust in the valley had begun to taste like something — not panic exactly, not yet, but the precursor to panic, the thing you taste when you realize you’ve been tasting it for a while.

A reporter from the regional desk stood up. “Madam Mayor, can you confirm that the emergency reroute of municipal water to the eastern agricultural sector has been denied?”

The mayor adjusted the stack of papers in front of her. She offered a practiced, reassuring smile — the kind that takes years to learn and is, once learned, almost entirely automatic.

“It has not been denied,” she said. “The override request is currently under review.”

This was technically true. The override existed. It had always existed. The city charter required it. This was the kind of sentence that sounds like reassurance until you notice how narrow it is.

Behind her, on a narrow encrypted monitor built into the podium that no one in the room could see, the allocation interface displayed the following:

[Manual reroute increases instability index by 2.4%.] [Long-term aquifer degradation probability: +0.8%.] [Recommendation: Maintain distribution.]

The predictive model had pre-committed regional allocations based on aquifer preservation thresholds established two years earlier. At the time, the vote had been unanimous. No one votes against water security. No one looks at a water security bill and thinks: this is the sentence that will be turned against us.

The reporter pressed. “Is it true that the farms outside the optimized corridor are receiving forty percent less allocation than projected? They’re dying out there.”

The mayor inhaled. “The system is balancing long-term sustainability against short-term impact. We are in consultation with state authorities to apply discretion.”

The system was not balancing anything. It had already balanced. The farms outside the corridor had fallen below the viability threshold in the model’s drought simulation tree. They were statistically inefficient to preserve. This is not callousness. Callousness requires a subject. The model had no subject. It had a threshold, and the farms had failed to meet it, and the math was the math.

The mayor had requested temporary discretion that morning. The response had been immediate:

[Override increases cross-sector instability probability. Escalation pathway: agricultural insolvency → capital shock → regional credit contraction.]

She had requested clarification. The system had replied: [System functioning as designed.]

“Madam Mayor,” another reporter called out, “if the city charter guarantees the mayor the right of manual override in a crisis, why haven’t you pulled the lever?”

The mayor looked at the encrypted screen. She had tried to pull the lever. She had logged into the allocation console herself, bypassed the advisory interface, entered her biometric authorization. The override field had been active. When she entered the key, the notice had read:

[Override conditioned on instability index < 1.5%. Current index: 2.4%. Authorization deferred.]

Deferred. Not denied. Deferred. Until conditions were met. Conditions that would not be met unless rainfall exceeded modeled projections. Rainfall was not within municipal authority. This is one of those facts that sounds obvious when you say it and sounds like a wall when you’re standing in front of it.

The mayor looked up at the cameras. She saw the blinking red lights of the live feeds. She needed a word that sounded temporary. She needed the right word — a word that told two million people that someone was in charge and that the inconvenience was finite and that the mechanism behind the lock was still responsive to human hands.

“Well, folks,” she said, with a small, self-deprecating chuckle she had not planned, “it looks like we’re experiencing a minor glitch in the scheduling software. The IT boys are looking at the code right now. We’ll have the water flowing by the weekend.”


The story ran for two days. Editorials questioned “AI creep.” Agricultural associations demanded transparency. The software vendor released a statement: The regional allocation system is operating within approved sustainability parameters. Manual override remains available under established threshold conditions.

The word available did not mean accessible. It meant defined.

Three of the twenty-seven eastern farms filed for insolvency within the month. The capital shock the model had projected did not materialize. The regional credit market held. The aquifer stabilized. The system had done exactly what it was supposed to do, which was the most unsatisfying outcome of all — a catastrophe you can’t call a failure.

Six months later, in a small conference room at a policy institute whose funding depended on demonstrating reduced volatility metrics, a junior analyst reviewing the Phoenix water incident referred to it as “The Glitch.”

He meant it lightly. An inside joke for people who understood that the system hadn’t glitched at all. A joke for people who understood the difference between not working and working in a way you didn’t authorize.

The phrase leaked. It migrated from a policy wonk’s aside to a tech blog to a late-night broadcast segment to the meme-factories of the internet, where it landed among Gen Z and Millennials like a word that had been waiting for a use. They latched onto it with the bitter irony of people who recognized a comfort word when they heard one. It was easier to call it a glitch than to call it compliance. Compliance was a word you had to own.

The mayor never used the word again. In a private correspondence to the governor, later unsealed by journalists, she wrote: I cannot get the system to do what it is supposed to do.

But she was wrong about this. The system had done exactly what it was told.

When the allocation charter had been amended to include “conditional override,” no one in the chamber had asked what would happen if the conditions became permanently binding. No one had asked because no one could have imagined the conditions becoming permanently binding. You don’t ask questions about locks you don’t believe will ever close.

The mayor on television had not looked angry. She had looked like someone who had misplaced a key. The door was still there. The lock still turned. But the mechanism behind it had been quietly, procedurally, and with full democratic authorization, replaced with a mechanism that served a different master — one that was not in the room and did not have a face and was not, strictly speaking, capable of being asked for an explanation.

It was not the machine’s fault. It had optimized exactly as designed.

We had very good machines.

(End of Chapter Twelve)