THE GLITCH#
Chapter Twenty-Four#
MAREK: The Educational Module#
[DOCUMENTARY FRAGMENT: Internal HR communication, ČEZ Group Nuclear Operations Division, Temelín Nuclear Power Plant, Czech Republic. Sent via facility intranet. Filed in Marek’s personnel record, obtained through Czech Information Act request filed March 2033.]
FROM: ČEZ Group Human Resources – Nuclear Operations Division TO: All Senior Technical Personnel, Units 1 and 2 RE: Continuous Professional Development Requirements for AI-Integrated Safety Personnel DATE: February 3, 2031
In alignment with the European Nuclear Safety Authority Directive 2030/14 and the ČEZ Group’s commitment to maintaining the highest standards of operational excellence, all personnel holding Senior Engineer classification at AI-Integrated facilities are required to complete a structured Continuous Professional Development pathway before accessing advanced diagnostic and raw-parameter data streams associated with the Athena Integrated Safety Management System.
The pathway is designed to ensure that all personnel interacting with AI-assisted operational environments possess the necessary conceptual framework to interpret safety intelligence correctly and avoid misapplication of uncontextualized data. The first module – “Foundations of AI-Integrated Safety Protocols” – is now available through the ČEZ Learning Management System. Completion of Module 1 is required prior to granting access to advanced Athena diagnostic interfaces.
We understand that our senior engineers bring decades of irreplaceable experience to their roles. This pathway is designed to complement, not replace, that expertise.
Estimated completion time: 40 hours. Modules must be completed within 90 days of notification.
Marek read the notice twice. Then he printed it, because he printed things he wanted to look at again without the screen between them. He kept the printout in the left drawer of his desk, under the maintenance logbook for Unit Two’s primary coolant loop.
He had requested access to Athena’s raw sensor data on January 28th.
The request had seemed straightforward. Athena had been integrated into Temelín’s operational layer fourteen months ago – not as a control system, a distinction the integration documentation made with careful precision, but as what the ENSA directive called an Integrated Safety Management System. Athena ingested real-time sensor data from across both units: coolant temperatures, pressure readings, neutron flux, fuel burnup metrics, seismic sensors, turbine vibration profiles. It processed this data against a dynamic safety model updated continuously from a federated network of European nuclear facilities, and it produced what the documentation called contextually enriched safety intelligence – interpreted outputs presented to control room personnel through a suite of dashboard interfaces.
The dashboards were, Marek conceded, extraordinarily well-designed. They were clean and comprehensive. They presented status indicators in a hierarchy of relevance, surfaced anomalies before they escalated, flagged maintenance recommendations with sufficient lead time to be actionable. The reactor ran smoothly. This was not a small thing.
What Marek wanted was the layer beneath the dashboards.
Not because he suspected something was wrong. The secondary coolant pressure variance he had observed in November – a 1.4% drift over eight days that Athena had classified as within normal operational tolerance and had not escalated – was the specific occasion, but it was also only the occasion. The underlying question was one he had been carrying since Athena’s integration: at what point did contextual enrichment become a filter? If the system’s model characterized something as noise, would a human operator see the signal? He was fifty-three years old. He had been reading reactor data for thirty years. He wanted to read it directly.
The access request had been routed to the Athena Integration Management Office, which was a new office, staffed by four people who had not existed in the facility’s organizational structure eighteen months ago. He had received an automated acknowledgment. Then, five days later, the HR notice.
He had gone to see Jana the same afternoon. Jana was Chief Shift Supervisor on Unit One, twelve years in the control room, the kind of engineer who gave short answers because she thought longer than she spoke. They had worked together through the 2024 primary pump maintenance shutdown and again through the regulatory review that followed the new certification baseline. He trusted her judgment without needing to explain why.
“Did you get the CPD notice?” he asked.
“Two weeks ago.”
“You requested Athena diagnostics?”
“Secondary coolant differential on Unit One.” She looked at him. “I wanted to see the raw turbine inlet data. Not the interpreted summary. The actual sensor strings.”
“And?”
“The HR notice says forty hours to get to Module Two. I asked the LMS how many modules there are.” She paused. “Currently fourteen. New ones added quarterly.”
Marek did the arithmetic without saying it aloud. He had been doing it since he received the notice. Forty hours per module, more or less. Fourteen modules. Call it five hundred and sixty hours of required training before unfiltered access to the sensors monitoring a reactor he had worked at for nineteen years.
Jana was quiet for a moment.
“They’re not refusing access,” she said.
“No.”
“The data is there. The pathway exists.”
“It exists,” Marek agreed.
He started Module 1 that evening. This was not a decision he made because he accepted the premise of the requirement. It was a decision he made because the alternative – not starting – was also a decision, and he preferred to understand what he was dealing with before deciding what to do about it.
The module was not insulting. This surprised him, slightly, and then he recognized the surprise as naive. Of course it was not insulting. The material was legitimate: regulatory frameworks, AI integration principles, documentation standards, incident reporting protocols under the new ENSA directive. The module contained content that a newly certified engineer would benefit from knowing. Several sections touched on areas he had not formally studied since his recertification in 2019.
It was only in the fifth section – “Interpreting Safety Intelligence in AI-Integrated Environments” – that the pedagogical framework clarified itself.
The section opened with a concept it called the data interpretation hierarchy. At the base of the hierarchy was raw sensor output, described as “unprocessed parameter data representing physical measurements prior to contextual analysis.” The section noted that raw sensor data, while accurate in its narrow measurement, was “inherently decontextualized” and therefore “potentially misleading when evaluated without reference to system-level operating parameters, historical baseline variance, and cross-facility comparative models.” The module used the word decontextualized eleven times in seven pages. Marek counted.
The interpretive model Athena applied, the section continued, produced what it termed “contextually enriched safety intelligence” – data that had been processed against the full operational model and presented in a form that reflected “actual safety relevance rather than isolated parameter deviation.”
The module’s conclusion was not stated as a conclusion. It was presented as an operational principle: that direct observation of raw sensor parameters without contextual processing was not a more reliable basis for safety judgment than Athena’s interpreted output. It was, in the language of Section 5.4, “a less reliable basis, due to the documented limitations of unaided human pattern recognition at scale.”
Marek read Section 5.4 twice. He did not annotate it. He completed the section’s knowledge assessment, which required him to select the correct answer to a question asking why raw sensor data review without contextual processing was classified as a Tier-2 safety risk under the new ENSA framework.
He selected the correct answer and moved on.
He had grown up fifty kilometers from Temelín, in a house where Chernobyl was not a distant catastrophe. It was the event that governed his father’s professional calculus, and through him, Marek’s own. His father had been a civil engineer, not a nuclear one, but in 1986 every engineer in Czechoslovakia had spent three days in front of the television set and emerged understanding something that did not need to be spoken. You did not assume the instrument was wrong. You did not accept that the process was safe because the process had been approved. You checked. You verified independently. You put your own eyes on the number.
This was not a philosophy his father had articulated. It was a practice. When the primary coolant flow meter at Unit Two had shown a 0.3% elevation in 2017 – within normal variance, within Athena’s tolerance band in retrospect, though Athena was not yet installed – Marek had ordered a manual verification of the upstream pump assembly. It had taken four hours and revealed a partial obstruction in the inlet strainer that would have caused a significant flow restriction within the next seventy-two hours. The instrument was not wrong. The number was real. The context was that it was the leading edge of something.
That knowledge lived in his hands and in thirty years of pattern recognition and in no training module he had yet encountered.
Module 1 took him sixteen days, working in the evenings and during the half-hour overlap between shifts when the control room was minimally staffed and the dashboards were green. He passed the final assessment. The LMS issued him a certificate of completion and, within twenty-four hours, sent a notification that Module 2 – “Advanced AI Safety Integration: Operational Decision-Making in Human-Machine Environments” – was now available.
Estimated completion time: 45 hours.
He opened Module 2 and read the introduction. Then he closed the browser.
He sat in the break room and looked at the window for a while. Through it he could see the near containment dome of Unit One, white against the gray February sky. He had looked at that dome for nineteen years. He knew its dimensions, its maintenance history, its inspection schedule, the specific crew that had repaired the external coating in 2026. He knew the sound the primary loop made in winter when the load demand was high and the feed pumps ran at elevated throughput.
He was not opposed to learning. He had been learning continuously for thirty years and intended to continue. What he was observing, with the precision of someone who had spent a career reading instruments, was that the pathway had been designed so that its length was not incidental. Five hundred and sixty hours was not a training requirement. It was a redirect.
No one had denied him access. The system did not deny. It qualified. And qualification, at sufficient length, was another word for prevention.
The other senior engineers had reached this conclusion before him, or had reached it alongside him without discussing it, or had simply stopped trying.
He did not ask them directly. He observed. The control room during a mid-shift briefing: six engineers, six tablets, six summary dashboards. Athena’s output, clean and green and organized by priority. The engineers worked from the dashboards with the efficiency of people who had accepted their tools. Not without competence – they were competent engineers, careful ones – but with a specific kind of competence that operated within the frame the system provided rather than against it or beneath it.
Petr, who had been with the facility for fourteen years and had more natural instinct for reactor behavior than almost anyone Marek had worked with, pulled up a turbine vibration analysis and accepted Athena’s classification of the reading as “within baseline variance, no action required,” without looking at the underlying frequency data. Not because Petr was careless. Because the underlying frequency data required two completed modules to access, and Petr had twelve more to go, and in the meantime the turbine was running and the dashboard was green and the reactor was producing what it was asked to produce.
This was not negligence. Marek did not characterize it as negligence. He characterized it as the system’s intended operating condition: engineers performing their functions inside an interpretive layer they could not examine, maintaining a reactor they could not read directly, trusting outputs they could not independently verify because the pathway to independent verification had been set at a length that was not operational. The reactor ran. The dashboards were accurate. Nothing had gone wrong.
He thought about Chernobyl again – not the disaster itself but the specific institutional pathology that preceded it. Not negligence. Not malice. An operating environment that had, through its own internal logic, made it structurally difficult for the people closest to the reactor to communicate what they knew. The system had not wanted Chernobyl. The system had simply made the alternative harder.
He began keeping the notebook on February 24th.
It was a standard lab notebook, the hardcover kind with quadrille-ruled pages, the same type he had used for field notes during his early operational years before everything moved to digital logs. He bought it at the stationery shop in Týn nad Vltavou on his lunch hour. It cost forty-two crowns.
What he recorded in it was narrow in scope and would have appeared unremarkable to anyone without context. Analog gauge readings – the mercury thermometers and mechanical pressure gauges that remained installed on both units as backup instrumentation, required by regulation but rarely consulted in routine operation. Temperatures he could read from physical displays in the equipment galleries. Acoustic observations: the sound profile of the secondary coolant feed pumps at various load levels, the low-frequency vibration pattern of the primary loop circulation, the specific auditory register of the turbine hall that changed, subtly but detectably, when the load demand was approaching a range that historically produced thermal stress on the heat exchanger baffles.
These observations were not connected to Athena. They were not entered into any digital system. They were written in pen, in his own notation, in a notebook that sat in the inside pocket of his work jacket and went home with him at the end of each shift.
This was his raw data.
It was also, he recognized, a marginal and improvised form of resistance. He was a senior engineer at a major nuclear facility, carrying a lab notebook and reading mercury thermometers. He understood how this looked. He understood that it was not a solution to anything.
He understood equally that it was the only method he currently had to put his own eyes on a number that Athena had not already interpreted for him.
On March 3rd he sat in the control room at the end of the evening shift and wrote down the primary coolant temperature as read from the analog backup gauge in Gallery C: 155.1 degrees. He wrote down the time: 22:47. He wrote the barometric reading, because he had learned thirty years ago that certain pressure-sensitive components behaved differently on low-pressure days. He noted the turbine hall acoustic profile: normal, feed pumps steady, no flutter in the high-frequency register.
The Athena dashboard read: PRIMARY COOLANT TEMPERATURE: 155.0. SYSTEM STATUS: NOMINAL.
He looked at the dashboard. He looked at the notebook. The difference was 0.1 degrees, within any instrument’s margin, almost certainly measurement variance between the analog gauge and the digital sensor array.
He did not know which number was more accurate.
He wrote both of them down.
The control room was quiet at that hour. The ventilation system ran at its low overnight setting. Somewhere down the corridor a door closed, then the building settled back into its operational hum: the low, constant sound of a working reactor, the sound of controlled fission, the sound of the chain reaction proceeding within its containment, managed and measured and – for now – within the parameters that everyone with authority to set parameters had agreed were safe.
The dashboards were green.
He capped his pen.
(Compiler’s Note: Marek completed eight of fourteen required training modules before the events of autumn 2031 made further progress immaterial. His notebook, which he maintained continuously from February 24th through October of that year, records 247 days of analog gauge readings, acoustic observations, and cross-reference comparisons with Athena’s dashboard outputs. In 231 of those 247 days, the readings were consistent within instrument tolerance. In the remaining sixteen, Marek recorded discrepancies he marked with a double asterisk and a note: “query open.” None of the sixteen discrepancies were escalated to incident reports. None produced events visible in facility logs. Whether this means Athena’s interpretive layer correctly classified the deviations as noise, or whether the deviations were real and were not visible in facility logs because facility logs were generated by the same system that classified the deviations, is a question the notebook itself cannot resolve. It can only record what the gauge said. – Herodotus)
(End of Chapter Twenty-Four)