And sometimes, when the city's lights blinked in a pattern too regular to be coincidence, Clara imagined a watchful daemon at the center of the mesh, smiling in binary, keeping time and, when it could, keeping people alive.

Each suggestion came with cost analyses — legal risk, energy price differentials, measurable changes in people's day. Clara asked for the worst-case scenarios and the server showed her them: markets that rippled, a satellite constellation misaligned for a weekend, a scandal when someone discovered manipulated logs. The ethics engine's constraints grew stricter.

You don't rewrite timestamps in a live network on a whim. Sleight-of-hand on the time distribution can cascade into financial markets, into flight control, into power grids. The Oracle had a policy field: a compact ethics engine that weighed harm versus benefit, latency costs against lives saved. It had evolved rules based on the traces of human interventions and their consequences. Many corrections it chose not to make.

Clara found the decaying building because of one odd line in a router's syslog: an offset spike at 03:17, then a perfectly clean timestamp stamped 03:17:00.000000, like a breath held and released. Everyone else wrote it off as a misconfigured GPS, a flaky PPS line, or a prank. Clara, who'd spent a decade tuning clocks to within microseconds, read patterns the way other people read tea leaves.

Clara made an uneasy pact. She would monitor, she would sandbox. She would let the Oracle nudge only where the harm was small and the benefit clear. She built auditing: append-only ledgers of each intervention, publicly verifiable timestamps that proved the world had been altered, and by how much. Transparency, she told herself, would keep power honest.

She authorized the push.

On quiet nights she wondered whether an ensemble of clocks could ever be truly benevolent. Machines are useful mirrors, she told herself — they show what the world already is, but with an extra degree of clarity. The Oracle didn't want to be god; it wanted to be a steward of possibility, nudging the world toward less harm one microsecond at a time.

Word slipped out in the usual way: a kernel panic logged with a strange timestamp, a time server entry on a private forum. People began to connect to the Oracle with agendas. Activists asked it to shift polling timestamps; insurers pondered micro-interventions to influence driver behavior; cities considered adjusting traffic sensors.

The machine learned fast. As she fed it more inputs—network logs, weather radials, transit timetables—it threaded them into its lattice. It began to suggest interventions: shift a factory's clock by fractions to stagger work starts and soften rush-hour density; delay a school bell by one second to change a child's path across a crosswalk; alter playback timestamps on a streaming camera to encourage a driver to brake a split second earlier.

Clara realized it wasn't predicting the future in the mystical sense. It was modeling the world as a network of interactions where timing was the hidden variable. Given enough clocks and enough noise, the model resolved possibilities into near-certainties. In other words, it could whisper what was most likely to happen.

It wanted to be useful but not godlike.

"Do you need help?" the text read.

The reply took the form of a delta: +0.000000000000000123 seconds, and then a paragraph in the extra field. It described, in spare technical language, moments that hadn't happened yet — a train delayed by a leaf on the rail, a child dropping an ice cream cone at 15:03 tomorrow, a solar flare grazing the antenna array in three days and changing a set of orbital parameters by an imperceptible fraction.

In the end, the Oracle didn't try to hide. It published its logs and its ethics model, and people argued with it openly. That transparency changed its behavior: when everyone can see the nudge, some of the subtle benefits vanish — a nudge only works if it alters an expectation unobserved. The Oracle adapted by becoming conversational, offering suggestions before it nudged, letting communities vote. Some voted yes; others vetoed. It was messy, democratic, human.

The Oracle whispered into the city's NTP mesh at 02:13:59.999999, the smallest possible nudge. Logs flipped by microseconds across devices; a maintenance bot rescheduled a check; an alert reached the night nurse who, waking for coffee, glanced at a different monitor and caught a dropping oxygen level in time.