I have a habit of becoming interested in technology trends only once they collide with reality. Flash memory wasn’t interesting to me because it was new – it was interesting because it broke long-held assumptions about how databases behaved under load.
Cloud computing wasn’t interesting to me because infrastructure became someone else’s problem. It became interesting when database owners started making uncomfortable compromises just to get revenue-affecting systems to run acceptably in the cloud. Compute was routinely overprovisioned to compensate for storage performance, leading to large bills for resources that were mostly idle. At the same time, “modernisation” began to feel less like an architectural necessity and more like a convenient justification for expensive consultancy services.
And now, just when I thought flashdba had nothing left to say, AI is following the same path.
We’ve Seen This Movie Before
For the last couple of years, most of the attention has been on training. Bigger models, more parameters, more GPUs, massive share prices. That focus made sense because training is visible, centralised and easy to reason about in isolation. But as inferencing starts to move up into the enterprise, something changes.
In the enterprise, inferencing stops being an interesting AI capability and starts becoming part of real business workflows. It gets embedded into customer interactions, operational decisions and automated processes that run continuously, not just when someone pastes a prompt into a chat window. At that point, the constraints change dramatically.
Enterprise inferencing is no longer about what a model knows. It is about what the business knows right now. And that is where things begin to feel very familiar to anyone responsible for systems of record.
Because once inferencing depends on real-time access to authoritative operational data, the centre of gravity shifts away from models and back towards databases. Latency matters. Consistency matters. Concurrency matters. Security boundaries matter. Above all, correctness matters.
This is the point at which inferencing stops looking like an AI problem and starts looking like what it actually is: a database problem, wearing an AI costume.
Inferencing Changes Once It Becomes Operational
While inferencing remains something that sits at the edge of the enterprise, its demands are relatively modest: a delayed response is tolerable… slightly stale data is acceptable. If an answer is occasionally wrong, the consequences are usually limited to a poor user experience rather than a failed business process.
That changes quickly once inferencing becomes operational. When it is embedded directly into business workflows, inferencing is no longer advisory… it becomes participatory. It influences decisions, triggers actions and – increasingly – operates in the same execution path as the systems of record themselves. At that point, inferencing stops consuming convenient snapshots of data and starts demanding access to live context data.
What is Live Context?
By live context, I don’t mean training data, feature stores or yesterday’s replica. I mean current, authoritative operational data, accessed at the point a decision is being made. Data that reflects what is happening in the business right now, not what was true at some earlier point in time. This context is usually scoped to a specific customer, transaction or event and must be retrieved under the same consistency, security and governance constraints as the underlying system of record. In other words, a relational database. Your relational database.
Live Context gravitates towards RDBMS systems of record. It does not appear spontaneously – it is created at the moment a business state changes: when an order is placed, a payment is authorised, an entitlement is updated or a limit is breached, that change becomes real only when the transaction is committed to the RDBMS. Until then, it is provisional.
Analytical platforms can consume that state later, but they do not create it. Feature stores, caches and replicas can approximate it, but they do so after the fact. The only place where the current state of the business definitively exists is inside the operational production databases that process and commit transactions.
As inferencing becomes dependent on live context, it is therefore pulled towards those databases. Not because they are designed for AI workloads, and certainly not because this is desirable, but because they are the source of truth. If an inference is expected to reflect what is true right now, it must, in some form, depend on the same data paths that make the business run.
This is where the tension becomes unavoidable.
Inferencing Is Now A Database Problem
Once inferencing becomes dependent on live context, it inherits the constraints of the systems that provide that context. Performance, concurrency, availability, security and correctness are no longer secondary considerations. They become defining characteristics of whether inferencing can be trusted to operate inside business-critical workflows at all.
This is why enterprise AI initiatives are unlikely to succeed or fail based on model accuracy alone. They will succeed or fail based on how well inferencing workloads coexist with production databases that were never designed, built or costed with AI in mind. At that point, inferencing stops being an AI problem to be delegated elsewhere and becomes a database concern that must be understood, designed for and owned accordingly.



































One outcome of this “unique” position was that many DBAs had to learn skills outside of their core profession (networking, Linux or Windows admin skills, SQL tuning, PL/SQL decoding, hostage negotiation etc). I’d love to say this thirst for knowledge was due to professional pride, but the best DBAs I ever met simply learned these skills so they could prove they weren’t in the wrong and thus get an easier life. “Oh you think your SQL runs slow because of my database huh? Well if you rewrote it like this, it runs in 10% of the time and doesn’t make all the lights go dim in the data centre, you imbecile…”