Chris Powell, Chief Scientist at SAIC, describes a future shaped by quantum and data convergence, where proximity replaces scale and intelligence forms at the source.
As Chief Scientist at SAIC, Chris Powell spends his days thinking about what happens when scale itself stops working, when the size and speed of the world overtake the systems built to understand it.
His office sits inside the company’s Technology Horizons group, a place meant to explore what comes after the limits of computing, and his view of that horizon is stark.
The next era, he says, will not be defined by how fast we can calculate but by how close we can think.
SAIC’s work covers an enormous stretch of modern complexity. Its systems run inside defense networks, guide digital twins for large engineering projects, process satellite imagery, and increasingly lately, manage quantum experiments that sit somewhere between physics and information theory.
It’s Powell’s job to cuts across all of it. He listens for the patterns that emerge when data grows faster than it can move and told us on a recent episode of the Shared Everythingthat the problem used to be computation, but now it’s distance.
The volume of information produced by the world has exceeded our ability to transport it, and that has changed the nature of the machine itself, he argues.
For most of computing’s history, the model was simple. Bring the data to the processor. Build bigger networks, faster memory, and larger systems to hold it. The industry spent half a century perfecting that motion. But Powell says the time for that has passed. 50 zettabytes of new data enter the world each year and every byte that travels burns time and power.
In Powell’s view, the work ahead is no longer about making data move faster, it’s about stillness, the moment when the machine stops chasing information and begins to think where the data already lives.
This idea guides nearly everything SAIC builds. In a hypersonic defense scenario, a decision must occur in seconds. No central system can keep pace with that. In weather prediction or satellite analysis, the data originates at the edge of the atmosphere and loses value every millisecond it waits for interpretation. The same is true, he adds, for quantum sensing, financial intelligence, and digital engineering among other areas.
The answer, Powell argues, is to turn every point of collection into a point of thought.
Powell describes VAST Data as a kind of architectural enabler for the shift he is trying to engineer. He explains that the structure of VAST’s data layer allows systems to bypass the rigid, sequential I/O patterns that limited past generations of supercomputing.
Traditional approaches relied on linear algebra routines that demanded strict organization of data, but the problems SAIC now faces are unpredictable, with random, unpatterned access. In that environment, Powell says, the organization of infrastructure matters as much as the algorithms themselves.
The design of VAST’s platform, its ability to handle randomness at scale and sustain consistent performance “inherently enables that kind of success,” meaning it lets new quantum and AI-driven workloads run without collapsing under their own data demands
That change might sound subtle but it rewrites the idea of what computing is. To bring processing to the data is to erase the space between storage and logic. The old boundaries (memory here, compute there, humans somewhere in between) collapse into a single continuous surface. Transistors can shrink only so far, he says. Beyond that, the limits are not mechanical but spatial. The shortest possible path is no path at all.
Quantum computing, in his view, is a sign of how this will unfold.
The real importance of quantum is that it can compute where the data lives. Every state exists at once until the problem defines its own outcome and Powell sees that as the pattern that will guide classical systems too.
The future machine will no longer pull data through itself. It will exist within the data, shaped by it, learning from it in place.
Powell describes this shift as though it were already happening, and in many ways it is. SAIC’s research in digital engineering ties simulation, manufacturing, and operations into one continuous feedback loop. Each system refines the next, creating a world that models itself as it runs. The company’s mission, as he describes it, is not to build a faster computer but to design an environment where computation and observation occur in the same moment. That goal extends from national security to earth science to medicine.
Powell likes to recall the concept coined by his mentor, Steve Chen: a supercomputer in a soda can.
This was once a metaphor for impossible density but perhaps there was a ring of prophecy to it. The idea is that a system that once filled a datacenter can now live inside a satellite or a sensor pod and power efficiency becomes the new measure of intelligence. Ten microwatts running for ten years in space, he tells us, would be a greater breakthrough than any exascale milestone.
The idea, which is quite profound when you think about it, is not to make computing bigger but to make it disappear into the world.
And the more Powell describes this future, the more it sounds inevitable. The old strategy of adding scale has reached its end and now we must look to proximity. What began as a push for performance has become a search for nearness, until the distance between data and understanding is gone.
When that happens, the machine will no longer stand apart from what it measures. It will simply be there, thinking inside the data itself.
In the end, Powell’s view of the future depends on architectures that can hold their own against the flood.
The systems that will matter most are the ones that treat randomness as native, that can organize and interpret data without forcing it into shape first. They will live closer to the source, fluent in the unpredictable, capable of turning chaos into understanding in real time.



