In DC this week, the focus shifted from invention to endurance as AI enters the domain of infrastructure and public sector planning, treated less as software than as system.
As someone who has attended a GTC event almost every year since the beginning, it’s fair to say this week’s DC gathering felt different. Way different.
The shift in tone is reflected by (and from) the news today amid trade talks and big tech earnings as well as a broader conversation about the jobs market and the role of AI.
In short, it’s real boots on the ground time when it comes to tech and AI policy in industry, and defense Despite the expected flashy keynote in the halls the tone was heavier because the purpose clearer. Everyone already knows what this is about as the work has shifted from invention to construction.
Nobody is here to debate which model is smarter this time. They are here to talk about the plumbing of intelligence: power, cooling, networks, locality, and control. Engineers speak the language of logistics, talking about AI the way people used to talk about power grids. There is an implicit understanding that this is what tech maturity looks like.
The shift in geography matters. In San Jose, GTC always played like a pilgrimage to the center of gravity. Here in Washington this week, the gravitational field belongs to policy.
Panels that once explained deep learning now open with energy budgets and procurement timelines. The phrase “sovereign infrastructure” comes up so often that no one even pauses to define it. The word “deployment” has replaced “innovation.”
You hear the same pattern across every side conversation: where data lives, who controls it, how inference runs in constrained environments, and what happens when compute becomes a matter of state capability.
The federal presence lends the week a different vibe as well. These aren’t investors or startup founders trading optimism, we all saw badges of program leads from defense, energy, and research agencies looking for durable architectures.
One official called AI “a national utility in waiting,” which seems an accurate description of what’s being drawn up in the background. For this crowd, architectures aren’t drawn as part of a product cycle but for decades of sustained use. Power and datacenters are now mapped alongside ports and railways in national plans.
What’s interesting is how seamlessly the private sector has adapted to this framing.
Hardware vendors are positioning themselves less as manufacturers and more as builders of infrastructure templates. Network folks talk about edge deployments as “micro clouds” that serve specific regions or agencies. Storage and compute companies are presenting integrated stacks meant to be dropped into existing facilities like modular substations.
It is impossible to walk the floor without noticing how the disciplines have converged. Telecom, defense, industrial automation, and cloud architecture have collapsed into a single discussion about distributed intelligence.
At one demo a telecom engineer described sub-millisecond backhaul the same way a research scientist talks about feedback latency in an inference loop. A logistics executive listening nearby nodded as if it’s all the same language and I guess in a sense it is.
The boundaries between compute and communication are fading because the workloads demand it. This was the first time I’ve been at a conference (usually HPC or AI focused) and that was just implicitly agreed upon and understood.
At the technical sessions, the most crowded talks actually seemed to be the least glamorous ones. Energy efficiency, cooling density, fault tolerance, data placement all drew crowds of real public sector users.
This shouldn’t come as much as a surprise because models scale faster than infrastructure. The industry has reached a point where every increment of performance now depends on the physics of the datacenter and power drew a lot of the overheard conversations and session attendance.
Around the periphery of the event, the conversations start to take on a different dimension. Beyond power and cooling, there is a sense that AI infrastructure is beginning to rewrite organizational boundaries, something that came up at least in almost every single panel.
The builders of telecom networks are partnering with supercomputing centers. Energy companies are building internal data pipelines to manage sensor-driven AI for grid balancing. Manufacturing companies are experimenting with agentic control systems that optimize production lines in real time.
The unifying thread is that AI is no longer confined to the cloud. It is escaping into the physical world, embedded in systems that must endure.
There’s no doubt the mood has changed since the first wave of AI conferences a decade ago. The early talk was about breakthrough models, sudden leaps, and creative possibility. The energy was improvisational but the conversations were…well, procedural.
The exciting talks weren’t around the latest models but things like provisioning, routing, load, redundancy.
And all of this is a sign of a maturing market, though not one at a standstill.



