It’s not every day that a data company, a GPU powerhouse, and a century-old professional sports league share a room. But that’s exactly what happened when VAST, NVIDIA, and the National Hockey League (NHL®) took the stage at GTC 2025.
On paper, they have little in common, save for a shared belief that infrastructure should never be the limiting factor in achieving something bold.
Six years ago, the NHL came to VAST with a problem that, if you squint, doesn’t seem all that different from a hundred other AI-era data challenges.
They had a lot of video—around 550,000 hours of it. Footage dating back to 1917. Game tapes. Archival score sheets. Player photos. Documentary B-roll. 25 million discrete images, all told. But that wasn’t the problem.
The problem was they had no seamless way to make use of it.
Everything lived in a tape library perched on the 23rd floor of a skyscraper in Manhattan, across from Madison Square Garden.
The location was ironic. The visibility of the sport—its past, its story, its soul—was buried behind latency, access issues, and analog limitations.
The NHL had built a library. But what it needed was a live newsroom.

Hockey, unlike many other professional sports, is fast. So fast that even if you have the footage, you don’t have the time. Between the whistle and the next play, a commentator has maybe ten seconds to say something meaningful. Compare that to the NFL, where actual gameplay is something closer to 17 minutes across a three-hour broadcast and the rest is pacing, storytelling, graphics, ads and the like.
Hockey doesn’t have the luxury of pause. That's great for the sport, but not so great for the people trying to contextualize it.
So the NHL needed a system that could take that century of footage and make it actionable, usable. Not tomorrow, but during the game. If a rookie scores a goal, the broadcaster should be able to instantly pull the last time an American-born left wing did something similar. That kind of thing.
And that kind of real-time context isn’t just about elevating the fan experience; it’s a prerequisite for modern sports storytelling.
That’s where VAST entered the frame—not just as an IT vendor, but as a systems partner who could architect a real-time, AI-optimized data pipeline that could skate from the arena to the cloud to the commentary booth.
Over the last six years, VAST and the NHL built something most fans don’t even know exists. And why would they when it appears seamless? But behind that easy-looking delivery, know that every NHL arena—32 in total—now runs VAST infrastructure locally.
Game footage is sucked in and shipped live to NHL headquarters in New York, where it’s edited, archived, and more recently, parsed by AI models.
That dense historical archive, once hidden on LTO tapes, is migrated to a scale-out flash platform that supports massive concurrency and data-intensive workloads.
Petabytes of data, terabytes per second of throughput.
Still, capacity alone doesn’t get you instant insights. That’s where NVIDIA stepped in. Adam Ryason and his team, who build agentic pipelines for video search and summarization are behind the AI Blueprint a modular, GPU-accelerated platform that lets organizations like the NHL take decades of video, chunk it into digestible bites, describe it using natural language, and stitch those embeddings into a searchable, vectorized database.
The magic for an org like the NHL? You don’t need to know what file to open. You just need to know what you’re looking for.
The result of this combined effort is something like an AI-enhanced memory of the game, one that gets sharper over time.
The vision-language models are pretrained, then fine-tuned on hockey-specific footage to learn what, for instance, a power play looks like or what counts as a highlight or even when the crowd starts losing it.
The retrieval pipeline connects to a RAG stack that lets editors, announcers, and even fans ask normal questions—“show me all rookie goals against this goalie” or “show me every time there was a breakaway in the third period last night”—and get answers, clips, and context in almost real time.
It’s not just good tech. It’s something closer to a creative revolution in how a sport narrates itself.
The NHL’s David Lehanski, Executive Vice President of Business Development & Innovation talks about the future in terms of customization: fans choosing their own camera angles, stats overlays, even highlight reels generated just for them.
Want to follow your favorite player around for an entire game, ISO-cam style? Done.
Want to build a social media package that shows every goal by a rookie in the last decade scored against a particular goalie? You can do that too.
The tech stack doesn’t just support engagement, it enables insane levels of personalization at fidelity never seen before.
The archive is no longer a museum—it’s a living asset. It can show who these NHL players are, what they’ve done, how the game has changed. It can make a case for greatness in ways players won’t (and maybe shouldn’t) make for themselves.
It can show the next generation what’s possible, because it remembers everything.
This isn’t about hockey, really. It’s about what happens when an organization treats its history as strategic input.
You realize your data isn’t just dead weight—it’s actually some bit of narrative potential waiting to be unlocked.
The NHL had the foresight to ask the question. VAST built the system that lets them answer it.
What used to be a vault is now a vision.
The arena is no longer just a theater. It’s a studio. And every second is searchable.
Watch the session from NVIDIA GTC 2025 here:
NHL and the NHL Shield are registered trademarks of the National Hockey League. © 2025 NHL. All Rights Reserved.