Skip to content


  Seeing the Next Phase of AI: Adam Little Reflects on NVIDIA GTC 2026

Publish Date: 04-08-2026
 

By: Adam Little, Elevate Board of Directors President

Each year, NVIDIA GTC brings together some of the brightest minds in AI, infrastructure, and enterprise technology. The 2026 event in San Jose made one thing clear: we are entering a new phase of AI, moving beyond experimentation and into real-world execution at scale.

As I reflect on the event and the conversations that followed at our reception, a few themes stand out.

The Shift to Agentic and Physical AI

Three areas left a lasting impression: agentic AI, physical AI, and the Vera Rubin platform.

We are moving past large language models and chatbots into something far more capable. Tools like OpenClaw and NemoClaw are helping build an ecosystem where AI agents can reason, plan, and execute multi-step tasks. This shift fundamentally changes how organizations will approach automation and productivity.

Physical AI is also becoming real. Digital twins, robotics, and AI factories are no longer conceptual. They are actively shaping the future of manufacturing and operations. Integrating real-time factory data into digital environments is pushing the boundaries of what is possible while reducing risk in physical production.

The transition from Blackwell to Vera Rubin is not just another hardware upgrade. It represents a fundamental shift in the economics of inference at scale. Taken together, these innovations show that NVIDIA is no longer just a chip company. It is delivering the tools and blueprints for what feels like the next industrial revolution.

Where the Physical and Digital Worlds Meet

One of the most impactful sessions I attended was a panel on building the future of manufacturing, featuring leaders from Siemens, ABB Robotics, and SK Hynix.

What stood out was how seamlessly the physical and digital worlds are converging. The integration of real-time factory systems, such as LIMS and MES, into digital twin environments is already happening. With platforms like Omniverse enabling real-time simulation, organizations can test, refine, and optimize processes before deploying anything on the production line.

That level of visibility and control can significantly reduce errors and unlock new efficiency.

Infrastructure Still Matters

Infrastructure was another major theme, particularly Dell Technologies’ expanded AI Factory with NVIDIA.

Two areas stood out: the Dell Lightning File System and agentic AI blueprints. Scaling high-performance storage is one of the biggest challenges as AI adoption grows. At the same time, many organizations are still asking a basic but critical question: where do we start?

Bringing together industry leaders and delivering practical starting points through agentic blueprints helps organizations move from curiosity to execution.

Meeting Organizations Where They Are

At our post-event reception, this theme continued. While there was excitement around NVIDIA’s Vera Rubin architecture, not every organization is ready to jump into water-cooled racks and large-scale AI data centers.

This is a journey. Solutions from Intel, NVIDIA, and Dell Technologies are evolving to meet organizations where they are today. Options like Intel Xeon CPUs and Intel Gaudi GPUs provide alternative paths forward. The key takeaway is that there is no single right way to approach AI infrastructure, but there is a wrong move, and that is not getting started at all.

The Data Problem and Workflow Shift

For practical takeaways, I would focus on two things: data and workflows.

First, the data problem. Getting data into a common structure with proper governance is critical. Start with data literacy and a shared data dictionary. If teams cannot align on business terms, AI systems will only amplify confusion.

Second, AI development is moving beyond prompts. Prompt engineering and chatbots still have value, but the real opportunity lies in workflows. Agentic systems that can execute multi-step processes are where organizations will begin to see immediate and meaningful impact.

Control, Governance, and Hybrid Models

Another trend was the growing demand for control over data and AI outcomes.

Solutions like the Nemotron family and Sovereign AI blueprints push more workloads back on-premises, giving organizations greater governance over sensitive data and reasoning processes. The public cloud still plays a role for burst capacity and scale.

This hybrid approach is quickly becoming the standard for enterprise AI strategy.

A New Perspective on Infrastructure

One surprising discussion centered on Groq and its Language Processing Unit technology.

At first, it seemed like an outlier in a world dominated by GPUs. But with NVIDIA integrating this technology into Vera Rubin, the direction is clear. Inference efficiency is emerging as the next bottleneck. For IT and data teams, the future of AI infrastructure is heterogeneous, combining different types of compute to achieve the best outcomes.

Final Thoughts

If there is one thing I would emphasize, it is that none of us have all the answers right now. The pace of change in AI is fast, and we are all learning as we go.

What matters most is staying engaged, experimenting, and learning from each other. I encourage all of you to continue the conversation within the Elevate community. Share what is working, what is not, and where you see value. That collective insight is one of our greatest strengths.

I would love to hear what you are seeing in your own organizations and how you are approaching this next phase of AI.