Tech Explained: Here’s a simplified explanation of the latest technology update around Tech Explained: Davos 2026 | Power, security, and enterprise readiness to define next phase of AI adoption in 2026, say tech CEOs in Simple Termsand what it means for users..

At the World Economic Forum in Davos this year, conversations around artificial intelligence are shifting from promise to practice, with global technology leaders pointing to 2026 as a critical year when AI moves decisively from pilots to deployment at scale. Power availability, secure systems, and enterprise readiness are emerging as the key factors that will shape the next phase of adoption, executives told CNBC-TV18.

Speaking at the sidelines of World Economic Forum 2026 in Davos, Jeetu Patel, President and Chief Product Officer at Cisco, said the AI industry is entering a new phase of maturity after years of experimentation. While 2025 saw companies move from simple chatbots to more advanced agentic AI, the coming year will be about putting those systems into production.

“2026 will be the production of agentic AI,” Patel said, adding that early forms of physical AI and large world models will also begin to surface, even as different approaches remain at varying stages of development.

That shift, Patel said, is driving investment into areas where constraints are becoming more visible. According to him, infrastructure, trust, and data are now the biggest bottlenecks for AI growth. There is simply not enough power, computing capacity, and network bandwidth to meet rising demand, while security and safety are essential if enterprises are to trust and use these systems at scale. Data gaps, particularly around machine-generated data, are also drawing attention. Patel noted that these areas align closely with Cisco’s portfolio, underpinning the company’s strong recent performance and outlook.

Power constraints are already emerging as a major risk to AI expansion, according to Varun Sivaram, Founder and CEO of Emerald AI. He warned that 2026 will be the year when energy shortages start to seriously limit AI ambitions in countries such as the US and India. “They have chips; they need power in 2026,” Sivaram said, pointing out that while tens of gigawatts of data centre capacity are planned, only a fraction can currently be connected to existing grids.

Emerald AI is attempting to address this challenge by making data centres more flexible in how they consume electricity. Sivaram said the Nvidia-backed company has moved from the lab to commercial deployment in just 18 months and has launched what it calls the world’s first power-flexible AI factory in the US. By allowing data centres to adjust power use in real time, he said they can connect to the grid much faster without pushing up electricity bills for surrounding communities, while making better use of spare capacity already available in energy systems.

For enterprises operating in an increasingly volatile and uncertain global environment, AI is also being seen as a critical decision-making tool. Chakri Gottemukkala, Co-Founder and CEO of o9 Solutions, said companies are facing unprecedented levels of complexity as supply chains and markets remain under strain. While large language models have attracted attention, he said their impact on enterprise decision-making has so far been limited. The next step, according to Gottemukkala, lies in combining the accessibility of LLMs with structured enterprise knowledge through what he described as neuro-symbolic AI, allowing insights to move beyond specialists and reach frontline teams more effectively.

Also Read | Indian IT set for AI-led growth spurt in 2026: Uniphore Chief Umesh Sachdeva

As AI systems become more deeply embedded in business operations, concerns around trust and security are also rising. Jonathan Zanger, Chief Technology Officer at Check Point Software, said many AI solutions were not built with security in mind, leaving gaps that attackers can exploit. He said companies are now responding, with boards and CEOs significantly increasing budgets for AI security. “In 2026, we definitely need to double down on investment to ensure AI is adopted securely,” Zanger said.

Patel echoed that view, noting that while AI was initially used to strengthen cyber defences, companies now need to focus on securing AI itself. He said enterprise systems must be predictable and reliable, even as models grow more complex. Changes in architecture, driven partly by power constraints, are also reshaping how AI infrastructure is built, with multiple data centres increasingly linked to function as single virtual clusters.

Together, the comments from industry leaders in Davos underline a clear message: as AI moves into large-scale deployment, success in 2026 will depend less on experimentation and more on solving practical challenges around power, security, and enterprise integration.

Below is the excerpt of the discussion.

Q: Jeetu, let me start by asking you about the mega trends that are likely to shape the AI world in 2026?

Patel: In 2025 we had this move toward the second phase of AI, which was moving from chatbots to agentic AI. And I think there was a lot of experimentation, like you said, that happened. 2026 will be the production of agentic AI. But you’ll also start to see the early remnants of physical AI with large world models start to come out as well. I think you’re going to start to see all three phases that are in different stages of maturity.

Q: What does that mean in terms of investments? I mean, this has been a big tailwind for the US economy and, to some extent, for global growth. When we talk about investments, what is the market opportunity that you’re assessing at this point in time?

Patel: Investments are going to be in the areas where there are constraints. There are constraints in three areas: infrastructure, trust, and data. So if you start thinking about infrastructure, there is simply not enough power, compute, and network bandwidth in the world to satiate the needs of AI. That’s where we’re seeing a tremendous amount of investment. You also have to trust these systems; otherwise, you won’t end up using them. So security and safety will be a big deal. Then there’s the data gap, where machine data will actually start getting flowed into AI. Those are the three big areas where we are starting to see a fair amount of investment continuing to propagate.

Q: And for Cisco specifically, what does 2026 look like?

Patel: It turns out that those three areas happen to be directly congruent with the areas where we have a fair number of products.

Q: And just a coincidence?

Patel: But we’ve had a great year. The past few quarters have been fantastic from a growth perspective, and we continue to see that as the outlook moving forward.

Q: Let me first address the power issue with Varun. What problem is Emerald AI trying to solve today?

Sivaram: I think 2026 is the year the power bottleneck really bites for AI. In America, we’re trying to build 50 gigawatts worth of data centers in just the next three years, but only 25 can get plugged in. The same is true in India. Everywhere in the world, except for China—China will have 400 gigawatts of spare capacity by 2030 for AI. For countries like the US, India, or the UK, to be competitive and advance AI innovation, they need power. They have chips; they need power in 2026, and that’s what we are solving at Emerald AI.

Q: How confident are you about solving that problem, and what are you doing specifically to address it?

Sivaram: Emerald AI, as you mentioned, is an Nvidia-backed startup. Over the last 18 months, we have gone from lab bench to full-scale commercial deployment with Nvidia. We’ve announced the world’s first power-flexible AI factory in Virginia. Emerald AI is a software company that makes AI data centers power flexible. That means we can take an AI factory or data center and change the amount of power it uses on the fly. Today’s power grid is largely underused; only a few times a year does it reach peak load. During those moments, Emerald AI running on a data center—an NVIDIA data center, for example, or Oracle, one of our partners—can reduce the consumption of that data center so it can fit on the grid and get connected in six months or a year instead of 10 years. That means we can build far more data centers in the West or in India. And on top of that, we don’t raise prices for neighbouring communities. Donald Trump, our President, has been clear: we can’t have data centers increasing bills for everyone. Using Emerald AI, you can be power flexible as a data center. That means bills don’t go up as we better utilise the energy network, while taking advantage of 100 gigawatts of unused spare capacity on the power system today.

Q: Chakri, let me come to you. It’s a VUCA world on steroids, and supply chain resilience is being tested globally. What are clients asking of you today?

Gottemukkala: There was a study published recently that said 2025 was the most VUCA year yet, and 2026 started off that way as well. At o9, we help companies make decisions in a volatile, complex, uncertain, and ambiguous world. Volatility and complexity mean everyone faces many more situations daily, making decision-making more difficult. Regarding AI, we’ve been using different forms of AI for driving enterprise decision-making. The big trend I see is that while LLMs have been tried, there has been limited success in enterprise decision-making because you have to combine the accessibility LLMs provide with the structure and knowledge of the enterprise. The future I see is what we’re calling neuro-symbolic AI, where you combine the neural nets of LLMs with the symbolic representation of enterprise knowledge to drive decision-making. This brings access to knowledge from executives to the front lines. People can access models more easily rather than relying only on planners and analysts.

Q: Jonathan, let me come to you regarding Jeetu’s point about trust, safety, and security. That continues to be an issue. Are companies giving it the attention it deserves?

Zanger: I think it requires serious attention. In recent years, we’ve seen a lot of investment and creativity going toward developing AI applications. Much of the world’s innovation is focused on improving models and applications and developing infrastructure. On the flip side, many solutions were not developed with security in mind. When we look at AI applications or infrastructure, we often examine them from an attacker’s perspective and see gaps that threat actors can exploit. In 2026, we definitely need to double down on investment to ensure AI is adopted securely.

Q: Are CEOs putting money in that direction currently?

Zanger: Yes, we see increasing budgets, especially around AI security, like never before. Two years ago, this trend wasn’t as strong. Now it’s expanding. It’s a top priority for boards, and CEOs.

Q: Top priority for CEOs: trust, safety, and security. Jeetu, as Chief Product Officer, how are you building that in?

Patel: Initially, AI was used for defences because attackers are getting sophisticated, so defences have to be at machine scale. Now, you can’t just use AI for defence; you have to secure AI itself. These models can behave unpredictably, but enterprise applications need to be deterministic. We need effective guardrails and validation. We’ve invested not only in AI for cyber defence but also in securing AI itself. The second area is architecture, which changes due to power constraints. Previously, a single GPU ran a model. Then clusters of GPUs in servers and data centers were needed. Now, multiple data centers can be connected across long distances to act as one virtual ultra-cluster. That requires a different set of silicon and chips, which we specialise in. This scale across data centers allows models to act as one virtual cluster, necessary as models grow larger and scaling laws persist.

Q: How scalable is this, and how much is already in use?

Patel: Hyper-scalers are now building clusters of hundreds of thousands of GPUs across different locations. Power is often unavailable locally, so data centers are built where power exists. If power is available across multiple locations, they are networked to operate as one virtual GPU. There’s also innovation on the power side, as Varun mentioned. All these factors together ready infrastructure for AI.

Watch accompanying video for entire discussion.