Tech Explained: Here’s a simplified explanation of the latest technology update around Tech Explained: No network, no AI evolution in Simple Termsand what it means for users..
Infrastructure can look “good enough” right up until the moment it isn’t.
Connectivity networks were designed for the way we have used digital services over the past 20 years. We downloaded files, navigated the internet and streamed. It is all downlink-heavy usage.
AI flips that logic, it needs to be fed. When was the last time you checked your home WiFi’s upload speed? 100 MBPS download speed with 10 MBPS upload is average today, but this won’t work for the AI evolution.
The flow of data becomes more bi-lateral and distributed, with far less tolerance for delay, jitter, packet loss and downtime. As consumers, we become frustrated with buffering, but AI won’t tolerate it.
AI changing the geometry of the internet and when the geometry changes, the network design must change with it.
Not just a faster network, but a new type of network
In Nokia’s research, it is clear that the current standards aren’t working.
Three out of four decision makers expect they’ll need sub-30ms latency for AI in the next 2–3 years, including 13% targeting sub-10ms. This is a new consideration for network design and delivery.
Recent analysis from Ookla suggests the U.S. network has work to do. Not a single state in the U.S. has latency below 30ms. At the high range is Hawaii at 108ms and the low range is D.C. at 37ms. Only 15 states had readings below 50ms.
Latency is the time in which networks take to respond. Generally speaking, the lower the better as when the delay grows it starts to feel like a bad video call where people talk over each other.
We’ve not had to be significantly concerned with latency to date as real-time responsiveness hasn’t been critical; we tolerate a small amount of buffering when streaming for example.
For AI systems making decisions in the moment, this is not acceptable.
Fraud detection and payment authorization, autonomous transport safety systems or keeping the lights on when the grid is under stress, these are examples where low latency becomes mission critical.
That’s why connectivity keeps surfacing as the real choke point. In Nokia’s research, 58% of decision-makers say the network is the single biggest barrier to scaling AI, not the models, but the ability to access them everywhere, in real-time.
