Cisco’s Nvidia-powered AI Grid could be a turning point for telcos

  • Cisco has revealed a new AI Grid reference architecture with Nvidia for edge inferencing on telecom networks
  • Comcast is using AI Grid to roll out three trial use cases
  • AI Grid could help telcos tap into a huge new revenue opportunity

All eyes are on Nvidia this week and the announcements flowing out of its GTC event in San Jose. But it’s actually an announcement tucked away in a Nvidia-related Cisco press release that holds the most promise for telcos: AI Grid.

While the word “grid” calls to mind familiar electrical infrastructure, Cisco and Nvidia appear determined to build a grid of intelligence using existing telco networks.

In a nutshell, AI Grid is a reference design that combines the power of Cisco's Mobility Services Platform with Nvidia RTX PRO Blackwell Series GPUs. The idea is to push AI processing deeper into network infrastructure and closer to customers, creating a sort of network of intelligence.

As we’ve argued before, telcos happen to have the perfect locations for edge compute installations. They also have troves of customer data and relationships that they can parlay into meaningful AI services. This is exactly why AI Grid has potent promise.

“Think of AI Grid as a way of taking AI capabilities and applying them or enhancing existing services,” Kevin Wollenweber, Cisco SVP and GM of Data Center and Internet Infrastructure, told Fierce. New services, of course, promise to improve the user experience while also giving telcos a way to boost ARPU by charging for these new services. 

Comcast takes AI Grid to the streets

Comcast will be among the first operators to leverage AI Grid, announcing plans to roll out a nationwide field trials of three use cases focused on advertising, gaming and an enterprise concierge.

“By bringing NVIDIA GPUs directly into our edge cloud, we can explore what becomes possible when AI inference happens only milliseconds from end users,” Comcast Chief Network Officer Elad Nafshi said in a statement.

The advertising use case with Decart will seek to deliver customized video advertisements at the household level, while the gaming application will seek to deliver ultra-low latency for online gaming. But it is the third use case – an enterprise concierge agent being launched in collaboration with startup Personal AI – which is perhaps the most compelling.

AI revenue opportunity

Suman Kanuganti, CEO at Personal AI, told Fierce that it will essentially be injecting AI into the call path as part of the deployment. And if that sounds familiar, yes, that’s exactly what folks like Alianza (via its Intelligent Communications Fabric) and T-Mobile (with its Live Translation service) have talked about doing.

Kanuganti noted that while there are only around 1 billion Google searches per day, there are 3 billion phone calls in the same timeframe. That’s a massive opportunity for telcos, which now have the ability to “offer this utility directly on their existing phone lines” to turn them into monetizable “AI lines.”

For consumers, Kanuganti said this service could end up delivering operators around a $10 bump in ARPU per subscriber. For businesses, the bump could be anywhere in the $30-60 range, he said. Multiply that by thousands or millions of lines and you can see why this kind of use case might be appealing for operators on the hunt for new revenue generation. 

In the enterprise concierge use case Comcast is deploying, the AI line will essentially become another staffer that can handle front desk tasks – answering calls with customized greetings, managing appointments and answering questions.

Kanuganti added that while LLMs tend to rely on large context windows to deliver results – in the realm of 32,000-plus tokens – Personal AI uses models that have contextual memory embedded within them. So, the size of its personalized AI models is only around 1 GB.

When you think about the fact that operators connect more than just phones – hello, underappreciated IoT devices – the opportunity around AI Grid gets really interesting. The biggest question now is whether operators can execute.