AI Inference Calculator

Download the Report
By submitting this form, you understand and agree that your data will be shared with GSMA Intelligence and our partner, referred to below.
Please sign in or register for a free public account to access this report.
To help understand the business value of deploying AI at the edge, GSMA Intelligence has worked on a research series this year in collaboration with our partners Dell and NVIDIA. This involved a major report and three follow up deep dives to explore individual use cases. However, operators need to be able to simulate the impacts in practice, not just theory.
To do this, we have developed a plug-and-play custom calculator for operators and enterprise companies to use their own data and explore the downstream implications of running AI inference at the edge, including total cost of ownership. The calculator is free to use, and includes a selection of parameters that you can customise based on your own deployment plans. Over the remainder of 2025 and into 2026 we plan to add more use cases for AI inference to flesh out what these look like for interested telcos and their partners.
Please get in touch with any comments – these help us refine the calculator so that it is of the greatest use.
Related research
AI inference in practice: time is money
As AI adoption grows, inferencing will accelerate, raising the question of where workloads will be processed and how they translate into business benefits. This analysis examines AI on the near edge in distributed telco data centres, with Kinetica highlighted as an example.
Distributed inference: how AI can turbocharge the edge
For several years, edge compute has been a key part of the 5G value proposition for telecoms operators selling into industries – even before AI. However, AI adds a new dimension to the value of edge through distributed inference.
AI inference in practice: choosing the right edge
As AI adoption grows, inferencing will accelerate, raising questions about workload processing and business benefits. This analysis examines how running AI workloads on the edge can deliver improved outcomes.
Authors
How to access this report
Annual subscription: Subscribe to our research modules for comprehensive access to more than 200 reports per year.
Enquire about subscriptionContact our research team
Get in touch with us to find out more about our research topics and analysis.
Contact our research teamMedia
To cite our research, please see our citation policy in our Terms of Use, or contact our Media team for more information.
Learn moreRelated research
AI inference in practice: time is money
As AI adoption grows, inferencing will accelerate, raising the question of where workloads will be processed and how they translate into business benefits. This analysis examines AI on the near edge in distributed telco data centres, with Kinetica highlighted as an example.
Distributed inference: how AI can turbocharge the edge
For several years, edge compute has been a key part of the 5G value proposition for telecoms operators selling into industries – even before AI. However, AI adds a new dimension to the value of edge through distributed inference.
AI inference in practice: choosing the right edge
As AI adoption grows, inferencing will accelerate, raising questions about workload processing and business benefits. This analysis examines how running AI workloads on the edge can deliver improved outcomes.
- 200 reports a year
- 50 million data points
- Over 350 metrics