AI inference in practice: choosing the right edge

Please sign in or register for a free public account to access this report.
Inferencing is the real-time decision-making of an AI model. As AI adoption grows, inferencing will accelerate, raising questions about workload processing and business benefits. As outlined in "Distributed inference: how AI can turbocharge the edge", enterprises need to know that where inferencing happens – whether at the user edge (device and enterprise) or network edge (far edge, near edge and telco private cloud) – will have major implications for application performance, data sovereignty, resilience and energy efficiency.
This research forms part of a series illustrating the impact of AI inference, with each report focusing on a distinct edge location and featuring an example company. This analysis examines how running AI workloads on the edge can deliver improved outcomes, with Aible the featured company.
Related research
AI inference in practice: time is money
As AI adoption grows, inferencing will accelerate, raising the question of where workloads will be processed and how they translate into business benefits. This analysis examines AI on the near edge in distributed telco data centres, with Kinetica highlighted as an example.
Distributed inference: how AI can turbocharge the edge
For several years, edge compute has been a key part of the 5G value proposition for telecoms operators selling into industries – even before AI. However, AI adds a new dimension to the value of edge through distributed inference.
AI inference in practice: time is money
As AI adoption grows, inferencing will accelerate, raising the question of where workloads will be processed and how they translate into business benefits. This analysis examines AI on the near edge in distributed telco data centres, with Kinetica highlighted as an example.
Authors
How to access this report
Annual subscription: Subscribe to our research modules for comprehensive access to more than 200 reports per year.
Enquire about subscriptionContact our research team
Get in touch with us to find out more about our research topics and analysis.
Contact our research teamMedia
To cite our research, please see our citation policy in our Terms of Use, or contact our Media team for more information.
Learn moreRelated research
AI inference in practice: time is money
As AI adoption grows, inferencing will accelerate, raising the question of where workloads will be processed and how they translate into business benefits. This analysis examines AI on the near edge in distributed telco data centres, with Kinetica highlighted as an example.
Distributed inference: how AI can turbocharge the edge
For several years, edge compute has been a key part of the 5G value proposition for telecoms operators selling into industries – even before AI. However, AI adds a new dimension to the value of edge through distributed inference.
AI inference in practice: time is money
As AI adoption grows, inferencing will accelerate, raising the question of where workloads will be processed and how they translate into business benefits. This analysis examines AI on the near edge in distributed telco data centres, with Kinetica highlighted as an example.
- 200 reports a year
- 50 million data points
- Over 350 metrics