Generative AI in Focus: Consumer Behaviour and Operator Views Survey Dashboard 2024

This report is available to those subscribed to the Digital Consumer module.
Against the current backdrop of growing interest and activity in generative AI, getting relevant insights are crucial to formulate strategies and plans. This dashboard comprises two main areas.
Building on our consumer survey in eight major countries worldwide, the first part of the dashboard provides all the relevant data related to consumer behaviour for generative AI in a consistent and structured way. We analyse consumer awareness, usage and experience of generative AI and their views on the areas that will benefit the most from generative AI. A number of filters allows the comparison of data for individual countries and for specific consumer segments (e.g. by age, network used or frequency of engagement in gaming).
The second part of the dashboard focuses on operators. Our survey of 100 operators worldwide reveals important insights into what operators think about the generative AI opportunity and where they are with their generative AI strategies. A number of filters allows the comparison of data for individual regions and for specific segments (e.g. by type of operator, size of revenues or size of subscribers).
Related research
AI inference in practice: new intelligence from the hospital floor
Enterprises need to understand that where inferencing happens – whether at the user edge (device and enterprise/on-premises) or network edge (far edge, near edge and telco private cloud) – will have major implications for application performance, data sovereignty, resilience and energy efficiency. This analysis focuses on how running AI workloads on the enterprise edge (on-premises) can deliver improved outcomes.
MWC Shanghai 2025: a window into the future?
MWC Shanghai is in the books for another year, having attracted 45,000 visitors (from 12,500 companies), along with 400 exhibitors and partner groups. The numbers were up from the 2024 event by 13% and 92% respectively. This analysis highlights the key takeaways and implications from meetings, summits and announcements at the event – and in particular, whether the progress seen in China can be mapped to other regions.
AI inference in practice: choosing the right edge
As AI adoption grows, inferencing will accelerate, raising questions about workload processing and business benefits. This analysis examines how running AI workloads on the edge can deliver improved outcomes.
Authors
How to access this report
Annual subscription: Subscribe to our research modules for comprehensive access to more than 200 reports per year.
Enquire about subscriptionContact our research team
Get in touch with us to find out more about our research topics and analysis.
Contact our research teamMedia
To cite our research, please see our citation policy in our Terms of Use, or contact our Media team for more information.
Learn moreRelated research
AI inference in practice: new intelligence from the hospital floor
Enterprises need to understand that where inferencing happens – whether at the user edge (device and enterprise/on-premises) or network edge (far edge, near edge and telco private cloud) – will have major implications for application performance, data sovereignty, resilience and energy efficiency. This analysis focuses on how running AI workloads on the enterprise edge (on-premises) can deliver improved outcomes.
MWC Shanghai 2025: a window into the future?
MWC Shanghai is in the books for another year, having attracted 45,000 visitors (from 12,500 companies), along with 400 exhibitors and partner groups. The numbers were up from the 2024 event by 13% and 92% respectively. This analysis highlights the key takeaways and implications from meetings, summits and announcements at the event – and in particular, whether the progress seen in China can be mapped to other regions.
AI inference in practice: choosing the right edge
As AI adoption grows, inferencing will accelerate, raising questions about workload processing and business benefits. This analysis examines how running AI workloads on the edge can deliver improved outcomes.
- 200 reports a year
- 50 million data points
- Over 350 metrics