On-Device Generative AI Expected to Drive Heterogenous AI Chipset Shipments to Over 1.8 Billion by 2030
NEW YORK, Feb. 21, 2024 /PRNewswire/ -- Generative Artificial Intelligence (AI) workloads have moved beyond the bounds of cloud environments and can now run on-device supported by implementing heterogeneous AI chipsets. Combined with an abstraction layer that can efficiently distribute AI workloads between processing architectures and compressed LLMs with under 15 billion parameters, these chipsets can enable enterprises and consumers to run generative AI inferencing locally. Consequently, ABI Research, a global technology intelligence firm, estimates worldwide shipments of heterogeneous AI chipsets will reach over 1.8 billion by 2030 as laptops, smartphones, and other form factors will increasingly ship with on-device AI capabilities.
- Consequently, ABI Research, a global technology intelligence firm, estimates worldwide shipments of heterogeneous AI chipsets will reach over 1.8 billion by 2030 as laptops, smartphones, and other form factors will increasingly ship with on-device AI capabilities.
- "Cloud deployment will act as a bottleneck for generative AI to scale due to data privacy, latency, and networking cost concerns.
- "What's new is the generative AI workloads running on heterogenous chipsets, which distribute workloads at the hardware level between CPU, GPU, and NPU.
- The productivity AI applications running on-device, powered by heterogeneous AI chipsets, will drive significant market growth in personal and work devices.