What advancements are shaping the future of AI, from the edge to the chip level? As the AI landscape rapidly evolves, the focus is shifting from centralized data processing to on-device intelligence, enabling more agile, secure, and instantaneous decision-making. The recent Embedded World 2024 event highlighted this shift, showcasing breakthroughs in edge AI and specialized AI chipsets that are transforming embedded systems and IoT solutions.
Edge AI, which enables AI processing directly on devices rather than through cloud-based systems, dominated discussions at Embedded World. This trend is reshaping industries by reducing latency, enhancing privacy, and lowering costs associated with cloud dependencies. Edge AI allows devices to make intelligent decisions independently, laying the groundwork for more advanced autonomous systems across sectors like healthcare, automotive, and manufacturing.
NVIDIA continues to lead in edge AI, extending its influence beyond gaming and data centers. Its powerful GPUs, essential for complex AI workloads, are increasingly deployed in industrial and autonomous settings. At Embedded World, NVIDIA’s hardware powered a range of AI-enabled edge solutions. Taiwan-based Aetina, for example, presented an industrial edge solution based on NVIDIA's Jetson AGX Orin GPU, which supports up to 275 trillion operations per second. NVIDIA also showcased a non-destructive fault detection method using ultrasonic testing, further advancing edge AI capabilities.
Evaluating device compatibility and performance without physical hardware is a significant challenge in AI deployment. New AI development platforms allow developers to simulate AI performance on specific chipsets, addressing this issue. At Embedded World, Advantech showcased its EdgeAI SDK, which enables testing across multiple AI chipsets from Intel, Qualcomm, NVIDIA, and others. Such solutions remove barriers to AI adoption, facilitating easier on-device integration.
As computing power at the edge increases, AI model training is shifting from the cloud to edge data centres or 'thick edge' servers. Companies like MAINGEAR and Phison have introduced high-performance workstations capable of on-premises AI training, reducing cloud dependency and enhancing data privacy. At Embedded World, Aetina launched its AIP-FR68 Edge AI Training platform, which features multiple NVIDIA GPUs for substantial computing power, enabling more localized AI training and less reliance on the cloud.
Integrating neural processing units (NPUs) into edge devices boosts AI inference capabilities, leading to energy efficiency, improved multitasking, and better thermal management. These advances drive edge artificial intelligence applications in wearables and sensor nodes. NXP, for example, presented its MCX N Series MCUs, which deliver 42 times faster ML inference than standard CPUs. ARM’s Cortex-A55 paired with Ethos U65 NPUs demonstrated a 70% offload of the CPU workload to the NPU, improving efficiency for AI-enabled devices.
Embedded cellular connectivity has enhanced the capabilities of AI-powered IoT devices, enabling localized decision-making. This development has significant implications for smart cities, factories, and other industries that require real-time processing with low latency. At Embedded World, Fibocom demonstrated a Qualcomm-powered smart mower capable of obstacle detection and autonomous operation without cloud reliance. Thundercomm’s EB3G2 IoT edge gateway also performed real-time human detection, useful for applications like security and traffic management.
Tiny AI, designed for resource-constrained devices, integrates small AI and ML models, allowing these devices to make independent decisions without cloud support. This trend enhances privacy and reduces network load. MY VOICE AI showcased its NANOVOICE speaker verification solution for secure, low-power devices. SensiML’s smart drill prototype used tiny ML to classify fastening states in real time, and Nordic Semiconductor’s Thingy53 IoT prototyping device performed anomaly detection using a tiny ML model, highlighting the potential of micro-edge AI.
At Embedded World 2024, edge AI was rapidly advancing across micro, thin, and thick edge applications. By moving intelligence closer to data sources, edge AI is transforming industries with faster response times, greater data privacy, and reduced reliance on large cloud providers. This shift opens up new possibilities across healthcare, automotive, and industrial sectors, where edge AI applications are set to redefine traditional operational models.
In the coming years, advances in edge AI will make IoT systems more efficient, enabling smart devices to make faster and more accurate decisions. Improvements in edge AI and AI chips will continue to drive automation, reduce costs, and reshape embedded computing. As development continues, edge AI’s role in IoT and AI ecosystems will undoubtedly expand.