Edge Impulse (a Qualcomm company) frames its platform as a model-to-firmware pipeline for edge AI: capture sensor or camera data, label it, train a compact model, then ship an optimized artifact that can run without a cloud round trip. The demos emphasize quantization, runtime portability, and repeatable edge MLOps where latency, privacy, and uptime matter for real work. https://edgeimpulse.com/
One highlight is an XR industrial worker assistant running on TCL RayNeo X3 Pro glasses built on Snapdragon AR1, with a dual micro-display overlay and a forward camera. Edge Impulse trains a YOLO-class detector (their “YOLO Pro” variant) to identify specialized parts, then a local Llama 3.2 flow pulls the right documentation and generates step-by-step context like part numbers, install notes, and purpose for a field crew guide.
The workflow focus is data: capture images directly from the wearable, annotate in Studio, and iterate via active learning where an early model helps pre-label the next batch. They also point to connectors that let foundation models assist labeling, plus data augmentation and synthetic data generation to widen coverage. This segment was filmed at the Qualcomm booth during CES Las Vegas 2026, but the core story is a repeatable edge pipeline, not a one-off demo.
A second showcase moves to the factory line: vision-based defect detection on Qualcomm Dragonwing IQ9, positioned for on-device AI at up to 100 TOPS. The UI runs with Qt, while the model flags defective coffee pods in real time and an on-device Llama 3.2 3B interface answers queries like defect summaries or safety prompts, all offline on the same device.
They round it out with PPE and person detection on an industrial gateway, plus Arduino collaborations: the UNO Q hybrid board (Dragonwing QRB2210 MPU + STM32U585 MCU) using USB-C hubs for peripherals, wake-word keyword spotting, and App Lab flows to deploy Edge Impulse models. There’s also a cascaded pattern where a small on-device detector triggers a cloud VLM only when extra scene context is needed, a practical tradeoff for cost and scale.
Edge Impulse XR + IQ9 edge AI: YOLO-Pro, Llama 3.2, AR1 smart glasses, defect detection
Edge Impulse on-device GenAI workflows: Hexagon NPU, QNN, 8-bit quant, Arduino UNO Q
I’m publishing about 100+ videos from CES 2026, I upload about 4 videos per day at 5AM/11AM/5PM/11PM CET/EST. Check out all my CES 2026 videos in my playlist here: https://www.youtube.com/playlist?list=PL7xXqJFxvYvjaMwKMgLb6ja_yZuano19e
This video was filmed using the DJI Pocket 3 ($669 at https://amzn.to/4aMpKIC using the dual wireless DJI Mic 2 microphones with the DJI lapel microphone https://amzn.to/3XIj3l8 ), watch all my DJI Pocket 3 videos here https://www.youtube.com/playlist?list=PL7xXqJFxvYvhDlWIAxm_pR9dp7ArSkhKK
Click the “Super Thanks” button below the video to send a highlighted comment under the video! Brands I film are welcome to support my work in this way 😁
Check out my video with Daylight Computer about their revolutionary Sunlight Readable Transflective LCD Display for Healthy Learning: https://www.youtube.com/watch?v=U98RuxkFDYY



