TeleCANesis shows what “getting data where it needs to go” looks like inside a modern off-road vehicle platform: routing signals and commands between infotainment UI, instrument cluster, and embedded services so the right data arrives at the right endpoint with predictable timing. In this demo, that includes moving Bluetooth media metadata (track, artist) and control commands between the HMI layer and the Bluetooth stack, without each app hard-wiring every connection. https://telecanesis.com/
On the vehicle side, the same message routes carry speed, gear state, and other telemetry into the cluster, and can also drive body functions like lighting or logic such as enabling a reverse camera when the gear selector changes. The takeaway is less about a single widget and more about a reusable data plane: map signals once, then reuse them across displays, ECUs, and services as the product evolves, while keeping latency and ordering in check.
There’s also a cabin detail from Ottawa Infotainment: audio is produced via transducers bonded into the roof and doors, so the panels become the radiating surface instead of installing traditional speaker cone. The video was filmed at CES Las Vegas 2026, and the booth context matters because it ties UI, sensor inputs, and connectivity into one integrated experience rather than a lab bench.
Across the booth, TeleCANesis sits under multiple UI stacks and display technologies, feeding the same vehicle signals into different HMIs, and routing safety-related sensor data in other demos. A key point is how this scales when the compute architecture gets more complex: in a next-gen platform with a hypervisor and multiple guest environments, TeleCANesis acts as the messaging backbone between isolated partitions so apps can exchange only the intended data across a clean boundary.
Under the hood, the approach leans on thin middleware plus model-driven configuration and automated code generation (including the TeleCANesis Hub toolkit built on QNX), which makes verification and safety/security certification more tractable than hand-written glue code. They describe using AI during project ingestion and setup, but keeping runtime messaging deterministic, because safety-critical routing is one of the places you can’t tolerate “creative” behavior from tooling. That split—AI to accelerate setup, determinism to ship—captures the engineering mindset in one shot.
I’m publishing about 100+ videos from CES 2026, I upload about 4 videos per day at 5AM/11AM/5PM/11PM CET/EST. Check out all my CES 2026 videos in my playlist here: https://www.youtube.com/playlist?list=PL7xXqJFxvYvjaMwKMgLb6ja_yZuano19e
This video was filmed using the DJI Pocket 3 ($669 at https://amzn.to/4aMpKIC using the dual wireless DJI Mic 2 microphones with the DJI lapel microphone https://amzn.to/3XIj3l8 ), watch all my DJI Pocket 3 videos here https://www.youtube.com/playlist?list=PL7xXqJFxvYvhDlWIAxm_pR9dp7ArSkhKK
Click the “Super Thanks” button below the video to send a highlighted comment under the video! Brands I film are welcome to support my work in this way 😁
Check out my video with Daylight Computer about their revolutionary Sunlight Readable Transflective LCD Display for Healthy Learning: https://www.youtube.com/watch?v=U98RuxkFDYY



