Weebit Nano is positioning ReRAM as an embedded non-volatile memory alternative to flash for SoCs that need faster writes, lower power, better endurance, and easier scaling below 28 nm. In this interview, CEO Coby Hanoch explains why the company focuses on embedded NVM rather than bulk storage: the target is firmware, security keys, calibration data, AI coefficients, and instant-on system behavior integrated directly on the same die as compute and control logic. https://www.weebit-nano.com/
The key technical point is that Weebit’s ReRAM is a back-end-of-line technology, built between metal layers rather than in the silicon substrate. That matters for mixed-signal and analog-heavy designs, because it avoids many of the layout and process compromises associated with embedded flash. Hanoch describes the cell in simple terms: voltage moves ions to form or break a conductive path, switching between low and high resistance states that represent stored data.
For edge AI, the pitch is especially clear. If model coefficients can live in embedded non-volatile memory on the AI chip, designers can avoid a separate external flash device, reduce board cost, shorten boot time, cut power draw, and remove a security exposure created when weights are copied at startup. That fits near-memory compute, and it also points toward in-memory compute, where analog-style ReRAM arrays may eventually support more efficient AI inference for gesture recognition, sensor workloads, and always-on edge devices.
The interview also shows why this matters beyond AI. Embedded ReRAM is relevant for power management ICs, MCUs, IoT nodes, automotive electronics, and aerospace-oriented designs that need retention without power, robust endurance, and tolerance for harsh conditions. Weebit highlights qualification work for automotive temperature ranges, radiation immunity as a useful characteristic, and the benefit of integrating memory without disturbing the optimal analog portion of a chip.
Filmed at Embedded World 2026 in Nuremberg, the discussion captures a memory company moving from R&D into commercialization. Weebit already talks about customers such as onsemi and Texas Instruments, growing capacity targets in the embedded range, and a roadmap that connects embedded NVM with future AI architectures. The result is not “more storage” in the consumer sense, but a more integrated memory block for edge silicon where power, cost, area, boot latency, and security all matter at once.
All my Embedded World videos are in this playlist: https://www.youtube.com/playlist?list=PL7xXqJFxvYvjgUpdNMBkGzEWU6YVxR8Ga



