Loading web-font TeX/Main/Regular
An 2.31uJ/Inference Ultra-Low Power Always-on Event-Driven AI-IoT SoC With Switchable nvSRAM Compute-in-Memory Macro | IEEE Journals & Magazine | IEEE Xplore

An 2.31uJ/Inference Ultra-Low Power Always-on Event-Driven AI-IoT SoC With Switchable nvSRAM Compute-in-Memory Macro


Abstract:

Internet-of-Things (IoT) drives the demand for artificial intelligence (AI) system-on-chips (SoCs) for vast always-on ultra-low power applications such as human action re...Show More

Abstract:

Internet-of-Things (IoT) drives the demand for artificial intelligence (AI) system-on-chips (SoCs) for vast always-on ultra-low power applications such as human action recognition (HAR) for surveillance systems, face detection (FD) and recognition (FR) for home security, etc. Previous AI-IoT SoCs still face limited system efficiency caused by the high leaky power of SRAMs, huge external memory access (EMA), and frequent on-chip data transfer. The proposed ultra-low power RISC-V embedded AI-IoT SoC is composed of 1) a novel bit-line (BL) segmented coupled nvSRAM macro with switchable working modes: SRAM, non-volatile memory (NVM), NVM computing in memory (CIM), performing pre-charge reusing, power gating and local data swapping; 2) a hot-silent encoded (HSE) uDMA cluster with 1MB multi-bank eMRAM to reduce the on-chip transmission power and eliminate the EMA power; 3) and an event-driven wake-up unit (EDWU) for skipping unnecessary inference; 4) a RISC-V core with dedicated ISA extension for switchable working modes. The proposed SoC achieves an energy efficiency of 20.3–35.5 TOPS/W @ResNet-20 (fix-point-8, FXP8) inferencing, which shows a 2.82\times 3.69\times efficiency improvement compared to the previous state-of-the-art (SOTA) AI-IoT SoCs.
Page(s): 2534 - 2538
Date of Publication: 08 March 2024

ISSN Information:

Funding Agency:


References

References is not available for this document.