Multi-Context TCAM-Based Selective Computing: Design Space Exploration for a Low-Power NN

Ren Arakawa, Naoya Onizawa, Jean Philippe DIguet, Takahiro Hanyu

Research output: Contribution to journalArticlepeer-review

9 Citations (Scopus)


In this paper, we propose a low-power memory-based computing architecture, called selective computing architecture (SCA). It consists of multipliers and an LUT (Look-Up Table)-based component, that is multi-context ternary content-addressable memory (MC-TCAM). Either of them is selected by input-data conditions in neural-networks (NNs). Compared with quantized NNs, a higher accurate multiplication can be performed with low-power consumption in the proposed architecture. If input data stored in the MC-TCAM appears, the corresponding multiplication results for multiple weights are obtained. The MC-TCAM stores only shorter length of input data, resulting in achieving a low-power computing. The performance of the SCA is determined by three physical parameters concerning the configuration of MC-TCAM. The power dissipation of the target NN can be minimized by exploring these parameters in the design space. The hardware based on the proposed architecture is evaluated using TSMC 65 nm CMOS technology and MTJ model. In the case of speech command recognition, the power consumption at the multiplication of the first convolutional layer in a convolutional NN is reduced by 67% compared to the solution relying only on multipliers.

Original languageEnglish
Article number9234692
Pages (from-to)67-76
Number of pages10
JournalIEEE Transactions on Circuits and Systems I: Regular Papers
Issue number1
Publication statusPublished - 2021 Jan


  • Neural networks
  • VLSI
  • look-up table
  • memory-based computing
  • ternary content-addressable memory


Dive into the research topics of 'Multi-Context TCAM-Based Selective Computing: Design Space Exploration for a Low-Power NN'. Together they form a unique fingerprint.

Cite this