| Home > Publications database > Gain Cell-based Analog Content Addressable Memory for Dynamic Associative Tasks in AI |
| Contribution to a conference proceedings | FZJ-2026-00228 |
; ; ;
2025
IEEE
This record in other databases:
Please use a persistent id in citations: doi:10.1109/ISCAS56072.2025.11044190
Abstract: analog Content Addressable Memories (aCAMs)have proven useful for associative Compute-in-Memory (CIM)applications like Decision Trees, Finite State Machines, andHyper-dimensional Computing. While non-volatile implementa-tions using FeFETs and ReRAM devices offer speed, power,and area advantages, they suffer from slow write speeds andlimited write cycles, making them less suitable for computa-tions involving fully dynamic data patterns. To address theselimitations, in this work, we propose a capacitor gain cell-based aCAM designed for dynamic processing, where frequentmemory updates are required. Our system compares analog inputvoltages to boundaries stored in capacitors, enabling efficientdynamic tasks. We demonstrate the application of aCAM withintransformer attention mechanisms by replacing the softmax-scaled dot-product similarity with aCAM similarity, achievingcompetitive results. Circuit simulations on a TSMC 28 nmnode show promising performance in terms of energy efficiency,precision, and latency, making it well-suited for fast, dynamic AIapplications.
|
The record appears in these collections: |