| Home > Publications database > Analog Softmax with Wide Input Current Range for In-Memory Computing |
| Contribution to a conference proceedings/Contribution to a book | FZJ-2026-00226 |
; ; ; ;
2025
IEEE
This record in other databases:
Please use a persistent id in citations: doi:10.1109/ISCAS56072.2025.11043251
Abstract: The Softmax activation function plays a pivotalrole in both the attention mechanism of Transformers andin the final layer of neural networks performing classification.The Softmax function outputs probabilities by normalizing theinput values, emphasizing differences among them to highlightthe largest values. In digital implementations, the complexityof softmax grows linearly with the number of inputs. Incontrast, analog implementations enable parallel computationswith lower latency. In this work, we demonstrate that thisapproach achieves a more efficient linear scaling of latencyas vector size increases logarithmically. This analog softmaxcircuits are implemented in TSMC 28 nm PDK technology,capable of driving up to 128 inputs and producing an ana-log current output spanning three orders of magnitude. Thestudy examines the circuit’s power consumption, latency, anderror, emphasizing its efficiency compared to the alternativeapproach of converting outputs to digital signals via ADCsand performing the softmax calculation digitally. By reducingreliance on these power-intensive operations, this work aims tosignificantly enhance energy efficiency in in-memory computingsystems.
|
The record appears in these collections: |