001050456 001__ 1050456
001050456 005__ 20260121204316.0
001050456 0247_ $$2doi$$a10.1109/ISCAS56072.2025.11043251
001050456 037__ $$aFZJ-2026-00226
001050456 1001_ $$0P:(DE-HGF)0$$aDube, Aradhana$$b0$$eCorresponding author
001050456 1112_ $$a2025 IEEE International Symposium on Circuits and Systems (ISCAS)$$cLondon$$d2025-05-25 - 2025-05-28$$wUnited Kingdom
001050456 245__ $$aAnalog Softmax with Wide Input Current Range for In-Memory Computing
001050456 260__ $$bIEEE$$c2025
001050456 29510 $$a2025 IEEE International Symposium on Circuits and Systems (ISCAS) : [Proceedings] - IEEE, 2025. - ISBN 979-8-3503-5683-0 - doi:10.1109/ISCAS56072.2025.11043251
001050456 300__ $$a1-5
001050456 3367_ $$2ORCID$$aCONFERENCE_PAPER
001050456 3367_ $$033$$2EndNote$$aConference Paper
001050456 3367_ $$2BibTeX$$aINPROCEEDINGS
001050456 3367_ $$2DRIVER$$aconferenceObject
001050456 3367_ $$2DataCite$$aOutput Types/Conference Paper
001050456 3367_ $$0PUB:(DE-HGF)8$$2PUB:(DE-HGF)$$aContribution to a conference proceedings$$bcontrib$$mcontrib$$s1768997948_13601
001050456 3367_ $$0PUB:(DE-HGF)7$$2PUB:(DE-HGF)$$aContribution to a book$$mcontb
001050456 520__ $$aThe Softmax activation function plays a pivotalrole in both the attention mechanism of Transformers andin the final layer of neural networks performing classification.The Softmax function outputs probabilities by normalizing theinput values, emphasizing differences among them to highlightthe largest values. In digital implementations, the complexityof softmax grows linearly with the number of inputs. Incontrast, analog implementations enable parallel computationswith lower latency. In this work, we demonstrate that thisapproach achieves a more efficient linear scaling of latencyas vector size increases logarithmically. This analog softmaxcircuits are implemented in TSMC 28 nm PDK technology,capable of driving up to 128 inputs and producing an ana-log current output spanning three orders of magnitude. Thestudy examines the circuit’s power consumption, latency, anderror, emphasizing its efficiency compared to the alternativeapproach of converting outputs to digital signals via ADCsand performing the softmax calculation digitally. By reducingreliance on these power-intensive operations, this work aims tosignificantly enhance energy efficiency in in-memory computingsystems.
001050456 536__ $$0G:(DE-HGF)POF4-5234$$a5234 - Emerging NC Architectures (POF4-523)$$cPOF4-523$$fPOF IV$$x0
001050456 588__ $$aDataset connected to CrossRef Conference
001050456 7001_ $$0P:(DE-Juel1)192242$$aManea, Paul$$b1$$ufzj
001050456 7001_ $$0P:(DE-HGF)0$$aGibertini, Paolo$$b2
001050456 7001_ $$0P:(DE-HGF)0$$aCovi, Erika$$b3
001050456 7001_ $$0P:(DE-Juel1)188145$$aStrachan, John Paul$$b4$$ufzj
001050456 773__ $$a10.1109/ISCAS56072.2025.11043251
001050456 8564_ $$uhttps://ieeexplore.ieee.org/abstract/document/11043251
001050456 8564_ $$uhttps://juser.fz-juelich.de/record/1050456/files/Analog_Softmax_with_Wide_Input_Current_Range_for_In-Memory_Computing.pdf$$yRestricted
001050456 909CO $$ooai:juser.fz-juelich.de:1050456$$pVDB
001050456 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-HGF)0$$aForschungszentrum Jülich$$b0$$kFZJ
001050456 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)192242$$aForschungszentrum Jülich$$b1$$kFZJ
001050456 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)188145$$aForschungszentrum Jülich$$b4$$kFZJ
001050456 9131_ $$0G:(DE-HGF)POF4-523$$1G:(DE-HGF)POF4-520$$2G:(DE-HGF)POF4-500$$3G:(DE-HGF)POF4$$4G:(DE-HGF)POF$$9G:(DE-HGF)POF4-5234$$aDE-HGF$$bKey Technologies$$lNatural, Artificial and Cognitive Information Processing$$vNeuromorphic Computing and Network Dynamics$$x0
001050456 920__ $$lyes
001050456 9201_ $$0I:(DE-Juel1)PGI-14-20210412$$kPGI-14$$lNeuromorphic Compute Nodes$$x0
001050456 980__ $$acontrib
001050456 980__ $$aVDB
001050456 980__ $$acontb
001050456 980__ $$aI:(DE-Juel1)PGI-14-20210412
001050456 980__ $$aUNRESTRICTED