| Hauptseite > Publikationsdatenbank > Analog in-memory computing attention mechanism for fast and energy-efficient large language models |
| Journal Article | FZJ-2026-00224 |
; ; ; ; ; ;
2025
Nature Research
London
This record in other databases:
Please use a persistent id in citations: doi:10.1038/s43588-025-00854-1 doi:10.34734/FZJ-2026-00224
|
The record appears in these collections: |