| Hauptseite > Publikationsdatenbank > Analog in-memory computing attention mechanism for fast and energy-efficient large language models |
| Typ | Amount | VAT | Currency | Share | Status | Cost centre |
| Hybrid-OA | 0.00 | 0.00 | EUR | (Publish and Read) | ZB | |
| Sum | 0.00 | 0.00 | EUR | |||
| Total | 0.00 |
| Journal Article | FZJ-2026-00224 |
; ; ; ; ; ;
2025
Nature Research
London
This record in other databases:
Please use a persistent id in citations: doi:10.1038/s43588-025-00854-1 doi:10.34734/FZJ-2026-00224
|
The record appears in these collections: |