001028217 001__ 1028217 001028217 005__ 20250204113906.0 001028217 0247_ $$2doi$$a10.1038/s42256-024-00846-2 001028217 0247_ $$2datacite_doi$$a10.34734/FZJ-2024-04411 001028217 0247_ $$2WOS$$aWOS:001258009300004 001028217 037__ $$aFZJ-2024-04411 001028217 082__ $$a004 001028217 1001_ $$0P:(DE-Juel1)201426$$aRenner, Alpha$$b0$$eCorresponding author 001028217 245__ $$aVisual odometry with neuromorphic resonator networks 001028217 260__ $$aLondon$$bSpringer Nature Publishing$$c2024 001028217 3367_ $$2DRIVER$$aarticle 001028217 3367_ $$2DataCite$$aOutput Types/Journal article 001028217 3367_ $$0PUB:(DE-HGF)16$$2PUB:(DE-HGF)$$aJournal Article$$bjournal$$mjournal$$s1734427835_28897 001028217 3367_ $$2BibTeX$$aARTICLE 001028217 3367_ $$2ORCID$$aJOURNAL_ARTICLE 001028217 3367_ $$00$$2EndNote$$aJournal Article 001028217 520__ $$aVisual odometry (VO) is a method used to estimate self-motion of a mobile robot using visual sensors. Unlike odometry based on integrating differential measurements that can accumulate errors, such as inertial sensors or wheel encoders, VO is not compromised by drift. However, image-based VO is computationally demanding, limiting its application in use cases with low-latency, low-memory and low-energy requirements. Neuromorphic hardware offers low-power solutions to many vision and artificial intelligence problems, but designing such solutions is complicated and often has to be assembled from scratch. Here we propose the use of vector symbolic architecture (VSA) as an abstraction layer to design algorithms compatible with neuromorphic hardware. Building from a VSA model for scene analysis, described in our companion paper, we present a modular neuromorphic algorithm that achieves state-of-the-art performance on two-dimensional VO tasks. Specifically, the proposed algorithm stores and updates a working memory of the presented visual environment. Based on this working memory, a resonator network estimates the changing location and orientation of the camera. We experimentally validate the neuromorphic VSA-based approach to VO with two benchmarks: one based on an event-camera dataset and the other in a dynamic scene with a robotic task. 001028217 536__ $$0G:(DE-HGF)POF4-5234$$a5234 - Emerging NC Architectures (POF4-523)$$cPOF4-523$$fPOF IV$$x0 001028217 588__ $$aDataset connected to DataCite 001028217 7001_ $$00000-0002-3954-9688$$aSupic, Lazar$$b1$$eCorresponding author 001028217 7001_ $$00000-0001-7460-2467$$aDanielescu, Andreea$$b2 001028217 7001_ $$00000-0002-7109-1689$$aIndiveri, Giacomo$$b3 001028217 7001_ $$00000-0001-8248-4544$$aFrady, E. Paxon$$b4 001028217 7001_ $$00000-0002-6738-9263$$aSommer, Friedrich T.$$b5$$eCorresponding author 001028217 7001_ $$00000-0003-4684-202X$$aSandamirskaya, Yulia$$b6$$eCorresponding author 001028217 773__ $$0PERI:(DE-600)2933875-X$$a10.1038/s42256-024-00846-2$$n6$$p653–663$$tNature machine intelligence$$v6$$x2522-5839$$y2024 001028217 8564_ $$uhttps://juser.fz-juelich.de/record/1028217/files/2209.02000v3.pdf$$yOpenAccess 001028217 909CO $$ooai:juser.fz-juelich.de:1028217$$pdnbdelivery$$pdriver$$pVDB$$popen_access$$popenaire 001028217 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)201426$$aForschungszentrum Jülich$$b0$$kFZJ 001028217 9131_ $$0G:(DE-HGF)POF4-523$$1G:(DE-HGF)POF4-520$$2G:(DE-HGF)POF4-500$$3G:(DE-HGF)POF4$$4G:(DE-HGF)POF$$9G:(DE-HGF)POF4-5234$$aDE-HGF$$bKey Technologies$$lNatural, Artificial and Cognitive Information Processing$$vNeuromorphic Computing and Network Dynamics$$x0 001028217 9141_ $$y2024 001028217 915__ $$0StatID:(DE-HGF)0160$$2StatID$$aDBCoverage$$bEssential Science Indicators$$d2023-08-29 001028217 915__ $$0StatID:(DE-HGF)0113$$2StatID$$aWoS$$bScience Citation Index Expanded$$d2023-08-29 001028217 915__ $$0StatID:(DE-HGF)3003$$2StatID$$aDEAL Nature$$d2023-08-29$$wger 001028217 915__ $$0StatID:(DE-HGF)0510$$2StatID$$aOpenAccess 001028217 915__ $$0StatID:(DE-HGF)0100$$2StatID$$aJCR$$bNAT MACH INTELL : 2022$$d2024-12-05 001028217 915__ $$0StatID:(DE-HGF)0200$$2StatID$$aDBCoverage$$bSCOPUS$$d2024-12-05 001028217 915__ $$0StatID:(DE-HGF)0300$$2StatID$$aDBCoverage$$bMedline$$d2024-12-05 001028217 915__ $$0StatID:(DE-HGF)0199$$2StatID$$aDBCoverage$$bClarivate Analytics Master Journal List$$d2024-12-05 001028217 915__ $$0StatID:(DE-HGF)1160$$2StatID$$aDBCoverage$$bCurrent Contents - Engineering, Computing and Technology$$d2024-12-05 001028217 915__ $$0StatID:(DE-HGF)0150$$2StatID$$aDBCoverage$$bWeb of Science Core Collection$$d2024-12-05 001028217 915__ $$0StatID:(DE-HGF)9920$$2StatID$$aIF >= 20$$bNAT MACH INTELL : 2022$$d2024-12-05 001028217 9201_ $$0I:(DE-Juel1)PGI-15-20210701$$kPGI-15$$lNeuromorphic Software Eco System$$x0 001028217 980__ $$ajournal 001028217 980__ $$aVDB 001028217 980__ $$aUNRESTRICTED 001028217 980__ $$aI:(DE-Juel1)PGI-15-20210701 001028217 9801_ $$aFullTexts