001047299 001__ 1047299
001047299 005__ 20251023202112.0
001047299 0247_ $$2doi$$a10.48550/ARXIV.2503.09244
001047299 037__ $$aFZJ-2025-04214
001047299 1001_ $$0P:(DE-Juel1)175101$$aPaul, Richard D.$$b0
001047299 245__ $$aHow To Make Your Cell Tracker Say 'I dunno!'
001047299 260__ $$barXiv$$c2025
001047299 3367_ $$0PUB:(DE-HGF)25$$2PUB:(DE-HGF)$$aPreprint$$bpreprint$$mpreprint$$s1761207040_20943
001047299 3367_ $$2ORCID$$aWORKING_PAPER
001047299 3367_ $$028$$2EndNote$$aElectronic Article
001047299 3367_ $$2DRIVER$$apreprint
001047299 3367_ $$2BibTeX$$aARTICLE
001047299 3367_ $$2DataCite$$aOutput Types/Working Paper
001047299 500__ $$aRDP is funded by the Helmholtz School for Data Science in Life, Earth, and Energy (HDS-LEE). DR’s research is funded by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) – 548823575. This work was supported by the President’s Initiative and Networking Funds of the Helmholtz Association of German Research Centres [EMSIG ZT-I-PF-04-044].
001047299 520__ $$aCell tracking is a key computational task in live-cell microscopy, but fully automated analysis of high-throughput imaging requires reliable and, thus, uncertainty-aware data analysis tools, as the amount of data recorded within a single experiment exceeds what humans are able to overlook. We here propose and benchmark various methods to reason about and quantify uncertainty in linear assignment-based cell tracking algorithms. Our methods take inspiration from statistics and machine learning, leveraging two perspectives on the cell tracking problem explored throughout this work: Considering it as a Bayesian inference problem and as a classification problem. Our methods admit a framework-like character in that they equip any frame-to-frame tracking method with uncertainty quantification. We demonstrate this by applying it to various existing tracking algorithms including the recently presented Transformer-based trackers. We demonstrate empirically that our methods yield useful and well-calibrated tracking uncertainties.
001047299 536__ $$0G:(DE-HGF)POF4-2171$$a2171 - Biological and environmental resources for sustainable use (POF4-217)$$cPOF4-217$$fPOF IV$$x0
001047299 588__ $$aDataset connected to DataCite
001047299 650_7 $$2Other$$aComputer Vision and Pattern Recognition (cs.CV)
001047299 650_7 $$2Other$$aQuantitative Methods (q-bio.QM)
001047299 650_7 $$2Other$$aApplications (stat.AP)
001047299 650_7 $$2Other$$aFOS: Computer and information sciences
001047299 650_7 $$2Other$$aFOS: Biological sciences
001047299 7001_ $$0P:(DE-Juel1)176923$$aSeiffarth, Johannes$$b1
001047299 7001_ $$0P:(DE-HGF)0$$aRügamer, David$$b2
001047299 7001_ $$0P:(DE-Juel1)129394$$aScharr, Hanno$$b3
001047299 7001_ $$0P:(DE-Juel1)129051$$aNöh, Katharina$$b4
001047299 7001_ $$0P:(DE-Juel1)129394$$aScharr, Hanno$$b5$$ufzj
001047299 773__ $$a10.48550/ARXIV.2503.09244
001047299 909CO $$ooai:juser.fz-juelich.de:1047299$$pVDB
001047299 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)175101$$aForschungszentrum Jülich$$b0$$kFZJ
001047299 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)176923$$aForschungszentrum Jülich$$b1$$kFZJ
001047299 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)129394$$aForschungszentrum Jülich$$b3$$kFZJ
001047299 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)129051$$aForschungszentrum Jülich$$b4$$kFZJ
001047299 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)129394$$aForschungszentrum Jülich$$b5$$kFZJ
001047299 9131_ $$0G:(DE-HGF)POF4-217$$1G:(DE-HGF)POF4-210$$2G:(DE-HGF)POF4-200$$3G:(DE-HGF)POF4$$4G:(DE-HGF)POF$$9G:(DE-HGF)POF4-2171$$aDE-HGF$$bForschungsbereich Erde und Umwelt$$lErde im Wandel – Unsere Zukunft nachhaltig gestalten$$vFür eine nachhaltige Bio-Ökonomie – von Ressourcen zu Produkten$$x0
001047299 9141_ $$y2025
001047299 9201_ $$0I:(DE-Juel1)IBG-1-20101118$$kIBG-1$$lBiotechnologie$$x0
001047299 9201_ $$0I:(DE-Juel1)IAS-8-20210421$$kIAS-8$$lDatenanalyse und Maschinenlernen$$x1
001047299 980__ $$apreprint
001047299 980__ $$aVDB
001047299 980__ $$aI:(DE-Juel1)IBG-1-20101118
001047299 980__ $$aI:(DE-Juel1)IAS-8-20210421
001047299 980__ $$aUNRESTRICTED