001025235 001__ 1025235
001025235 005__ 20250204113839.0
001025235 0247_ $$2doi$$a10.1007/s40747-024-01422-2
001025235 0247_ $$2ISSN$$a2199-4536
001025235 0247_ $$2ISSN$$a2198-6053
001025235 0247_ $$2datacite_doi$$a10.34734/FZJ-2024-02803
001025235 0247_ $$2WOS$$aWOS:001220476300001
001025235 037__ $$aFZJ-2024-02803
001025235 041__ $$aEnglish
001025235 082__ $$a004
001025235 1001_ $$0P:(DE-Juel1)185971$$aAlia, Ahmed$$b0$$eCorresponding author
001025235 245__ $$aA novel Voronoi-based convolutional neural network framework for pushing person detection in crowd videos
001025235 260__ $$aSwitzerland$$bSpringer Nature$$c2024
001025235 3367_ $$2DRIVER$$aarticle
001025235 3367_ $$2DataCite$$aOutput Types/Journal article
001025235 3367_ $$0PUB:(DE-HGF)16$$2PUB:(DE-HGF)$$aJournal Article$$bjournal$$mjournal$$s1714719160_25114
001025235 3367_ $$2BibTeX$$aARTICLE
001025235 3367_ $$2ORCID$$aJOURNAL_ARTICLE
001025235 3367_ $$00$$2EndNote$$aJournal Article
001025235 520__ $$aAnalyzing the microscopic dynamics of pushing behavior within crowds can offer valuable insights into crowd patternsand interactions. By identifying instances of pushing in crowd videos, a deeper understanding of when, where, and whysuch behavior occurs can be achieved. This knowledge is crucial to creating more effective crowd management strategies,optimizing crowd flow, and enhancing overall crowd experiences. However, manually identifying pushing behavior at themicroscopic level is challenging, and the existing automatic approaches cannot detect such microscopic behavior. Thus,this article introduces a novel automatic framework for identifying pushing in videos of crowds on a microscopic level.The framework comprises two main components: (i) feature extraction and (ii) video detection. In the feature extractioncomponent, a new Voronoi-based method is developed for determining the local regions associated with each person in theinput video. Subsequently, these regions are fed into EfficientNetV1B0 Convolutional Neural Network to extract the deepfeatures of each person over time. In the second component, a combination of a fully connected layer with a Sigmoid activationfunction is employed to analyze these deep features and annotate the individuals involved in pushing within the video. Theframework is trained and evaluated on a new dataset created using six real-world experiments, including their correspondingground truths. The experimental findings demonstrate that the proposed framework outperforms state-of-the-art approaches,as well as seven baseline methods used for comparative analysis.
001025235 536__ $$0G:(DE-HGF)POF4-5111$$a5111 - Domain-Specific Simulation & Data Life Cycle Labs (SDLs) and Research Groups (POF4-511)$$cPOF4-511$$fPOF IV$$x0
001025235 536__ $$0G:(BMBF)01DH16027$$aPilotprojekt zur Entwicklung eines palästinensisch-deutschen Forschungs- und Promotionsprogramms 'Palestinian-German Science Bridge' (01DH16027)$$c01DH16027$$x1
001025235 536__ $$0G:(GEPRIS)491111487$$aDFG project 491111487 - Open-Access-Publikationskosten / 2022 - 2024 / Forschungszentrum Jülich (OAPKFZJ) (491111487)$$c491111487$$x2
001025235 588__ $$aDataset connected to CrossRef, Journals: juser.fz-juelich.de
001025235 7001_ $$0P:(DE-HGF)0$$aMaree, Mohammed$$b1
001025235 7001_ $$0P:(DE-Juel1)132077$$aChraibi, Mohcine$$b2
001025235 7001_ $$0P:(DE-Juel1)132266$$aSeyfried, Armin$$b3
001025235 773__ $$0PERI:(DE-600)2834740-7$$a10.1007/s40747-024-01422-2$$p27$$tComplex & intelligent systems$$v0$$x2199-4536$$y2024
001025235 8564_ $$uhttps://juser.fz-juelich.de/record/1025235/files/s40747-024-01422-2%20%281%29.pdf$$yOpenAccess
001025235 8564_ $$uhttps://juser.fz-juelich.de/record/1025235/files/s40747-024-01422-2%20%281%29.gif?subformat=icon$$xicon$$yOpenAccess
001025235 8564_ $$uhttps://juser.fz-juelich.de/record/1025235/files/s40747-024-01422-2%20%281%29.jpg?subformat=icon-1440$$xicon-1440$$yOpenAccess
001025235 8564_ $$uhttps://juser.fz-juelich.de/record/1025235/files/s40747-024-01422-2%20%281%29.jpg?subformat=icon-180$$xicon-180$$yOpenAccess
001025235 8564_ $$uhttps://juser.fz-juelich.de/record/1025235/files/s40747-024-01422-2%20%281%29.jpg?subformat=icon-640$$xicon-640$$yOpenAccess
001025235 8767_ $$8SN-2024-00571-b$$d2024-09-11$$eAPC$$jZahlung erfolgt
001025235 909CO $$ooai:juser.fz-juelich.de:1025235$$pdnbdelivery$$popenCost$$pVDB$$pdriver$$pOpenAPC$$popen_access$$popenaire
001025235 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)185971$$aForschungszentrum Jülich$$b0$$kFZJ
001025235 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)132077$$aForschungszentrum Jülich$$b2$$kFZJ
001025235 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)132266$$aForschungszentrum Jülich$$b3$$kFZJ
001025235 9131_ $$0G:(DE-HGF)POF4-511$$1G:(DE-HGF)POF4-510$$2G:(DE-HGF)POF4-500$$3G:(DE-HGF)POF4$$4G:(DE-HGF)POF$$9G:(DE-HGF)POF4-5111$$aDE-HGF$$bKey Technologies$$lEngineering Digital Futures – Supercomputing, Data Management and Information Security for Knowledge and Action$$vEnabling Computational- & Data-Intensive Science and Engineering$$x0
001025235 9141_ $$y2024
001025235 915pc $$0PC:(DE-HGF)0000$$2APC$$aAPC keys set
001025235 915pc $$0PC:(DE-HGF)0001$$2APC$$aLocal Funding
001025235 915pc $$0PC:(DE-HGF)0002$$2APC$$aDFG OA Publikationskosten
001025235 915pc $$0PC:(DE-HGF)0003$$2APC$$aDOAJ Journal
001025235 915pc $$0PC:(DE-HGF)0113$$2APC$$aDEAL: Springer Nature 2020
001025235 915__ $$0StatID:(DE-HGF)0160$$2StatID$$aDBCoverage$$bEssential Science Indicators$$d2023-10-26
001025235 915__ $$0LIC:(DE-HGF)CCBY4$$2HGFVOC$$aCreative Commons Attribution CC BY 4.0
001025235 915__ $$0StatID:(DE-HGF)0113$$2StatID$$aWoS$$bScience Citation Index Expanded$$d2023-10-26
001025235 915__ $$0StatID:(DE-HGF)0700$$2StatID$$aFees$$d2023-10-26
001025235 915__ $$0StatID:(DE-HGF)0510$$2StatID$$aOpenAccess
001025235 915__ $$0StatID:(DE-HGF)0561$$2StatID$$aArticle Processing Charges$$d2023-10-26
001025235 915__ $$0StatID:(DE-HGF)0200$$2StatID$$aDBCoverage$$bSCOPUS$$d2024-12-20
001025235 915__ $$0StatID:(DE-HGF)0300$$2StatID$$aDBCoverage$$bMedline$$d2024-12-20
001025235 915__ $$0StatID:(DE-HGF)0501$$2StatID$$aDBCoverage$$bDOAJ Seal$$d2024-04-10T15:36:50Z
001025235 915__ $$0StatID:(DE-HGF)0500$$2StatID$$aDBCoverage$$bDOAJ$$d2024-04-10T15:36:50Z
001025235 915__ $$0StatID:(DE-HGF)0030$$2StatID$$aPeer Review$$bDOAJ : Double anonymous peer review$$d2024-04-10T15:36:50Z
001025235 915__ $$0StatID:(DE-HGF)0199$$2StatID$$aDBCoverage$$bClarivate Analytics Master Journal List$$d2024-12-20
001025235 915__ $$0StatID:(DE-HGF)1160$$2StatID$$aDBCoverage$$bCurrent Contents - Engineering, Computing and Technology$$d2024-12-20
001025235 915__ $$0StatID:(DE-HGF)0150$$2StatID$$aDBCoverage$$bWeb of Science Core Collection$$d2024-12-20
001025235 920__ $$lyes
001025235 9201_ $$0I:(DE-Juel1)IAS-7-20180321$$kIAS-7$$lZivile Sicherheitsforschung$$x0
001025235 9801_ $$aFullTexts
001025235 980__ $$ajournal
001025235 980__ $$aVDB
001025235 980__ $$aUNRESTRICTED
001025235 980__ $$aI:(DE-Juel1)IAS-7-20180321
001025235 980__ $$aAPC