000830014 001__ 830014
000830014 005__ 20210129230416.0
000830014 0247_ $$2doi$$a10.1016/j.isprsjprs.2015.08.002
000830014 0247_ $$2ISSN$$a0924-2716
000830014 0247_ $$2ISSN$$a1872-8235
000830014 0247_ $$2WOS$$aWOS:000363075300020
000830014 0247_ $$2altmetric$$aaltmetric:4602753
000830014 037__ $$aFZJ-2017-03616
000830014 082__ $$a550
000830014 1001_ $$0P:(DE-HGF)0$$aAasen, Helge$$b0$$eCorresponding author
000830014 245__ $$aGenerating 3D hyperspectral information with lightweight UAV snapshot cameras for vegetation monitoring: From camera calibration to quality assurance
000830014 260__ $$aAmsterdam [u.a.]$$bElsevier$$c2015
000830014 3367_ $$2DRIVER$$aarticle
000830014 3367_ $$2DataCite$$aOutput Types/Journal article
000830014 3367_ $$0PUB:(DE-HGF)16$$2PUB:(DE-HGF)$$aJournal Article$$bjournal$$mjournal$$s1495182315_3840
000830014 3367_ $$2BibTeX$$aARTICLE
000830014 3367_ $$2ORCID$$aJOURNAL_ARTICLE
000830014 3367_ $$00$$2EndNote$$aJournal Article
000830014 520__ $$aThis paper describes a novel method to derive 3D hyperspectral information from lightweight snapshot cameras for unmanned aerial vehicles for vegetation monitoring. Snapshot cameras record an image cube with one spectral and two spatial dimensions with every exposure. First, we describe and apply methods to radiometrically characterize and calibrate these cameras. Then, we introduce our processing chain to derive 3D hyperspectral information from the calibrated image cubes based on structure from motion. The approach includes a novel way for quality assurance of the data which is used to assess the quality of the hyperspectral data for every single pixel in the final data product. The result is a hyperspectral digital surface model as a representation of the surface in 3D space linked with the hyperspectral information emitted and reflected by the objects covered by the surface. In this study we use the hyperspectral camera Cubert UHD 185-Firefly, which collects 125 bands from 450 to 950 nm. The obtained data product has a spatial resolution of approximately 1 cm for the spatial and 21 cm for the hyperspectral information. The radiometric calibration yields good results with less than 1% offset in reflectance compared to an ASD FieldSpec 3 for most of the spectral range. The quality assurance information shows that the radiometric precision is better than 0.13% for the derived data product. We apply the approach to data from a flight campaign in a barley experiment with different varieties during the growth stage heading (BBCH 52 – 59) to demonstrate the feasibility for vegetation monitoring in the context of precision agriculture. The plant parameters retrieved from the data product correspond to in-field measurements of a single date field campaign for plant height (R2 = 0.7), chlorophyll (BGI2, R2 = 0.52), LAI (RDVI, R2 = 0.32) and biomass (RDVI, R2 = 0.29). Our approach can also be applied for other image-frame cameras as long as the individual bands of the image cube are spatially co-registered beforehand.
000830014 536__ $$0G:(DE-HGF)POF3-582$$a582 - Plant Science (POF3-582)$$cPOF3-582$$fPOF III$$x0
000830014 588__ $$aDataset connected to CrossRef
000830014 65027 $$0V:(DE-MLZ)SciArea-160$$2V:(DE-HGF)$$aBiology$$x0
000830014 7001_ $$0P:(DE-Juel1)145906$$aBurkart, Andreas$$b1$$ufzj
000830014 7001_ $$0P:(DE-HGF)0$$aBolten, Andreas$$b2
000830014 7001_ $$0P:(DE-HGF)0$$aBareth, Georg$$b3
000830014 773__ $$0PERI:(DE-600)2012663-3$$a10.1016/j.isprsjprs.2015.08.002$$gVol. 108, p. 245 - 259$$p245 - 259$$tISPRS journal of photogrammetry and remote sensing$$v108$$x0924-2716$$y2015
000830014 8564_ $$uhttps://juser.fz-juelich.de/record/830014/files/1-s2.0-S0924271615001938-main.pdf$$yRestricted
000830014 8564_ $$uhttps://juser.fz-juelich.de/record/830014/files/1-s2.0-S0924271615001938-main.gif?subformat=icon$$xicon$$yRestricted
000830014 8564_ $$uhttps://juser.fz-juelich.de/record/830014/files/1-s2.0-S0924271615001938-main.jpg?subformat=icon-1440$$xicon-1440$$yRestricted
000830014 8564_ $$uhttps://juser.fz-juelich.de/record/830014/files/1-s2.0-S0924271615001938-main.jpg?subformat=icon-180$$xicon-180$$yRestricted
000830014 8564_ $$uhttps://juser.fz-juelich.de/record/830014/files/1-s2.0-S0924271615001938-main.jpg?subformat=icon-640$$xicon-640$$yRestricted
000830014 8564_ $$uhttps://juser.fz-juelich.de/record/830014/files/1-s2.0-S0924271615001938-main.pdf?subformat=pdfa$$xpdfa$$yRestricted
000830014 909CO $$ooai:juser.fz-juelich.de:830014$$pVDB
000830014 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)145906$$aForschungszentrum Jülich$$b1$$kFZJ
000830014 9131_ $$0G:(DE-HGF)POF3-582$$1G:(DE-HGF)POF3-580$$2G:(DE-HGF)POF3-500$$3G:(DE-HGF)POF3$$4G:(DE-HGF)POF$$aDE-HGF$$bKey Technologies$$lKey Technologies for the Bioeconomy$$vPlant Science$$x0
000830014 915__ $$0StatID:(DE-HGF)0420$$2StatID$$aNationallizenz
000830014 915__ $$0StatID:(DE-HGF)0300$$2StatID$$aDBCoverage$$bMedline
000830014 915__ $$0StatID:(DE-HGF)0100$$2StatID$$aJCR$$bISPRS J PHOTOGRAMM : 2015
000830014 915__ $$0StatID:(DE-HGF)0200$$2StatID$$aDBCoverage$$bSCOPUS
000830014 915__ $$0StatID:(DE-HGF)0600$$2StatID$$aDBCoverage$$bEbsco Academic Search
000830014 915__ $$0StatID:(DE-HGF)0030$$2StatID$$aPeer Review$$bASC
000830014 915__ $$0StatID:(DE-HGF)0199$$2StatID$$aDBCoverage$$bThomson Reuters Master Journal List
000830014 915__ $$0StatID:(DE-HGF)0111$$2StatID$$aWoS$$bScience Citation Index Expanded
000830014 915__ $$0StatID:(DE-HGF)0150$$2StatID$$aDBCoverage$$bWeb of Science Core Collection
000830014 915__ $$0StatID:(DE-HGF)1160$$2StatID$$aDBCoverage$$bCurrent Contents - Engineering, Computing and Technology
000830014 915__ $$0StatID:(DE-HGF)9900$$2StatID$$aIF < 5
000830014 920__ $$lyes
000830014 9201_ $$0I:(DE-Juel1)IBG-2-20101118$$kIBG-2$$lPflanzenwissenschaften$$x0
000830014 980__ $$ajournal
000830014 980__ $$aVDB
000830014 980__ $$aI:(DE-Juel1)IBG-2-20101118
000830014 980__ $$aUNRESTRICTED