Hyperspectral Remote Sensing Instrument design, field campaigns and data analysis
MetadataShow full item record
Hyperspectral remote sensing is still a young field of research, but with a lot of interest, justified by its promising ability to detect and quantify targets. Recent developments in camera technology and single-board computers have allowed for the appearance of extremely capable hyperspectral imaging systems, with low cost and small footprints. This thesis explores several topics of relevance for hyperspectral remote sensing, always with a strong practical component. It is my intent that someone can use this document as a guide to their journey into the world of spectral image sensing, analysis, and understanding. The main contributions presented are: Low-cost hyperspectral instrument design: A DIY instrument based on lowcost components, and additive manufacturing technology is described. Such instruments can be valuable both as research and education tools. Integration of remote sensing instruments on UAVs: A practical guide focusing on a lightweight hyperspectral imaging payload with a push-broom imager, GPS, and an Inertial Measurement Unit (IMU) as well as data synchronization and acquisition systems. Algorithms for data enhancement: First, a method for separating the effect of shadows (de-shadowing) and other partially known lighting condition changes from the actual targets of the analysis such as the effects of the physical, chemical or biological properties of the ground, which are of interest. Second, a method for fusing co-registered high spatial and low spectral resolution image data – e.g., RGB – with low spatial and high spectral resolution data – Hyperspectral. This is possible by exploiting the overlap in observed phenomena by the two cameras to create a model through least square projection. Algorithms for understandable data compression: A novel method and software system for rational handling of time series of multichannel measurements. This quantitative learning tool, the On-The-Fly Processing (OTFP), develops reduced-rank bilinear subspace models that summaries massive streams of multivariate responses, capturing the evolving covariation patterns among the many input variables over time and space. Thereby, a considerable data compression is possible without significant loss of useful systematic information.