The importance of vegetation structure and biomass in controlling land-atmosphere exchange is widely recognized, but measurements of canopy structure are challenging, time consuming, and often rely on destructive methods. The Microsoft Kinect is an infrared sensor designed for video gaming that outputs synchronized color and depth images and that has the potential to allow rapid characterization of vegetation structure. We compared depth images from a Kinect sensor with manual measurements of plant structure and size for two species growing in a California grassland. The depth images agreed well with the horizontal and vertical measurements of plant size made manually. Similarly, the plant volumes calculated with a three-dimensional convex hulls approach was well related to plant biomass. The Kinect showed some limitations for ecological observation associated with a short measurement range and daytime light contamination. Nonetheless, the Kinect’s light weight, fast acquisition time, low power requirement, and cost make it a promising tool for rapid field surveys of canopy structure, especially in small-statured vegetation.
References
[1]
Brunner, A. A light model for spatially explicit forest stand models. Forest Ecol. Manage. 1998, 107, 19–46.
[2]
Pielke, R.A.I.; Avissar, R. Influence of landscape structure on local and regional climate. Landscape Ecol. 1990, 4, 133–155.
[3]
Rotenberg, E.; Yakir, D. Distinct patterns of changes in surface energy budget associated with forestation in the semiarid region. Glob. Change Biol. 2011, 17, 1536–1548.
[4]
Lee, X.; Goulden, M.L.; Hollinger, D.Y.; Barr, A.; Black, T.A.; Bohrer, G.; Bracho, R.; Drake, B.; Goldstein, A.; Gu, L.; et al. Observed increase in local cooling effect of deforestation at higher latitudes. Nature 2011, 479, 384–387.
[5]
Bonan, G.B. Forests and climate change: Forcings, feedbacks, and the climate benefits of forests. Science 2008, 320, 1444–1449.
[6]
Lu, D. The potential and challenge of remote sensing based biomass estimation. Int. J. Remote Sens. 2006, 27, 1297–1328.
[7]
Powell, S.L.; Cohen, W.B.; Healey, S.P.; Kennedy, R.E.; Moisen, G.G.; Pierce, K.B.; Ohmann, J.L. Quantification of live aboveground forest biomass dynamics with Landsat time-series and field inventory data: A comparison of empirical modeling approaches. Remote Sens. Environ. 2010, 114, 1053–1068.
[8]
Bergen, K.M.; Goetz, S.J.; Dubayah, R.O.; Henebry, G.M.; Hunsaker, C.T.; Imhoff, M.L.; Nelson, R.F.; Parker, G.G.; Radeloff, V.C. Remote sensing of vegetation 3-D structure for biodiversity and habitat: Review and implications for lidar and radar spaceborne missions. J. Geophys. Res. 2009, 114, 1–13.
[9]
Seidel, D.; Beyer, F.; Hertel, D.; Fleck, S.; Leuschner, C. 3D-laser scanning: A non-destructive method for studying above-ground biomass and growth of juvenile trees. Agr. Forest Meteorol. 2011, 151, 1305–1311.
[10]
Henning, J.; Radtke, P. Ground-based laser imaging for assessing three dimensional forest canopy structure. Photogramm. Eng. Remote Sens. 2006, 72, 1349–1358.
[11]
Naesset, E.; Gobakken, T.; Nasset, E. Estimation of above- and below-ground biomass across regions of the boreal forest zone using airborne laser. Remote Sens. Environ. 2008, 112, 3079–3090.
[12]
Zhao, K.; Popescu, S.; Nelson, R. Lidar remote sensing of forest biomass: A scale-invariant estimation approach using airborne lasers. Remote Sens. Environ. 2009, 113, 182–196.
[13]
Morsdorf, F.; Meier, E.; Allg?wer, B.; Nüesch, D. Clustering in airborne laser scanning raw data for segmentation of single trees. Int. Arch. Photogramm. Remote Sens. Spat. Inform. Sci. 2003, 34, 27–33.
[14]
Jochem, A.; Hollaus, M.; Rutzinger, M.; H?fle, B. Estimation of aboveground biomass in alpine forests: A semi-empirical approach considering canopy transparency derived from airborne LiDAR data. Sensors 2011, 11, 278–295.
[15]
Wang, Y.; Weinacker, H.; Koch, B. A LiDAR point cloud based procedure for vertical canopy structure analysis and 3D single tree modelling in forest. Sensors 2008, 8, 3938–3951.
[16]
Baldocchi, D.; Falge, E.; Gu, L.; Olson, R.; Hollinger, D.; Running, S.; Anthoni, P.; Bernhofer, C.; Davis, K.; Evans, R.; Fuentes, J.; Goldstein, A.; Katul, G.; Law, B.E.; Lee, X.; Malhi, Y.; Meyers, T.; Munger, W.; Oechel, W.; Paw U, K.T.; Pilegaard, K.; Schmid, H.P.; Valentini, R.; Verma, S.; Vesala, T.; Wilson, K.; Wofsy, S. FLUXNET: A new tool to study the temporal and spatial variability of ecosystem-scale carbon dioxide, water vapor, and energy flux densities. Bull. Amer. Meteorol. Soc. 2001, 82, 2415–2434.
[17]
Gamon, J.; Rahman, A.; Dungan, J.; Schildhauer, M.; Huemmrich, K. Spectral Network (SpecNet)-What is it and why do we need it? Remote Sens. Environ. 2006, 103, 227–235.
[18]
Balzarolo, M.; Anderson, K.; Nichol, C.; Rossini, M.; Vescovo, L.; Arriga, N.; Wohlfahrt, G.; Calvet, J.C.; Carrara, A.; Cerasoli, S.; et al. Ground-based optical measurements at european flux sites: A review of methods, instruments and current controversies. Sensors 2011, 11, 7954–7981.
[19]
Livingston, M.A.; Sebastian, J.; Ai, Z.; Decker, J.W. Performance Measurements for the Microsoft Kinect Skeleton. Proceedings of the IEEE Virtual Reality Short Papers and Posters, Costa Mesa, CA, USA, 4–8 March 2012; pp. 119–120.
[20]
Khoshelham, K. Accuracy Analysis of Kinect Depth Data. Proceedings of the International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Calgary, Canada, 29–31 August 2011.
[21]
Freedman, B.; Shpunt, A.; Meir, M.; Yoel, A. Depth Mapping Using Projected Patterns. US Patent 8,150,142, 6 September 2007.
[22]
Rusu, R. 3D is Here: Point Cloud Library (PCL). Proceedings of the IEEE International Conference on Robotics and Automation, Shanghai, China, 9–13 May 2011; pp. 1–4.
[23]
Izadi, S.; Kim, D.; Hilliges, O. KinectFusion: Real-Time 3D Reconstruction and Interaction Using a Moving Depth Camera. Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology, Santa Barbara, CA, USA, 16–19 October 2011.
[24]
De Berg, M.; Cheong, O.; Van Kreveld, M.; Overmars, M. Computational Geometry. Algorithms and Applications, 3rd ed.; Springer-Verlag: Heidelberg, Germany, 2008.
[25]
Richardson, A.; Jenkins, J. Use of digital webcam images to track spring green-up in a deciduous broadleaf forest. Oecologia 2007, 152, 323–334.
[26]
Fitzgerald, G.J. Characterizing vegetation indices derived from active and passive sensors. Int. J. Remote Sens. 2010, 31, 4335–4348.
[27]
Puttonen, E.; Suomalainen, J.; Hakala, T.; R?ikk?nen, E.; Kaartinen, H.; Kaasalainen, S.; Litkey, P. Tree species classification from fused active hyperspectral reflectance and LIDAR measurements. Forest Ecol. Manage. 2010, 260, 1843–1852.
[28]
Dandois, J.P.; Ellis, E.C. Remote sensing of vegetation structure using computer vision. Remote Sens. 2010, 2, 1157–1176.
[29]
Wallace, L.; Lucieer, A.; Watson, C.; Turner, D. Development of a UAV-LiDAR system with application to forest inventory. Remote Sens. 2012, 4, 1519–1543.
[30]
Rosnell, T.; Honkavaara, E. Point cloud generation from aerial image data acquired by a quadrocopter type micro unmanned aerial vehicle and a digital still camera. Sensors 2012, 12, 453–480.