Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Niko Viljanen is active.

Publication


Featured researches published by Niko Viljanen.


Remote Sensing | 2015

Using UAV-Based Photogrammetry and Hyperspectral Imaging for Mapping Bark Beetle Damage at Tree-Level

R. Näsi; Eija Honkavaara; Päivi Lyytikäinen-Saarenmaa; Minna Blomqvist; Paula Litkey; Teemu Hakala; Niko Viljanen; Tuula Kantola; Topi-Mikko Tapio Tanhuanpää; Markus Holopainen

Low-cost, miniaturized hyperspectral imaging technology is becoming available for small unmanned aerial vehicle (UAV) platforms. This technology can be efficient in carrying out small-area inspections of anomalous reflectance characteristics of trees at a very high level of detail. Increased frequency and intensity of insect induced forest disturbance has established a new demand for effective methods suitable in mapping and monitoring tasks. In this investigation, a novel miniaturized hyperspectral frame imaging sensor operating in the wavelength range of 500–900 nm was used to identify mature Norway spruce (Picea abies L. Karst.) trees suffering from infestation, representing a different outbreak phase, by the European spruce bark beetle (Ips typographus L.). We developed a new processing method for analyzing spectral characteristic for high spatial resolution photogrammetric and hyperspectral images in forested environments, as well as for identifying individual anomalous trees. The dense point clouds, measured using image matching, enabled detection of single trees with an accuracy of 74.7%. We classified the trees into classes of healthy, infested and dead, and the results were promising. The best results for the overall accuracy were 76% (Cohen’s kappa 0.60), when using three color classes (healthy, infested, dead). For two color classes (healthy, dead), the best overall accuracy was 90% (kappa 0.80). The survey methodology based on high-resolution hyperspectral imaging will be of a high practical value for forest health management, indicating a status of bark beetle outbreak in time.


Remote Sensing | 2017

Individual Tree Detection and Classification with UAV-Based Photogrammetric Point Clouds and Hyperspectral Imaging

Olli Nevalainen; Eija Honkavaara; Sakari Tuominen; Niko Viljanen; Teemu Hakala; Xiaowei Yu; Juha Hyyppä; Heikki Saari; Ilkka Pölönen; Nilton Nobuhiro Imai; Antonio Maria Garcia Tommaselli

Small unmanned aerial vehicle (UAV) based remote sensing is a rapidly evolving technology. Novel sensors and methods are entering the market, offering completely new possibilities to carry out remote sensing tasks. Three-dimensional (3D) hyperspectral remote sensing is a novel and powerful technology that has recently become available to small UAVs. This study investigated the performance of UAV-based photogrammetry and hyperspectral imaging in individual tree detection and tree species classification in boreal forests. Eleven test sites with 4151 reference trees representing various tree species and developmental stages were collected in June 2014 using a UAV remote sensing system equipped with a frame format hyperspectral camera and an RGB camera in highly variable weather conditions. Dense point clouds were measured photogrammetrically by automatic image matching using high resolution RGB images with a 5 cm point interval. Spectral features were obtained from the hyperspectral image blocks, the large radiometric variation of which was compensated for by using a novel approach based on radiometric block adjustment with the support of in-flight irradiance observations. Spectral and 3D point cloud features were used in the classification experiment with various classifiers. The best results were obtained with Random Forest and Multilayer Perceptron (MLP) which both gave 95% overall accuracies and an F-score of 0.93. Accuracy of individual tree identification from the photogrammetric point clouds varied between 40% and 95%, depending on the characteristics of the area. Challenges in reference measurements might also have reduced these numbers. Results were promising, indicating that hyperspectral 3D remote sensing was operational from a UAV platform even in very difficult conditions. These novel methods are expected to provide a powerful tool for automating various environmental close-range remote sensing tasks in the very near future.


IEEE Transactions on Geoscience and Remote Sensing | 2016

Remote Sensing of 3-D Geometry and Surface Moisture of a Peat Production Area Using Hyperspectral Frame Cameras in Visible to Short-Wave Infrared Spectral Ranges Onboard a Small Unmanned Airborne Vehicle (UAV)

Eija Honkavaara; Matti Eskelinen; Ilkka Pölönen; Heikki Saari; Harri Ojanen; Rami Mannila; Christer Holmlund; Teemu Hakala; Paula Litkey; Tomi Rosnell; Niko Viljanen; Merja Pulkkanen

Miniaturized hyperspectral imaging sensors are becoming available to small unmanned airborne vehicle (UAV) platforms. Imaging concepts based on frame format offer an attractive alternative to conventional hyperspectral pushbroom scanners because they enable enhanced processing and interpretation potential by allowing for acquisition of the 3-D geometry of the object and multiple object views together with the hyperspectral reflectance signatures. The objective of this investigation was to study the performance of novel visible and near-infrared (VNIR) and short-wave infrared (SWIR) hyperspectral frame cameras based on a tunable Fabry-Pérot interferometer (FPI) in measuring a 3-D digital surface model and the surface moisture of a peat production area. UAV image blocks were captured with ground sample distances (GSDs) of 15, 9.5, and 2.5 cm with the SWIR, VNIR, and consumer RGB cameras, respectively. Georeferencing showed consistent behavior, with accuracy levels better than GSD for the FPI cameras. The best accuracy in moisture estimation was obtained when using the reflectance difference of the SWIR band at 1246 nm and of the VNIR band at 859 nm, which gave a root mean square error (rmse) of 5.21 pp (pp is the mass fraction in percentage points) and a normalized rmse of 7.61%. The results are encouraging, indicating that UAV-based remote sensing could significantly improve the efficiency and environmental safety aspects of peat production.


Remote Sensing | 2018

Assessment of Classifiers and Remote Sensing Features of Hyperspectral Imagery and Stereo-Photogrammetric Point Clouds for Recognition of Tree Species in a Forest Area of High Species Diversity

Sakari Tuominen; R. Näsi; Eija Honkavaara; Andras Balazs; Teemu Hakala; Niko Viljanen; Ilkka Pölönen; Heikki Saari; Harri Ojanen

Recognition of tree species and geospatial information on tree species composition is essential for forest management. In this study, tree species recognition was examined using hyperspectral imagery from visible to near-infrared (VNIR) and short-wave infrared (SWIR) camera sensors in combination with a 3D photogrammetric canopy surface model based on RGB camera stereo-imagery. An arboretum with a diverse selection of 26 tree species from 14 genera was used as a test area. Aerial hyperspectral imagery and high spatial resolution photogrammetric color imagery were acquired from the test area using unmanned aerial vehicle (UAV) borne sensors. Hyperspectral imagery was processed to calibrated reflectance mosaics and was tested along with the mosaics based on original image digital number values (DN). Two alternative classifiers, a k nearest neighbor method (k-nn), combined with a genetic algorithm and a random forest method, were tested for predicting the tree species and genus, as well as for selecting an optimal set of remote sensing features for this task. The combination of VNIR, SWIR, and 3D features performed better than any of the data sets individually. Furthermore, the calibrated reflectance values performed better compared to uncorrected DN values. These trends were similar with both tested classifiers. Of the classifiers, the k-nn combined with the genetic algorithm provided consistently better results than the random forest algorithm. The best result was thus achieved using calibrated reflectance features from VNIR and SWIR imagery together with 3D point cloud features; the proportion of correctly-classified trees was 0.823 for tree species and 0.869 for tree genus.


Sensors | 2018

Direct Reflectance Measurements from Drones: Sensor Absolute Radiometric Calibration and System Tests for Forest Reflectance Characterization

Teemu Hakala; Lauri Markelin; Eija Honkavaara; Barry Scott; Theo Theocharous; Olli Nevalainen; R. Näsi; Juha Suomalainen; Niko Viljanen; Claire Greenwell; Nigel P. Fox

Drone-based remote sensing has evolved rapidly in recent years. Miniaturized hyperspectral imaging sensors are becoming more common as they provide more abundant information of the object compared to traditional cameras. Reflectance is a physically defined object property and therefore often preferred output of the remote sensing data capture to be used in the further processes. Absolute calibration of the sensor provides a possibility for physical modelling of the imaging process and enables efficient procedures for reflectance correction. Our objective is to develop a method for direct reflectance measurements for drone-based remote sensing. It is based on an imaging spectrometer and irradiance spectrometer. This approach is highly attractive for many practical applications as it does not require in situ reflectance panels for converting the sensor radiance to ground reflectance factors. We performed SI-traceable spectral and radiance calibration of a tuneable Fabry-Pérot Interferometer -based (FPI) hyperspectral camera at the National Physical Laboratory NPL (Teddington, UK). The camera represents novel technology by collecting 2D format hyperspectral image cubes using time sequential spectral scanning principle. The radiance accuracy of different channels varied between ±4% when evaluated using independent test data, and linearity of the camera response was on average 0.9994. The spectral response calibration showed side peaks on several channels that were due to the multiple orders of interference of the FPI. The drone-based direct reflectance measurement system showed promising results with imagery collected over Wytham Forest (Oxford, UK).


Remote Sensing | 2018

Estimating Biomass and Nitrogen Amount of Barley and Grass Using UAV and Aircraft Based Spectral and Photogrammetric 3D Features

R. Näsi; Niko Viljanen; Jere Kaivosoja; Katja Alhonoja; Teemu Hakala; Lauri Markelin; Eija Honkavaara

The timely estimation of crop biomass and nitrogen content is a crucial step in various tasks in precision agriculture, for example in fertilization optimization. Remote sensing using drones and aircrafts offers a feasible tool to carry out this task. Our objective was to develop and assess a methodology for crop biomass and nitrogen estimation, integrating spectral and 3D features that can be extracted using airborne miniaturized multispectral, hyperspectral and colour (RGB) cameras. We used the Random Forest (RF) as the estimator, and in addition Simple Linear Regression (SLR) was used to validate the consistency of the RF results. The method was assessed with empirical datasets captured of a barley field and a grass silage trial site using a hyperspectral camera based on the Fabry-Pérot interferometer (FPI) and a regular RGB camera onboard a drone and an aircraft. Agricultural reference measurements included fresh yield (FY), dry matter yield (DMY) and amount of nitrogen. In DMY estimation of barley, the Pearson Correlation Coefficient (PCC) and the normalized Root Mean Square Error (RMSE%) were at best 0.95% and 33.2%, respectively; and in the grass DMY estimation, the best results were 0.79% and 1.9%, respectively. In the nitrogen amount estimations of barley, the PCC and RMSE% were at best 0.97% and 21.6%, respectively. In the biomass estimation, the best results were obtained when integrating hyperspectral and 3D features, but the integration of RGB images and 3D features also provided results that were almost as good. In nitrogen content estimation, the hyperspectral camera gave the best results. We concluded that the integration of spectral and high spatial resolution 3D features and radiometric calibration was necessary to optimize the accuracy.


Urban Forestry & Urban Greening | 2018

Remote sensing of bark beetle damage in urban forests at individual tree level using a novel hyperspectral camera from UAV and aircraft

R. Näsi; Eija Honkavaara; Minna Blomqvist; Päivi Lyytikäinen-Saarenmaa; Teemu Hakala; Niko Viljanen; Tuula Kantola; Markus Holopainen


ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences | 2014

Autonomous hyperspectral UAS photogrammetry for environmental monitoring applications

Eija Honkavaara; Teemu Hakala; L. Markelin; Anttoni Jaakkola; Heikki Saari; Harri Ojanen; Ilkka Pölönen; Sakari Tuominen; R. Näsi; Tomi Rosnell; Niko Viljanen


ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences | 2016

UAS based tree species identification using the novel FPI based hyperspectral cameras in visible, NIR and SWIR spectral ranges

R. Näsi; Eija Honkavaara; Sakari Tuominen; Heikki Saari; Ilkka Pölönen; Teemu Hakala; Niko Viljanen; J. Soukkamäki; I. Näkki; Harri Ojanen; J. Reinikainen


Silva Fennica | 2017

Hyperspectral UAV-imagery and photogrammetric canopy height model in estimating forest stand variables

Sakari Tuominen; Andras Balazs; Eija Honkavaara; Ilkka Pölönen; Heikki Saari; Teemu Hakala; Niko Viljanen

Collaboration


Dive into the Niko Viljanen's collaboration.

Top Co-Authors

Avatar

Eija Honkavaara

Finnish Geodetic Institute

View shared research outputs
Top Co-Authors

Avatar

Teemu Hakala

Finnish Geodetic Institute

View shared research outputs
Top Co-Authors

Avatar

R. Näsi

Finnish Geodetic Institute

View shared research outputs
Top Co-Authors

Avatar

Heikki Saari

VTT Technical Research Centre of Finland

View shared research outputs
Top Co-Authors

Avatar

Ilkka Pölönen

Information Technology University

View shared research outputs
Top Co-Authors

Avatar

Harri Ojanen

VTT Technical Research Centre of Finland

View shared research outputs
Top Co-Authors

Avatar

Lauri Markelin

Finnish Geodetic Institute

View shared research outputs
Top Co-Authors

Avatar

Sakari Tuominen

Finnish Forest Research Institute

View shared research outputs
Top Co-Authors

Avatar

Jere Kaivosoja

VTT Technical Research Centre of Finland

View shared research outputs
Top Co-Authors

Avatar

Tomi Rosnell

Finnish Geodetic Institute

View shared research outputs
Researchain Logo
Decentralizing Knowledge