Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jonas Nygårds is active.

Publication


Featured researches published by Jonas Nygårds.


international conference on robotics and automation | 2008

C-SAM: Multi-Robot SLAM using square root information smoothing

Lars A. A. Andersson; Jonas Nygårds

This paper presents collaborative smoothing and mapping (C-SAM) as a viable approach to the multi-robot map- alignment problem. This method enables a team of robots to build joint maps with or without initial knowledge of their relative poses. To accomplish the simultaneous localization and mapping this method uses square root information smoothing (SRIS). In contrast to traditional extended Kalman filter (EKF) methods the smoothing does not exclude any information and is therefore also better equipped to deal with non-linear process and measurement models. The method proposed does not require the collaborative robots to have initial correspondence. The key contribution of this work is an optimal smoothing algorithm for merging maps that are created by different robots independently or in groups. The method not only joins maps from different robots, it also recovers the complete robot trajectory for each robot involved in the map joining. It is also shown how data association between duplicate features is done and how this reduces uncertainty in the complete map. Two simulated scenarios are presented where the C-SAM algorithm is applied on two individually created maps. One basically joins two maps resulting in a large map while the other shows a scenario where sensor extension is carried out.


international conference on information fusion | 2010

Crowd behavior analysis under cameras network fusion using probabilistic methods

Paulo Drews; João Quintas; Jorge Dias; Maria Andersson; Jonas Nygårds; Joakim Rydell

The use of cameras in surveillance is increasing in the last years due to the low cost of the sensor and the requirement by surveillance in public places. However, the manual analysis of this data is impracticable. Thus, automatic and robust methods to processing this high quantity of data are required. This paper proposes a framework to address this problem. The crowd analysis is achieved in camera networks information by using the optical flow. The Hidden Markov models and Bayesian Networks are compared to understand the agents behavior in the scene. The experimental results are obtained for several sequences where fight and robbery occurs. Results are promise in order to get an automatic system to find abnormal events.


intelligent robots and systems | 2006

Concurrent Path and Sensor Planning for a UAV - Towards an Information Based Approach Incorporating Models of Environment and Sensor

Per Skoglar; Jonas Nygårds; Morgan Ulvklo

In this paper we propose a framework for autonomous path and sensor planning of a UAV performing a surveillance and exploration mission. Concurrent path and sensor planning is a very challenging problem, since realistic models of platforms, sensors and environment are very complex and inherently non-linear. Our approach is one attempt to address these issues and formulate an optimization problem using tools from information theory


intelligent robots and systems | 2002

Robust and efficient tracking in image sequences using a Kalman filter and an affine motion model

Henrik Karlsson; Jonas Nygårds

This paper summarizes an already existing tracking algorithm and presents an approach to improve it by incorporating a Kalman filter. The aim is to develop a lightweight, robust real-time tracking system, to be used on an experimental geo-referenced camera platform. The result is a more robust, occlusion-resistant tracking algorithm exemplified by some tracking scenarios.


intelligent robots and systems | 1998

On covariances for fusing laser rangers and vision with sensors onboard a moving robot

Jonas Nygårds; Åke Wernersson

Consider a robot to measure or operate on man made objects randomly located in the workspace. The optronic sensing onboard the robot are a scanning range measuring time-of-flight laser and a CCD camera. The goal of the paper is to give explicit covariance matrices for the extracted geometric primitives in the surrounding workspace. Emphasis is on correlation properties of the stochastic error models during motion. Topics studied include: (i) covariance of Radon/Hough peaks for plane surfaces; (ii) covariances for the intersection of two planes; (iii) equations for combining vision features, plane surfaces and range discontinuities; and (iv) explicit equations of how the covariance matrices are transformed during the robot motion. Typical applications are; models for verification and updating of CAD-models when navigating inside buildings and industrial plants, and accumulating sensor readings for a telecommanded robot.


IFAC Proceedings Volumes | 1993

Change Detection in Natural Scenes Using Laser Range Measurements from a Mobile Robot

P.L. Klöör; P. Lunquist; P. Ohlsson; Jonas Nygårds; Åke Wernersson

Abstract The problem studied in this paper is to detect changes in both indoor scenes and natural outdoor scenes. The sensor is a scanning range measuring laser mounted on a mobile robot. The system should have the capability to detect when objects have been moved, have disappeared or when new objects have entered the area to be monitored. Also, the mobile robot must be able to navigate despite there are changes in the scene. The paper describes a method for simultaneous change detection and pose estimation. The algorithms are presently tested for outdoor scenes. The key tool is the distance transform


international conference on robotics and automation | 1992

On sensor feedback for gripping an object within prescribed posture tolerances

Åke Wernersson; Bengt Boberg; Bernt Nilsson; Jonas Nygårds; T. Rydberg

The problem addressed is sensor feedback for guiding a robot to the correct gripping point and to arrive within the prescribed tolerances. The solution is divided into three parts, the extraction of posture parameters from a range camera dynamic filtering to reduce spurious responses and a linear quadratic Gaussian designed feedback law to guide the robot to the correct gripping point. The prescribed tolerances were reached within a reasonable time. For transparent results a mobile was considered with three state variables. This is a first simple case toward a unified theory of tolerance controlled gripping.<<ETX>>


international conference on robotics and automation | 1996

Sensor motion planning with active uncertainty reduction: gripping and docking tasks

Bernt Nilsson; Jonas Nygårds

This paper deals with planning of motion for a robot approaching a box shaped object using a range sensor. The plan is given as the minimizing solution of a criterion approximating the expectation of the quadratic error for the final position. The expectation is calculated using Gaussian sum approximations for the distributions of the stochastic variables involved. An approximation of the information received by a range sensor measuring a box is also presented. The resulting criterion is minimized over the sensor positions by a gradient descent algorithm yielding sub optimal solutions. Even though the solutions are only sub optimal they display both probing and cautious behavior reflecting the uncertainties involved. As new observations become available the future sensor positions are replanned with the horizon reduced by one step, adjusting the solution with the updated estimated position and covariance.


international conference on indoor positioning and indoor navigation | 2014

Cooperative localization using a foot-mounted inertial navigation system and ultrawideband ranging

Fredrik Olsson; Jouni Rantakokko; Jonas Nygårds

This paper aims to evaluate the performance gains that can be obtained by introducing cooperative localization in an indoor firefighter localization system, through the use of scenario-based simulations. Robust and accurate indoor localization for firefighters is a problem that is not yet resolved. Foot-mounted inertial navigation systems (INS) have been examined for first responder localization, but they have an accumulating position error that grows over time. By using ultrawideband (UWB) ranging between the firefighters and combining range measurements with position and uncertainty estimates from the foot-mounted INS via a cooperative localization approach it is possible to reduce the position error significantly. An error model for the position estimates received from single and dual foot-mounted INS is proposed based on experimental results, and it contains a scaling error which depends on the distance travelled and a heading error which grows linearly over time. The position error for dead-reckoning systems depends on the type of movement. Similarly, an error model for the UWB range measurements was designed where the range measurements experience a bias and variance, which is determined by the number of walls between the transmitter and receiver. By implementing these error models in a scenario-based simulation environment it is possible to evaluate the performance gain of different cooperative localization algorithms. A centralized extended Kalman Filter (EKF) algorithm has been implemented, and the position accuracy and heading improvements are evaluated over a smoke diving operation scenario. The cooperative localization scheme reduces the position errors by up to 70% in a scenario where a three-person smoke diver team performs a search and rescue operation.


Airborne intelligence, surveillance, reconnaissance (ISR) systems and applications. Conference | 2004

Image processing and sensor management for autonomous UAV surveillance

Morgan Ulvklo; Jonas Nygårds; Jorgen M. Karlholm; Per Skoglar

This paper describes a framework for image processing and sensor management for an autonomous unmanned airborne surveillance system equipped with infrared and video sensors. Our working hypothesis is that integration of the detection-tracking-classification chain with spatial awareness makes possible intelligent autonomous data acquisition by means of active sensor control. A central part of the framework is a surveillance scene representation, suitable for target tracking, geolocation, and sensor data fusion involving multiple platforms. The representation, based on Simultaneous Localization and Mapping, SLAM, take into account uncertainties associated with sensor data, platform navigation, and prior knowledge. A client/server approach, for on-line adaptable surveillance missions, is introduced. The presented system is designed to simultaneously and autonomously perform the following tasks: provide wide area coverage from multiple viewpoints by means of a step-stare procedure, detect and track multiple stationary and moving ground targets, perform a detailed analysis of detected regions-of-interest, and generate precise target coordinates by means of multi-view geolocation techniques.

Collaboration


Dive into the Jonas Nygårds's collaboration.

Top Co-Authors

Avatar

Åke Wernersson

Luleå University of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Morgan Ulvklo

Swedish Defence Research Agency

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jouni Rantakokko

Swedish Defence Research Agency

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jorgen M. Karlholm

Swedish Defence Research Agency

View shared research outputs
Researchain Logo
Decentralizing Knowledge