Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jens Arnspang is active.

Publication


Featured researches published by Jens Arnspang.


international conference on pattern recognition | 2000

Mirror-based trinocular systems in robot-vision

Brian Kirstein Ramsgaard; Ivar Balslev; Jens Arnspang

A new variant of multi-ocular stereo vision has been developed. The system involves a single camera and two orthogonal planar mirrors. The resulting device is a low-cost, compact sensor, particularly suitable for depth determination in robot vision applications. The motivation for the work is the need for a sensor determining spatial coordinates of a robot tool and the object to be processed. The system inherently possess fewer calibration parameters and provides a higher accuracy in the depth determination than traditional two-camera stereo systems. A prototype of the new device was built, and test results are presented.


computer analysis of images and patterns | 1995

Using Mirror Cameras for Estimating Depth

Jens Arnspang; Henrik Nielsen; Morten H. Christensen; Knud Henriksen

A mirror camera arrangement attached to a conventional perspective camera is suggested for the purpose of computing depth of spatial points. The arrangement simulates a multi camera set up, where all internal parameters are equal and a common cyclopedian system is well defined. A linear system, from which spatial point coordinates may be determined, is derived. Computational experiments with actual mirror camera data are reported. Average relative error on estimated depth values was 0.6 %.


Image and Vision Computing | 1993

Motion constraint equations based on constant image irradiance

Jens Arnspang

Abstract From the basic assumption of constant image irradiance used by Horn and Schunck to derive the classic motion constraint equation, infinite sets of independent motion constraint equations are derived, relating image irradiance with limited order optic flow. Alternatives to the classic motion constraint equation are, furthermore, pointed out, relating optic flow with length of iso-irradiance contours, area and volume measures and their partial derivatives.


Pattern Recognition Letters | 1989

Image irradiance equations for a zooming camera

Jens Arnspang; Jun Ma

Abstract Image irradiance equations for a camera, zooming onto a static surface, are discussed. Surface range may be determined directly from image irradiance. The classic motion constraint equation is not valid during zooming; its exact and approximate extension are discussed.


computer music modeling and retrieval | 2003

Extraction of structural patterns in popular melodies

Esben Skovenborg; Jens Arnspang

A method is presented to extract musical features from melodic material. Various viewpoints are defined to focus on complementary aspects of the material. To model the melodic context, two measures of entropy are employed: A set of trained probabilistic models capture local structures via the information-theoretic notion of unpredictability, and an alternative entropy-measure based on adaptive coding is developed to reflect phrasing or motifs. A collection of popular music, in the form of MIDI-files, is analysed using the entropy-measures and techniques from pattern-recognition. To visualise the topology of the ‘tune-space’, a self-organising map is trained with the extracted feature-parameters, leading to the Tune Map.


computer analysis of images and patterns | 1999

Relating Scene Depth to Image Ratios

Jens Arnspang; Knud Henriksen; Fredrik Bergholm

An alternative to the classic depth from stereo disparities is presented. In this new approach two scene points with different and finite depths are viewed by two identical cameras with parallel optic axes. Both the case of frontal views and of side views, where both cameras are rotated by the same angle is addressed. A simple construction for the vanishing point of the line connecting the two scene points is presented. Both for frontal views and side views it is shown that the relative scene depth of two points equals the reciprocal of the ratio of the image plane distances from the image points to the vanishing point of the line they define. For side views it is furthermore shown how the lens plane separation and ratios of image plane distances to vanishing points directly determine the absolute depths to the scene points. Neither camera focal length, image plane optic center, image coordinate scale nor coordinate disparities are used in the calculations of the absolute scene depths.


computer analysis of images and patterns | 1995

Estimating Time to Contact with Curves, avoiding Calibration and Aperture Problem

Jens Arnspang; Knud Henriksen; Robert Stahr

A set of simple time to contact estimators are derived, using isolated points or curve segments. For this purpose the use of both optic flow and optic acceleration is suggested. For curves it is pointed out, that there is no aperture problem present, since normal flow and acceleration of the curve segment is sufficient for estimating time to contact. Time to contact with a curve segment may be calculated without calibrating camera focal length and camera coordinate system, without computing spatial velocity and depth maps and without computing the complete optic flow field for the curve segment. Computational illustrations with actual camera data are reported.


Pattern Recognition Letters | 1991

On the use of time varying shading and surface rim irradiance

Jens Arnspang

Abstract The question of using time varying image irradiance, produced by a static object and a distant moving light source, and the question of using image irradiance of the visual rim, are addressed. Motion constraint equations relating image irradiance with light source position and motion, and with surface orientation and visual surface curvature, are derived. Possible linear determination schemes are pointed out.


Pattern Recognition Letters | 1989

On the use of the horizon of a translating planar curve

Jens Arnspang

Abstract Translation of a planar curve is considered. Using the horizon, orientation of curve plane and any curve tangent may be determined from one image, optic flow and focus of expansion from two images, spatial velocity and position from three images.


[1989] Proceedings. Workshop on Visual Motion | 1989

Moving towards the horizon of a planar curve

Jens Arnspang

The use of the horizon of a planar and convex curve has been discussed. The spatial orientation of the curve plane and of any curve tangent may be determined unambiguously from image coordinates in one image. Curve points may be matched geometrically from two images of an image sequence, produced by translation or acceleration, which does not produce rotation. The focus of expansion during translation may be constructed geometrically from two images. Spatial position and velocity of a curve point may be determined unscaled, if the spatial acceleration is known and the optic flow and optic acceleration are nonzero and nonaligned. Any of these determination schemes are unambiguous and involve very few calculations. Algorithms have been suggested for the determination of the horizon from an image sequence, where the horizon is not visible.<<ETX>>

Collaboration


Dive into the Jens Arnspang's collaboration.

Top Co-Authors

Avatar

Knud Henriksen

University of Copenhagen

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Cynthia M. Grund

University of Southern Denmark

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Henrik Nielsen

University of Copenhagen

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jun Ma

University of Copenhagen

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge