Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Roland J. Arsenault is active.

Publication


Featured researches published by Roland J. Arsenault.


human factors in computing systems | 2000

Eye-hand co-ordination with force feedback

Roland J. Arsenault; Colin Ware

The term Eye-hand co-ordination refers to hand movements controlled with visual feedback and reinforced by hand contact with objects. A correct perspective view of a virtual environment enables normal eye-hand co-ordination skills to be applied. But is it necessary for rapid interaction with 3D objects? A study of rapid hand movements is reported using an apparatus designed so that the user can touch a virtual object in the same place where he or she sees it. A Fitts tapping task is used to assess the effect of both contact with virtual objects and real-time update of the centre of perspective based on the users actual eye position. A Polhemus tracker is used to measure the users head position and from this estimate their eye position. In half of the conditions, head tracked perspective is employed so that visual feedback is accurate while in the other half a fixed eye-position is assumed. A Phantom force feedback device is used to make it possible to touch the targets in selected conditions. Subjects were required to change their viewing position periodically to assess the importance of correct perspective and of touching the targets in maintaining eye-hand co-ordination, The results show that accurate perspective improves performance by an average of 9% and contact improves it a further 12%. A more detailed analysis shows the advantages of head tracking to be greater for whole arm movements in comparison with movements from the elbow.


Biology Letters | 2007

'Megapclicks': acoustic click trains and buzzes produced during night-time foraging of humpback whales (Megaptera novaeangliae)

Alison K. Stimpert; David N. Wiley; Whitlow W. L. Au; Mark Johnson; Roland J. Arsenault

Humpback whales (Megaptera novaeangliae) exhibit a variety of foraging behaviours, but neither they nor any baleen whale are known to produce broadband clicks in association with feeding, as do many odontocetes. We recorded underwater behaviour of humpback whales in a northwest Atlantic feeding area using suction-cup attached, multi-sensor, acoustic tags (DTAGs). Here we describe the first recordings of click production associated with underwater lunges from baleen whales. Recordings of over 34 000 ‘megapclicks’ from two whales indicated relatively low received levels at the tag (between 143 and 154 dB re 1 μPa pp), most energy below 2 kHz, and interclick intervals often decreasing towards the end of click trains to form a buzz. All clicks were recorded during night-time hours. Sharp body rolls also occurred at the end of click bouts containing buzzes, suggesting feeding events. This acoustic behaviour seems to form part of a night-time feeding tactic for humpbacks and also expands the known acoustic repertoire of baleen whales in general.


IEEE Computer Graphics and Applications | 2006

Visualizing the underwater behavior of humpback whales

Colin Ware; Roland J. Arsenault; Matthew D Plumlee; David N. Wiley

A new collaboration between visualization experts, engineers, and marine biologists has changed. For the first time, we can see and study the foraging behavior of humpback whales. Our studys primary objective was furthering the science of marine mammal ethology. We also had a second objective: field testing GeoZui4D, an innovative test-bench for investigate effective ways of navigating through time-varying geospatial data


oceans conference | 2001

GeoZui3D: data fusion for interpreting oceanographic data

Colin Ware; Matthew D Plumlee; Roland J. Arsenault; Larry A. Mayer; Shep Smith

GeoZui3D stands for geographic zooming user interface. It is a new visualization software system designed for interpreting multiple sources of 3D data. The system supports gridded terrain models, triangular meshes, curtain plots, and a number of other display objects. A novel center of workspace interaction method unifies a number of aspects of the interface. It creates a simple viewpoint control method, it helps link multiple views, and is ideal for stereoscopic viewing. GeoZui3D has a number of features to support real-time input. Through a CORBA interface external entities can influence the position and state of objects in the display. Extra windows can be attached to moving objects allowing for their position and data to be monitored. We describe the application of this system for heterogeneous data fusion, for multibeam QC and for ROV/AUV monitoring.


applied perception in graphics and visualization | 2004

Frames of reference in virtual object rotation

Colin Ware; Roland J. Arsenault

It is difficult with most current computer interfaces to rotate a virtual object so that it matches the orientation of another virtual object. Times to perform this simple task can exceed 20 seconds whereas the same kind of rotation can be accomplished with real objects and with some VR interfaces in less than two seconds. In many advanced 3D user interfaces, the hand manipulating a virtual object is not in the same place as the object being manipulated. The available evidence suggests that this is not usually a significant problem for manipulations requiring translations of virtual objects, but it is when rotations are required. We hypothesize that the problems may be caused by frame of reference effects---mismatches between the visual frame of reference and the haptic frame of reference. Here we report two experiments designed to study interactions between visual and haptic reference frames space.In our first study we investigated the effect of rotating the frame of the controller with respect to the frame of the object being rotated. We measured a broad U--shaped relationship. Subjects could tolerate quite large mismatches, but when the orientation mismatch approached 90 degrees performance deteriorated rapidly by up to a factor of 5. In our second experiment we manipulated both rotational and translational correspondence between visual and haptic frames of reference. We predicted that the haptic reference frame might rotate in egocentric coordinates when the input device was in a different location than the virtual object. The experimental results showed a change in the direction predicted; they are consistent with a rotation of the haptic frame of reference, although only by about half the magnitude predicted. Implications for the design of control devices are discussed.


international conference on robotics and automation | 2005

A System for Real-Time Spatio-Temporal 3-D Data Visualization in Underwater Robotic Exploration

Stephen C. Martin; Louis L. Whitcomb; Roland J. Arsenault; Matthew D Plumlee; Colin Ware

This paper reports the development of a real-time human-computer interface (HCI) system that enables a human operator to more effectively utilize the large volume of quantitative data (navigation, scientific, and vehicle status) generated in real-time by the sensor suites of underwater ro botic vehicles. The system provides interactive 3-D graphical interfaces that display, under user control, the quantitative spatial and temporal sensor data presently available to pilots and users only as two-dimensional plots and numerical dis plays. The system can presently display real-time bathymetric renderings of the sea-floor based upon vehicle navigation and sonar sensor data; vehicle trajectory data; a variety of scalar valued sensor data; and geo-referenced targets and waypoints. We report the accuracy of the real-time navigation and bathymetric sonar data processing by comparing the real-time sonar bathymetry of our test tank floor to a high-resolution laser scan of the same tank floor. The real-time sonar bathymetry is shown to compare favorably to the laser scan data


Human Factors | 2012

Target Finding With a Spatially Aware Handheld Chart Display

Colin Ware; Roland J. Arsenault

Objective: The objective was to evaluate the use of a spatially aware handheld chart display in a comparison with a track-up fixed display configuration and to investigate how cognitive strategies vary when performing the task of matching chart symbols with environmental features under different display geometries and task constraints. Background: Small-screen devices containing both accelerometers and magnetometers support the development of spatially aware handheld maps. These can be designed so that symbols representing targets in the external world appear in a perspective view determined by the orientation of the device. Method: A panoramic display was used to simulate a marine environment. The task involved matching targets in the scene to symbols on simulated chart displays. In Experiment 1, a spatially aware handheld chart display was compared to a fixed track-up chart display. In Experiment 2, a gaze monitoring system was added and the distance between the chart display and the scene viewpoint was varied. Results: All respondents were faster with the handheld device. Novices were much more accurate with the handheld device. People allocated their gaze very differently if they had to move between a map display and a view of the environment. Conclusion: There may be important benefits to spatially aware handheld displays in reducing errors relating to common navigation tasks. Application: Both the difficulty of spatial transformations and the allocation of attention should be considered in the design of chart displays.


Journal of the Acoustical Society of America | 2006

Megapclicks: Tagged humpback whales produce click trains and buzzes during foraging

Alison K. Stimpert; Whitlow W. L. Au; Mark Johnson; David N. Wiley; Roland J. Arsenault

Mysticetes and odontocetes differ in many morphological and behavioral aspects related to feeding. Odontocetes echolocate using clicks to localize prey, while the mechanism through which mysticetes orient and find food is unknown. In this study, DTAGs were deployed on four humpback whales (Megaptera novaeangliae) observed feeding off the northeastern United States. From two of the four tagged whales, bouts of clicks produced with a temporal pattern resembling odontocete echolocation were recorded and 102 bouts of megapclicks were identified. Spectrally, megapclicks were unlike previously reported humpback sounds or odontocete clicks, and also had much lower amplitudes (peak frequency ∼1700 Hz, RL 148±6 dB Re: 1 uPa peak). Interclick intervals (ICI) were bimodally distributed (peaks at 25 and 80 ms), with shorter ICIs occurring at the end of megapclick bouts, in a pattern similar to an odontocete terminal buzz used in foraging. Behavioral sensor data supported a possible foraging function for megapclicks. ...


Journal of the Acoustical Society of America | 2007

Whale tracking underwater: High frequency acoustic pingers and the instrumented tag (DTAG)

Val E. Schmidt; Thomas C. Weber; Colin Ware; Roland J. Arsenault; David N. Wiley; Mark Johnson; Erik Dawe; Ari S. Friedlaender

Since 2004, scientists have been tagging and tracking humpback whales in Stellwagen Bank National Marine Sanctuary to better understand their behavior. Stellwagen Bank is a shoal area east of Boston and north of Cape Cod, MA where many species of baleen whale feed during the summer months. Instrumented tags (DTAGs) are suction‐cupped to the whales back from a RHIB. DTAGS, developed at WHOI, record whale pitch, roll, and heading, 3‐D acceleration, depth, and sound for up to 20 h. A pseudotrack for the tagged whale can be generated using visual fixes at the surface and dead‐reckoning while the whale is underwater. During extended dives, the solution is expected to exhibit substantial drift, placing limits on the ability to understand feeding behavior, mother‐calf interactions, etc. In order to develop higher accuracy whale tracks, three GPS‐positioned high‐frequency (25–32 kHz) acoustic pingers were deployed around tagged animals in July 2007. The pingers produce time‐encoded pulses from known positions, which are recorded along with whale vocalizations and ambient noise on the whale tag. Pulse arrival times from each pinger are converted into ranges from the known pinger locations to generate an underwater whale track. Results from this work will be presented.


Journal of the Acoustical Society of America | 2006

Linking sound production and dive behavior in feeding humpback whales

Alison K. Stimpert; Whitlow W. L. Au; David N. Wiley; Kenneth A Shorter; Kira Barton; Mark Johnson; Colin Ware; Roland J. Arsenault

Acoustic studies of baleen whales are becoming increasingly common. However, a minority of studies combines acoustic data with technologies that allow sound production to be placed in a detailed behavioral context. Noninvasive digital acoustic recording tags (DTAGS) were attached to humpback whales (Megaptera novaeangliae) on the western North Atlantic’s Great South Channel feeding grounds to study foraging and acoustic behavior. Acoustic records totaling 48.4 data hours from four attachments were aurally and automatically analyzed, and occurrences of several sound types were correlated with body orientation and dive behavior of the tagged subject. Whales produced a wide variety of sound types, which differed in depth and context of production. Long, tonal groans and moans (3‐s average duration, 500‐Hz peak frequency) occurred more frequently in upper portions of the water column, in contrast to broadband paired bursts (50‐ms average duration, 300‐Hz peak frequency), which appeared associated with bottom ...

Collaboration


Dive into the Roland J. Arsenault's collaboration.

Top Co-Authors

Avatar

Colin Ware

University of New Hampshire

View shared research outputs
Top Co-Authors

Avatar

Matthew D Plumlee

University of New Hampshire

View shared research outputs
Top Co-Authors

Avatar

David N. Wiley

National Oceanic and Atmospheric Administration

View shared research outputs
Top Co-Authors

Avatar

Larry A. Mayer

University of New Hampshire

View shared research outputs
Top Co-Authors

Avatar

Mark Johnson

University of St Andrews

View shared research outputs
Top Co-Authors

Avatar

Rick T Brennan

University of New Hampshire

View shared research outputs
Top Co-Authors

Avatar

Alison K. Stimpert

Moss Landing Marine Laboratories

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Kira Barton

University of Michigan

View shared research outputs
Top Co-Authors

Avatar

Lindsay Gee

University of New Hampshire

View shared research outputs
Researchain Logo
Decentralizing Knowledge