Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Michal Bujacz is active.

Publication


Featured researches published by Michal Bujacz.


conference on human system interactions | 2008

Remote guidance for the blind — A proposed teleassistance system and navigation trials

Michal Bujacz; P. Barański; M. Moranski; Pawel Strumillo; Andrzej Materka

The paper presents initial research on the system for remote guidance of the blind. The concept is based on the idea that a blind pedestrian can be aided by spoken instructions from an operator who receives a video stream from a camera carried by the visually impaired user. An early prototype utilizing two laptop PCs and a wireless Internet connection is used in orientation and mobility trials, which aim to measure the potential usefulness of the system and discover possible problems with user-operator communication or device design. Test results show a quantitative performance increase when traveling with a remote guide: 15-50% speed increase and nearly halved times of navigational tasks; however, the main success is the engendered feeling of safety when assisted and the enthusiasm with which the concept was welcomed by blind trial participants.


Archive | 2011

Sonification of 3D Scenes in an Electronic Travel Aid for the Blind

Michal Bujacz; Michal Pec; Piotr Skulimowski; Pawel Strumillo; Andrzej Materka

Sight, hearing and touch are the sensory modalities that play a dominating role in spatial perception in humans, i.e. the ability to recognize the geometrical structure of the surrounding environment, awareness of self-location in surrounding space and determining in terms of depth and directions the location of nearby objects. Information streams from these senses are continuously integrated and processed in the brain, so that a cognitive representation of the 3D environment can be accurately built whether stationary or in movement. Each of the three senses uses different cues for exploring the environment and features a different perception range (Hall, 1966). Touch provides information on the so called near space (termed also haptic space), whereas vision and hearing are capable of yielding percepts representing objects or events in the so called far space. Spatial orientation in terms of locating scene elements is the key capability allowing humans to interact with the surrounding environment, e.g. reaching objects, avoiding obstacles, wayfinding (Gollage, 1999) and determining own location with respect to the environment. An important aspect of locating objects in 3D space is the integration of percepts coming from different senses. Understanding distance to objects (depth perception) has been possible by concurrent binocular seeing and touching experience of near space objects (Millar, 1994). For locating and recognition of far space objects, vision and hearing cooperate in order to determine distance, bearings and the type of objects. The field of view of vision is limited to the space in front of the observer whereas hearing is ominidirectional and sound sources can be located even if occluded by other objects. Correct reproduction of sensory stimuli is important in virtual reality systems in which 3D vision based technologies are predominantly employed for creating immersive artificial environments. Many applications can greatly benefit from building acoustic 3D spaces (e.g. operators of complex control panels, in-field communication of combating soldiers or firemen). If such spaces are appropriately synthesized, perception capacity and immersion in the environment can be considerably enhanced (Castro, 2006). It has been also evidenced that if spatial instead of monophonic sounds are applied, the reaction time to acoustic stimuli becomes shorter and the listener is less prone to fatigue (Moore, 2004). Because of the enriched acoustic experience such devices offer (e.g. spaciousness and interactivity) they are frequently termed auditory display systems. Recently, such systems gain also in importance


Photonics Applications in Astronomy, Communications, Industry, and High-Energy Physics Experiments 2007 | 2007

Head related transfer functions measurement and processing for the purpose of creating a spatial sound environment

Michal Pec; Michal Bujacz; Pawel Strumillo

The use of Head Related Transfer Functions (HRTFs) in audio processing is a popular method of obtaining spatialized sound. HRTFs describe disturbances caused in the sound wave by the human body, especially by head and the ear pinnae. Since these shapes are unique, HRTFs differ greatly from person to person. For this reason measurement of personalized HRTFs is justified. Measured HRTFs also need further processing to be utilized in a system producing spatialized sound. This paper describes a system designed for efficient collecting of Head Related Transfer Functions as well as the measurement, interpolation and verification procedures.


international conference on computers helping people with special needs | 2016

Sound of Vision - Spatial Audio Output and Sonification Approaches

Michal Bujacz; Karol Kropidlowski; Gabriel Ivanica; Alin Moldoveanu; Charalampos Saitis; Adam B. Csapo; György Wersényi; Simone Spagnol; Ómar I. Jóhannesson; Runar Unnthorsson; Mikolai Rotnicki; Piotr Witek

The paper summarizes a number of audio-related studies conducted by the Sound of Vision consortium, which focuses on the construction of a new prototype electronic travel aid for the blind. Different solutions for spatial audio were compared by testing sound localization accuracy in a number of setups, comparing plain stereo panning with generic and individual HRTFs, as well as testing different types of stereo headphones vs custom designed quadrophonic proximaural headphones. A number of proposed sonification approaches were tested by sighted and blind volunteers for accuracy and efficiency in representing simple virtual environments.


Photonics Applications in Astronomy, Communications, Industry, and High-Energy Physics Experiments 2009 | 2009

Dead reckoning navigation: supplementing pedestrian GPS with an accelerometer-based pedometer and an electronic compass

P. Barański; Michal Bujacz; Pawel Strumillo

The article presents a prototype wearable device that corrects inaccurate GPS readouts during pedestrian travel. The electronic circuit consists of a microcontroller, an accelerometer and a digital compass. The accelerometer readouts are filtered to detect the steps of the pedestrian and are also used to estimate the stride length. The digital compass provides the direction of motion. When the GPS parameters warn of a high dilution of precision, the location of the pedestrian is corrected by data provided by the accelerometer and the digital compass.


international conference on signals and electronic systems | 2008

Individual HRTF measurements for accurate obstacle sonification in an electronic travel aid for the blind

Michal Pec; Michal Bujacz; Pawel Strumillo; Andrzej Materka

The article presents a study of virtual sound source localization errors with the use of personalized head related transfer functions (HRTFs) in light of design of an electronic travel aid for the blind. The proposed device for aiding visually disabled individuals in independent mobility requires presentation of spatial sounds, which lead to the need to construct a system for efficient individual HRTF measurement. Measurements were performed for 15 sighted and blind individuals. Verification trials limited to the frontal hemisphere have shown that the localization of virtual sound sources can be performed with accuracy reaching average errors of 6.36deg in azimuth and 9.47deg in elevation.


Wireless Communications and Mobile Computing | 2018

Current Use and Future Perspectives of Spatial Audio Technologies in Electronic Travel Aids

Simone Spagnol; György Wersényi; Michal Bujacz; Oana Bălan; Marcelo Herrera Martínez; Alin Moldoveanu; Runar Unnthorsson

Electronic travel aids (ETAs) have been in focus since technology allowed designing relatively small, light, and mobile devices for assisting the visually impaired. Since visually impaired persons rely on spatial audio cues as their primary sense of orientation, providing an accurate virtual auditory representation of the environment is essential. This paper gives an overview of the current state of spatial audio technologies that can be incorporated in ETAs, with a focus on user requirements. Most currently available ETAs either fail to address user requirements or underestimate the potential of spatial sound itself, which may explain, among other reasons, why no single ETA has gained a widespread acceptance in the blind community. We believe there is ample space for applying the technologies presented in this paper, with the aim of progressively bridging the gap between accessibility and accuracy of spatial audio in ETAs.


Archive | 2018

Different Approaches to Aiding Blind Persons in Mobility and Navigation in the “Naviton” and “Sound of Vision” Projects

Pawel Strumillo; Michal Bujacz; P. Baranski; Piotr Skulimowski; Piotr Korbel; Mateusz Owczarek; K. Tomalczyk; Alin Moldoveanu; Runar Unnthorsson

In this chapter, we summarize several years of research efforts aimed at building ICT (Information and Communications Technologies) based systems for aiding the blind in travel and navigation at the Lodz University of Technology, mainly as part of the “Naviton” project (http://www.naviton.pl), from a Polish Ministry of Higher Education grant. We report on different approaches we undertook in these challenging goals which comprise the following prototype solutions: (1) sonified stereovision system for obstacle avoidance and environment imaging, (2) radio beacons for local navigation, (3) remote assistance system, (4) mobile navigation applications, (5) real-time tracking of public transport vehicles, (6) haptic imaging . We shortly describe these technologies and discuss user feedback from the trials of these technological aids. Finally, we point out the key objectives and first results of a Horizon 2020 project entitled “Sound of Vision: natural sense of vision through acoustics and haptics” (http://www.soundofvision.net) that started in 2015.


Photonics Applications in Astronomy, Communications, Industry, and High-Energy Physics Experiments 2007 | 2007

Synthesizing a 3D auditory scene for use in an electronic travel aid for the blind

Michal Bujacz; Pawel Strumillo

A system for auditory presentation of 3D scenes to the blind is presented, with the focus of the paper on the synthesis of sound codes suitable to carry important scene information. First, a short review of existing electronic travel aids for the blind (ETAs) is provided. Second, the project of the wearable ETA device, currently under development at the Technical University of Lodz, is outlined, along with the system modules: 3D scene reconstruction, object (obstacle) selection, synthesis of the sound code and the application of head related transfer functions (HRTFs) for generating spatialized sound. The importance of psychoacoustics, especially Bregmans theory of sound streams, is analyzed and proposed methods of sound code synthesis are presented, along with the software used for their verification.


international conference on computers helping people with special needs | 2018

EchoVis: Training Echolocation Using Binaural Recordings – Initial Benchmark Results

Michal Bujacz; Marta Szyrman; Grzegorz Górski; Rafał Charłampowicz; Sławomir Strugarek; Adam Bancarewicz; Anna Trzmiel; Agnieszka Nelec; Piotr Witek; Aleksander Waszkielewicz

In this paper, we describe a recently begun project aimed at teaching of echolocation using a mobile game. The presented research concerns initial echolocation tests with real world obstacles and similar tests performed using binaural recordings. Tests that included detection and recognition of large obstacles in various environments (padded room, non-padded room, outdoors) were performed by three groups 10 persons each: blind children, blind adults and sighted adults. A mixed group of volunteers also tested binaural recordings of the same environments using a mobile application for Android and iOS devices. The presented preliminary research shows a large variance in echolocation ability of the participants. Less than 20% of the 30 volunteers could reliably (with >80% certainty) localize 1 m and 2 m wide walls at distances 1 to 3 m, while about as many showed no echolocation skills and answered at a random level. On average sighted adults performed better in echolocation tests than blind children, but worse than blind adults. Tests in outdoor environments showed much better results than indoors and a padded room was marginally better for echolocation than the non-padded room. Performance with recordings was generally worse than in analogous real tests, but the same trends could be clearly observed, e.g. a proportional drop-off of correctness with distance. The tests with recordings also demonstrated that a repeatable pre-recorded or synthesized clicker originating from a loudspeaker was a better solution than recordings with live clicker sounds.

Collaboration


Dive into the Michal Bujacz's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Piotr Skulimowski

Lodz University of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Alin Moldoveanu

Politehnica University of Bucharest

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

György Wersényi

Széchenyi István University

View shared research outputs
Top Co-Authors

Avatar

Andrzej Materka

Lodz University of Technology

View shared research outputs
Top Co-Authors

Avatar

Andrzej Radecki

Lodz University of Technology

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge