Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Silvia Cirstea is active.

Publication


Featured researches published by Silvia Cirstea.


Hearing Research | 2014

A summary of research investigating echolocation abilities of blind and sighted humans

Andrew J. Kolarik; Silvia Cirstea; Shahina Pardhan; Brian C. J. Moore

There is currently considerable interest in the consequences of loss in one sensory modality on the remaining senses. Much of this work has focused on the development of enhanced auditory abilities among blind individuals, who are often able to use sound to navigate through space. It has now been established that many blind individuals produce sound emissions and use the returning echoes to provide them with information about objects in their surroundings, in a similar manner to bats navigating in the dark. In this review, we summarize current knowledge regarding human echolocation. Some blind individuals develop remarkable echolocation abilities, and are able to assess the position, size, distance, shape, and material of objects using reflected sound waves. After training, normally sighted people are also able to use echolocation to perceive objects, and can develop abilities comparable to, but typically somewhat poorer than, those of blind people. The underlying cues and mechanisms, operable range, spatial acuity and neurological underpinnings of echolocation are described. Echolocation can result in functional real life benefits. It is possible that these benefits can be optimized via suitable training, especially among those with recently acquired blindness, but this requires further study. Areas for further research are identified.


IEEE Transactions on Industrial Electronics | 2010

Direct Neural-Network Hardware-Implementation Algorithm

Andrei Dinu; Marcian Cirstea; Silvia Cirstea

An algorithm for compact neural-network hardware implementation is presented, which exploits the special properties of the Boolean functions describing the operation of artificial neurons with step activation function. The algorithm contains three steps: artificial-neural-network (ANN) mathematical model digitization, conversion of the digitized model into a logic-gate structure, and hardware optimization by elimination of redundant logic gates. A set of C++ programs automates algorithm implementation, generating an optimized very high speed integrated circuit hardware description language code. This strategy bridges the gap between ANN design software and hardware design packages (Xilinx). Although the method is directly applicable only to neurons with step activation functions, it can be extended to sigmoidal functions.


Experimental Brain Research | 2013

Evidence for enhanced discrimination of virtual auditory distance among blind listeners using level and direct-to-reverberant cues

Andrew J. Kolarik; Silvia Cirstea; Shahina Pardhan

Totally blind listeners often demonstrate better than normal capabilities when performing spatial hearing tasks. Accurate representation of three-dimensional auditory space requires the processing of available distance information between the listener and the sound source; however, auditory distance cues vary greatly depending upon the acoustic properties of the environment, and it is not known which distance cues are important to totally blind listeners. Our data show that totally blind listeners display better performance compared to sighted age-matched controls for distance discrimination tasks in anechoic and reverberant virtual rooms simulated using a room-image procedure. Totally blind listeners use two major auditory distance cues to stationary sound sources, level and direct-to-reverberant ratio, more effectively than sighted controls for many of the virtual distances tested. These results show that significant compensation among totally blind listeners for virtual auditory spatial distance leads to benefits across a range of simulated acoustic environments. No significant differences in performance were observed between listeners with partial non-correctable visual losses and sighted controls, suggesting that sensory compensation for virtual distance does not occur for listeners with partial vision loss.


Attention Perception & Psychophysics | 2016

Auditory distance perception in humans: a review of cues, development, neuronal bases, and effects of sensory loss

Andrew J. Kolarik; Brian C. J. Moore; Pavel Zahorik; Silvia Cirstea; Shahina Pardhan

Auditory distance perception plays a major role in spatial awareness, enabling location of objects and avoidance of obstacles in the environment. However, it remains under-researched relative to studies of the directional aspect of sound localization. This review focuses on the following four aspects of auditory distance perception: cue processing, development, consequences of visual and auditory loss, and neurological bases. The several auditory distance cues vary in their effective ranges in peripersonal and extrapersonal space. The primary cues are sound level, reverberation, and frequency. Nonperceptual factors, including the importance of the auditory event to the listener, also can affect perceived distance. Basic internal representations of auditory distance emerge at approximately 6 months of age in humans. Although visual information plays an important role in calibrating auditory space, sensorimotor contingencies can be used for calibration when vision is unavailable. Blind individuals often manifest supranormal abilities to judge relative distance but show a deficit in absolute distance judgments. Following hearing loss, the use of auditory level as a distance cue remains robust, while the reverberation cue becomes less effective. Previous studies have not found evidence that hearing-aid processing affects perceived auditory distance. Studies investigating the brain areas involved in processing different acoustic distance cues are described. Finally, suggestions are given for further research on auditory distance perception, including broader investigation of how background noise and multiple sound sources affect perceived auditory distance for those with sensory loss.


Journal of the Acoustical Society of America | 2013

Discrimination of virtual auditory distance using level and direct-to-reverberant ratio cuesa)

Andrew J. Kolarik; Silvia Cirstea; Shahina Pardhan

The study investigated how listeners used level and direct-to-reverberant ratio (D/R) cues to discriminate distances to virtual sound sources. Sentence pairs were presented at virtual distances in simulated rooms that were either reverberant or anechoic. Performance on the basis of level was generally better than performance based on D/R. Increasing room reverberation time improved performance based on the D/R cue such that the two cues provided equally effective information at further virtual source distances in highly reverberant environments. Orientation of the listener within the virtual room did not affect performance.


Journal of the Acoustical Society of America | 2013

An assessment of virtual auditory distance judgments among blind and sighted listeners

Andrew J. Kolarik; Silvia Cirstea; Shahina Pardhan; Brian C. J. Moore

Auditory distance perception is a crucial component of blind listeners’ spatial awareness. Many studies have reported supra-normal spatial auditory abilities among blind individuals, such as enhanced azimuthal localization [Voss et al. (2004)] and distance discrimination [Kolarik et al. (in press)]. However, it is not known whether blind listeners are better able to use acoustic information to enhance judgments of distance to single sound sources, or whether lack of visual spatial cues prevents calibration of auditory distance information, leading to worse performance than for sighted listeners. Blind and sighted listeners were presented with single, stationary virtual sound sources between 1.22 and 13.79 m away in a virtual anechoic environment simulated using an image-source model. Stimuli were spoken sentences. Sighted listeners systematically underestimated distance to remote virtual sources, while blind listeners overestimated the distance to nearby virtual sources and underestimated it for remote virtual sources. The findings suggest that blind listeners are less accurate at judging absolute distance, and experience a compression of the auditory world, relative to sighted listeners. The results support a perceptual deficiency hypothesis for absolute distance judgments, suggesting that compensatory processes for audition do not develop among blind listeners when estimating the distance to single, stationary sound sources.


Journal of the Acoustical Society of America | 2011

Perceiving auditory distance using level and direct-to-reverberant ratio cues

Andrew J. Kolarik; Silvia Cirstea; Shahina Pardhan

The study investigated how level and reverberation cues contribute to distance discrimination, and how accuracy is affected by reverberation cue strength. Sentence pairs were presented at distances between 1 and 8 m in a virtual room simulated using an image-source model and two reverberation settings (lower and higher). Listeners performed discrimination judgments in three conditions: level cue only (Level-Only), reverberation only (Equalized), and both cues available (Normal). Percentage correct judgment of which sentence was closer was measured. Optimal distance discrimination was obtained in the Normal condition. Perception of the difference in distance between sentences had a lower threshold (i.e., performance was significantly better, p<0.05) for closer than further targets in Normal and Level-Only conditions. On the contrary, in the Equalized condition, these thresholds were lower for further than closer targets. Thresholds were lower at higher reverberation in the Equalized condition, and for furt...


Experimental Brain Research | 2017

Auditory spatial representations of the world are compressed in blind humans

Andrew J. Kolarik; Shahina Pardhan; Silvia Cirstea; Brian C. J. Moore

Compared to sighted listeners, blind listeners often display enhanced auditory spatial abilities such as localization in azimuth. However, less is known about whether blind humans can accurately judge distance in extrapersonal space using auditory cues alone. Using virtualization techniques, we show that auditory spatial representations of the world beyond the peripersonal space of blind listeners are compressed compared to those for normally sighted controls. Blind participants overestimated the distance to nearby sources and underestimated the distance to remote sound sources, in both reverberant and anechoic environments, and for speech, music, and noise signals. Functions relating judged and actual virtual distance were well fitted by compressive power functions, indicating that the absence of visual information regarding the distance of sound sources may prevent accurate calibration of the distance information provided by auditory signals.


conference of the industrial electronics society | 2006

Reusable VHDL Architectures for Induction Motor PWM Vector Control, targeting FPGAs

Abdulmagid Aounis; Silvia Cirstea; Marcian Cirstea

The paper presents a new approach to the modelling, simulation, digital controller design and implementation of an induction motor vector control system. The method uses very high speed integrated circuit hardware description language (VHDL) as a unique EDA environment for system modelling, evaluation and controller design. Simulation and experimental results are presented. The advantages of using VHDL for a power drive behavioural modelling and control strategy FPGA implementation include important benefits like holistic analysis of a system in view of adopting an overall optimised control strategy, efficient design reusability of the typical vector control blocks in a range of digital controllers and IP cores, the ability to use this approach for the design of combined FPGA/microprocessor (core) efficient digital controllers through hardware-software codesign and not ultimately the use of a single flexible CAD environment for all phases of digital controller design and implementation, leading to short development time and short time to market


international symposium on industrial electronics | 2008

Depth extraction from 3D-integral images approached as an inverse problem

Silvia Cirstea; Amar Aggoun; Malcolm McCormick

The paper presents two methods for the extraction of depth information from planar recorded data of 3D (three - dimensional) integral images. A description of the integral imaging system and the associated point spread function are presented. Depth estimation from 3D-integral pictures is formulated as an inverse problem of integral image formation. To cure the ill-posedness of the problem, approximate solutions are searched using so called dasiaregularization methodspsila. Two regularization schemes for obtaining constrained least-squares solutions are presented. The first algorithm is based on the projected Landweber method. The second method is a constrained version of Tikhonovpsilas regularization method for ill-posed problems. Finally, illustrative simulation results are given.

Collaboration


Dive into the Silvia Cirstea's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Andrew Kolarik

Anglia Ruskin University

View shared research outputs
Top Co-Authors

Avatar

J.A. Barrett

Anglia Ruskin University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

M.M. De Souza

Centro Universitário da FEI

View shared research outputs
Top Co-Authors

Avatar

A. Parera-Ruiz

Anglia Ruskin University

View shared research outputs
Researchain Logo
Decentralizing Knowledge