Neil Mennie
University of Nottingham Malaysia Campus
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Neil Mennie.
Perception | 1999
Michael F. Land; Neil Mennie; Jennifer Rusted
The aim of this study was to determine the pattern of fixations during the performance of a well-learned task in a natural setting (making tea), and to classify the types of monitoring action that the eyes perform. We used a head-mounted eye-movement video camera, which provided a continuous view of the scene ahead, with a dot indicating foveal direction with an accuracy of about 1 deg. A second video camera recorded the subjects activities from across the room. The videos were linked and analysed frame by frame. Foveal direction was always close to the object being manipulated, and very few fixations were irrelevant to the task. The first object-related fixation typically led the first indication of manipulation by 0.56 s, and vision moved to the next object about 0.61 s before manipulation of the previous object was complete. Each object-related act that did not involve a waiting period lasted an average of 3.3 s and involved about 7 fixations. Roughly a third of all fixations on objects could be definitely identified with one of four monitoring functions: locating objects used later in the process, directing the hand or object in the hand to a new location, guiding the approach of one object to another (eg kettle and lid), and checking the state of some variable (eg water level). We conclude that although the actions of tea-making are ‘automated’ and proceed with little conscious involvement, the eyes closely monitor every step of the process. This type of unconscious attention must be a common phenomenon in everyday life.
Journal of Neurophysiology | 2008
Doris I. Braun; Neil Mennie; Christoph Rasche; Alexander C. Schütz; Michael J. Hawken; Karl R. Gegenfurtner
At slow speeds, chromatic isoluminant stimuli are perceived to move much slower than comparable luminance stimuli. We investigated whether smooth pursuit eye movements to isoluminant stimuli show an analogous slowing. Beside pursuit speed and latency, we studied speed judgments to the same stimuli during fixation and pursuit. Stimuli were either large sine wave gratings or small Gaussians blobs moving horizontally at speeds between 1 and 11 degrees /s. Targets were defined by luminance contrast or color. Confirming prior studies, we found that speed judgments of isoluminant stimuli during fixation showed a substantial slowing when compared with luminance stimuli. A similarly strong and significant effect of isoluminance was found for pursuit initiation: compared with luminance targets of matched contrasts, latencies of pursuit initiation were delayed by 50 ms at all speeds and eye accelerations were reduced for isoluminant targets. A small difference was found between steady-state eye velocities of luminance and isoluminant targets. For comparison, we measured latencies of saccades to luminance and isoluminant stimuli under similar conditions, but the effect of isoluminance was only found for pursuit. Parallel psychophysical experiments revealed that different from speed judgments of moving isoluminant stimuli made during fixation, judgments during pursuit are veridical for the same stimuli at all speeds. Therefore information about target speed seems to be available for pursuit eye movements and speed judgments during pursuit but is degraded for perceptual speed judgments during fixation and for pursuit initiation.
Neuropsychologia | 2010
Emer M. E. Forde; Jennifer Rusted; Neil Mennie; Michael F. Land; Glyn W. Humphreys
We examined eye movements in a patient, FK, who has action disorganisation syndrome (ADS), as he performed the everyday task of making a cup of tea. We compared his eye movements with those of a person with Alzheimers disease and with healthy volunteers. Despite showing very disorganised behaviour many aspects of FKs eye movements were relatively normal. However, unlike normal participants FK made no advance glances to objects that were about to be used, and he made increased numbers of fixations to irrelevant objects during the task. There were also differences in the durations of his eye movements during correct actions and during his perseverative and task-addition responses. We discuss the implications for understanding ADS and the cognitive processes required for correctly performing everyday tasks.
human vision and electronic imaging conference | 2003
Roxanne L. Canosa; Jeff B. Pelz; Neil Mennie; Joseph Peak
Eye movements are an external manifestation of selective attention and can play an important role in indicating which attributes of a scene carry the most pertinent information. Models that predict gaze distribution often define a local conspicuity value that relies on low-level image features to indicate the perceived salience of an image region. While such bottom-up models have some success in predicting fixation densities in simple 2D images, success with natural scenes requires an understanding of the goals of the observer, including the perceived usefulness of an object in the context of an explicit or implicit task. In the present study, observers viewed natural images while their eye movements were recorded. Eye movement patterns revealed that subjects preferentially fixated objects relevant for potential actions implied by the gist of the scene, rather than selecting targets based purely on image features. A proto-object map is constructed that is based on highly textured regions of the image that predict the location of potential objects. This map is used as a mask to inhibit the unimportant low-level features and enhance the important features to constrain the regions of potential interest. The resulting importance map correlates well to subject fixations on natural-task images.
Archive | 2014
M. M. Shahimin; Z. Mohammed; N. H. Saliman; N. Mohamad-Fadzil; N. A. Razali; H. A. Mutalib; Neil Mennie
This chapter reports the use of an infrared eye tracker in providing more objective and quantitative results in evaluating the reading performance of a congenital nystagmus subject. A 14-year-old boy was fitted with silicone hydrogel contact lens to correct his high refractive errors. His distance visual acuities were 6/19 (both spectacles and contact lens), and his near visual acuity improved from N10 with spectacles to N8 after 10 weeks of wearing the prescribed contact lens. His reading speed doubled from 44 to 105 words per minute (wpm) with contact lens. The improvement in reading speed was consistent with the eye movement recording results which revealed the staircase eye movement pattern, typically found in individuals with normal vision. Furthermore, his mean fixation duration during reading was found to be longer with contact lens compared to wearing spectacles.
Cognitive Vision | 2009
Geoffrey M. Underwood; Neil Mennie; Katherine Humphrey; Jean Underwood
Two experiments examined the eye movements made when remembering pictures of real-world objects and scenes, and when those images are imagined rather than inspected. In Experiment 1 arrays of simple objects were first shown, and eye movements used to indicate the location of an object declared as having been present in the array. Experiment 2 investigated the similarity of eye fixation scanpaths between the initial encoding of a picture of a real-world scene and a second viewing of a picture and when trying to imagine that picture using memory. Closer similarities were observed between phases that involved more similar tasks, and the scanpaths were just as similar when the task was presented immediately or after two days. The possibility raised by these results is that images can be retrieved from memory by re-instating the sequence of fixations made during their initial encoding.
PLOS ONE | 2013
Shern Shiou Tan; Tomas Maul; Neil Mennie
Background Visual to auditory conversion systems have been in existence for several decades. Besides being among the front runners in providing visual capabilities to blind users, the auditory cues generated from image sonification systems are still easier to learn and adapt to compared to other similar techniques. Other advantages include low cost, easy customizability, and universality. However, every system developed so far has its own set of strengths and weaknesses. In order to improve these systems further, we propose an automated and quantitative method to measure the performance of such systems. With these quantitative measurements, it is possible to gauge the relative strengths and weaknesses of different systems and rank the systems accordingly. Methodology Performance is measured by both the interpretability and also the information preservation of visual to auditory conversions. Interpretability is measured by computing the correlation of inter image distance (IID) and inter sound distance (ISD) whereas the information preservation is computed by applying Information Theory to measure the entropy of both visual and corresponding auditory signals. These measurements provide a basis and some insights on how the systems work. Conclusions With an automated interpretability measure as a standard, more image sonification systems can be developed, compared, and then improved. Even though the measure does not test systems as thoroughly as carefully designed psychological experiments, a quantitative measurement like the one proposed here can compare systems to a certain degree without incurring much cost. Underlying this research is the hope that a major breakthrough in image sonification systems will allow blind users to cost effectively regain enough visual functions to allow them to lead secure and productive lives.
bioRxiv | 2017
Neil Mennie; Rachael C. Symonds; Mazrul Mahadzir
Anthocyanins are an important part of the human diet and the most commonly consumed plant secondary metabolites. They are potent antioxidants, and in several recent studies the ingestion of anthocyanins has been linked to positive health benefits for humans. Here, we show that when given a choice between two alternative samples of cabbage to ingest, captive born orangutans (n = 6) voluntarily chose the sample that contained greater amounts of anthocyanin. This occurred when they had to decide between samples of red cabbage (Brassica oleracea var. capitata f. rubra) from the same plant (p<0.05), and samples from green cabbage (Brassica oleracea var. capitata) (p<0.01). This indicates that anthocyanin holds a reward value for these hominids. There was no difference in L*a*b* colour between ingested and discarded samples in red cabbage, but when the choice was between two green samples, the animals chose samples that were more green and yellow. There was also no difference in the amount of lightness (L*) between chosen and discarded samples of either plant. It is therefore unclear if the animals use leaf colour in decision-making. In addition to other macro nutrients provided by plants, anthocyanin is also chosen by these endangered apes.
PeerJ | 2015
Shern Shiou Tan; Tomas Maul; Neil Mennie
Loss of vision is a severe impairment to the dominant sensory system. It often has a catastrophic effect upon the sufferer, with knock-on effects to their standard of living, their ability to support themselves, and their care-givers lives. Research into visual impairments is multi-faceted, focusing on the causes of these debilitating conditions as well as attempting to alleviate the daily lives of affected individuals. One of the methods is through the usage of sensory substitution device. Our proposed system, Luminophonics, focuses on visual to auditory cross modalities information conversions. A visual to audio sensory substitution device a type of system that obtains a continual stream visual inputs which it converts into corresponding auditory soundscape. Ultimately, this device allows the visually impaired to visualize the surrounding environment by only listening to the generated soundscape. Even though there is a huge potential for this kind of devices,
Experimental Brain Research | 2007
Neil Mennie; Mary Hayhoe; Brian Sullivan