Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where John R. Iversen is active.

Publication


Featured researches published by John R. Iversen.


Experimental Brain Research | 2005

The influence of metricality and modality on synchronization with a beat

Aniruddh D. Patel; John R. Iversen; Yanqing Chen; Bruno H. Repp

The great majority of the world’s music is metrical, i.e., has periodic structure at multiple time scales. Does the metrical structure of a non-isochronous rhythm improve synchronization with a beat compared to synchronization with an isochronous sequence at the beat period? Beat synchronization is usually associated with auditory stimuli, but are people able to extract a beat from rhythmic visual sequences with metrical structure? We addressed these questions by presenting listeners with rhythmic patterns which were either isochronous or non-isochronous in either the auditory or visual modality, and by asking them to tap to the beat, which was prescribed to occur at 800-ms intervals. For auditory patterns, we found that a strongly metrical structure did not improve overall accuracy of synchronization compared with isochronous patterns of the same beat period, though it did influence the higher-level patterning of taps. Synchronization was impaired in weakly metrical patterns in which some beats were silent. For the visual patterns, we found that participants were generally unable to synchronize to metrical non-isochronous rhythms, or to rapid isochronous rhythms. This suggests that beat perception and synchronization have a special affinity with the auditory system.


Annals of the New York Academy of Sciences | 2009

Top-Down Control of Rhythm Perception Modulates Early Auditory Responses

John R. Iversen; Bruno H. Repp; Aniruddh D. Patel

Our perceptions are shaped by both extrinsic stimuli and intrinsic interpretation. The perceptual experience of a simple rhythm, for example, depends upon its metrical interpretation (where one hears the beat). Such interpretation can be altered at will, providing a model to study the interaction of endogenous and exogenous influences in the cognitive organization of perception. Using magnetoencephalography (MEG), we measured brain responses evoked by a repeating, rhythmically ambiguous phrase (two tones followed by a rest). In separate trials listeners were instructed to impose different metrical organizations on the rhythm by mentally placing the downbeat on either the first or the second tone. Since the stimulus was invariant, differences in brain activity between the two conditions should relate to endogenous metrical interpretation. Metrical interpretation influenced early evoked neural responses to tones, specifically in the upper beta range (20–30 Hz). Beta response was stronger (by 64% on average) when a tone was imagined to be the beat, compared to when it was not. A second experiment established that the beta increase closely resembles that due to physical accents, and thus may represent the genesis of a subjective accent. The results demonstrate endogenous modulation of early auditory responses, and suggest a unique role for the beta band in linking of endogenous and exogenous processing. Given the suggested role of beta in motor processing and long‐range intracortical coordination, it is hypothesized that the motor system influences metrical interpretation of sound, even in the absence of overt movement.


Journal of the Acoustical Society of America | 2008

Perception of rhythmic grouping depends on auditory experience

John R. Iversen; Aniruddh D. Patel; Kengo Ohgushi

Many aspects of perception are known to be shaped by experience, but others are thought to be innate universal properties of the brain. A specific example comes from rhythm perception, where one of the fundamental perceptual operations is the grouping of successive events into higher-level patterns, an operation critical to the perception of language and music. Grouping has long been thought to be governed by innate perceptual principles established a century ago. The current work demonstrates instead that grouping can be strongly dependent on culture. Native English and Japanese speakers were tested for their perception of grouping of simple rhythmic sequences of tones. Members of the two cultures showed different patterns of perceptual grouping, demonstrating that these basic auditory processes are not universal but are shaped by experience. It is suggested that the observed perceptual differences reflect the rhythms of the two languages, and that native language can exert an influence on general auditory perception at a basic level.


Journal of the Acoustical Society of America | 2006

Comparing the rhythm and melody of speech and music: The case of British English and French

Aniruddh D. Patel; John R. Iversen; Jason C. Rosenberg

For over half a century, musicologists and linguists have suggested that the prosody of a cultures native language is reflected in the rhythms and melodies of its instrumental music. Testing this idea requires quantitative methods for comparing musical and spoken rhythm and melody. This study applies such methods to the speech and music of England and France. The results reveal that music reflects patterns of durational contrast between successive vowels in spoken sentences, as well as patterns of pitch interval variability in speech. The methods presented here are suitable for studying speech-music relations in a broad range of cultures.


Trends in Cognitive Sciences | 2007

The linguistic benefits of musical abilities

Aniruddh D. Patel; John R. Iversen

Growing evidence points to a link between musical abilities and certain phonetic and prosodic skills in language. However, the mechanisms that underlie these relations are not well understood. A recent study by Wong et al. suggests that musical training sharpens the subcortical encoding of linguistic pitch patterns. We consider the implications of their methods and findings for establishing a link between musical training and phonetic abilities more generally.


Annals of the New York Academy of Sciences | 2009

Studying Synchronization to a Musical Beat in Nonhuman Animals

Aniruddh D. Patel; John R. Iversen; Micah R. Bregman; Irena Schulz

The recent discovery of spontaneous synchronization to music in a nonhuman animal (the sulphur‐crested cockatoo Cacatua galerita eleonora) raises several questions. How does this behavior differ from nonmusical synchronization abilities in other species, such as synchronized frog calls or firefly flashes? What significance does the behavior have for debates over the evolution of human music? What kinds of animals can synchronize to musical rhythms, and what are the key methodological issues for research in this area? This paper addresses these questions and proposes some refinements to the “vocal learning and rhythmic synchronization hypothesis.”


Aphasiology | 2008

Musical syntactic processing in agrammatic Broca's aphasia

Aniruddh D. Patel; John R. Iversen; Marlies Wassenaar; Peter Hagoort

Background: Growing evidence for overlap in the syntactic processing of language and music in non‐brain‐damaged individuals leads to the question of whether aphasic individuals with grammatical comprehension problems in language also have problems processing structural relations in music. Aims: The current study sought to test musical syntactic processing in individuals with Brocas aphasia and grammatical comprehension deficits, using both explicit and implicit tasks. Methods & Procedures: Two experiments were conducted. In the first experiment 12 individuals with Brocas aphasia (and 14 matched controls) were tested for their sensitivity to grammatical and semantic relations in sentences, and for their sensitivity to musical syntactic (harmonic) relations in chord sequences. An explicit task (acceptability judgement of novel sequences) was used. The second experiment, with 9 individuals with Brocas aphasia (and 12 matched controls), probed musical syntactic processing using an implicit task (harmonic priming). Outcomes & Results: In both experiments the aphasic group showed impaired processing of musical syntactic relations. Control experiments indicated that this could not be attributed to low‐level problems with the perception of pitch patterns or with auditory short‐term memory for tones. Conclusions: The results suggest that musical syntactic processing in agrammatic aphasia deserves systematic investigation, and that such studies could help probe the nature of the processing deficits underlying linguistic agrammatism. Methodological suggestions are offered for future work in this little‐explored area.


Psychological Research-psychologische Forschung | 2013

Synchronization with competing visual and auditory rhythms: bouncing ball meets metronome

Michael J. Hove; John R. Iversen; Allen Zhang; Bruno H. Repp

Synchronization of finger taps with periodically flashing visual stimuli is known to be much more variable than synchronization with an auditory metronome. When one of these rhythms is the synchronization target and the other serves as a distracter at various temporal offsets, strong auditory dominance is observed. However, it has recently been shown that visuomotor synchronization improves substantially with moving stimuli such as a continuously bouncing ball. The present study pitted a bouncing ball against an auditory metronome in a target–distracter synchronization paradigm, with the participants being auditory experts (musicians) and visual experts (video gamers and ball players). Synchronization was still less variable with auditory than with visual target stimuli in both groups. For musicians, auditory stimuli tended to be more distracting than visual stimuli, whereas the opposite was the case for the visual experts. Overall, there was no main effect of distracter modality. Thus, a distracting spatiotemporal visual rhythm can be as effective as a distracting auditory rhythm in its capacity to perturb synchronous movement, but its effectiveness also depends on modality-specific expertise.


Communicative & Integrative Biology | 2009

Avian and human movement to music: Two further parallels

Aniruddh D. Patel; John R. Iversen; Micah R. Bregman; Irena Schulz

It has recently been demonstrated that a nonhuman animal (the medium sulphur-crested cockatoo Cacatua galerita eleonora) can entrain its rhythmic movements to the beat of human music across a wide range of tempi. Entrainment occurrs in “synchronized bouts”, occasional stretches of synchrony embedded in longer sequences of rhythmic movement to music. Here we examine non-synchronized rhythmic movements made while dancing to music, and find strong evidence for a preferred tempo around 126 beats per minute [bpm]. The animal shows best synchronization to music when the musical tempo is near this preferred tempo. The tendency to dance to music at a preferred tempo, and to synchronize best when the music is near this tempo, parallels how young humans move to music. These findings support the idea that avian and human synchronization to music have similar neurobiological foundations.


Journal of the Acoustical Society of America | 2006

Nonlinguistic rhythm perception depends on culture and reflects the rhythms of speech: Evidence from English and Japanese

Aniruddh D. Patel; John R. Iversen; Kengo Ohgushi

Does a listener’s native language influence the perception of nonverbal rhythms? One way to address this question is to examine the perception of rhythmic grouping, as suggested by Jakobson, Fant, and Halle in Preliminaries to Speech Analysis. Grouping refers to the perceptual clustering of events into higher level units such as phrases. Several principles of grouping of auditory patterns, established a century ago, have come to be thought of as innate properties of perception. The supporting data have come entirely from Western cultures, however. In this study, native English and Japanese speakers were tested for their perception of grouping in simple rhythmic sequences of tones. Members of the two cultures showed different patterns of perceptual grouping, demonstrating that basic auditory processes are shaped by culture. Furthermore, measurements of the temporal structure of English versus Japanese speech and music suggest that the observed differences reflect the rhythms of the two languages, rather th...

Collaboration


Dive into the John R. Iversen's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Irena Schulz

University of California

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Makiko Sadakata

Radboud University Nijmegen

View shared research outputs
Top Co-Authors

Avatar

Peter Desain

Radboud University Nijmegen

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge