Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Julie A. Brefczynski-Lewis is active.

Publication


Featured researches published by Julie A. Brefczynski-Lewis.


Proceedings of the National Academy of Sciences of the United States of America | 2007

Neural correlates of attentional expertise in long-term meditation practitioners

Julie A. Brefczynski-Lewis; Antoine Lutz; H. S. Schaefer; D. B. Levinson; Richard J. Davidson

Meditation refers to a family of mental training practices that are designed to familiarize the practitioner with specific types of mental processes. One of the most basic forms of meditation is concentration meditation, in which sustained attention is focused on an object such as a small visual stimulus or the breath. In age-matched participants, using functional MRI, we found that activation in a network of brain regions typically involved in sustained attention showed an inverted u-shaped curve in which expert meditators (EMs) with an average of 19,000 h of practice had more activation than novices, but EMs with an average of 44,000 h had less activation. In response to distracter sounds used to probe the meditation, EMs vs. novices had less brain activation in regions related to discursive thoughts and emotions and more activation in regions related to response inhibition and attention. Correlation with hours of practice suggests possible plasticity in these mechanisms.


PLOS ONE | 2008

Regulation of the Neural Circuitry of Emotion by Compassion Meditation: Effects of Meditative Expertise

Antoine Lutz; Julie A. Brefczynski-Lewis; Tom Johnstone; Richard J. Davidson

Recent brain imaging studies using functional magnetic resonance imaging (fMRI) have implicated insula and anterior cingulate cortices in the empathic response to anothers pain. However, virtually nothing is known about the impact of the voluntary generation of compassion on this network. To investigate these questions we assessed brain activity using fMRI while novice and expert meditation practitioners generated a loving-kindness-compassion meditation state. To probe affective reactivity, we presented emotional and neutral sounds during the meditation and comparison periods. Our main hypothesis was that the concern for others cultivated during this form of meditation enhances affective processing, in particular in response to sounds of distress, and that this response to emotional sounds is modulated by the degree of meditation training. The presentation of the emotional sounds was associated with increased pupil diameter and activation of limbic regions (insula and cingulate cortices) during meditation (versus rest). During meditation, activation in insula was greater during presentation of negative sounds than positive or neutral sounds in expert than it was in novice meditators. The strength of activation in insula was also associated with self-reported intensity of the meditation for both groups. These results support the role of the limbic circuitry in emotion sharing. The comparison between meditation vs. rest states between experts and novices also showed increased activation in amygdala, right temporo-parietal junction (TPJ), and right posterior superior temporal sulcus (pSTS) in response to all sounds, suggesting, greater detection of the emotional sounds, and enhanced mentation in response to emotional human vocalizations for experts than novices during meditation. Together these data indicate that the mental expertise to cultivate positive emotion alters the activation of circuitries previously linked to empathy and theory of mind in response to emotional stimuli.


The Journal of Neuroscience | 2009

Human Cortical Organization for Processing Vocalizations Indicates Representation of Harmonic Structure as a Signal Attribute

James W. Lewis; William J. Talkington; Nathan A. Walker; George A. Spirou; Audrey Jajosky; Chris Frum; Julie A. Brefczynski-Lewis

The ability to detect and rapidly process harmonic sounds, which in nature are typical of animal vocalizations and speech, can be critical for communication among conspecifics and for survival. Single-unit studies have reported neurons in auditory cortex sensitive to specific combinations of frequencies (e.g., harmonics), theorized to rapidly abstract or filter for specific structures of incoming sounds, where large ensembles of such neurons may constitute spectral templates. We studied the contribution of harmonic structure to activation of putative spectral templates in human auditory cortex by using a wide variety of animal vocalizations, as well as artificially constructed iterated rippled noises (IRNs). Both the IRNs and vocalization sounds were quantitatively characterized by calculating a global harmonics-to-noise ratio (HNR). Using functional MRI, we identified HNR-sensitive regions when presenting either artificial IRNs and/or recordings of natural animal vocalizations. This activation included regions situated between functionally defined primary auditory cortices and regions preferential for processing human nonverbal vocalizations or speech sounds. These results demonstrate that the HNR of sound reflects an important second-order acoustic signal attribute that parametrically activates distinct pathways of human auditory cortex. Thus, these results provide novel support for the presence of spectral templates, which may subserve a major role in the hierarchical processing of vocalizations as a distinct category of behaviorally relevant sound.


Journal of Cognitive Neuroscience | 2006

Lefties Get It “Right” When Hearing Tool Sounds

James W. Lewis; Raymond E. Phinney; Julie A. Brefczynski-Lewis; Edgar A. DeYoe

Our ability to manipulate and understand the use of a wide range of tools is a feature that sets humans apart from other animals. In right-handers, we previously reported that hearing hand-manipulated tool sounds preferentially activates a left hemisphere network of motor-related brain regions hypothesized to be related to handedness. Using functional magnetic resonance imaging, we compared cortical activation in strongly right-handed versus left-handed listeners categorizing tool sounds relative to animal vocalizations. Here we show that tool sounds preferentially evoke activity predominantly in the hemisphere opposite the dominant hand, in specific high-level motor-related and multisensory cortical regions, as determined by a separate task involving pantomiming tool-use gestures. This organization presumably reflects the idea that we typically learn the meaning of tool sounds in the context of using them with our dominant hand, such that the networks underlying motor imagery or action schemas may be recruited to facilitate recognition.


Perspectives on Psychological Science | 2018

Mind the Hype: A Critical Evaluation and Prescriptive Agenda for Research on Mindfulness and Meditation

Nicholas T. Van Dam; Marieke K. van Vugt; David R. Vago; Laura Schmalzl; Clifford D. Saron; Andrew Olendzki; Ted Meissner; Sara W. Lazar; Catherine E. Kerr; Jolie Gorchov; Kieran C. R. Fox; Brent A. Field; Willoughby B. Britton; Julie A. Brefczynski-Lewis; David E. Meyer

During the past two decades, mindfulness meditation has gone from being a fringe topic of scientific investigation to being an occasional replacement for psychotherapy, tool of corporate well-being, widely implemented educational practice, and “key to building more resilient soldiers.” Yet the mindfulness movement and empirical evidence supporting it have not gone without criticism. Misinformation and poor methodology associated with past studies of mindfulness may lead public consumers to be harmed, misled, and disappointed. Addressing such concerns, the present article discusses the difficulties of defining mindfulness, delineates the proper scope of research into mindfulness practices, and explicates crucial methodological issues for interpreting results from investigations of mindfulness. For doing so, the authors draw on their diverse areas of expertise to review the present state of mindfulness research, comprehensively summarizing what we do and do not know, while providing a prescriptive agenda for contemplative science, with a particular focus on assessment, mindfulness training, possible adverse effects, and intersection with brain imaging. Our goals are to inform interested scientists, the news media, and the public, to minimize harm, curb poor research practices, and staunch the flow of misinformation about the benefits, costs, and future prospects of mindfulness meditation.


Journal of Cognitive Neuroscience | 2009

The topography of visuospatial attention as revealed by a novel visual field mapping technique

Julie A. Brefczynski-Lewis; Ritobrato Datta; James W. Lewis; Edgar A. DeYoe

Previously, we and others have shown that attention can enhance visual processing in a spatially specific manner that is retinotopically mapped in the occipital cortex. However, it is difficult to appreciate the functional significance of the spatial pattern of cortical activation just by examining the brain maps. In this study, we visualize the neural representation of the “spotlight” of attention using a back-projection of attention-related brain activation onto a diagram of the visual field. In the two main experiments, we examine the topography of attentional activation in the occipital and parietal cortices. In retinotopic areas, attentional enhancement is strongest at the locations of the attended target, but also spreads to nearby locations and even weakly to restricted locations in the opposite visual field. The dispersion of attentional effects around an attended site increases with the eccentricity of the target in a manner that roughly corresponds to a constant area of spread within the cortex. When averaged across multiple observers, these patterns appear consistent with a gradient model of spatial attention. However, individual observers exhibit complex variations that are unique but reproducible. Overall, these results suggest that the topography of visual attention for each individual is composed of a common theme plus a personal variation that may reflect their own unique “attentional style.”


Human Brain Mapping | 2011

Cortical network differences in the sighted versus early blind for recognition of human-produced action sounds.

James W. Lewis; Chris Frum; Julie A. Brefczynski-Lewis; William J. Talkington; Nathan A. Walker; Kristina M. Rapuano; Amanda L. Kovach

Both sighted and blind individuals can readily interpret meaning behind everyday real‐world sounds. In sighted listeners, we previously reported that regions along the bilateral posterior superior temporal sulci (pSTS) and middle temporal gyri (pMTG) are preferentially activated when presented with recognizable action sounds. These regions have generally been hypothesized to represent primary loci for complex motion processing, including visual biological motion processing and audio–visual integration. However, it remained unclear whether, or to what degree, life‐long visual experience might impact functions related to hearing perception or memory of sound‐source actions. Using functional magnetic resonance imaging (fMRI), we compared brain regions activated in congenitally blind versus sighted listeners in response to hearing a wide range of recognizable human‐produced action sounds (excluding vocalizations) versus unrecognized, backward‐played versions of those sounds. Here, we show that recognized human action sounds commonly evoked activity in both groups along most of the left pSTS/pMTG complex, though with relatively greater activity in the right pSTS/pMTG by the blind group. These results indicate that portions of the postero‐lateral temporal cortices contain domain‐specific hubs for biological and/or complex motion processing independent of sensory‐modality experience. Contrasting the two groups, the sighted listeners preferentially activated bilateral parietal plus medial and lateral frontal networks, whereas the blind listeners preferentially activated left anterior insula plus bilateral anterior calcarine and medial occipital regions, including what would otherwise have been visual‐related cortex. These global‐level network differences suggest that blind and sighted listeners may preferentially use different memory retrieval strategies when hearing and attempting to recognize action sounds. Hum Brain Mapp , 2011.


Frontiers in Human Neuroscience | 2011

In the blink of an eye: neural responses elicited to viewing the eye blinks of another individual.

Julie A. Brefczynski-Lewis; Michael E. Berrebi; Marie E. McNeely; Amy L. Prostko; Aina Puce

Facial movements have the potential to be powerful social signals. Previous studies have shown that eye gaze changes and simple mouth movements can elicit robust neural responses, which can be altered as a function of potential social significance. Eye blinks are frequent events and are usually not deliberately communicative, yet blink rate is known to influence social perception. Here, we studied event-related potentials (ERPs) elicited to observing non-task relevant blinks, eye closure, and eye gaze changes in a centrally presented natural face stimulus. Our first hypothesis (H1) that blinks would produce robust ERPs (N170 and later ERP components) was validated, suggesting that the brain may register and process all types of eye movement for potential social relevance. We also predicted an amplitude gradient for ERPs as a function of gaze change, relative to eye closure and then blinks (H2). H2 was only partly validated: large temporo-occipital N170s to all eye change conditions were observed and did not significantly differ between blinks and other conditions. However, blinks elicited late ERPs that, although robust, were significantly smaller relative to gaze conditions. Our data indicate that small and task-irrelevant facial movements such as blinks are measurably registered by the observers brain. This finding is suggestive of the potential social significance of blinks which, in turn, has implications for the study of social cognition and use of real-life social scenarios.


Frontiers in Human Neuroscience | 2013

Multiple faces elicit augmented neural activity

Aina Puce; Marie E. McNeely; Michael E. Berrebi; James C. Thompson; Jillian E. Hardee; Julie A. Brefczynski-Lewis

How do our brains respond when we are being watched by a group of people?Despite the large volume of literature devoted to face processing, this question has received very little attention. Here we measured the effects on the face-sensitive N170 and other ERPs to viewing displays of one, two and three faces in two experiments. In Experiment 1, overall image brightness and contrast were adjusted to be constant, whereas in Experiment 2 local contrast and brightness of individual faces were not manipulated. A robust positive-negative-positive (P100-N170-P250) ERP complex and an additional late positive ERP, the P400, were elicited to all stimulus types. As the number of faces in the display increased, N170 amplitude increased for both stimulus sets, and latency increased in Experiment 2. P100 latency and P250 amplitude were affected by changes in overall brightness and contrast, but not by the number of faces in the display per se. In Experiment 1 when overall brightness and contrast were adjusted to be constant, later ERP (P250 and P400) latencies showed differences as a function of hemisphere. Hence, our data indicate that N170 increases its magnitude when multiple faces are seen, apparently impervious to basic low-level stimulus features including stimulus size. Outstanding questions remain regarding category-sensitive neural activity that is elicited to viewing multiple items of stimulus categories other than faces.


ieee nuclear science symposium | 2011

HelmetPET: A silicon photomultiplier based wearable brain imager

S. Majewski; James Proffitt; Julie A. Brefczynski-Lewis; Alexander V. Stolin; Andrew G. Weisenberger; Wenze Xi; R. Wojcik

We are developing the HelmetPET, a wearable human PET brain imager which has the potential application of evaluating brain function utilizing PET based radiopharmaceuticals in standing, balancing or moving patients. The HelmetPET is composed of two rings of radiation detectors together providing a cylindrical reconstructed volume with an axial length of 5 cm. Each ring is composed of twenty 2.5 cm2 silicon photomultiplier (SiPM) based detector modules. Each detector module is composed of a 5×5 array of twenty-five Hamamatsu S10362-33-050P Multi Pixel Photon Counters (MPPCs). The 3 mm2 MPPCs are arranged on a 5mm step. Coupled to each of the MPPC modules is a LYSO scintillator crystal array coupled to the MPPC array using to two different LYSO pixel arrays: 1.0×1.0×10 mm3 and 1.5×1.5×10 mm3. The current phase of the project is to equip the forty 2.5 cm2 detector modules with resistive readout and assemble them in a helmet type head support and suspend from a flexible mechanical mount.

Collaboration


Dive into the Julie A. Brefczynski-Lewis's collaboration.

Top Co-Authors

Avatar

James W. Lewis

West Virginia University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Aina Puce

Indiana University Bloomington

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Chris Frum

West Virginia University

View shared research outputs
Top Co-Authors

Avatar

James Proffitt

Thomas Jefferson National Accelerator Facility

View shared research outputs
Top Co-Authors

Avatar

Jinyi Qi

University of California

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge