Jaewon Hwang
University of Rochester
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Jaewon Hwang.
Neuron | 2008
Soyoun Kim; Jaewon Hwang; Daeyeol Lee
Reward from a particular action is seldom immediate, and the influence of such delayed outcome on choice decreases with delay. It has been postulated that when faced with immediate and delayed rewards, decision makers choose the option with maximum temporally discounted value. We examined the preference of monkeys for delayed reward in an intertemporal choice task and the neural basis for real-time computation of temporally discounted values in the dorsolateral prefrontal cortex. During this task, the locations of the targets associated with small or large rewards and their corresponding delays were randomly varied. We found that prefrontal neurons often encoded the temporally discounted value of reward expected from a particular option. Furthermore, activity tended to increase with [corrected] discounted values for targets [corrected] presented in the neurons preferred direction, suggesting that activity related to temporally discounted values in the prefrontal cortex might determine the animals behavior during intertemporal choice.
Frontiers in Behavioral Neuroscience | 2009
Jaewon Hwang; Soyoun Kim; Daeyeol Lee
Humans and animals are more likely to take an action leading to an immediate reward than actions with delayed rewards of similar magnitudes. Although such devaluation of delayed rewards has been almost universally described by hyperbolic discount functions, the rate of this temporal discounting varies substantially among different animal species. This might be in part due to the differences in how the information about reward is presented to decision makers. In previous animal studies, reward delays or magnitudes were gradually adjusted across trials, so the animals learned the properties of future rewards from the rewards they waited for and consumed previously. In contrast, verbal cues have been used commonly in human studies. In the present study, rhesus monkeys were trained in a novel inter-temporal choice task in which the magnitude and delay of reward were indicated symbolically using visual cues and varied randomly across trials. We found that monkeys could extract the information about reward delays from visual symbols regardless of the number of symbols used to indicate the delay. The rate of temporal discounting observed in the present study was comparable to the previous estimates in other mammals, and the animals choice behavior was largely consistent with hyperbolic discounting. Our results also suggest that the rate of temporal discounting might be influenced by contextual factors, such as the novelty of the task. The flexibility furnished by this new inter-temporal choice task might be useful for future neurobiological investigations on inter-temporal choice in non-human primates.
Neural Networks | 2009
Soyoun Kim; Jaewon Hwang; Hyojung Seo; Daeyeol Lee
Humans and animals often must choose between rewards that differ in their qualities, magnitudes, immediacy, and likelihood, and must estimate these multiple reward parameters from their experience. However, the neural basis for such complex decision making is not well understood. To understand the role of the primate prefrontal cortex in determining the subjective value of delayed or uncertain reward, we examined the activity of individual prefrontal neurons during an inter-temporal choice task and a computer-simulated competitive game. Consistent with the findings from previous studies in humans and other animals, the monkeys behaviors during inter-temporal choice were well accounted for by a hyperbolic discount function. In addition, the activity of many neurons in the lateral prefrontal cortex reflected the signals related to the magnitude and delay of the reward expected from a particular action, and often encoded the difference in temporally discounted values that predicted the animals choice. During a computerized matching pennies game, the animals approximated the optimal strategy, known as Nash equilibrium, using a reinforcement learning algorithm. We also found that many neurons in the lateral prefrontal cortex conveyed the signals related to the animals previous choices and their outcomes, suggesting that this cortical area might play an important role in forming associations between actions and their outcomes. These results show that the primate lateral prefrontal cortex plays a central role in estimating the values of alternative actions based on multiple sources of information.
Neuroscience | 2012
Lizabeth M. Romanski; Jaewon Hwang
A number of studies have demonstrated that the relative timing of audiovisual stimuli is especially important for multisensory integration of speech signals although the neuronal mechanisms underlying this complex behavior are unknown. Temporal coincidence and congruency are thought to underlie the successful merging of two intermodal stimuli into a coherent perceptual representation. It has been previously shown that single neurons in the non-human primate prefrontal cortex integrate face and vocalization information. However, these multisensory responses and the degree to which they depend on temporal coincidence have yet to be determined. In this study we analyzed the response latency of ventrolateral prefrontal (VLPFC) neurons to face, vocalization and combined face-vocalization stimuli and an offset (asynchronous) version of the face-vocalization stimulus. Our results indicate that for most prefrontal multisensory neurons, the response latency for the vocalization was the shortest, followed by the combined face-vocalization stimuli. The face stimulus had the longest onset response latency. When tested with a dynamic face-vocalization stimulus that had been temporally offset (asynchronous) one-third of multisensory cells in VLPFC demonstrated a change in response compared to the response to the natural, synchronous face-vocalization movie. Our results indicate that prefrontal neurons are sensitive to the temporal properties of audiovisual stimuli. A disruption in the temporal synchrony of an audiovisual signal which results in a change in the firing of communication related prefrontal neurons could underlie the loss in intelligibility which occurs with asynchronous speech stimuli.
The Journal of Neuroscience | 2015
Jaewon Hwang; Lizabeth M. Romanski
During communication we combine auditory and visual information. Neurophysiological research in nonhuman primates has shown that single neurons in ventrolateral prefrontal cortex (VLPFC) exhibit multisensory responses to faces and vocalizations presented simultaneously. However, whether VLPFC is also involved in maintaining those communication stimuli in working memory or combining stored information across different modalities is unknown, although its human homolog, the inferior frontal gyrus, is known to be important in integrating verbal information from auditory and visual working memory. To address this question, we recorded from VLPFC while rhesus macaques (Macaca mulatta) performed an audiovisual working memory task. Unlike traditional match-to-sample/nonmatch-to-sample paradigms, which use unimodal memoranda, our nonmatch-to-sample task used dynamic movies consisting of both facial gestures and the accompanying vocalizations. For the nonmatch conditions, a change in the auditory component (vocalization), the visual component (face), or both components was detected. Our results show that VLPFC neurons are activated by stimulus and task factors: while some neurons simply responded to a particular face or a vocalization regardless of the task period, others exhibited activity patterns typically related to working memory such as sustained delay activity and match enhancement/suppression. In addition, we found neurons that detected the component change during the nonmatch period. Interestingly, some of these neurons were sensitive to the change of both components and therefore combined information from auditory and visual working memory. These results suggest that VLPFC is not only involved in the perceptual processing of faces and vocalizations but also in their mnemonic processing.
Frontiers in Neuroscience | 2012
Soyoun Kim; Xinying Cai; Jaewon Hwang; Daeyeol Lee
The value of an object acquired by a particular action often determines the motivation to produce that action. Previous studies found neural signals related to the values of different objects or goods in the orbitofrontal cortex, while the values of outcomes expected from different actions are broadly represented in multiple brain areas implicated in movement planning. However, how the brain combines the values associated with various objects and the information about their locations is not known. In this study, we tested whether the neurons in the dorsolateral prefrontal cortex (DLPFC) and striatum in rhesus monkeys might contribute to translating the value signals between multiple frames of reference. Monkeys were trained to perform an oculomotor intertemporal choice in which the color of a saccade target and the number of its surrounding dots signaled the magnitude of reward and its delay, respectively. In both DLPFC and striatum, temporally discounted values (DVs) associated with specific target colors and locations were encoded by partially overlapping populations of neurons. In the DLPFC, the information about reward delays and DVs of rewards available from specific target locations emerged earlier than the corresponding signals for target colors. Similar results were reproduced by a simple network model built to compute DVs of rewards in different locations. Therefore, DLPFC might play an important role in estimating the values of different actions by combining the previously learned values of objects and their present locations.
international symposium on neural networks | 2009
Soyoun Kim; Jaewon Hwang; Hyojung Seo; Daeyeol Lee
The Journal of Neuroscience | 2015
Bethany Plakke; Jaewon Hwang; Lizabeth M. Romanski
Archive | 2015
Keisetsu Shima; Jun Tanji; Masahiko Takada; Yoshihiro Hirata; Shigehiro Miyachi; Ken-ichi Inoue; Taihei Ninomiya; Daisuke Takahara; Jaewon Hwang; Lizabeth M. Romanski; Joo-Hyun Song; Robert M. McPeek
Archive | 2015
Asif A. Ghazanfar; Chandramouli Chandrasekaran; Luis Lemus; Dipanwita Ghose; Alexander Maier; Aaron R. Nidiffer; Mark T. Wallace; C Perrodin; Christoph Kayser; Nk Logothetis; Christopher I. Petkov; Jaewon Hwang; Lizabeth M. Romanski