Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Kazuya Mera is active.

Publication


Featured researches published by Kazuya Mera.


international conference on knowledge-based and intelligent information and engineering systems | 2004

Emotion Analyzing Method Using Physiological State

Kazuya Mera; Takumi Ichimura

We propose a method to calculate the client’s emotion from the con-tent of the dialogue and the client’s physical states. Firstly, the system analyzes the client’s utterances grammatically and calculates the degree of preference for each case frame elements. The system also extracts 20 features from four physiological signals (blood pressure, skin conductance, respiration, and heart rate) based on Picard’s research. Both data are inputted into a parallel sand glass type neural network to calculate the user’s pleasure/displeasure for the sentence. Then, it classifies the pleasure/displeasure into 20 emotions.


computational intelligence | 2013

Emotion-oriented agent in mental state transition learning network

Takumi Ichimura; Kazuya Mera

Mental state transition network which consists of mental states connected to one another is a basic concept of approximating human psychological and mental responses. It can represent transition from an emotional state to another with stimulus calculated by emotion generating calculations method. In this paper, the agent using mental state transition network can interact with humans to realise smooth communication by two kinds of reinforcement learning methods. Some experimental results can show the variance of humans delicate emotion. The proposed technique can be expected to be an emotional-oriented interface in case of treatment of mental disorder.


systems, man and cybernetics | 2012

A method using acoustic features to detect inadequate utterances in medical communication

Michihisa Kurisu; Kazuya Mera; Ryunosuke Wada; Yoshiaki Kurosawa; Toshiyuki Takezawa

We previously proposed a method that uses grammatical features to detect inadequate utterances of doctors. However, nonverbal information such as that conveyed by gestures, facial expression, and tone of voice are also important. In this paper, we propose a method that uses eight acoustic features to detect three types of mental states (sincerity, confidence, and doubtfulness/acceptance). A Support Vector Machine (SVM) is used to learn these features. Experiments showed that the systems accuracy and recall rates respectively ranged from 0.79-0.91 and 0.80-0.94.


2011 International Conference on Speech Database and Assessments (Oriental COCOSDA) | 2011

A question-and-answer classification technique for constructing and managing spoken dialog system

Ryosuke Inoue; Yoshiaki Kurosawa; Kazuya Mera; Toshiyuki Takezawa

To recognize user speech accurately and respond to it appropriately, a spoken dialog system usually uses a question-and-answer database (QADB) which contains many question-and-answer pairs. The systems first select a question example which is the most similar to the recognition result for the input voice from the database. An answer sentence which is then paired with the selected question example is output to the user. Many systems have a large database to enable a more appropriate answer to be output. However, when such a database is used, the waiting time increases because the system needs to find the most appropriate question example from a vast number of question examples. We propose a method of classifying the queries in the QADB. By classifying question examples into some clusters using pLSA, an appropriate question example can be found more quickly than when using the conventional method. We evaluated the validity of our proposed method by changing various parameters.


international conference on knowledge based and intelligent information and engineering systems | 2006

Expressed emotion calculation method according to the user's personality

Kazuya Mera; Takumi Ichimura

In human communication, people do not always express the emotions they feel inside. In this paper, we propose a method to calculate “expressed emotions” from “aroused emotions” based on the personality of the agent, the situation of the conversation, and relationship with the partner. We consider five types of personality factors (extroversion, agreeableness, conscientiousness, neuroticism, and openness to experience) based on the “Big 5” model. The effect of each personality factor is calculated to amplify/suppress the aroused emotions. The amplification/suppression effects from five personality factors are collated and the degree of aroused emotion is calculated.


international conference on knowledge-based and intelligent information and engineering systems | 2004

Emotion Oriented Intelligent System for Elderly People

Kazuya Mera; Yoshiaki Kurosawa; Takumi Ichimura

We propose the “emotion oriented intelligent interface for elderly people” to be able to access computers easily. We applied three methods about natural language dialogue and emotion, analyzing the user’s utterances, estimating and expressing the user’s emotions, and analyzing the user’s intention from his/her utterances. By using these three methods, the user would be able to communicate with the system naturally. We constructed an interface system based on the methods, and the interface system has been applied into the “web-based health service system for elderly people.”


international conference on knowledge-based and intelligent information and engineering systems | 2003

Emotion Generating Calculations Based on Hidden Markov Model

Kazuya Mera; Takumi Ichimura

The EGCs can calculate a human being’s current emotion from the content of the utterance. In this paper, we extend the EGCs to refer variance of emotions based on Hidden Markov Model (HMM). Because the EGCs focus on 20 types of emotion, a 20-dimension vector is checked for a stimulus from external world. The enfeeblement process of the emotion vector’s degree is expressed by the exponential distribution function and the current emotion vector is calculated based on HMM.


Journal of the Acoustical Society of America | 2016

Natural language dialog system considering speaker’s emotion for open-ended conversation

Takumi Takahashi; Kazuya Mera; Yoshiaki Kurosawa; Toshiyuki Takezawa

To respond appropriately to an utterance, human-like communication system, should consider not only words in the utterance but also the speaker’s emotion. We thus proposed a natural language dialog system that can estimate the user’s emotion from utterances and respond on the basis of the estimated emotion. To estimate a speaker’s emotion (positive, negative, or neutral), 384 acoustic features extracted from an utterance are utilized by a Support Vector Machine (SVM). Artificial Intelligence Markup Language (AIML)-based response generating rules are expanded so that the speaker’s emotion can be considered as a condition of these rules. Two experiments were carried out to compare impressions of a dialog agent that considered emotion (proposed system) with those of an agent that did not (previous system). In the first experiment, 10 subjects evaluated the impressions after watch four conversation videos (no emotion estimation, correct emotion estimation, inadequate emotion estimation, and imperfect emotion ...


systems, man and cybernetics | 2011

Classification of EGC output and Mental State Transition Network using Self Organizing Map

Kazuya Mera; Takumi Ichimura

Mental State Transition Network which consists of mental states connected one another is a basic concept of approximating to human psychological and mental responses. It can represent transition from an emotional state to other one with stimulus by calculating Emotion Generating Calculations method. However, this method ignores most of emotions except for an emotion which has the strongest effect although EGC can calculate the degree of 20 emotions in parallel. In this paper, we investigate the discrepancy between the group of emotions by EGC and the clustering results of the relation of sentences and their emotions by Self Organizing Map. Mental state transits based on the group of emotions on the map. For example, a set of emotions in a group transits the mental state “happy,” and negative mental state is enfeebled.


international conference on knowledge based and intelligent information and engineering systems | 2000

Emotion-based planning evaluation method

Kazuya Mera; S. Kawamoto; Mitsuko Yamura-Takei; Teruaki Aizawa

It has been suggested by a number of researchers that emotional systems in agents might have great practical value. We propose a system that makes a decision about a plan, basing its judgment on emotion (i.e. pleasure/displeasure) extracted from knowledge, in an attempt to build a human-like computer agent, better able to communicate with users. We present an emotion-based plan evaluation method, and then provide the result simulated on naturally occurring short-story scenarios and show that plan evaluation based on artificial emotion generated by our method successfully matches the one based on human emotion.

Collaboration


Dive into the Kazuya Mera's collaboration.

Top Co-Authors

Avatar

Takumi Ichimura

Prefectural University of Hiroshima

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Toshiyuki Yamashita

Tokyo Metropolitan University

View shared research outputs
Top Co-Authors

Avatar

Teruaki Aizawa

Hiroshima City University

View shared research outputs
Top Co-Authors

Avatar

Akira Hara

Hiroshima City University

View shared research outputs
Top Co-Authors

Avatar

Makoto Yoshie

Hiroshima City University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ryunosuke Wada

Hiroshima City University

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge