Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Hamido Fujita is active.

Publication


Featured researches published by Hamido Fujita.


Knowledge Based Systems | 2015

Application of entropies for automated diagnosis of epilepsy using EEG signals

U. Rajendra Acharya; Hamido Fujita; K. Vidya Sudarshan; Shreya Bhat; Joel E.W. Koh

Epilepsy can be detected using EEG signals.The entropy indicates the complexity of the EEG signal.Various entropies are used to diagnose epilepsy.Unique ranges for various entropies are proposed. Epilepsy is the neurological disorder of the brain which is difficult to diagnose visually using Electroencephalogram (EEG) signals. Hence, an automated detection of epilepsy using EEG signals will be a useful tool in medical field. The automation of epilepsy detection using signal processing techniques such as wavelet transform and entropies may optimise the performance of the system. Many algorithms have been developed to diagnose the presence of seizure in the EEG signals. The entropy is a nonlinear parameter that reflects the complexity of the EEG signal. Many entropies have been used to differentiate normal, interictal and ictal EEG signals. This paper discusses various entropies used for an automated diagnosis of epilepsy using EEG signals. We have presented unique ranges for various entropies used to differentiate normal, interictal, and ictal EEG signals and also ranked them depending on the ability to discrimination ability of three classes. These entropies can be used to classify the different stages of epilepsy and can also be used for other biomedical applications.


Knowledge Based Systems | 2017

A visual interaction consensus model for social network group decision making with trust propagation

Jian Wu; Francisco Chiclana; Hamido Fujita; Enrique Herrera-Viedma

A theoretical visual interaction framework to model consensus in social network group decision making (SN-GDM) is put forward with following three main components: (1) construction of trust relationship; (2) trust based recommendation mechanism; and (3) visual adoption mechanism. To do that, dual trust propagation is investigated to connect incomplete trust relationship by trusted third partners, in a way that it can fit our intuition in these cases: trust values decrease while distrust values increase. Trust relationship is proposed to be used in determining the trust degree of experts and in aggregating individual opinions into a collective one. Three levels of consensus degree are defined and used to identify the inconsistent experts. A trust based recommendation mechanism is developed to generate advices according to individual trust relationship, making recommendations more likeable to be implemented by the inconsistent experts to achieve higher levels of consensus. Therefore, it has an advantage with respect to existing interaction models because it does not force the inconsistent experts to accept advices irrespective of their trust on them. Finally, a visual adoption mechanism, which provides visual information representations on experts individual consensus positions before and after adopting the recommendation advices, is presented and analysed theoretically. Experts can select their appropriate feedback parameters to achieve a balance between group consensus and individual independence. Consequently, the proposed visual interaction model adds real and needed flexibility in guiding the consensus reaching process in SN-GDM.


Information Sciences | 2016

Towards felicitous decision making

Hai Wang; Zeshui Xu; Hamido Fujita; Shousheng Liu

The era of Big Data has arrived along with large volume, complex and growing data generated by many distinct sources. Nowadays, nearly every aspect of the modern society is impacted by Big Data, involving medical, health care, business, management and government. It has been receiving growing attention of researches from many disciplines including natural sciences, life sciences, engineering and even art & humanities. It also leads to new research paradigms and ways of thinking on the path of development. Lots of developed and under-developing tools improve our ability to make more felicitous decisions than what we have made ever before. This paper presents an overview on Big Data including four issues, namely: (i) concepts, characteristics and processing paradigms of Big Data; (ii) the state-of-the-art techniques for decision making in Big Data; (iii) felicitous decision making applications of Big Data in social science; and (iv) the current challenges of Big Data as well as possible future directions.


Knowledge Based Systems | 2009

Intelligent human interface based on mental cloning-based software

Hamido Fujita; Jun Hakura; Masaki Kurematu

This paper reports on our experience in adapting emotional experiences of the software engineers in the evolutionary design of software systems. This paper represents the progress report on the development relative to the state of art needed to have multidisciplinary technologies for establishing the best harmonica engagement between human user and software application, based on cognitive analysis. The best performance related engagement has been achieved using together both the facial and voice analysis. And through it, we have measured (collectivized and quantified) and observed the user behavior, and accordingly enhanced the engagement by generative interactive scenario. The approach has been experimented using a famous literature person (Keni Miyazawa).


Knowledge Based Systems | 2017

Automated detection of coronary artery disease using different durations of ECG segments with convolutional neural network

U. Rajendra Acharya; Hamido Fujita; Oh Shu Lih; Muhammad Adam; Jen Hong Tan; Chua Kuang Chua

Coronary artery disease (CAD) is caused due by the blockage of inner walls of coronary arteries by plaque. This constriction reduces the blood flow to the heart muscles resulting in myocardial infarction (MI). The electrocardiogram (ECG) is commonly used to screen the cardiac health. The ECG signals are nonstationary and nonlinear in nature whereby the transient disease indicators may appear randomly on the time scale. Therefore, the procedure to diagnose the abnormal beat is arduous, time consuming and prone to human errors. The automated diagnosis system overcomes these problems. In this study, convolutional neural network (CNN) structures comprising of four convolutional layers, four max pooling layers and three fully connected layers are proposed for the diagnosis of CAD using two and five seconds durations of ECG signal segments. Deep CNN is able to differentiate between normal and abnormal ECG with an accuracy of 94.95%, sensitivity of 93.72%, and specificity of 95.18% for Net 1 (two seconds) and accuracy of 95.11%, sensitivity of 91.13% and specificity of 95.88% for Net 2 (5 s). The proposed system can help the clinicians in their accurate and reliable decision making of CAD using ECG signals.


Knowledge Based Systems | 2015

An integrated index for detection of Sudden Cardiac Death using Discrete Wavelet Transform and nonlinear features

U. Rajendra Acharya; Hamido Fujita; Vidya K. Sudarshan; Vinitha Sree; Lim Wei Jie Eugene; Dhanjoo N. Ghista; Ru San Tan

Display Omitted Novel Sudden Cardiac Death Index (SCDI) is proposed using ECG signals.Nonlinear features are extracted from DWT coefficients.SCDI is formulated using nonlinear features.SCDI predicts accurately SCD 4min before its onset. Early prediction of person at risk of Sudden Cardiac Death (SCD) with or without the onset of Ventricular Tachycardia (VT) or Ventricular Fibrillation (VF) still remains a continuing challenge to clinicians. In this work, we have presented a novel integrated index for prediction of SCD with a high level of accuracy by using electrocardiogram (ECG) signals. To achieve this, nonlinear features (Fractal Dimension (FD), Hursts exponent (H), Detrended Fluctuation Analysis (DFA), Approximate Entropy (ApproxEnt), Sample Entropy (SampEnt), and Correlation Dimension (CD)) are first extracted from the second level Discrete Wavelet Transform (DWT) decomposed ECG signal. The extracted nonlinear features are ranked using t-value and then, a combination of highly ranked features are used in the formulation and employment of an integrated Sudden Cardiac Death Index (SCDI). This calculated novel SCDI can be used to accurately predict SCD (four minutes before the occurrence) by using just one numerical value four minutes before the SCD episode. Also, the nonlinear features are fed to the following classifiers: Decision Tree (DT), k-Nearest Neighbour (KNN), and Support Vector Machine (SVM). The combination of DWT and nonlinear analysis of ECG signals is able to predict SCD with an accuracy of 92.11% (KNN), 98.68% (SVM), 93.42% (KNN) and 92.11% (SVM) for first, second, third and fourth minutes before the occurrence of SCD, respectively. The proposed SCDI will constitute a valuable tool for the medical professionals to enable them in SCD prediction.


Information Fusion | 2018

A minimum adjustment cost feedback mechanism based consensus model for group decision making under social network with distributed linguistic trust

Jian Wu; Lifang Dai; Francisco Chiclana; Hamido Fujita; Enrique Herrera-Viedma

Abstract A theoretical feedback mechanism framework to model consensus in social network group decision making (SN-GDM) is proposed with following two main components: (1) the modelling of trust relationship with linguistic information; and (2) the minimum adjustment cost feedback mechanism. To do so, a distributed linguistic trust decision making space is defined, which includes the novel concepts of distributed linguistic trust functions, expectation degree, uncertainty degrees and ranking method. Then, a social network analysis (SNA) methodology is developed to represent and model trust relationship between a networked group, and the trust in-degree centrality indexes are calculated to assign an importance degree to the associated user. To identify the inconsistent users, three levels of consensus degree with distributed linguistic trust functions are calculated. Then, a novel feedback mechanism is activated to generate recommendation advices for the inconsistent users to increase the group consensus degree. Its novelty is that it produces the boundary feedback parameter based on the minimum adjustment cost optimisation model. Therefore, the inconsistent users are able to reach the threshold value of group consensus incurring a minimum modification of their opinions or adjustment cost, which provides the optimum balance between group consensus and individual independence. Finally, after consensus has been achieved, a ranking order relation for distributed linguistic trust functions is constructed to select the most appropriate alternative of consensus.


Knowledge Based Systems | 2016

A hybrid approach to the sentiment analysis problem at the sentence level

Orestes Appel; Francisco Chiclana; Jenny Carter; Hamido Fujita

The objective of this article is to present a hybrid approach to the Sentiment Analysis problem at the sentence level. This new method uses natural language processing (NLP) essential techniques, a sentiment lexicon enhanced with the assistance of SentiWordNet, and fuzzy sets to estimate the semantic orientation polarity and its intensity for sentences, which provides a foundation for computing with sentiments. The proposed hybrid method is applied to three different data-sets and the results achieved are compared to those obtained using Naive Bayes and Maximum Entropy techniques. It is demonstrated that the presented hybrid approach is more accurate and precise than both Naive Bayes and Maximum Entropy techniques, when the latter are utilised in isolation. In addition, it is shown that when applied to datasets containing snippets, the proposed method performs similarly to state of the art techniques.


Knowledge Based Systems | 2016

Automated detection and localization of myocardial infarction using electrocardiogram

U. Rajendra Acharya; Hamido Fujita; K. Vidya Sudarshan; Shu Lih Oh; Muhammad Adam; Joel E.W. Koh; Jen-Hong Tan; Dhanjoo N. Ghista; Roshan Joy Martis; Chua Kuang Chua; Chua Kok Poo; Ru San Tan

Identification and timely interpretation of changes occurring in the 12 electrocardiogram (ECG) leads is crucial to identify the types of myocardial infarction (MI). However, manual annotation of this complex nonlinear ECG signal is not only cumbersome and time consuming but also inaccurate. Hence, there is a need of computer aided techniques to be applied for the ECG signal analysis process. Going further, there is a need for incorporating this computerized software into the ECG equipment, so as to enable automated detection of MIs in clinics. Therefore, this paper proposes a novel method of automated detection and localization of MI by using ECG signal analysis. In our study, a total of 200 twelve lead ECG subjects (52 normal and 148 with MI) involving 611,405 beats (125,652 normal beats and 485,753 beats of MI ECG) are segmented from the 12 lead ECG signals. Firstly, ECG signal obtained from 12 ECG leads are subjected to discrete wavelet transform (DWT) up to four levels of decomposition. Then, 12 nonlinear features namely, approximate entropy ( E a x ), signal energy (?x), fuzzy entropy ( E f x ), Kolmogorov-Sinai entropy ( E k s x ), permutation entropy ( E p x ), Renyi entropy ( E r x ), Shannon entropy ( E s h x ), Tsallis entropy ( E t s x ), wavelet entropy ( E w x ), fractal dimension ( F D x ), Kolmogorov complexity ( C k x ), and largest Lyapunov exponent ( E L L E x ) are extracted from these DWT coefficients. The extracted features are then ranked based on the t value. Then these features are fed into the k-nearest neighbor (KNN) classifier one by one to get the highest classification performance by using minimum number of features. Our proposed method has achieved the highest average accuracy of 98.80%, sensitivity of 99.45% and specificity of 96.27% in classifying normal and MI ECG (two classes), by using 47 features obtained from lead 11 (V5). We have also obtained the highest average accuracy of 98.74%, sensitivity of 99.55% and specificity of 99.16% in differentiating the 10 types of MI and normal ECG beats (11 class), by using 25 features obtained from lead 9 (V3). In addition, our study results achieved an accuracy of 99.97% in locating inferior posterior infarction by using only lead 9 (V3) ECG signal. Our proposed method can be used as an automated diagnostic tool for (i) the detection of different (10 types of) MI by using 12 lead ECG signal, and also (ii) to locate the MI by analyzing only one lead without the need to analyze other leads. Thus, our proposed algorithm and computerized system software (incorporated into the ECG equipment) can aid the physicians and clinicians in accurate and faster location of MIs, and thereby providing adequate time available for the requisite treatment decision.


Knowledge Based Systems | 2015

Systematic mapping study on granular computing

Saber Salehi; Ali Selamat; Hamido Fujita

This paper discusses on the research on granular computing from 2012 to 2014.We defined four perspectives of classification schemes to map the selected studies.A total of 112 relevant published articles in established journals have been studied.39% of the relevant articles belong to a rough set framework.Fuzzy-GrC algorithm has outperformed other clustering algorithms in experiments. Granular computing has attracted many researchers as a new and rapidly growing paradigm of information processing. In this paper, we apply systematic mapping study to classify the granular computing researches to discover relative derivations to specify its research strength and quality. Our search scope is limited to the Science Direct and IEEE Transactions papers published between January 2012 and August 2014. We defined four perspectives of classification schemes to map the selected studies that are focus area, contribution type, research type and framework. Results of mapping the selected studies show that almost half of the research focused area belongs to category of data analysis. In addition, most of the selected papers belong to proposing the solutions in research type scheme. Distribution of papers between tool, method and enhancement categories of contribution type are almost equal. Moreover, 39% of the relevant papers belong to the rough set framework. The results show that there is little attention paid to cluster analysis in existing frameworks to discover granules for classification. We applied five clustering algorithms on three datasets from UCI repository to compare the form of information granules, and then classify the patterns and define them to a specific class based on their geometry and belongings. The clustering algorithms are DBSCAN, c-means, k-means, GAk-means and Fuzzy-GrC and the comparison of information granules are based on the coverage, misclassification and accuracy. The survey of experimental results mostly shows Fuzzy-GrC and GAk-means algorithm superior to other clustering algorithms; while, c-means clustering algorithm shows inferior to other clustering algorithms.

Collaboration


Dive into the Hamido Fujita's collaboration.

Top Co-Authors

Avatar

Jun Hakura

Iwate Prefectural University

View shared research outputs
Top Co-Authors

Avatar

Masaki Kurematsu

Iwate Prefectural University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Tianrui Li

Southwest Jiaotong University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jen Hong Tan

National University of Singapore

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge