Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Junhai Zhai is active.

Publication


Featured researches published by Junhai Zhai.


Information Sciences | 2008

Induction of multiple fuzzy decision trees based on rough set technique

Xi-Zhao Wang; Junhai Zhai; Shu-Xia Lu

The integration of fuzzy sets and rough sets can lead to a hybrid soft-computing technique which has been applied successfully to many fields such as machine learning, pattern recognition and image processing. The key to this soft-computing technique is how to set up and make use of the fuzzy attribute reduct in fuzzy rough set theory. Given a fuzzy information system, we may find many fuzzy attribute reducts and each of them can have different contributions to decision-making. If only one of the fuzzy attribute reducts, which may be the most important one, is selected to induce decision rules, some useful information hidden in the other reducts for the decision-making will be losing unavoidably. To sufficiently make use of the information provided by every individual fuzzy attribute reduct in a fuzzy information system, this paper presents a novel induction of multiple fuzzy decision trees based on rough set technique. The induction consists of three stages. First several fuzzy attribute reducts are found by a similarity based approach, and then a fuzzy decision tree for each fuzzy attribute reduct is generated according to the fuzzy ID3 algorithm. The fuzzy integral is finally considered as a fusion tool to integrate the generated decision trees, which combines together all outputs of the multiple fuzzy decision trees and forms the final decision result. An illustration is given to show the proposed fusion scheme. A numerical experiment on real data indicates that the proposed multiple tree induction is superior to the single tree induction based on the individual reduct or on the entire feature set for learning problems with many attributes.


soft computing | 2012

Dynamic ensemble extreme learning machine based on sample entropy

Junhai Zhai; Hong-yu Xu; Xi-Zhao Wang

Extreme learning machine (ELM) as a new learning algorithm has been proposed for single-hidden layer feed-forward neural networks, ELM can overcome many drawbacks in the traditional gradient-based learning algorithm such as local minimal, improper learning rate, and low learning speed by randomly selecting input weights and hidden layer bias. However, ELM suffers from instability and over-fitting, especially on large datasets. In this paper, a dynamic ensemble extreme learning machine based on sample entropy is proposed, which can alleviate to some extent the problems of instability and over-fitting, and increase the prediction accuracy. The experimental results show that the proposed approach is robust and efficient.


soft computing | 2011

Fuzzy decision tree based on fuzzy-rough technique

Junhai Zhai

Using an efficient criterion in selection of fuzzy conditional attributes (i.e. expanded attributes) is important for generation of fuzzy decision trees. Given a fuzzy information system (FIS), fuzzy conditional attributes play a crucial role in fuzzy decision making. Besides, different fuzzy conditional attributes have different influences on decision making, and some of them may be more important than the others. Two well-known criteria employed to select expanded attributes are fuzzy classification entropy and classification ambiguity, both of which essentially use the ratio of uncertainty to measure the significance of fuzzy conditional attributes. Based on fuzzy-rough technique, this paper proposes a new criterion, in which expanded attributes are selected by using significance of fuzzy conditional attributes with respect to fuzzy decision attributes. An illustrative example as well as the experimental results demonstrates the effectiveness of our proposed method.


International Journal of Pattern Recognition and Artificial Intelligence | 2008

FAST FUZZY MULTICATEGORY SVM BASED ON SUPPORT VECTOR DOMAIN DESCRIPTION

Xi-Zhao Wang; Shu-Xia Lu; Junhai Zhai

This paper proposes a fast fuzzy classifier of multicategory support vector machines (FMSVM) based on support vector domain description (SVDD). The main idea is that the proposed FMSVM is obtained by directly considering all data in one optimization formulation, using a fuzzy membership to each input point. The fuzzy membership is determined by support vector domain description (SVDD). For making support vector machine (SVM) more practical, we use an implement of the modified sequential minimal optimization (SMO) that can quickly solve SVM quadratic programming (QP) problems without any extra matrix storage or the use of numerical QP optimization steps at all. Compared with the existing SVMs, the newly proposed FMSVM that uses the L2-norm in the objective function shows improvement with regards to accuracy of classification and reduction of the effects of noises and outliers. The experiment also shows the efficiency of the modified SMO for expediting the training of SVM.


International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems | 2013

FUSION OF EXTREME LEARNING MACHINE WITH FUZZY INTEGRAL

Junhai Zhai; Hongyu Xu; Yan Li

Extreme learning machine (ELM) is an efficient and practical learning algorithm used for training single hidden layer feed-forward neural networks (SLFNs). ELM can provide good generalization performance at extremely fast learning speed. However, ELM suffers from instability and over-fitting, especially on relatively large datasets. Based on probabilistic SLFNs, an approach of fusion of extreme learning machine (F-ELM) with fuzzy integral is proposed in this paper. The proposed algorithm consists of three stages. Firstly, the bootstrap technique is employed to generate several subsets of original dataset. Secondly, probabilistic SLFNs are trained with ELM algorithm on each subset. Finally, the trained probabilistic SLFNs are fused with fuzzy integral. The experimental results show that the proposed approach can alleviate to some extent the problems mentioned above, and can increase the prediction accuracy.


Journal of Intelligent and Fuzzy Systems | 2013

An improved algorithm for calculating fuzzy attribute reducts

Junhai Zhai; Mengyao Zhai; Chenyan Bai

Fuzzy rough attribute reduct has been widely used to remove redundant real-valued attributes without discretizing. By now, there are two existing fuzzy rough attribute reduct methods, one is based on dependency function and another based on discernibility matrix. The former proposed by Shen in 2002 can deal with fuzzy decision table FDT with real-valued condition attributes and fuzzy decision attributes. However, this algorithm is not convergent on many real datasets, and the computational complexity of the algorithm increases exponentially with the number of input variables. The latter proposed by Tsang in 2008 can only deal with fuzzy decision table with real-valued condition attributes and symbol-valued decision attributes. In this paper, we extend the latter method and propose two algorithms for calculating all fuzzy rough attribute reducts to deal with fuzzy decision table with real-valued condition and decision attributes. The first algorthim is designed for computing all fuzzy attribute reducts, yet the computation complexity of this algorithm increases exponentially with the number of attributes. The second one which can find one near-optimal reduct is a heuristic variant of the first algorithm. The experimental results show the proposed method is feasible and effective.


Journal of Intelligent and Fuzzy Systems | 2016

A cross-selection instance algorithm

Junhai Zhai; Ta Li; Xizhao Wang

Motivated by the idea of cross-validation, a novel instance selection algorithm is proposed in this paper. The novelties of the proposed algorithm are that (1) it cross selects the important instances from the original data set with a committee, (2) it can deal with the problem of selecting instance from large data sets. We experimentally compared our algorithm with five state-of-the-art approaches which are CNN, ENN, RNN, MCS, and ICF on 3 artificial data sets and 6 UCI data sets, including 4 large data sets, ranking from 130K to 4898K in size. The experimental results show that the proposed algorithm is very efficient and effective, especially on large data sets.


international conference on machine learning and cybernetics | 2014

Monotonic Decision Tree for Interval Valued Data

Hong Zhu; Junhai Zhai; Shanshan Wang; Xi-Zhao Wang

Traditional decision tree algorithms for interval valued data only can deal with non-ordinal classification problems. In this paper, we presented an algorithm to solve the ordinal classification problems, where both the condition attributes with interval values and the decision attributes meet the monotonic requirement. The algorithm uses the rank mutual information to select extended attributes, which guarantees that the outputted decision tree is monotonic. The proposed algorithm is illustrated by a numerical example, and a monotonically consistent decision tree is generated. The design of algorithm can provide some useful guidelines for extending real-vauled to interval-valued attributes in ordinal decision tree induction.


intelligent data analysis | 2014

Condensed fuzzy nearest neighbor methods based on fuzzy rough set technique

Junhai Zhai; Meng-Yao Zhai; Xiaomeng Kang

As a generalization of K-nearest neighbor K-NN algorithm, the fuzzy K-nearest neighbor fuzzy K-NN algorithm was originally developed by Keller in 1985 to overcome one of the drawbacks of K-NN i.e. all of instances are considered equally important in K-NN. However, fuzzy K-NN algorithm still suffers from the problem of large memory requirement same as K-NN. To deal with this problem, based on fuzzy rough set technique, this paper proposed two condensed fuzzy nearest neighbor methods denoted by CFK-NN1 and CFK-NN2 and a modified fuzzy K-NN. The CFK-NN1 and CFK-NN2 both consists of three steps: 1 obtaining a fuzzy attribute reduct based on fuzzy rough set technique, 2 finding two sets of prototypes, the one is selected from fuzzy positive region corresponding to CFK-NN1 and the other is selected from fuzzy boundary region corresponding to CFK-NN2, 3 extracting fuzzy classification rules with the modified fuzzy K-NN from the two sets of prototypes. Extensive experiments and statistical analysis are conducted to verify the effectiveness of our proposed method. The experimental results and the statistical analysis of the experimental results both demonstrate that the proposed methods outperform other related methods such as CNN, ENN, and ICF et al.


soft computing | 2006

A Nonlinear Integral Defined on Partition and its Application to Decision Trees

Xi-Zhao Wang; Su-Fang Zhang; Junhai Zhai

Nonlinear integrals play an important role in information fusion. So far, all existing nonlinear integrals of a function with respect to a set function are defined on a subset of a space. In many of the problems with information fusion, such as decision tree generation in inductive learning, we often need to deal with the function defined on a partition of the space. Motivated by minimizing the classification information entropy of a partition while generating decision trees, this paper proposes a nonlinear integral of a function with respect to a nonnegative set function on a partition, and provides the conclusion that the sum of the weighted entropy of the union of several subsets is not less than the sum of the weighted entropy of a single subset. It is shown that selecting the entropy of a single attribute is better than selecting the entropy of the union of several attributes in generating rules by decision trees.

Collaboration


Dive into the Junhai Zhai's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Su-Fang Zhang

China Meteorological Administration

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Sufang Zhang

China Meteorological Administration

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge