LiMin Fu
University of Florida
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by LiMin Fu.
IEEE Transactions on Neural Networks | 1998
LiMin Fu
A challenging problem in machine learning is to discover the domain rules from a limited number of instances. In a large complex domain, it is often the case that the rules learned by the computer are at most approximate. To address this problem, this paper describes the CFNet which bases its activation function on the certainty factor (CF) model of expert systems. A new analysis on the computational complexity of rule learning in general is provided. A further analysis shows how this complexity can be reduced to a point where the domain rules can be accurately learned by capitalizing on the activation function characteristics of the CFNet. The claimed capability is adequately supported by empirical evaluations and comparisons with related systems.
international symposium on neural networks | 1992
LiMin Fu
Knowledgetron, a novel intelligent system which derives rule-based expert systems from neural networks trained by a special computational model, is described. The knowledge of such neural networks is extracted and represented as production rules. The main consideration is that the generated rule-based system perform as well as the original neural network. The system consists of two coupled components. One is the KTBP trainer, which is applied to a multilayer neural network for learning from the data. The trained neural network is translated into a rule-based system by the second component, the KT translator. The feasibility and validity of Knowledgetron have been demonstrated on both small and large neural networks for practical applications.<<ETX>>
IEEE Transactions on Neural Networks | 1998
LiMin Fu
The computational framework of rule-based neural networks inherits from the neural network and the inference engine of an expert system. In one approach, the network activation function is based on the certainty factor (CF) model of MYCIN-like systems. In this paper, it is shown theoretically that the neural network using the CF-based activation function requires relatively small sample sizes for correct generalization. This result is also confirmed by empirical studies in several independent domains.
IEEE Transactions on Neural Networks | 1996
LiMin Fu
A major development in knowledge-based neural networks is the integration of symbolic expert rule-based knowledge into neural networks, resulting in so-called rule-based neural (or connectionist) networks. An expert network here refers to a particular construct in which the uncertainty management model of symbolic expert systems is mapped into the activation function of the neural network. This paper addresses a yet-to-be-answered question: Why can expert networks generalize more effectively from a finite number of training instances than multilayered perceptrons? It formally shows that expert networks reduce generalization dimensionality and require relatively small sample sizes for correct generalization.
international symposium on neural networks | 1994
LiMin Fu; Hui-hunag Hsu; Jose C. Principe
How to learn new knowledge without forgetting old knowledge is a key issue in designing an incremental-learning neural network. In this paper, we present a rule-based connectionist approach in which old knowledge is preserved by bounding weight modifications. In addition, some heuristics are developed for avoiding overtraining of the network and adding new hidden units. The feasibility of this approach is demonstrated for classification problems including the iris and the promoter domains.<<ETX>>
international symposium on neural networks | 1994
Hyeoncheol Kim; LiMin Fu
How to obtain maximum generalization and fault-tolerance has been an important issue in designing a feedforward network. Research on rule-based neural networks suggests that generalization of a neural network is related to the directions of the pattern vectors encoded by hidden units, while fault-tolerance depends on the magnitudes of the weights. In this paper, a rule-based neural network is shown better than a standard neural network both in generalization and fault tolerance. In addition, a formal measure for evaluating network fault tolerance is introduced.<<ETX>>
international symposium on neural networks | 1991
LiMin Fu
Summary form only given. Attention is given to Knowledgetron, a novel, domain-independent system dedicated to automatically translating the knowledge of an artificial neural network which has been trained to classify the data correctly using backpropagation into a set of if-then rules. Knowledgetron is able to generate rules from an adapted net in an efficient way. Learning parameters of the net that critically influence the system behavior are identified. The system is able to deal with both single-layer and multilayer networks, and can learn both confirming and disconfirming rules. The tractability of the learning algorithm of Knowledgetron was shown analytically. Empirically, Knowledgetron was demonstrated in the domain of wind shear detection by infrared sensors with success.<<ETX>>
international symposium on neural networks | 1999
LiMin Fu
To discover underlying domain regularities or rules has been a major long-term goal for scientific research (knowledge discovery) and engineering application (problem solving). However, when the domain rules get complex, current machine learning programs learn only approximate rather than true domain rules from a limited amount of observed data. This paper presents a new neural-network-based system which is intended for discovering precisely the domain rules with neither false positives nor false negatives. In a performance study, this system is ten times more accurate than the most well-known rule-learning system, C4.5, in terms of the rate of false rules induced from the data.
international symposium on neural networks | 1996
Hui-Huang Hsu; LiMin Fu; Jose C. Principe
This paper presents a gammma neural network that learns temporal context through the adaptation of the depth of the maintained short-term memory. The capability of context learning by the gamma network is demonstrated in the domain of sleep staging.
international symposium on neural networks | 1994
LiMin Fu
Shows how to represent rule-based domain knowledge in the framework of a neural network and how to represent neural network knowledge in the format rules for connectionist symbolic processing.<<ETX>>