Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Rafal Adamczak.
Neural Processing Letters | 1998
Włodzisław Duch; Rafal Adamczak; Krzysztof Grąbczewski
Three neural-based methods for extraction of logical rules from data are presented. These methods facilitate conversion of graded response neural networks into networks performing logical functions. MLP2LN method tries to convert a standard MLP into a network performing logical operations (LN). C-MLP2LN is a constructive algorithm creating such MLP networks. Logical interpretation is assured by adding constraints to the cost function, forcing the weights to ±1 or 0. Skeletal networks emerge ensuring that a minimal number of logical rules are found. In both methods rules covering many training examples are generated before more specific rules covering exceptions. The third method, FSM2LN, is based on the probability density estimation. Several examples of performance of these methods are presented.
International Journal of Computational Intelligence and Applications | 2000
Włodzisław Duch; Rafal Adamczak; Krzysztof Grąbczewski; Grzegorz Żal; Yoichi Hayashi
A comprehensive methodology of extraction of optimal sets of logical rules using neural networks and global minimization procedures has been developed. Initial rules are extracted using density estimation neural networks with rectangular functions or multi-layered perceptron (MLP) networks trained with constrained backpropagation algorithm, transforming MLPs into simpler networks performing logical functions. A constructive algorithm called C-MLP2LN is proposed, in which rules of increasing specificity are generated consecutively by adding more nodes to the network. Neural rule extraction is followed by optimization of rules using global minimization techniques. Estimation of confidence of various sets of rules is discussed. The hybrid approach to rule extraction has been applied to a number of benchmark and real life problems with very good results. In many cases crisp logical rules are quite satisfactory, but sometimes fuzzy rules may be significantly more accurate.
Journal of Advanced Computational Intelligence and Intelligent Informatics | 1999
Włodzisław Duch; Rafal Adamczak; Krzysztof Grabczewski; Grzegorz Zal
Methodology of extraction of optimal sets of logical rules using neural networks and global minimization procedures has been developed. Initial rules are extracted using density estimation neural networks with rectangular functions or multi-layered perceptron (MLP) networks trained with constrained backpropagation algorithm, transforming MLPs into simpler networks performing logical functions. A constructive algorithm called C-MLP2LN is proposed, in which rules of increasing specificity are generated consecutively by adding more nodes to the network. Neural rule extraction is followed by optimization of rules using global minimization techniques. Estimation of confidence of various sets of rules is discussed. The hybrid approach to rule extraction has been applied to a number of benchmark and real life problems with very good results.
intelligent information systems | 2000
Włodzisław Duch; Norbert Jankowski; Krzysztof Grabczewski; Rafal Adamczak
Machine learning methods are frequently used to create rule-based classifiers. For continuous features linguistic variables used in conditions of the rules are defined by membership functions. These linguistic variables should be optimized at the level of single rules or sets of rules. Assuming the Gaussian uncertainty of input values allows to increase the accuracy of predictions and to estimate probabilities of different classes. Detailed interpretation of relevant rules is possible using (probabilistic) confidence intervals. A real life example of such interpretation is given for personality disorders. The approach to optimization and interpretation described here is applicable to any rule-based system.
Neural Processing Letters | 1999
Włodzisław Duch; Rafal Adamczak; Geerd H. F. Diercksen
Multilayer Perceptrons (MLPs) use scalar products to compute weighted activation of neurons providing decision borders using combinations of soft hyperplanes. The weighted fun-in activation function may be replaced by a distance function between the inputs and the weights, offering a natural generalization of the standard MLP model. Non-Euclidean distance functions may also be introduced by normalization of the input vectors into an extended feature space. Both approaches influence the shapes of decision borders dramatically. An illustrative example showing these changes is provided.
international symposium on neural networks | 1999
Włodzisław Duch; Rafal Adamczak
Multilayer perceptrons (MLPs) use scalar products to compute weighted activation of neurons providing decision borders using combinations of soft hyperplanes. The weighted fan-in activation function corresponds to Euclidean distance functions used to compute similarities between input and weight vector. Replacing the fan-in activation function by non-Euclidean distance function offers a natural generalization of the standard MLP model, providing more flexible decision borders. An alternative way leading to similar results is based on renormalization of the input vectors using non-Euclidean norms in extended feature spaces. Both approaches influence the shapes of decision borders dramatically, allowing to reduce the complexity of MLP networks.
IEEE Transactions on Neural Networks | 2001
Włodzisław Duch; Rafal Adamczak; Krzysztof Grabczewski
Archive | 2000
Rafal Adamczak; Norbert Jankowski
Archive | 2000
Włodzisław Duch; Rafal Adamczak; Krzysztof Gra̧bczewski
the european symposium on artificial neural networks | 1997
Włodzisław Duch; Rafal Adamczak; Krzysztof Grabczewski; Masumi Ishikawa; Hiroki Ueda