Adam Kowalczyk
Telecom Australia
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Adam Kowalczyk.
Neural Networks | 1993
Nicholas J. Redding; Adam Kowalczyk; Tom Downs
Constructive learning algorithms are important because they address two practical difficulties of learning in artificial neural networks. First, it is not always possible to determine the minimal network consistent with a particular problem. Second, algorithms like backpropagation can require networks that are larger than the minimal architecture for satisfactory convergence. Further, constructive algorithms have the advantage that polynomial-time learning is possible if network size is chosen by the learning algorithm so that the learning of the problem under consideration is simplified. This article considers the representational ability of feedforward networks (FFNs) in terms of the fan-in required by the hidden units of a network. We define network order to be the maximum fan-in of the hidden units of a network. We prove, in terms of the problems they may represent, that a higher-order network (HON) is at least as powerful as any other FFN architecture when the order of the networks are the same. Next, we present a detailed theoretical development of a constructive, polynomial-time algorithm that will determine an exact HON realization with minimal order for an arbitrary binary or bipolar mapping problem. This algorithm does not have any parameters that need tuning for good performance. We show how an FFN with sigmoidal hidden units can be determined from the HON realization in polynomial time. Last, simulation results of the constructive HON algorithm are presented for the two-or-more clumps problem, demonstrating that the algorithm performs well when compared with the Tiling and Upstart algorithms.
IEEE Transactions on Neural Networks | 1994
Adam Kowalczyk; Herman L. Ferrá
Introduces a class of simple polynomial neural network classifiers, called mask perceptrons. A series of algorithms for practical development of such structures is outlined. It relies on ordering of input attributes with respect to their potential usefulness and heuristic driven generation and selection of hidden units (monomial terms) in order to combat the exponential explosion in the number of higher-order monomial terms to choose from. Results of tests for two popular machine learning benchmarking domains (mushroom classification and faulty LED-display), and for two nonstandard domains (spoken digit recognition and article category determination) are given. All results are compared against a number of other classifiers. A procedure for converting a mask perceptron to a classical logic production rule is outlined and shown to produce a number of 100% percent accurate simple rules after training on 6-20% of a database.
international symposium on neural networks | 1991
Adam Kowalczyk; Herman L. Ferrá; Ken Gardiner
It is demonstrated by example that neural networks can be used successfully for automatic extraction of production rules from empirical data. The case considered is a popular public domain database of 8124 mushrooms. With the use of a term selection algorithm, a number of very accurate mask perceptrons (a kind of high-order network or polynomial classifier) have been developed. Then rounding of synaptic weights was applied, leading in many cases to networks with integer weights which were subsequently converted to production rules. It is also shown that focusing of network attention onto a smaller subset of useful attributes ordered with respect to their decreasing discriminating abilities helps significantly in accurate rule generation.<<ETX>>
Siam Journal on Applied Mathematics | 1987
S. Janeczko; Adam Kowalczyk
A classification of typical 3-dimensional Lagrangian singularities with
Reports on Mathematical Physics | 1988
S. Janeczko; Adam Kowalczyk
Z_2 \oplus Z_2
international symposium on neural networks | 1994
Adam Kowalczyk
symmetry (independent changes of sign in two coordinates) is presented. This provides the finite classification of typical and structurally stable local forms of a class of 4-dimensional internal energies with the uniaxial ferromagnet symmetry.The equivalence relation in the latest classification preserves the basic thermodynamic features of internal energies: the symmetry, the internal stability regions and the inequalities of chemical potentials for states of the system with respectively equal remaining thermodynamic forces.As an example of applications a phenomenological model of the Curie point for an uniaxial ferromagnet is presented. This demonstrates an alternative to classical “ad hoc” approaches in phenomenological modelling of critical phenomena.
international symposium on neural networks | 1991
Herman L. Ferrá; Adam Kowalczyk; Andrew Jennings
One of the useful methods of mathematical physics is the one arising from symplectic geometry and associating the singularities of Lagrangian submanifolds with the optical caustics, phase transitions, bifurcation patterns, obstacle geometry etc. In this paper we derive the stability criteria for singularities of equivariant Lagrangian submanifolds with a compact Lie group action determined by a system with symmetry. The recognition problem and classification list for stable (Z2)q-equivariant singularities is proved. We find that the classified stable local models occur as possible realizations for the equilibrium states in the breaking of symmetry and structural phase transitions. Additionally, the connection between two technically different (by generating functions, by Morse families) infinitesimal G-stability conditions for equivariant Lagrangian submanifols is studied and an alternative approach is proposed.
Communications in Mathematical Physics | 1993
S. Janeczko; Adam Kowalczyk
This paper extends the classical results of T. Cover (1965) and others on separating capacity of families of nonlinear neurons of the form x/spl rarr/sgn (/spl Sigma//sub i=1//sup d/ /spl omega//sub i//spl phi//sub i/(x))=0, where w/sub i//spl isin/E are real coefficients (synaptic weights), /spl phi//sub i/:E/sup n//spl rarr/E are functions (measurement transformation) and sgn is the signum function on E. We show that the capacity of such a system is 2dim/spl phi/ input patterns, i.e. twice the number of linearly independent functions in the set /spl phi//sub 1/,.../spl phi//sub d/, if the functions /spl phi//sub i/ are analytic. This is achieved by showing that in such a case the Covers assumption of /spl phi/-general positions of input vectors is almost universally satisfied.<<ETX>>
international symposium on neural networks | 1991
Adam Kowalczyk
The authors introduce an algorithm for selection and ordering of input attributes based on a generalization to a fuzzy case of the notion of conditional entropy. The algorithm is relatively computationally inexpensive and efficient, as was demonstrated in a number of experiments that are reported. The experimental results support the observation that preselection and ordering of a small number of effective input features constitute an important factor in the development of efficient neural network classifiers.<<ETX>>
Archive | 2005
Herman L. Ferrá; Robert Palmer; Michael John Dale; Peter Kenneth Campbell; Karl Alan Christiansen; Adam Kowalczyk; Jacek Szymanski
The paper provides the complete list of local models forZ2l-invariant generic germs of Lagrangian submanifolds of dimension ≦3. Classification is done directly for genrating functions of Lagrangian submanifolds and contains both elementary singularities and non-elementary ones with continuous moduli. The results demonstrate, in particular, that in contrast to the non-equivariant case the classification of equivariant Lagrangian singularities is not subordinated to the classification of symmetric functions up to the right equivariant equivalences.