John M. Zelle
University of Texas at Austin
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by John M. Zelle.
international symposium on neural networks | 1992
Paul T. Baffes; John M. Zelle
Concepts based on two observations of perceptrons are presented. When the perceptron learning algorithm cycles among hyperplanes, the hyperplanes may be compared in order to select one that gives a best split of the examples, and it is always possible for the perceptron to build a hyperplane that separates at least one example from all the rest. The authors describe the Extentron, which grows multi-layer networks capable of distinguishing nonlinearly separable data using the simple perceptron rule for linear threshold units. The resulting algorithm is simple, very fast, scales well to large problems, retains the convergence properties of the perceptron, and can be completely specified using only two parameters. Results are presented comparing the Extentron to other neural network paradigms and to symbolic learning systems.<<ETX>>
international joint conference on artificial intelligence | 1996
John M. Zelle; Raymond J. Mooney
This paper presents results from recent experiments with Chill, a corpus-based parser acquisition system. Chill treats language acquisition as the learning of search-control rules within a logic program. Unlike many current corpus-based approaches that use statistical learning algorithms, Chill uses techniques from inductive logic programming (ILP) to learn relational representations. Chill is a very flexible system and has been used to learn parsers that produce syntactic parse trees, case-role analyses, and executable database queries. The reported experiments compare Chills performance to that of a more naive application of ILP to parser acquisition. The results show that ILP techniques, as employed in Chill, are a viable alternative to statistical methods and that the control-rule framework is fundamental to Chills success.
technical symposium on computer science education | 2006
David Ranum; Bradley N. Miller; John M. Zelle; Mark Guzdial
Learning computer science requires deliberate and incremental exposure to the fundamental ideas of the discipline. This paper will describe our initial experience teaching an introductory computer science sequence using the programming language Python. We will present our position and then use specific examples to show how Python can provide an exceptional environment for teaching computer science.
Intelligence\/sigart Bulletin | 1994
Raymond J. Mooney; John M. Zelle
This paper presents a review of recent work that integrates methods from Inductive Logic Programming (ILP) and Explanation-Based Learning (EBL). ILP and EBL methods have complementary strengths and weaknesses and a number of recent projects have effectively combined them into systems with better performance than either of the individual approaches. In particular, integrated systems have been developed for guiding induction with prior knowledge (ML-Smart, FOCL, GRENDEL) refining imperfect domain theories (FORTE, AUDREY, Rx), and learning effective search-control knowledge (AxA-EBL, DOLPHIN).
national conference on artificial intelligence | 1996
John M. Zelle; Raymond J. Mooney
national conference on artificial intelligence | 1993
John M. Zelle; Raymond J. Mooney
Archive | 2003
John M. Zelle
international joint conference on artificial intelligence | 1993
John M. Zelle; Raymond J. Mooney
international conference on machine learning | 1994
John M. Zelle; Raymond J. Mooney; Joshua B. Konvisser
Archive | 1996
John M. Zelle