Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where H. David Mathias is active.

Publication


Featured researches published by H. David Mathias.


conference on learning theory | 1996

Teaching a Smarter Learner

Sally A. Goldman; H. David Mathias

We introduce a formal model of teaching in which the teacher is tailored to a particular learner, yet the teaching protocol is designed so that no collusion is possible. Not surprisingly, such a model remedies the nonintuitive aspects of other models in which the teacher must successfully teach any consistent learner. We prove that any class that can be exactly identified by a deterministic polynomial-time algorithm with access to a very rich set of example-based queries is teachable by a computationally unbounded teacher and a polynomial-time learner. In addition, we present other general results relating this model of teaching to various previous results. We also consider the problem of designing teacher/learner pairs in which both the teacher and learner are polynomial-time algorithms and describe teacher/learner pairs for the classes of 1-decision lists and Horn sentences


SIAM Journal on Computing | 1999

Exact Learning of Discretized Geometric Concepts

Nader H. Bshouty; Paul W. Goldberg; Sally A. Goldman; H. David Mathias

We first present an algorithm that uses membership and equivalence queries to exactly identify a discretized geometric concept defined by the union of m axis-parallel boxes in d-dimensional discretized Euclidean space where each coordinate can have n discrete values. This algorithm receives at most md counterexamples and uses time and membership queries polynomial in m and log n for any constant d. Furthermore, all equivalence queries can be formulated as the union of O(md log m) axis-parallel boxes. Next, we show how to extend our algorithm to efficiently learn, from only equivalence queries, any discretized geometric concept generated from any number of halfspaces with any number of known (to the learner) slopes in a constant dimensional space. In particular, our algorithm exactly learns (from equivalence queries only) unions of discretized axis-parallel boxes in constant dimensional space in polynomial time. Furthermore, this equivalence query only algorithm can be modified to handle a polynomial number of lies in the counterexamples provided by the environment. Finally, we introduce a new complexity measure that better captures the complexity of the union of


conference on learning theory | 1992

Learning k -term DNF formulas with an incomplete membership oracle

Sally A. Goldman; H. David Mathias

m


conference on learning theory | 1994

Learning unions of boxes with membership and equivalence queries

Paul W. Goldberg; Sally A. Goldman; H. David Mathias

boxes than simply the number of boxes and the dimension. Our new measure,


conference on learning theory | 1993

Teaching a smart learner

Sally A. Goldman; H. David Mathias

\sigma


symposium on the theory of computing | 1996

Noise-tolerant distribution-free learning of general geometric concepts

Nader H. Bshouty; Sally A. Goldman; H. David Mathias; Subhash Suri; Hisao Tamaki

, is the number of segments in the target, where a segment is a maximum portion of one of the sides of the target that lies entirely inside or entirely outside each of the other halfspaces defining the target. We present a modification of our first algorithm that uses time and queries polynomial in


conference on learning theory | 1995

DNF—if you can't learn'em, teach'em: an interactive model of teaching

H. David Mathias

\sigma


Information & Computation | 1998

Noise-Tolerant Parallel Learning of Geometric Concepts

Nader H. Bshouty; Sally A. Goldman; H. David Mathias

and log n. In fact, the time and queries (both membership and equivalence) used by this single algorithm are polynomial for either m or d constant.


conference on learning theory | 1995

Noise-tolerant parallel learning of geometric concepts

Nader H. Bshouty; Sally A. Goldman; H. David Mathias

We consider the problem of learning k-term DNF formulas using equivalence queries and incomplete membership queries as defined by Angluin and Slonim. We demonstrate that this model can be applied to non-monotone classes. Namely, we describe a polynomial-time algorithm that constructs a k-term DNF formula representationally equivalent to the target using incomplete membership queries and equivalence queries from the class of DNF formulas.


Archive | 1994

Exact learning of discretized concepts

Nader H. Bshouty; Paul W. Goldberg; Sally A. Goldman; H. David Mathias

We present two algorithms that use membership and equivalence queries to exactly identify the concepts given by the union of <italic>s</italic> discretized axis-parallel boxes in <italic>d</italic>-dimensional discretized Euclidean space where there are <italic>n</italic> discrete values that each coordinate can have. The first algorithm receives at most <italic>sd</italic> counterexamples and uses time and membership queries polynomial in <italic>s</italic> and log<italic>n</italic> for any <italic>d</italic> constant. Further, all equivalence queries made can be formulated as the union of <italic>O</italic>(<italic>sd</italic>log<italic>s</italic>) axis parallel boxes. Next, we introduce a new complexity measure that better captures the complexity of a union of boxes than simply the number of boxes and dimensions. Our new measure, &sgr;, is the number of segments in the target polyhedron where a segment is a maximum portion of one of the sides of the polyhedron that lies entirely inside or entirely outside each of the other halfspaces defining the polyhedron. We then present an improvement of our first algorithm that uses time and queries polynomial in &sgr; and log<italic>n</italic>. The hypothesis class used here is decision trees of height at most 2<italic>sd</italic>. Further we can show that the time and queries used by this algorithm are polynomial in <italic>d</italic> and log<italic>n</italic> for <italic>s</italic> any constant thus generating the exact learnability of DNF formulas with a constant number of terms. In fact, this single algorithm is efficient for either <italic>s</italic> or <italic>d</italic> constant.

Collaboration


Dive into the H. David Mathias's collaboration.

Top Co-Authors

Avatar

Sally A. Goldman

Washington University in St. Louis

View shared research outputs
Top Co-Authors

Avatar

Nader H. Bshouty

Technion – Israel Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Subhash Suri

University of California

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge