Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Gerald Gazdar is active.

Publication


Featured researches published by Gerald Gazdar.


Archive | 1981

Unbounded Dependencies and Coordinate Structure

Gerald Gazdar

Consider eliminating the transformational component of a generative grammar. In particular, consider the elimination of all movement rules, whether bounded or unbounded, and all rules making reference to identity of indices. Suppose, in fact, that the permitted class of generative grammars constituted a subset of those phrase structure grammars capable only of generating context-free languages. Such a move would have two important metatheoretical consequences, one having to do with learnability, the other with processability. In the first place, we would be imposing a rather dramatic restriction on the class of grammars that the language acquisition device needs to consider as candidates for the language being learned. And in the second place, we would have the beginnings of an explanation for the obvious, but largely ignored, fact that humans process the utterances they hear very rapidly.1 Sentences of a context-free language are provably parsable in a time which is proportional to the cube of the length of the sentence or less (Younger (1967), Earley (1970)). But no such restrictive result holds for the recursive or recursively enumerable sets potentially generable by grammars which include a transformational component.


Natural Language and Linguistic Theory | 1985

Coordination and how to distinguish categories

Ivan A. Sag; Gerald Gazdar; Thomas Wasow; Steven Weisler

ConclusionIn this paper we have presented a detailed treatment of key problems in the syntax of coordination in English which goes well beyond previous treatments in the breadth of its coverage.The separation of immediate dominance rules from linear precedence rules had played an essential role in our analysis. It is this aspect of Generalized Phrase Structure Grammar that allows the full range of conjunctions in English to be treated in a unified manner using a small set of constructs. This same factoring of dominance and ordering information is what allows us to account for such problems as the peculiar properties of the coordination of embedded clauses and NPs, as we have shown. In addition, it is the interplay of various independently motivated principles in GPSG, such as the Head Feature Convention and the Foot Feature Principle, that enable one to derive, rather than stipulate, a solution to such long-standing problems as the facts commonly discussed in terms of the Coordinate Structure Constraint and the Across-the-Board Convention.Over twenty years ago, the syntax of coordination was a key topic in the discussions that led to the widespread acceptance of transformational grammar. It is curious, then, that even today no version of transformational grammar has succeeded in explaining, and often not even in describing, well-known and very basic facts about coordination (e.g., the fact that arbitrary tensed VPs can coordinate with each other). Moreover, the various instances of coordination of unlike categories, which we have provided an account of without appeal to any ancillary devices or ad hoc principles, have received no serious analysis within the transformational tradition.Of course, much remains to be done on the grammar of coordinate constructions. Among the problems we have addressed insufficiently or not at all are the precise formulation of the syntax and semantics of non-constituent ellipsis, the treatment of ‘right node raising’ constructions, and the semantic peculiarities of N1-coordination discussed by Bergmann (1982). Nevertheless, the present paper improves on earlier generative treatments of coordination by broadening the coverage while at the same time stipulating less.


Archive | 1988

Applicability of Indexed Grammars to Natural Languages

Gerald Gazdar

If we take the class of context-free phrase structure grammars (CFPSGs) and modify it so that (i) grammars are allowed to make use of finite feature systems and (ii) rules are permitted to manipulate the features in arbitrary ways, then what we end up with is equivalent to what we started out with. Suppose, however, that we take the class of contextfree phrase structure grammars and modify it so that (i) grammars are allowed to employ a single designated feature that takes stacks of items drawn from some finite set as its values, and (ii) rules are permitted to push items onto, pop items from, and copy the stack. What we end up with now is no longer equivalent to the CF-PSGs but is significantly more powerful, namely the indexed grammars (Aho, 1968). This class of grammars has been alluded to a number of times in the recent linguistic literature: by Klein (1981) in connection with nested comparative constructions, by Dahl (1982) in connection with topicalised pronouns, by Engdahl (1982) and Gazdar (1982) in connection with Scandinavian unbounded dependencies, by Huybregts (1984) and Pulman and Ritchie (1984) in connection with Dutch, by Marsh and Partee (1984) in connection with variable binding, and doubtless elsewhere as well.


Linguistics and Philosophy | 1982

Natural languages and context-free languages

Geoffrey K. Pullum; Gerald Gazdar

In his 1956 paper ‘Three Models for the Description of Language’ Noam Chomsky posed an interesting open question: when we consider the human languages purely as sets of strings of words (henceforth stringsets), do they always fall within the class called context-free languages (CFL’s)? Chomsky declared that he did not know the answer to this question, and turned to a very different set of questions concerning relative elegance and economy of different types of description. Since 1956 various authors (Chomsky included) have attempted to provide answers in the negative, and the negative answer is now the standardly accepted one. We take up the question again in this paper, and show that it is still open, as all the arguments for the negative answer that have been provided in the literature are either empirically or formally incorrect.


conference of the european chapter of the association for computational linguistics | 1989

Inference in DATR

Roger Evans; Gerald Gazdar

DATR is a declarative language for representing a restricted class of inheritance networks, permitting both multiple and default inheritance. The principal intended area of application is the representation of lexical entries for natural language processing, and we use examples from this domain throughout. In this paper we present the syntax and inference mechanisms for the language. The goal of the DATR enterprise is the design of a simple language that (i) has the necessary expressive power to encode the lexical entries presupposed by contemporary work in the unification grammar tradition, (ii) can express all the evident generalizations about such entries, (iii) has an explicit theory of inference, (iv) is computationally tractable, and (v) has an explicit declarative semantics. The present paper is primarily concerned with (iii), though the examples used may hint at our strategy in respect of (i) and (ii).


New Generation Computing | 1985

Computationally relevant properties of natural languages and their grammars

Gerald Gazdar; Geoffrey K. Pullum

This paper surveys what is currently known about natural language morphology and syntax from the perspective of formal language theory. Firstly, the position of natural language word-sets and sentence-sets on the formal language hierarchy is discussed. Secondly, the contemporary use by linguists of a range of formal grammars (from finite state transducers to indexed grammars) in both word-syntax (i.e. morphology) and sentence-syntax is sketched. Finally, recent developments such as feature-theory, the use of extension and unification, default mechanisms, and metagram-matical techniques, are outlined.


meeting of the association for computational linguistics | 1995

Encoding Lexicalized Tree Adjoining Grammars with a Nonmonotonic Inheritance Hierarchy

Roger Evans; Gerald Gazdar; David J. Weir

This paper shows how DATR, a widely used formal language for lexical knowledge representation, can be used to define an LTAG lexicon as an inheritance hierarchy with internal lexical rules. A bottom-up featural encoding is used for LTAG trees and this allows lexical rules to be implemented as covariation constraints within feature structures. Such an approach eliminates the considerable redundancy otherwise associated with an LTAG lexicon.


Journal of Linguistics | 1999

German noun inflection.

Lynne J. Cahill; Gerald Gazdar

This is the second of a series of three papers that, taken together, will give an essentially complete account of inflection in standard German. In this paper we present that part of the account that covers nouns, one that captures all the regularities, subregularities and irregularities that are involved, but with a focus on the subregularities. Inflected forms are defined in terms of their syllable structure, as proposed in Cahill (1990a, b, 1993). The analysis is formulated as a DATR theory – a set of lexical axioms – from which all the relevant facts follow as theorems. DATR is a widely used formal lexical knowledge representation language developed for use in computational linguistics.


Linguistics | 1997

The inflectional phonology of German adjectives, determiners, and pronouns

Lynne J. Cahill; Gerald Gazdar

This is the first of a series of papers that, taken together, will give an essentially complete account of inflection in standard German. In this paper we present that part of the account that covers adjectives, determiners, and third-person pronouns, one that captures all the regularities, subregularities, and irregularities that are involved. The forms are defined in terms of their syllable structure, as proposed in Cahill (1990a, 1990b, 1993). The morphological treatment is based on ideas originally set out by Zwicky (1985). The analysis is formulated as a DATR theory - a set of lexical axioms - from which all the relevant facts follow as theorems. DATR is a widely used formal lexical-knowledge-representation language developed for use in computational linguistics


Journal of Pragmatics | 1980

Pragmatics and logical form

Gerald Gazdar

Abstract It has often been observed that there exist certain apparent differences in meaning between natural language words like some, possible, and, or , and their logical counterparts. These differences are described and an examination is made of their implications for the view that natural languages can be handled semantically in the same way that formal logical languages can be. It is argued that the differences can be naturally explained within an elaboration of Grices pragmatic theory of conversation. The view of meaning that emerges from these considerations is a hybrid one, compounded from a restricted logicist semantics on the one hand, and a broadly based pragmatics on the other.

Collaboration


Dive into the Gerald Gazdar's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ewan Klein

University of Edinburgh

View shared research outputs
Top Co-Authors

Avatar

Roger Evans

University of Brighton

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge