Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Barend Beekhuizen is active.

Publication


Featured researches published by Barend Beekhuizen.


Language and Speech | 2013

Three Design Principles of Language: The Search for Parsimony in Redundancy:

Barend Beekhuizen; Rens Bod; Willem H. Zuidema

In this paper we present three design principles of language – experience, heterogeneity and redundancy – and present recent developments in a family of models incorporating them, namely Data-Oriented Parsing/Unsupervised Data-Oriented Parsing. Although the idea of some form of redundant storage has become part and parcel of parsing technologies and usage-based linguistic approaches alike, the question how much of it is cognitively realistic and/or computationally optimally efficient is an open one. We argue that a segmentation-based approach (Bayesian Model Merging) combined with an all-subtrees approach reduces the number of rules needed to achieve an optimal performance, thus making the parser more efficient. At the same time, starting from unsegmented wholes comes closer to the acquisitional situation of a language learner, and thus adds to the cognitive plausibility of the model.


meeting of the association for computational linguistics | 2014

A Usage-Based Model of Early Grammatical Development

Barend Beekhuizen; Rens Bod; Afsaneh Fazly; Suzanne Stevenson; Arie Verhagen

The representations and processes yielding the limited length and telegraphic style of language production early on in acquisition have received little attention in acquisitional modeling. In this paper, we present a model, starting with minimal linguistic representations, that incrementally builds up an inventory of increasingly long and abstract grammatical representations (form+meaning pairings), in line with the usage-based conception of language acquisition. We explore its performance on a comprehension and a generation task, showing that, over time, the model better understands the processed utterances, generates longer utterances, and better expresses the situation these utterances intend to refer to.


Proceedings of the Sixth Workshop on Cognitive Aspects of Computational Language Learning | 2015

Perceptual, conceptual, and frequency effects on error patterns in English color term acquisition

Barend Beekhuizen; Suzanne Stevenson

Children’s overextension errors in word usage can yield insights into the underlying representation of meaning. We simulate overextension patterns in the domain of color with two word-learning models, and look at the contribution of three possible factors: perceptual properties of the colors, typological prevalence of certain color groupings into categories (as a proxy for cognitive naturalness), and color term frequency. We find that the perceptual features provide the strongest predictors of the error pattern observed during development, and can effectively rule out color term frequency as an explanation. Typological prevalence is shown to correlate strongly with the perceptual dimensions of color, and hence provides no effect over and above the perceptual dimensions.


Cognitive Science | 2018

More Than the Eye Can See: A Computational Model of Color Term Acquisition and Color Discrimination

Barend Beekhuizen; Suzanne Stevenson

We explore the following two cognitive questions regarding crosslinguistic variation in lexical semantic systems: Why are some linguistic categories-that is, the associations between a term and a portion of the semantic space-harder to learn than others? How does learning a language-specific set of lexical categories affect processing in that semantic domain? Using a computational word-learner, and the domain of color as a testbed, we investigate these questions by modeling both child acquisition of color terms and adult behavior on a non-verbal color discrimination task. A further goal is to test an approach to lexical semantic representation based on the principle that the more languages label any two situations with the same word, the more conceptually similar those two situations are. We compare such a crosslinguistically based semantic space to one based on perceptual similarity. Our computational model suggests a mechanistic explanation for the interplay between term frequency and the semantic closeness of learned categories in developmental error patterns for color terms. Our model also indicates how linguistic relativity effects could arise from an acquisition mechanism that yields language-specific topologies for the same semantic domain. Moreover, we find that the crosslinguistically inspired semantic space supports these results at least as well as-and in some aspects better than-the purely perceptual one, thus confirming our approach as a practical and principled method for lexical semantic representation in cognitive modeling.


Archive | 2014

3. Automating construction work: Data-Oriented Parsing and constructivist accounts of language acquisition

Barend Beekhuizen; Rens Bod

The constructionist approach to language has long proven its merits as a theoretical framework guiding linguistic observations. However, relatively little work has been dedicated to providing a precise, formalized definition of constructions and the mechanisms by means of which they are acquired. In giving an overview of recent work in Data-Oriented Parsing (DOP), we show how the theoretical development of construction grammar and usage-based approaches to language acquisition can benefit from the converging evidence and novel insights that computational models such as DOP can provide us with. In this chapter, we introduce DOP and compare its properties to usage-based and constructionist ideas about the nature of grammar and its acquisition. We discuss the unsupervised incarnation of DOP, U-DOP, and show how it can be used to address nativist hypotheses about the learnability of grammatical patterns. Finally, we propose an extension of the formalism that is able to learn a meaning-driven grammar from unstructured input data.


Cognitive Science | 2014

Learning Meaning without Primitives: Typology Predicts Developmental Patterns

Barend Beekhuizen; Afsaneh Fazly; Suzanne Stevenson


Cognitive Science | 2013

Word Learning in the Wild: What Natural Data Can Tell Us

Barend Beekhuizen; Afsaneh Fazly; Aida Nematzadeh; Suzanne Stevenson


Nederlandse Taalkunde | 2016

De zijnsstatus van de afhankelijke V1-constructie in het Nederlands

Barend Beekhuizen


Archive | 2017

3. Acquiring relational meaning from the situational context: What linguists can learn from analyzing videotaped interaction

Barend Beekhuizen; Rens Bod; Arie Verhagen; Jacqueline Evers-Vermeul; Elena Tribushinina


Cognitive Science | 2017

Calculating Probabilities Simplifies Word Learning.

Aida Nematzadeh; Barend Beekhuizen; Shanshan Huang; Suzanne Stevenson

Collaboration


Dive into the Barend Beekhuizen's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Rens Bod

University of Amsterdam

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge