Benjamin Börschinger
Macquarie University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Benjamin Börschinger.
international conference on acoustics, speech, and signal processing | 2013
Aren Jansen; Emmanuel Dupoux; Sharon Goldwater; Mark Johnson; Sanjeev Khudanpur; Kenneth Church; Naomi H. Feldman; Hynek Hermansky; Florian Metze; Richard C. Rose; Michael L. Seltzer; Pascal Clark; Ian McGraw; Balakrishnan Varadarajan; Erin Bennett; Benjamin Börschinger; Justin Chiu; Ewan Dunbar; Abdellah Fourtassi; David F. Harwath; Chia-ying Lee; Keith Levin; Atta Norouzian; Vijayaditya Peddinti; Rachael Richardson; Thomas Schatz; Samuel Thomas
We summarize the accomplishments of a multi-disciplinary workshop exploring the computational and scientific issues surrounding zero resource (unsupervised) speech technologies and related models of early language acquisition. Centered around the tasks of phonetic and lexical discovery, we consider unified evaluation metrics, present two new approaches for improving speaker independence in the absence of supervision, and evaluate the application of Bayesian word segmentation algorithms to automatic subword unit tokenizations. Finally, we present two strategies for integrating zero resource techniques into supervised settings, demonstrating the potential of unsupervised methods to improve mainstream technologies.
international joint conference on natural language processing | 2015
Zhendong Zhao; Lan Du; Benjamin Börschinger; John K. Pate; Massimiliano Ciaramita; Mark Steedman; Mark Johnson
Most existing topic models make the bagof-words assumption that words are generated independently, and so ignore potentially useful information about word order. Previous attempts to use collocations (short sequences of adjacent words) in topic models have either relied on a pipeline approach, restricted attention to bigrams, or resulted in models whose inference does not scale to large corpora. This paper studies how to simultaneously learn both collocations and their topic assignments. We present an efficient reformulation of the Adaptor Grammar-based topical collocation model (AG-colloc) (Johnson, 2010), and develop a point-wise sampling algorithm for posterior inference in this new formulation. We further improve the efficiency of the sampling algorithm by exploiting sparsity and parallelising inference. Experimental results derived in text classification, information retrieval and human evaluation tasks across a range of datasets show that this reformulation scales to hundreds of thousands of documents while maintaining the good performance of the AG-colloc model.
empirical methods in natural language processing | 2011
Benjamin Börschinger; Bevan K. Jones; Mark Johnson
Transactions of the Association for Computational Linguistics | 2014
Benjamin Börschinger; Mark Johnson
meeting of the association for computational linguistics | 2012
Benjamin Börschinger; Mark Johnson
Proceedings of the Australasian Language Technology Association Workshop 2011 | 2011
Benjamin Börschinger; Mark Johnson
meeting of the association for computational linguistics | 2013
Benjamin Börschinger; Mark Johnson; Katherine Demuth
international conference on computational linguistics | 2014
Gabriel Synnaeve; Isabelle Dautriche; Benjamin Börschinger; Mark Johnson; Emmanuel Dupoux
Proceedings of the Fourth Annual Workshop on Cognitive Modeling and Computational Linguistics (CMCL) | 2013
Abdellah Fourtassi; Benjamin Börschinger; Mark Johnson; Emmanuel Dupoux
Cognitive Science | 2012
Stephan C. Meylan; Chigusa Kurumada; Michael C. Frank; Benjamin Börschinger; Mark Johnson