Merel Scholman
Saarland University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Merel Scholman.
Dialogue & DIscourse | 2016
Merel Scholman; Jacqueline Evers-Vermeul; Ted Sanders
Over the last decennia, annotating discourse coherence relations has gained increasing interest of the linguistics research community. Because of the complexity of coherence relations, there is no agreement on an annotation standard. Current annotation methods often lack a systematic order of coherence relations. In this article, we investigate the usability of the cognitive approach to coherence relations, developed by Sanders et al. (1992, 1993), for discourse annotation. The theory proposes a taxonomy of coherence relations in terms of four cognitive primitives. In this paper, we first develop a systematic, step-wise annotation process. The reliability of this annotation scheme is then tested in an annotation experiment with non-trained, non-expert annotators. An implicit and explicit version of the annotation instruction was created to determine whether the type of instruction influences the annotator agreement. The results show that two of the four primitives, polarity and order of the segments, can be applied reliably by non-trained annotators. The other two primitives, basic operation and source of coherence, are more problematic. Participants using the explicit instruction show higher agreement on the primitives than participants used the implicit instruction. These results are comparable to agreement statistics of other discourse corpora annotated by trained, expert annotators. Given that non-trained, non-expert annotators show similar amounts of agreement, these results indicate that the cognitive approach to coherence relations is a promising method for annotating discourse.
linguistic annotation workshop | 2017
Merel Scholman; Vera Demberg
Traditional discourse annotation tasks are considered costly and time-consuming, and the reliability and validity of these tasks is in question. In this paper, we investigate whether crowdsourcing can be used to obtain reliable discourse relation annotations. We also examine the influence of context on the reliability of the data. The results of a crowdsourced connective insertion task showed that the method can be used to obtain reliable annotations: The majority of the inserted connectives converged with the original label. Further, the method is sensitive to the fact that multiple senses can often be inferred for a single relation. Regarding the presence of context, the results show no significant difference in distributions of insertions between conditions overall. However, a by-item comparison revealed several characteristics of segments that determine whether the presence of context makes a difference in annotations. The findings discussed in this paper can be taken as evidence that crowdsourcing can be used as a valuable method to obtain insights into the sense(s) of relations.
language resources and evaluation | 2016
Ines Rehbein; Merel Scholman; Vera Demberg
Dialogue & Discourse | 2016
Merel Scholman; Jacqueline Evers-Vermeul; Ted Sanders
Journal of Memory and Language | 2017
Merel Scholman; Hannah Rohde; Vera Demberg
Corpus Linguistics and Linguistic Theory | 2018
Ted Sanders; Vera Demberg; Jet Hoek; Merel Scholman; Fatemeh Torabi Asr; Sandrine Zufferey; Jacqueline Evers-Vermeul
meeting of the association for computational linguistics | 2017
Jet Hoek; Merel Scholman
arXiv: Computation and Language | 2017
Vera Demberg; Fatemeh Torabi Asr; Merel Scholman
Dialogue & Discourse | 2017
Jacqueline Evers-Vermeul; Jet Hoek; Merel Scholman
Dialogue & Discourse | 2017
Merel Scholman; Vera Demberg