Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Özge Alaçam is active.

Publication


Featured researches published by Özge Alaçam.


international conference on universal access in human computer interaction | 2013

Towards designing audio assistance for comprehending haptic graphs: a multimodal perspective

Özge Alaçam; Christopher Habel; Cengiz Acartürk

Statistical graphs, such as line graphs are widely used in multimodal communication settings. Language accompanies graphs and humans produce gestures during the course of communication. For visually impaired people, haptic-audio interfaces provide perceptual access to graphical representations. The local and sequential character of haptic perception introduces limitations in haptic perception of hard-to-encode information, which can be resolved by providing audio assistance. In this article we first present a review of multimodal interactions between gesture, language and graphical representations. We then focus on methodologies for investigating hard-to-encode information in graph comprehension. Finally, we present a case study to provide insight for designing audio assistance.


international conference of design user experience and usability | 2014

Developing a Verbal Assistance System for Line Graph Comprehension

Cengiz Acartürk; Özge Alaçam; Christopher Habel

Statistical graphs have been designed for accessible use by visually impaired users. Haptic devices provide an appropriate interface for haptic exploration of statistical graphs. However, haptic exploration of statistical graphs reveals a more local and sequential inspection pattern compared to visual exploration. This difference between haptic exploration and visual exploration is usually attributed to different characteristics of the exploration processes, such as bandwidth of information extraction. To facilitate information extraction from statistical graphs, alternative sensory modalities have been employed. In particular, line graphs have been represented by sound, thus leading to sonified graphs. Despite their demonstrated facilitating effects, sonified graphs have limitations under complex line representations. One method of overcoming those difficulties is to develop a verbal assistance system for haptic line graph comprehension. In the present article, we summarize our studies on designing and developing a verbal assistance system for haptic line graph comprehension. We present the findings in a set of studies conducted with blindfolded and visually impaired participants.


international conference of design user experience and usability | 2015

Haptic Exploration Patterns in Virtual Line-Graph Comprehension

Özge Alaçam; Cengiz Acartürk; Christopher Habel

Multi-modal interfaces that provide haptic access to statistical line graphs combined with verbal assistance are proposed as an effective tool to fulfill the needs of visually impaired people. Graphs do not only present data, they also provide and elicit the extraction of second order entities such as maxima or trends, which are closely linked to shape properties of the graphs. In an experimental study, we investigated collaborative joint activities between haptic explorers of graphs and verbal assistants who helped haptic explorers to conceptualize local and non-local second-order concepts. The assistants have not only to decide what to say but in particular when to say it. Based on the empirical data of this experiment, we describe in the present paper the design of a feature set for describing patterns of haptic exploration, which is able to characterize the need for verbal assistance during the course of haptic exploration. We employed a supervised classification algorithm, namely the J48 decision tree. The constructed features within the range from basic low-level user-action features to complex high-level conceptual were categorized into four feature sets. All feature set combinations achieved high accuracy level. The best results in terms of sensitivity and specificity were achieved by adding the low-level graphical features.


Diagrams'10 Proceedings of the 6th international conference on Diagrammatic representation and inference | 2010

Effects of graph type in the comprehension of cyclic events

Özge Alaçam; Annette Hohenberger; Kursat Cagiltay

This study presents an analysis of the effect of different graph types on the comprehension of cyclic events. The results indicated that although round and linear graph designs are informationally equivalent, the round graphs are computationally better suited than linear graphs for the interpretation of cyclic concepts.


Annual International Symposium on Information Management and Big Data | 2017

A Multi-modal Data-Set for Systematic Analyses of Linguistic Ambiguities in Situated Contexts

Özge Alaçam; Tobias Staron; Wolfgang Menzel

Human situated language processing involves the interaction of linguistic and visual processing and this cross-modal integration helps to resolve ambiguities and predict what will be revealed next in an unfolding sentence during spoken communication. However, most state-of-the-art parsing approaches rely solely on the language modality. This paper aims to introduce a multi-modal data-set addressing challenging linguistic structures and visual complexities, which state-of-the-art parsers should be able to deal with. It also briefly addresses the multi-modal parsing approach and a proof-of-concept study that shows the contribution of employing visual information during disambiguation.


Cognitive Processing | 2015

Switching reference frame preferences during verbally assisted haptic graph comprehension.

Özge Alaçam; Christopher Habel; Cengiz Acartürk

Abstract Haptic–audio interfaces allow haptic exploration of statistical line graphs accompanied by sound or speech, thus providing access to exploration by visually impaired people. Verbally assisted haptic graph exploration can be seen as a task-oriented collaborative activity between two partners, a haptic explorer and an observing assistant, who are disposed to individual preferences for using reference frames. The experimental findings reveal that haptic explorers’ spatial reference frames are mostly induced by hand movements, leading to actionperspective instead of conventionally left-to-right spatiotemporalperspective. Moreover, the communicational goal may result in a switch in perspective.


international conference on human computer interaction | 2009

A Usability Study of WebMaps with Eye Tracking Tool: The Effects of Iconic Representation of Information

Özge Alaçam; Mustafa Dalcı


Journal of Eye Movement Research | 2008

Multimodal Comprehension of Language and Graphics: Graphs with and without annotations

Cengiz Acarturk; Christopher Habel; Kursat Cagiltay; Özge Alaçam


Cognitive Science | 2012

Gestures in Communication through Line Graphs

Cengiz Acartürk; Özge Alaçam


Proceedings of the 2nd European and the 5th Nordic Symposium on Multimodal Communication, August 6-8, 2014, Tartu, Estonia | 2015

Verbally Assisted Haptic Graph Comprehension: The Role of Taking Initiative in Joint Activity

Özge Alaçam; Christopher Habel; Cengiz Acartürk

Collaboration


Dive into the Özge Alaçam's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Cengiz Acartürk

Middle East Technical University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Kursat Cagiltay

Middle East Technical University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Annette Hohenberger

Middle East Technical University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Feride Erdal

Middle East Technical University

View shared research outputs
Top Co-Authors

Avatar

Mustafa Dalcı

Middle East Technical University

View shared research outputs
Researchain Logo
Decentralizing Knowledge