Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Lauren R. Milne is active.

Publication


Featured researches published by Lauren R. Milne.


conference on computers and accessibility | 2014

BraillePlay: educational smartphone games for blind children

Lauren R. Milne; Cynthia L. Bennett; Richard E. Ladner; Shiri Azenkot

There are many educational smartphone games for children, but few are accessible to blind children. We present BraillePlay, a suite of accessible games for smartphones that teach Braille character encodings to promote Braille literacy. The BraillePlay games are based on VBraille, a method for displaying Braille characters on a smartphone. BraillePlay includes four games of varying levels of difficulty: VBReader and VBWriter simulate Braille flashcards, and VBHangman and VBGhost incorporate Braille character identification and recall into word games. We evaluated BraillePlay with a longitudinal study in the wild with eight blind children. Through logged usage data and extensive interviews, we found that all but one participant were able to play the games independently and found them enjoyable. We also found evidence that some children learned Braille concepts. We distill implications for the design of games for blind children and discuss lessons learned.


conference on computers and accessibility | 2015

Exploring the Opportunities and Challenges with Exercise Technologies for People who are Blind or Low-Vision

Kyle Rector; Lauren R. Milne; Richard E. Ladner; Batya Friedman; Julie A. Kientz

People who are blind or low-vision may have a harder time participating in exercise due to inaccessibility or lack of experience. We employed Value Sensitive Design (VSD) to explore the potential of technology to enhance exercise for people who are blind or low-vision. We conducted 20 semi-structured interviews about exercise and technology with 10 people who are blind or low-vision and 10 people who facilitate fitness for people who are blind or low-vision. We also conducted a survey with 76 people to learn about outsider perceptions of hypothetical exercise with people who are blind or low-vision. Based on our interviews and survey, we found opportunities for technology development in four areas: 1) mainstream exercise classes, 2) exercise with sighted guides, 3) rigorous outdoors activity, and 4) navigation of exercise spaces. Design considerations should include when and how to deliver auditory or haptic information based on exercise and context, and whether it is acceptable to develop less mainstream technologies if they enhance mainstream exercise. The findings of this work seek to inform the design of accessible exercise technologies.


conference on computers and accessibility | 2014

Tactile graphics with a voice: using QR codes to access text in tactile graphics

Catherine M. Baker; Lauren R. Milne; Jeffrey Scofield; Cynthia L. Bennett; Richard E. Ladner

Textbook figures are often converted into a tactile format for access by blind students. These figures are not truly accessible unless the text within the figures is also made accessible. A common solution to access text in a tactile image is to use embossed Braille. We have developed an alternative to Braille that uses QR codes for students who want tactile graphics, but prefer the text in figures be spoken, rather than in Braille. Tactile Graphics with a Voice (TGV) allows text within tactile graphics to be accessible by using a talking QR code reader app on a smartphone. To evaluate TGV, we performed a longitudinal study where ten blind and low vision participants were asked to complete tasks using three alternative picture taking guidance techniques: 1) no guidance, 2) verbal guidance, and 3) finger pointing guidance. Our results show that TGV is an effective way to access text in tactile graphics, especially for those blind users who are not fluent in Braille. In addition, guidance preferences varied with each of the guidance techniques being preferred by at least one participant.


human factors in computing systems | 2015

StructJumper: A Tool to Help Blind Programmers Navigate and Understand the Structure of Code

Catherine M. Baker; Lauren R. Milne; Richard E. Ladner

It can be difficult for a blind developer to understand and navigate through a large amount of code quickly, as they are unable to skim as easily as their sighted counterparts. To help blind developers overcome this problem, we present StructJumper, an Eclipse plugin that creates a hierarchical tree based on the nesting structure of a Java class. The programmer can use the TreeView to get an overview of the code structure of the class (including all the methods and control flow statements) and can quickly switch between the TreeView and the Text Editor to get an idea of where they are within the nested structure. To evaluate StructJumper, we had seven blind programmers complete three tasks with and without our tool. We found that the users thought they would use StructJumper and there was a trend that they were faster completing the tasks with StructJumper.


ACM Transactions on Accessible Computing | 2016

Tactile Graphics with a Voice

Catherine M. Baker; Lauren R. Milne; Ryan Drapeau; Jeffrey Scofield; Cynthia L. Bennett; Richard E. Ladner

We discuss the development of Tactile Graphics with a Voice (TGV), a system used to access label information in tactile graphics using QR codes. Blind students often rely on tactile graphics to access textbook images. Many textbook images have a large number of text labels that need to be made accessible. In order to do so, we propose TGV, which uses QR codes to replace the text, as an alternative to Braille. The codes are read with a smartphone application. We evaluated the system with a longitudinal study where 10 blind and low-vision participants completed tasks using three different modes on the smartphone application: (1) no guidance, (2) verbal guidance, and (3) finger-pointing guidance. Our results show that TGV is an effective way to access text in tactile graphics, especially for those blind users who are not fluent in Braille. We also found that preferences varied greatly across the modes, indicating that future work should support multiple modes. We expand upon the algorithms we used to implement the finger pointing, algorithms to automatically place QR codes on documents. We also discuss work we have started on creating a Google Glass version of the application.


ACM Sigaccess Accessibility and Computing | 2017

Blocks4All: making block programming languages accessible for blind children

Lauren R. Milne

Block programming languages, such as Scratch and Blockly, are being used as to introduce children to programming, but because they rely heavily on visual aspects, blind children are being left behind their peers in access to computer science education. We propose finding new techniques to make these types of programs accessible to blind children. We plan to use an iterative design process to create a web-based application on a touchscreen laptop, where children can synthesize music using different instruments and recordings of themselves. We plan to work with students at a local school to test and refine initial prototypes in a workshop setting. We will then evaluate the final prototype in a longitudinal study with students: collecting the programs that they create over a two-week period, and conducting observations and interviews throughout that period in order to evaluate Blocks4All.


human factors in computing systems | 2018

Values, Identity, and Social Translucence: Neurodiverse Student Teams in Higher Education

Annuska Zolyomi; Anne Spencer Ross; Arpita Bhattacharya; Lauren R. Milne; Sean A. Munson

To successfully function within a team, students must develop a range of skills for communication, organization, and conflict resolution. For students on the autism spectrum, these skills mirror the social, communicative, and cognitive experiences that can often be challenging for these learners. Since instructors and students collaborate using a mix of technology, we investigated the technology needs of neurodiverse teams comprised of autistic and non-autistic students. We interviewed seven autistic students and five employees of disability services in higher education. Our analysis focused on technology stakeholder values, stages of small-group development, and Social Translucence -- a model for online collaboration highlighting principles of visibility, awareness, and accountability. Despite motivation to succeed, neurodiverse students have difficulty expressing individual differences and addressing team conflict. To support future design of technology for neurodiverse teams, we propose: (1) a design space and design concepts including collaborative and affective computing tools, and (2) extending Social Translucence to account for student and group identities.


conference on computers and accessibility | 2014

Tactile graphics with a voice demonstration

Catherine M. Baker; Lauren R. Milne; Jeffrey Scofield; Cynthia L. Bennett; Richard E. Ladner

Textbook images are converted into tactile graphics to be made accessible to blind and low vision students. The text labels on these graphics are an important part of the image and must be made accessible as well. The graphics usually have the labels embossed in Braille. However, there are some blind and low vision students who cannot read Braille and need to be able to access the labels in a different manner. We present Tactile Graphics with a Voice, a system that encodes the labels in QR codes, which can be read aloud using the application, TGV, we developed. TGV provides feedback to support the user in scanning the QR code and allows the user to select which QR code to scan when multiple are close together.


human factors in computing systems | 2018

Blocks4All: Overcoming Accessibility Barriers to Blocks Programming for Children with Visual Impairments

Lauren R. Milne; Richard E. Ladner

Blocks-based programming environments are a popular tool to teach children to program, but they rely heavily on visual metaphors and are therefore not fully accessible for children with visual impairments. We evaluated existing blocks-based environments and identified five major accessibility barriers for visually impaired users. We explored techniques to overcome these barriers in an interview with a teacher of the visually impaired and formative studies on a touchscreen blocks-based environment with five children with visual impairments. We distill our findings on usable touchscreen interactions into guidelines for designers of blocks-based environments.


conference on computers and accessibility | 2017

Blocks4All Demonstration: a Blocks-Based Programming Environment for Blind Children

Lauren R. Milne; Catherine M. Baker; Richard E. Ladner

Blocks-based programming environments, such as Scratch and Blockly, are designed to make learning programming easier for young children. They are increasingly being used for both formal and informal curriculum, such as many of Code.orgs hour of code projects. However, these block-based environments rely heavily on visual metaphors and interactions, making them inaccessible for blind children. We describe our initial design of a touchscreen-based blocks environment that is accessible for blind children.

Collaboration


Dive into the Lauren R. Milne's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Sean A. Munson

University of Washington

View shared research outputs
Top Co-Authors

Avatar

Batya Friedman

University of Washington

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge