Brad A. Myers
Carnegie Mellon University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Brad A. Myers.
ACM Transactions on Computer-Human Interaction | 2000
Brad A. Myers; Scott E. Hudson; Randy Pausch
A user interface software tool helps developers design and implement the user interface. Research on past tools has had enormous impact on todays developers—virtually all applications today are built using some form of user interface tool. In this article, we consider cases of both success and failure in past user interface tools. From these cases we extract a set of themes which can serve as lessons for future work. Using these themes, past tools can be characterized by what aspects of the user interface they addressed, their threshold and ceiling, what path of least resistance they offer, how predictable they are to use, and whether they addressed a target that became irrelevant. We believe the lessons of these past themes are particularly important now, because increasingly rapid technological changes are likely to significantly change user interfaces. We are at the dawn of an era where user interfaces are about to break out of the “desktop” box where they have been stuck for the past 15 years. The next millenium will open with an increasing diversity of user interface on an increasing diversity of computerized devices. These devices include hand-held personal digital assistants (PDAs), cell phones, pages, computerized pens, computerized notepads, and various kinds of desk and wall size-computers, as well as devices in everyday objects (such as mounted on refridgerators, or even embedded in truck tires). The increased connectivity of computers, initially evidenced by the World Wide Web, but spreading also with technologies such as personal-area networks, will also have a profound effect on the user interface to computers. Another important force will be recognition-based user interfaces, especially speech, and camera-based vision systems. Other changes we see are an increasing need for 3D and end-user customization, programming, and scripting. All of these changes will require significant support from the underlying user interface sofware tools.
human factors in computing systems | 1986
William Buxton; Brad A. Myers
Two experiments were run to investigate two-handed input. The experimental tasks were representative of those found in CAD and office information systems. Experiment one involved the performance of a compound selection/positioning task. The two sub-tasks were performed by different hands using separate transducers. Without prompting, novice subjects adopted strategies that involved performing the two sub-tasks simultaneously. We interpret this as a demonstration that, in the appropriate context, users are capable of simultaneously providing continuous data from two hands without significant overhead. The results also show that the speed of performing the task was strongly correlated to the degree of parallelism employed. Experiment two involved the performance of a compound navigation/selection task. It compared a one-handed versus two-handed method for finding and selecting words in a document. The two-handed method significantly outperformed the commonly used one-handed method by a number of measures. Unlike experiment one, only two subjects adopted strategies that used both hands simultaneously. The benefits of the two-handed technique, therefore, are interpreted as being due to efficiency of hand motion. However, the two subjects who did use parallel strategies had the two fastest times of all subjects.
human factors in computing systems | 1995
James A. Landay; Brad A. Myers
Current interactive user interface construction tools are often more of a hindrance than a benefit during the early stages of user interface design. These tools take too much time to use and force designers to specify more of the design details than they wish to at this early stage. Most interface designers, especially those who have a background in graphic design, prefer to sketch early interface ideas on paper or on a white-board. We are developing an interactive tool that allows designers to quickly sketch an interface using an electronic pad and stylus. Our tool preserves the important properties of paper: A rough drawing can be produced very quickly and the medium is very flexible. However, unlike a paper sketch this electronic sketch can easily be exercised and modified. In addition, our system allows designers to examine, annotate, and edit a complete history of the design. When the designer is satisfied with this early prototype, the system can transform the sketch into a complete, finished interface in a specified look-and-feel. This transformation takes place with the guidance of the designer. By supporting the early design phases of the software life-cycle, our tool should both ease the prototyping of user interfaces and improve the speed with which a final interface can be created.
IEEE Computer | 2001
James A. Landay; Brad A. Myers
Researchers at University of California, Berkeley and Carnegie Mellon University have designed, implemented, and evaluated SILK (Sketching Interfaces Like Krazy), an informal sketching tool that combines many of the benefits of paper-based sketching with the merits of current electronic tools. With SILK, designers can quickly sketch an interface using an electronic pad and stylus, and SILK recognizes widgets and other interface elements as the designer draws them. Unlike paper-based sketching, however, designers can exercise these elements in their sketchy state. For example, a sketched scroll-bar is likely to contain an elevator or thumbnail, the small rectangle a user drags with a mouse. In a paper sketch, the elevator would just sit there, but in a SILK sketch, designers can drag it up and down, which lets them test component or widget behavior. SILK also supports the creation of storyboards-the arrangement of sketches to show how design elements behave, such as how a dialog box appears when the user activates a button. Storyboards are important because they give designers a way to show colleagues, customers, or end users early on how an interface will behave.
conference on computer supported cooperative work | 1998
Brad A. Myers; Herb Stiel; Robert Gargiulo
The Pebbles project is creating applications to connect multiple Personal Digital Assistants (PDAs) to a main computer such as a PC. We are using 3Com PalmPilots because they are starting to be ubiquitous. We created the “Remote Commander” application to allow users to take turns sending input from their PalmPilots to the PC as if they were using the PC’s mouse and keyboard. “PebblesDraw” is a shared whiteboard application we built that allows all of the users to send input simultaneously while sharing the same PC display. We are investigating the use of these applications in various contexts, such as co-located meetings.
Interactions | 1998
Brad A. Myers
This article summarizes the historical development of major advances in humancomputer interaction technology, emphasizing the pivotal role of university research in the advancement of the field.
ACM Computing Surveys | 2011
Andrew J. Ko; Robin Abraham; Laura Beckwith; Alan F. Blackwell; Margaret M. Burnett; Martin Erwig; Christopher Scaffidi; Joseph Lawrance; Henry Lieberman; Brad A. Myers; Mary Beth Rosson; Gregg Rothermel; Mary Shaw; Susan Wiedenbeck
Most programs today are written not by professional software developers, but by people with expertise in other domains working towards goals for which they need computational support. For example, a teacher might write a grading spreadsheet to save time grading, or an interaction designer might use an interface builder to test some user interface design ideas. Although these end-user programmers may not have the same goals as professional developers, they do face many of the same software engineering challenges, including understanding their requirements, as well as making decisions about design, reuse, integration, testing, and debugging. This article summarizes and classifies research on these activities, defining the area of End-User Software Engineering (EUSE) and related terminology. The article then discusses empirical research about end-user software engineering activities and the technologies designed to support them. The article also addresses several crosscutting issues in the design of EUSE tools, including the roles of risk, reward, and domain complexity, and self-efficacy in the design of EUSE tools and the potential of educating users about software engineering principles.
user interface software and technology | 2002
Jeffrey Nichols; Brad A. Myers; Michael J. Higgins; Joseph Hughes; Thomas K. Harris; Roni Rosenfeld; Mathilde Pignol
The personal universal controller (PUC) is an approach for improving the interfaces to complex appliances by introducing an intermediary graphical or speech interface. A PUC engages in two-way communication with everyday appliances, first downloading a specification of the appliances functions, and then automatically creating an interface for controlling that appliance. The specification of each appliance includes a high-level description of every function, a hierarchical grouping of those functions, and dependency information, which relates the availability of each function to the appliances state. Dependency information makes it easier for designers to create specifications and helps the automatic interface generators produce a higher quality result. We describe the architecture that supports the PUC, and the interface generators that use our specification language to build high-quality graphical and speech interfaces.
symposium on visual languages and human-centric computing | 2004
Andrew J. Ko; Brad A. Myers; Htet Htet Aung
As programming skills increase in demand and utility, the learnability of end-user programming systems is of utmost importance. However, research on learning barriers in programming systems has primarily focused on languages, overlooking potential barriers in the environment and accompanying libraries. To address this, a study of beginning programmers learning Visual Basic.NET was performed. This identified six types of barriers: design, selection, coordination, use, understanding, and information. These barriers inspire a new metaphor of computation, which provides a more learner-centric view of programming system design
ACM Transactions on Computer-Human Interaction | 1995
Brad A. Myers
Almost as long as there have been user interfaces, there have been special software systems and tools to help design and implement the user interface software. Many of these tools have demonstrated significant productivity gains for programmers, and have become important commercial products. Others have proven less successful at supporting the kinds of user interfaces people want to build. This article discusses the different kinds of user interface software tools, and investigates why some approaches have worked and others have not. Many examples of commercial and research systems are included. Finally, current research directions and open issues in the field are discussed.