Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Patricia Maes is active.

Publication


Featured researches published by Patricia Maes.


user interface software and technology | 2010

LuminAR: portable robotic augmented reality interface design and prototype

Natan Linder; Patricia Maes

In this paper we introduce LuminAR: a prototype for a new portable and compact projector-camera system designed to use the traditional incandescent bulb interface as a power source, and a robotic desk lamp that carries it, enabling it with dynamic motion capabilities. We are exploring how the LuminAR system embodied in a familiar form factor of a classic Angle Poise lamp may evolve into a new class of robotic, digital information devices.


augmented human international conference | 2013

EyeRing: a finger-worn input device for seamless interactions with our surroundings

Suranga Nanayakkara; Roy Shilkrot; Kian Peen Yeo; Patricia Maes

Finger-worn interfaces remain a vastly unexplored space for user interfaces, despite the fact that our fingers and hands are naturally used for referencing and interacting with the environment. In this paper we present design guidelines and implementation of a finger-worn I/O device, the EyeRing, which leverages the universal and natural gesture of pointing. We present use cases of EyeRing for both visually impaired and sighted people. We discuss initial reactions from visually impaired users which suggest that EyeRing may indeed offer a more seamless solution for dealing with their immediate surroundings than the solutions they currently use. We also report on a user study that demonstrates how EyeRing reduces effort and disruption to a sighted user. We conclude that this highly promising form factor offers both audiences enhanced, seamless interaction with information related to objects in the environment.


human factors in computing systems | 2011

MemTable: an integrated system for capture and recall of shared histories in group workspaces

Seth E. Hunter; Patricia Maes; Stacey D. Scott; Henry Kaufman

This paper presents the design, implementation, and evaluation of an interactive tabletop system that supports co-located meeting capture and asynchronous search and review of past meetings. The goal of the project is to evaluate the design of a conference table that augments the everyday work patterns of small collaborative groups by incorporating an integrated annotation system. We present a holistic design that values hardware ergonomics, supports heterogeneous input modalities, generates a memory of all user interactions, and provides access to historical data on and off the table. We present a user evaluation that assesses the usefulness of the input modalities and software features, and validates the effectiveness of the MemTable system as a tool for assisting memory recall.


human factors in computing systems | 2012

EyeRing: a finger-worn assistant

Suranga Nanayakkara; Roy Shilkrot; Patricia Maes

Finger-worn interfaces are a vastly unexplored space for interaction design. It opens a world of possibilities for solving day-to-day problems, for visually impaired people and sighted people. In this work we present EyeRing, a novel design and concept of a finger-worn device. We show how the proposed system may serve for numerous applications for visually impaired people such as recognizing currency notes and navigating, as well as helping sighted people to tour an unknown city or intuitively translate signage. The ring apparatus is autonomous, however it is counter parted by a mobile phone or computation device to which it connects wirelessly, and an earpiece for information retrieval. Finally, we will discuss how finger worn sensors may be extended and applied to other domains.


human computer interaction with mobile devices and services | 2011

PoCoMo: projected collaboration using mobile devices

Roy Shilkrot; Seth E. Hunter; Patricia Maes

As personal projection devices become more common they will be able to support a range of exciting and unexplored social applications. We present a novel system and method that enables playful social interactions between multiple projected characters. The prototype consists of two mobile projector-camera systems, with lightly modified existing hardware, and computer vision algorithms to support a selection of applications and example scenarios. Our system allows participants to discover the characteristics and behaviors of other characters projected in the environment. The characters are guided by hand movements, and can respond to objects and other characters, to simulate a mixed reality of life-like entities.


human factors in computing systems | 2012

EyeRing: an eye on a finger

Suranga Nanayakkara; Roy Shilkrot; Patricia Maes

Finger-worn devices are a greatly underutilized form of interaction with the surrounding world. By putting a camera on a finger we show that many visual analysis applications, for visually impaired people as well as the sighted, prove seamless and easy. We present EyeRing, a ring mounted camera, to enable applications such as identifying currency and navigating, as well as helping sighted people to tour an unknown city or intuitively translate signage. The ring apparatus is autonomous, however our system also includes a mobile phone or computation device to which it connects wirelessly, and an earpiece for information retrieval. Finally, we will discuss how different finger worn sensors may be extended and applied to other domains.


international conference on computer graphics and interactive techniques | 2014

Physical rendering with a digital airbrush

Roy Shilkrot; Patricia Maes; Amit Zoran

Airbrush painting is an expressive art form that allows for unrepeatable spray patterns and unique ink staining. Artists utilize these properties while painting, expressing subjective style and artistic intentions. We present an augmented airbrush device that acts both as a physical spraying device and an intelligent digital guiding tool, that maintains both manual and computerized control. We demonstrate our custom designed hardware and numerous algorithms that control it through hands-on usage examples of a human-computer collaborative of a physical painting effort.


augmented human international conference | 2013

The design of artifacts for augmenting intellect

Cassandra T. Xia; Patricia Maes

Fifty years ago, Doug Engelbart created a conceptual framework for augmenting human intellect in the context of problem-solving. We expand upon Engelbarts framework and use his concepts of process hierarchies and artifact augmentation for the design of personal intelligence augmentation (IA) systems within the domains of memory, motivation, decision making, and mood. This paper proposes a systematic design methodology for personal IA devices, organizes existing IA research within a logical framework, and uncovers underexplored areas of IA that could benefit from the invention of new artifacts.


international symposium on mixed and augmented reality | 2014

nARratives of augmented worlds

Roy Shilkrot; Nick Montfort; Patricia Maes

This paper presents an examination of augmented reality (AR) as a rising form of interactive narrative that combines computer-generated elements with reality, fictional with non-fictional objects, in the same immersive experience. Based on contemporary theory in narratology, we propose to view this blending of reality worlds as a metalepsis, a transgression of reality and fiction boundaries, and argue that authors could benefit from using existing conventions of narration to emphasize the transgressed boundaries, as is done in other media. Our contribution is three-fold, first we analyze the inherent connection between narrative, immersion, interactivity, fictionality and AR using narrative theory, and second we comparatively survey actual works in AR narratives from the past 15 years based on these elements from the theory. Lastly, we postulate a future for AR narratives through the perspective of the advancing technologies of both interactive narratives and AR.


international conference on computer graphics and interactive techniques | 2015

Z-drawing: a flying agent system for computer-assisted drawing

Sang-won Leigh; Harshit Agrawal; Patricia Maes

We present a drone-based drawing system where a users sketch on a desk is transformed across scale and time, and transferred onto a larger canvas at a distance in real-time. Various spatio-temporal transformations like scaling, mirroring, time stretching, recording and playing back over time, and simultaneously drawing at multiple locations allow for creating various artistic effects. The unrestricted motion of the drone promises scalability and a huge potential as an artistic medium.

Collaboration


Dive into the Patricia Maes's collaboration.

Top Co-Authors

Avatar

Roy Shilkrot

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Seth E. Hunter

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Natan Linder

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Amit Zoran

Hebrew University of Jerusalem

View shared research outputs
Top Co-Authors

Avatar

Cassandra T. Xia

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

David Merrill

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Harshit Agrawal

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Jeevan James Kalanithi

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Joseph A. Paradiso

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge