Matthew Aldrich
Massachusetts Institute of Technology
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Matthew Aldrich.
wearable and implantable body sensor networks | 2013
Brian Mayton; Nan Zhao; Matthew Aldrich; Nicholas Gillian; Joseph A. Paradiso
WristQue combines environmental and inertial sensing with precise indoor localization into a wristband wearable device that serves as the users personal control interface to networked infrastructure. WristQue enables users to take control of devices around them by pointing to select and gesturing to control. At the same time, it uniquely identifies and locates users to deliver personalized automatic control of the users environment. In this paper, the hardware and software components of the WristQue system are introduced, and a number of applications for lighting and HVAC control are presented, using pointing and gesturing as a new human interface to these networked systems.
Proceedings of SPIE | 2011
ByungKun Lee; Matthew Aldrich; Joseph A. Paradiso
The inherent control flexibility implied by solid-state lighting - united with the rich details offered by sensor networks - prompts us to rethink lighting control. In this research, we propose several techniques for measuring work surface illuminance and ambient light using a sensor network. The primary goal of this research is to measure work surface illuminance without distraction to the user. We discuss these techniques, including the lessons learned from our prior research. We present a new method for measuring the illuminance contribution of an arbitrary luminaire at the work surface by decomposing the modulated light into its fundamental and harmonic components.
Spie Newsroom | 2011
Joseph A. Paradiso; Matthew Aldrich; Nan Zhao
Lighting control is in the midst of radical change. Presentday state-of-the-art lighting systems tend to be extremely complex. They exhibit very flexible actuation possibilities with many degrees of freedom that can be exploited to answer dynamiclighting needs. However, user interfaces to these systems are woefully lacking. They are frequently denigrated to a panel of buttons that select particular presets that are often cryptically defined, poorly labeled, and seldom desirable. Higher-end lighting systems can provide affordances such as a touch screen to select particular, graphically illustrated presets, but the straitjacketed assumptions that are made often lead to frustration. Most attempts to integrate sensor feedback into commercial systems exploit simple motion sensors that tend to activate all lighting in a space when one occupant moves. This is generally energy wasteful. The converse, turning off all lights when the occupant does not move for a while, may also be a wrong choice that can irritate the user. Today’s crude motion sensors represent a coarse attempt at leveraging limited information and simple context to reduce energy consumption. Balancing precise control and energy efficiency remains a goal in modern lighting systems,1–10 since lighting accounts for 22% of all electricity consumed in the United States.11 The inherent control flexibility implied by solid-state lighting—combined with a rich description of a user’s environment provided by emerging sensor networks—offers a chance to rethink our present modes of lighting control. It also requires us to consider the importance of color and efficacy.12–18 Building upon the inherently digital nature of this technology, we discuss our vision of lighting control and suggest several highly responsive schemes that are adept at meeting users’ needs while mitigating energy usage. Our research aims at minimizing the energy spent lighting while simultaneously maximizing the light source’s usefulness. Figure 1. The lighting network consists of LED light sources, optional incandescent and fluorescent sources, and ambient room conditions (daylight), that are measured by a single or a group of sensor nodes. The latter return intensity and color information to the control node (e.g., a computer) for processing. Here, the intensity is controlled by a link to the artificial-light sources that is bidirectional from either the sensor node or the LEDs.
Proceedings of the 2nd ACM International Conference on Embedded Systems for Energy-Efficient Built Environments | 2015
Nan Zhao; Matthew Aldrich; Christoph Reinhart; Joseph A. Paradiso
An increasing number of internet-connected LED lighting fixtures and bulbs have recently become available. This development, in combination with emerging hardware and software solutions for activity recognition, establish an infrastructure for context-aware lighting. Automated lighting control could potentially provide a better user experience, increased comfort, higher productivity, and energy savings compared to static uniform illumination. The first question that comes to mind when thinking about context-aware lighting is how to determine the relevant activities and contexts. Do we need different lighting for reading a magazine and reading a book, or maybe just different lighting for reading versus talking on the phone? How do we identify the relevant situations, and what are the preferred lighting settings? In this paper we present three steps we took to answer these questions and demonstrate them via an adaptive five-channel solid-state lighting system with continuous contextual control. We implemented a multidimensional user interface for manual control as well as an autonomous solution using wearable sensors. We enable a simple set of sensors to manipulate complicated lighting scenarios by indirectly simplifying and reducing the complexity of the sensor-lighting control space using human-derived criteria. In a preliminary user study, we estimated significant energy savings of up to 52% and showed multiple future research directions, including behavioral feedback.
Proceedings of SPIE | 2010
Matthew Aldrich; Nan Zhao; Joseph A. Paradiso
Archive | 2010
Matthew Aldrich; Mark Feldmeier; Joseph A. Paradiso
ieee sensors | 2013
Matthew Aldrich; Akash Badshah; Brian Mayton; Nan Zhao; Joseph A. Paradiso
Archive | 2010
Matthew Aldrich
Archive | 2014
Matthew Aldrich
Archive | 2015
Joseph A. Paradiso; Matthew Aldrich; Nan Zhao