Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Brian Mayton is active.

Publication


Featured researches published by Brian Mayton.


international conference on robotics and automation | 2010

An Electric Field Pretouch system for grasping and co-manipulation

Brian Mayton; Louis LeGrand; Joshua R. Smith

Pretouch sensing is longer range than contact, but shorter range than vision. The hypothesis motivating this work is that closed loop feedback based on short range but non-contact measurements can improve the reliability of manipulation. This paper presents a grasping system that is guided at short range by Electric Field (EF) Pretouch. We describe two sets of experiments. The first set of experiments involves human-to-robot and robot-to-human handoff, including the use of EF Pretouch to detect whether or not a human is also touching an object that the robot is holding, which we call the “co-manipulation state.” In the second set of experiments, the robot picks up standalone objects. We describe a number of techniques that servo the arm and fingers in order to both collect relevant geometrical information, and to actually perform the manipulation task.


international conference on robotics and automation | 2011

Gambit: An autonomous chess-playing robotic system

Cynthia Matuszek; Brian Mayton; Roberto Aimi; Marc Peter Deisenroth; Liefeng Bo; Robert Chu; Mike Kung; Louis LeGrand; Joshua R. Smith; Dieter Fox

This paper presents Gambit, a custom, mid-cost 6-DoF robot manipulator system that can play physical board games against human opponents in non-idealized environments. Historically, unconstrained robotic manipulation in board games has often proven to be more challenging than the underlying game reasoning, making it an ideal testbed for small-scale manipulation. The Gambit system includes a low-cost Kinect-style visual sensor, a custom manipulator, and state-of-the-art learning algorithms for automatic detection and recognition of the board and objects on it. As a use-case, we describe playing chess quickly and accurately with arbitrary, uninstrumented boards and pieces, demonstrating that Gambits engineering and design represent a new state-of-the-art in fast, robust tabletop manipulation.


ieee sensors | 2011

DoppelLab: Tools for exploring and harnessing multimodal sensor network data

Gershon Dublon; Laurel S. Pardue; Brian Mayton; Noah Swartz; Nicholas Joliat; Patrick Hurst; Joseph A. Paradiso

We present DoppelLab, an immersive sensor data browser built on a 3-d game engine. DoppelLab unifies independent sensor networks and data sources within the spatial framework of a building. Animated visualizations and sonifications serve as representations of real-time data within the virtual space.


international conference on robotics and automation | 2010

Robot, feed thyself: Plugging in to unmodified electrical outlets by sensing emitted AC electric fields

Brian Mayton; Louis LeGrand; Joshua R. Smith

We describe a robot that is able to autonomously plug itself in to standard, unmodified electrical outlets by sensing the 60Hz electric fields emitted from the outlet. The building electrical infrastructure is not modified in any way. Unlike previous powerline localization work, no additional signal is injected in the powerlines—the already present AC power carrier signal in the outlet is used as the localization beacon. This technique is faster, more accurate, and potentially less expensive than previously reported vision-based systems for autonomous plugging in.


ieee sensors | 2012

TRUSS: Tracking Risk with Ubiquitous Smart Sensing

Brian Mayton; Gershon Dublon; Sebastian Palacios; Joseph A. Paradiso

We present TRUSS, or Tracking Risk with Ubiquitous Smart Sensing, a novel system that infers and renders safety context on construction sites by fusing data from wearable devices, distributed sensing infrastructure, and video. Wearables stream real-time levels of dangerous gases, dust, noise, light quality, altitude, and motion to base stations that synchronize the mobile devices, monitor the environment, and capture video. At the same time, low-power video collection and processing nodes track the workers as they move through the view of the cameras, identifying the tracks using information from the sensors. These processes together connect the context-mining wearable sensors to the video; information derived from the sensor data is used to highlight salient elements in the video stream. The augmented stream in turn provides users with better understanding of real-time risks, and supports informed decision-making. We tested our system in an initial deployment on an active construction site.


wearable and implantable body sensor networks | 2013

WristQue: A personal sensor wristband

Brian Mayton; Nan Zhao; Matthew Aldrich; Nicholas Gillian; Joseph A. Paradiso

WristQue combines environmental and inertial sensing with precise indoor localization into a wristband wearable device that serves as the users personal control interface to networked infrastructure. WristQue enables users to take control of devices around them by pointing to select and gesturing to control. At the same time, it uniquely identifies and locates users to deliver personalized automatic control of the users environment. In this paper, the hardware and software components of the WristQue system are introduced, and a number of applications for lighting and HVAC control are presented, using pointing and gesturing as a new human interface to these networked systems.


biomedical engineering systems and technologies | 2016

Thumbs-Up

Asaph Azaria; Brian Mayton; Joseph A. Paradiso


ieee sensors | 2013

Random walk and lighting control

Matthew Aldrich; Akash Badshah; Brian Mayton; Nan Zhao; Joseph A. Paradiso


technical symposium on computer science education | 2008

Multi-player soccer and wireless embedded systems

Gaetano Borriello; Carl Hartung; Bruce Hemingway; Karl Koscher; Brian Mayton


Teleoperators and Virtual Environments | 2017

The Networked Sensory Landscape: Capturing and Experiencing Ecological Change Across Scales

Brian Mayton; Gershon Dublon; Spencer Russell; Evan F. Lynch; Donald Derek Haddad; Vasant Ramasubramanian; Clement Duhart; Glorianna Davenport; Joseph A. Paradiso

Collaboration


Dive into the Brian Mayton's collaboration.

Top Co-Authors

Avatar

Joseph A. Paradiso

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Gershon Dublon

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Asaph Azaria

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Donald Derek Haddad

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Matthew Aldrich

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Nan Zhao

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Nicholas Joliat

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Spencer Russell

Massachusetts Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge