Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Seyed Abbas Sadat is active.

Publication


Featured researches published by Seyed Abbas Sadat.


international conference on robotics and automation | 2014

Feature-rich path planning for robust navigation of MAVs with Mono-SLAM

Seyed Abbas Sadat; Kyle Chutskoff; Damir Jungic; Jens Wawerla; Richard T. Vaughan

We present a path planning method for MAVs with vision-only MonoSLAM that generates safe paths to a goal according to the information richness of the environment. The planner runs on top of monocular SLAM and uses the available information about structure of the environment and features visibility to find trajectories that maintain visual contact with feature-rich areas. The MAV continuously re-plans as it explores and updates the feature-points in the map. In real-world experiments we show that our system is able to avoid paths that lead into visually-poor sections of the environment by considering the distribution of visual features. If the same system ignores the availability of visually-informative regions in the planning, it is unable to estimate its state accurately and fails to reach its goal.


intelligent robots and systems | 2015

UAV, do you see me? Establishing mutual attention between an uninstrumented human and an outdoor UAV in flight

Mani Monajjemi; Jake Bruce; Seyed Abbas Sadat; Jens Wawerla; Richard T. Vaughan

We present the first demonstration of establishing mutual attention between an outdoor UAV in autonomous normal flight and an uninstrumented human user. We use the familiar periodic waving gesture as a signal to attract the UAVs attention. The UAV can discriminate this gesture from human walking and running that appears similarly periodic. Once a signaling person is observed and tracked, the UAV acknowledges that the user has its attention by hovering and performing a “wobble” behavior. Both parties are now ready for further interaction. The system works on-board the UAV using a single camera for input and is demonstrated working reliably in real-robot trials.


human-robot interaction | 2014

You are green: a touch-to-name interaction in an integrated multi-modal multi-robot HRI system

Shokoofeh Pourmehr; Valiallah Monajjemi; Seyed Abbas Sadat; Fei Zhan; Jens Wawerla; Greg Mori; Richard T. Vaughan

We present a multi-modal multi-robot interaction whereby a user can identify an individual or a group of robots using haptic stimuli, and name them using a voice command (e.g. “You two are green”). Subsequent commands can be addressed to the same robot(s) by name (e.g. “Green! Take off!”). We demonstrate this as part of a real-world integrated system in which a user commands teams of autonomous robots in a coordinated exploration task.Categories and Subject DescriptorsI.2.9 [Articial Intelligence]: Robotics, Human Multi-Robot Interaction, User Interfaces


canadian conference on computer and robot vision | 2012

BRaVO: Biased Reciprocal Velocity Obstacles Break Symmetry in Dense Robot Populations

Seyed Abbas Sadat; Richard T. Vaughan

We present an extension to the Reciprocal Velocity Obstacles (RVO) approach to multi-robot collision avoidance with the aim of alleviating the problem of congestion caused by symmetrical situations in dense conditions. We show that in a resource transportation task RVO robots are unable to make progress due to crowds of robots with opposing navigation goals at source and sink. We introduce Biased Reciprocal Velocity Obstacles (BRVO), which breaks the symmetry among robots by giving priority to the robots leaving a task-related place of interest. BRVO is compared to RVO in two experiments and it is shown that BRVO is able to resolve the congestion much more quickly than RVO.


human-robot interaction | 2014

Integrating multi-modal interfaces to command UAVs

Valiallah Monajjemi; Shokoofeh Pourmehr; Seyed Abbas Sadat; Fei Zhan; Jens Wawerla; Greg Mori; Richard T. Vaughan

We present an integrated human-robot interaction system that enables a user to select and command a team of two Unmanned Aerial Vehicles (UAV) using voice, touch, face engagement and hand gestures. This system integrates multiple human [multi]-robot interaction interfaces as well as a navigation and mapping algorithm in a coherent semi-realistic scenario. The task of the UAVs is to explore and map a simulated Mars environment.


international conference on robotics and automation | 2015

Fractal trajectories for online non-uniform aerial coverage

Seyed Abbas Sadat; Jens Wawerla; Richard T. Vaughan


intelligent robots and systems | 2014

Recursive non-uniform coverage of unknown terrains for UAVs

Seyed Abbas Sadat; Jens Wawerla; Richard T. Vaughan


Artificial Life | 2010

SO-LOST - An Ant-Trail Algorithm for Multi-Robot Navigation with Active Interference Reduction.

Seyed Abbas Sadat; Richard T. Vaughan


international conference on robotics and automation | 2010

Blinkered LOST: Restricting sensor field of view can improve scalability in emergent multi-robot trail following

Seyed Abbas Sadat; Richard T. Vaughan


adaptive agents and multi agents systems | 2012

MO-LOST: adaptive ant trail untangling in multi-objective multi-colony robot foraging

Zhao Song; Seyed Abbas Sadat; Richard T. Vaughan

Collaboration


Dive into the Seyed Abbas Sadat's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jens Wawerla

Simon Fraser University

View shared research outputs
Top Co-Authors

Avatar

Fei Zhan

Simon Fraser University

View shared research outputs
Top Co-Authors

Avatar

Greg Mori

Simon Fraser University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Zhao Song

Simon Fraser University

View shared research outputs
Top Co-Authors

Avatar

Damir Jungic

Simon Fraser University

View shared research outputs
Top Co-Authors

Avatar

Jake Bruce

Simon Fraser University

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge