Ilya Afanasyev
Robotics Institute
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Ilya Afanasyev.
international conference on informatics in control automation and robotics | 2015
Ramil Khusainov; Ilya Shimchik; Ilya Afanasyev; Evgeni Magid
In the near future anthropomorphic robots will turn into an important part of our everyday routine. To successfully perform various tasks these robots require stable walking control algorithms, which could guarantee dynamic balance of the biped robot locomotion. Our research is focused on the development of locomotion algorithms which could provide effective anthropomorphic walking of a robot. As a target robotic platform we utilize an experimental model of a human-size robot - a novel Russian robot AR-601M. In this paper we introduce AR-601M robot and present a model of a biped robot with 11 DoF which simulates a simplified AR-601M robot. The simulation model is implemented in Matlab/Simulink environment and uses walking primitives in order to provide a dynamically stable locomotion.
advanced concepts for intelligent vision systems | 2015
Ilya Afanasyev; Artur Sagitov; Evgeni Magid
Nowadays robot simulators have robust physics engines, high-quality graphics, and convenient interfaces, affording researchers to substitute physical systems with their simulation models in order to pre-estimate the performance of theoretical findings before applying them to real robots. This paper describes Gazebo simulation approach to simultaneous localization and mapping SLAM based on Robot Operating System ROS using PR2 robot. The ROS-based SLAM approach applies Rao-Blackwellized particle filters and laser data to locate the PR2 robot in unknown environment and build a map. The real room 3D model was obtained from camera shots and reconstructed with Autodesk 123D Catch and MeshLab software. The results demonstrate the fidelity of the simulated 3D room to the obtained from the robot laser system ROS-calculated map and the feasibility of ROS-based SLAM with a Gazebo-simulated mobile robot to its usage in camera-based 3D environment. This approach will be further extended to ROS-based robotic simulations in Gazebo with a Russian anthropomorphic robot AR-601M.
international conference on machine vision | 2017
Alexander Buyval; Ilya Afanasyev; Evgeni Magid
This paper presents a comparison of four most recent ROS-based monocular SLAM-related methods: ORB-SLAM, REMODE, LSD-SLAM, and DPPTAM, and analyzes their feasibility for a mobile robot application in indoor environment. We tested these methods using video data that was recorded from a conventional wide-angle full HD webcam with a rolling shutter. The camera was mounted on a human-operated prototype of an unmanned ground vehicle, which followed a closed-loop trajectory. Both feature-based methods (ORB-SLAM, REMODE) and direct SLAMrelated algorithms (LSD-SLAM, DPPTAM) demonstrated reasonably good results in detection of volumetric objects, corners, obstacles and other local features. However, we met difficulties with recovering typical for offices homogeneously colored walls, since all of these methods created empty spaces in a reconstructed sparse 3D scene. This may cause collisions of an autonomously guided robot with unfeatured walls and thus limits applicability of maps, which are obtained by the considered monocular SLAM-related methods for indoor robot navigation.
Archive | 2016
Ramil Khusainov; Ilya Shimchik; Ilya Afanasyev; Evgeni Magid
In the past decades bipedal robots related research gained significant attention as the technology progresses towards acceptable humanoid robot assistants. Serious challenges of human-like biped robot locomotion include such issues as obtaining a human gait multi-functionality, energy efficiency and flexibility. In this paper we present Russian biped robot AR-601M and its locomotion modelling in Simulink environment using walking primitives approach. We consider two robot models: with 6 and 12 Degrees of Freedom (DoFs) per legs, using the same walking strategies. While the 6-DoF model is constrained to move only in sagittal plan, the 12-DoF model supports 3D motion and precisely reflects the hardware of AR-601M robot legs. The locomotion algorithm utilizes position control and involves inverse kinematics computations for the joints. The resulting simulation of robot locomotion is dynamically stable for both models at a small step length and short step time with relatively long damping pauses between the steps.
international symposium on mechatronics and its applications | 2015
Bulat Gabbasov; Igor Danilov; Ilya Afanasyev; Evgeni Magid
This paper presents biomechanical analysis of human locomotion recorded by Motion Capture (MoCap) system based on four Kinect 2 sensors and iPi Soft markerless tracking and visualization technology. To analyze multi-depth sensor video recordings we utilize iPi Mocap Studio software and iPi Biomech Add-on plug-in, which provide us visual and biomechanical human gait data: linear and angular joint coordinates, velocity, acceleration, center of mass (CoM) position, skeleton and 3D point cloud. The final analysis was performed in MATLAB environment, calculating zero moment point (ZMP) and ground projection of the CoM (GCoM) trajectories from human body dynamics by considering human body as a single weight point. These were followed by GCoM and ZMP error estimation. The further objective of our research is to reproduce the obtained with our MoCap system human-like gait with Russian biped robot AR-601M.
Journal of Robotics, Networking and Artificial Life | 2016
Ramil Khusainov; Ilya Afanasyev; Leysan Sabirova; Evgeni Magid
Control algorithms development for humanoid locomotion is difficult due to humanoid’s high number of Degrees of Freedom and system nonlinearity. Moreover, a humanoid may lose balance as its walking speed changes. We present AR-601M humanoid locomotion modelling with virtual height inverted pendulum and preview control approaches. A step length and period influence on walking stability were investigated and simulated within Simulink environment. Achievable in simulation maximal torques in leg joints and corresponding trajectories were verified against attainable robot motors’ values.
2017 International Conference on Mechanical, System and Control Engineering (ICMSC) | 2017
Evgeni Magid; Roman Lavrenov; Ilya Afanasyev
Optimal path planning in dynamic environments for an unmanned vehicle is a complex task of mobile robotics that requires an integrated approach. This paper describes a path planning algorithm, which allows to build a preliminary motion trajectory using global information about environment, and then dynamically adjust the path in real-time by varying objective function weights. We introduce a set of key parameters for path optimization and the algorithm implementation in MATLAB. The developed algorithm is suitable for fast and robust trajectory tuning to a dynamically changing environment and is capable to provide efficient planning for mobile robots.
international conference on computer vision theory and applications | 2016
Igor Danilov; Bulat Gabbasov; Ilya Afanasyev; Evgeni Magid
This article presents the methods of zero moment point (ZMP) trajectory evaluation for human locomotion by processing biomechanical data recorded with Kinect-based motion capture (MoCap) system. Our MoCap system consists of four Kinect 2 sensors, using commercial iPi soft markerless tracking and visualization technology. We apply iPi Mocap Studio software to multi-depth sensor video recordings, acquiring visual and biomechanical human gait data, including linear and angular coordinates, velocities, accelerations and center of mass (CoM) position of each joint. Finally, we compute ZMP and ground projection of the CoM (GCOM) trajectories from human body dynamics in MATLAB by two methods, where human body is treated as (1) a single mass point, and (2) multiple mass points (with following ZMP calculation via inertia tensor). The further objective of our research is to reproduce the human-like gait with Russian biped robot AR-601M.
international conference on informatics in control, automation and robotics | 2017
Dmitry Popov; Alexandr Klimchik; Ilya Afanasyev
This paper presents a low-cost anthropomorphic robot, considering its design, simulation, manufacturing and experiments. The robot design was inspired by open source Poppy Humanoid project and enhanced up to 12 DoF lower limb structure, providing additional capability to develop more natural, fast and stable biped robot walking. The current robot design has a non-spherical hip joint, that does not allow to use an analytical solution for the inverse kinematics, therefore another hybrid solution was presented. Problem of robot joint’s compliance was addressed using virtual joint method for stiffness modeling with further compensation of elastic deflections caused by the robot links weight. Finally, we modeled robot’s lower-part in V-REP simulator, manufactured its prototype using 3D printing technology, and implemented ZMP preview control, providing experiments with demonstration of stable biped locomotion.
international conference on informatics in control, automation and robotics | 2017
Maxim Sokolov; Oleg Bulichev; Ilya Afanasyev
This article presents a comparative analysis of ROS-based monocular visual odometry, lidar odometry and ground truth-related path estimation for a crawler-type robot in indoor environment. We tested these methods with the crawler robot ”Engineer”, which was teleoperated in a small-sized indoor workspace with officestyle environment. Since robot’s onboard computer can not work simultaneously with ROS packages of lidar odometry and visual SLAM, we used online computation of lidar odometry, while video data from onboard camera was processed offline by ORB-SLAM and LSD-SLAM algorithms. As far as crawler robot motion is accompanied by significant vibrations, we faced some problems with these visual SLAM, which resulted in decreasing accuracy of robot trajectory evaluation or even fails in visual odometry, in spite of using a video stabilization filter. The comparative analysis shown that lidar odometry is close to the ground truth, whereas visual odometry can demonstrate significant trajectory deviations.