Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Ronny Mardiyanto is active.

Publication


Featured researches published by Ronny Mardiyanto.


Artificial Life and Robotics | 2009

Electric wheelchair control with gaze direction and eye blinking

Djoko Purwanto; Ronny Mardiyanto; Kohei Arai

We propose an electric wheelchair controlled by gaze direction and eye blinking. A camera is set up in front of a wheelchair user to capture image information. The sequential captured image is interpreted to obtain the gaze direction and eye blinking properties. The gaze direction is expressed by the horizontal angle of the gaze, and this is derived from the triangle formed by the centers of the eyes and the nose. The gaze direction and eye blinking are used to provide direction and timing commands, respectively. The direction command relates to the direction of movement of the electric wheelchair, and the timing command relates to the time when the wheelchair should move. The timing command with an eye blinking mechanism is designed to generate ready, backward movement, and stop commands for the electric wheelchair. Furthermore, to move at a certain velocity, the electric wheelchair also receives a velocity command as well as the direction and timing commands. The disturbance observer-based control system is used to control the direction and velocity. For safety purposes, an emergency stop is generated when the electric wheelchair user does not focus their gaze consistently in any direction for a specifi ed time. A number of simulations and experiments were conducted with the electric wheelchair in a laboratory environment.


international conference on information technology: new generations | 2011

Eye-based HCI with Full Specification of Mouse and Keyboard Using Pupil Knowledge in the Gaze Estimation

Kohei Arai; Ronny Mardiyanto

Eye-based Human Computer Interaction (HCI) with full specification of mouse and keyboard by human eyes only is proposed for, in particular, disabled person and for input device of wearable computing. It utilizes pupil knowledge in gaze estimation process. Existing conventional eye-mouse, gaze-mouse does not robust against various types of users with different features of color, shape and size of eyes and also is affected by illumination changes, users movement (attitude changes), etc. Using knowledge about features, the influences are eliminated. Also the proposed eye-based HCI system allows simultaneous key input of more than three keys, for instance, Ctl+Alt+Del for initiating task manager, left/right click and drag/drop of mouse event capabilities. Although currently commercially available screen keyboard allows simultaneous key input for two keys such as shit+*, Alt+*, it does not allow three keys input at once. Such these full specification of mouse events and keyboard functions are available for the proposed HCI system. Experimental results with six different nationalities of users with the different features shows effectiveness of using the knowledge for improvement of gaze estimation accuracy.


international conference on information technology: new generations | 2011

Comparative Study on Blink Detection and Gaze Estimation Methods for HCI, in Particular, Gabor Filter Utilized Blink Detection Method

Kohei Arai; Ronny Mardiyanto

Blink detection is used for a variety of applications such as Human-Computer Interaction, wearable computing, etc. Blink detection method with Gabor filter is proposed to realize high accurate blink detection. The blink detection accuracy is evaluated by several people and compare to the other existing conventional methods. Through the comparison, it is found that the proposed blink detection method with Gabor filter is superior to the other methods.


International Journal of Advanced Computer Science and Applications | 2011

Eye-based Human Computer Interaction Allowing Phoning, Reading E-Book/E-Comic/E-Learning, Internet Browsing, and TV Information Extraction

Kohei Arai; Ronny Mardiyanto

Eye-based Human-Computer Interaction: HCI system which allows phoning, reading e-book/e-comic/e-learning, internet browsing, and TV information extraction is proposed for handicap student in E-Learning Application. The conventional eye-based HCI applications are facing problems on accuracy and process speed. We develop new interfaces for improving key-in accuracy and process speed of eye-based key-in for E-Learning application, in particular. We propose eye-based HCI by utilizing camera mounted glasses for gaze estimation. We use the sight for controlling the user interface such as navigation of e-comic/e-book/e-learning contents, phoning, internet browsing, and TV information extraction. We develop interfaces including standard interface navigator with five keys, single line of moving keyboard, and multi line of moving keyboard in order to allow the aforementioned functions without burdening the accuracy. The experimental results show the proposed system does work the aforementioned functions in a real time basis.


International Journal of Advanced Computer Science and Applications | 2013

Method for Psychological Status Estimation by Gaze Location Monitoring Using Eye-Based Human- Computer Interaction

Kohei Arai; Ronny Mardiyanto

Method for psychological status estimation by gaze location monitoring using Eye-Based Human-Computer Interaction: EBHCI is proposed. Through the experiment with English book reading of e-learning content, relation between psychological status and the distance between the correct location of English sentence reading points and the corresponding location derived from EBHCI is clarified. Psych9ological status is estimated from peak alpha frequency derived from eeg signals. It is concluded that psychological status can be estimated with gaze location monitoring.


International Journal of Advanced Computer Science and Applications | 2011

Autonomous Control of Eye Based Electric Wheel Chair with Obstacle Avoidance and Shortest Path Findings Based on Dijkstra Algorithm

Kohei Arai; Ronny Mardiyanto

Autonomous Eye Based Electric Wheel Chair: EBEWC control system which allows handicap person (user) to control their EWC with their eyes only is proposed. Using EBEWC, user can move to anywhere they want on a same floor in a hospital autonomously with obstacle avoidance with visible camera and ultrasonic sensor. User also can control EBEWC by their eyes. The most appropriate route has to be determined with avoiding obstacles and then autonomous real time control has to be done. Such these processing time and autonomous obstacle avoidance together with the most appropriate route determination are important for the proposed EBEWC. All the required performances are evaluated and validated. Obstacles can be avoided using acquired images with forward looking camera. The proposed EBEWC system allows creation of floor layout map that contains obstacles locations in a real time basis. The created and updated maps can be share by the electric wheel chairs on a same floor of a hospital. Experimental data show that the system allows computer input (more than 80 keys) almost perfectly and electric wheel chair can be controlled with human eyes-only safely.


International Journal of Advanced Research in Artificial Intelligence | 2014

Speed and Vibration Performance as well as Obstacle Avoidance Performance of Electric Wheel Chair Controlled by Human Eyes Only

Kohei Arai; Ronny Mardiyanto

Speed and vibration performance as well as obstacle avoidance performance of the previously proposed Electric Wheel Chair: EWC controlled by human eyes only is conducted. Experimental results show acceptable performances of speed vibration performance as well as obstacle avoidance performance for disabled persons. More importantly, disabled persons are satisfied with the proposed EWC because it works by their eyes only. Without hands and finger, they can control EWC freely.


international conference on information technology: new generations | 2011

Eye Based HCI with Moving Keyboard for Reducing Fatigue Effects

Kohei Arai; Ronny Mardiyanto

Moving keyboard utilized eye-based HCI: Human Computer Interaction is proposed for reducing physical and psychological fatigues. Fatigues occur as eye intention and gaze angle and direction is changed for selection of designated keys and are happened when users use eye-based HCI system for a long period depending on the number of gaze changes as well as traveling length of eye gaze position on the computer screen. This paper proposes eye-based HCI with moving keyboard layout which allow reducing fatigue effect. Only thing user has to do is just look at the center of the computer screen and four adjacent keys, right, left, up, and down keys. By using EEG: Electroencephalograph analyzer, a comparative study has been done between the proposed eye-based HCI with moving keyboard and the existing conventional eye-based HCI. Taking Event-Related Desynchronization (ERD) into account, users.fatigues are measured. The results show fatigue effects of the proposed HCI is smaller than that of the conventional HCI.


international conference on computational science and its applications | 2010

Improvement of gaze estimation robustness using pupil knowledge

Kohei Arai; Ronny Mardiyanto

This paper presents an eye gaze estimation system which robust against various users. Our method utilizes an IR camera mounted on glass to allow users movement. Pupil knowledge such as shape, size, location, and motion are used. This knowledge works based on the knowledge priority. Pupil appearance such as size, color, and shape are used as the first priority. When this step fails, then pupil is estimated based on its location as second priority. When all steps fail, then we estimate pupil based on its motion as the last priority. The aim of this proposed method is to make the system compatible for various user as well as to overcome problem associated with illumination changes and user movement. The proposed system is tested using several users with various race as well as nationality and the experiment result are compared to the well-known adaptive threshold method and template matching method. The proposed method shows good performance, robustness, accuracy and stability against illumination changes without any prior calibration.


International Journal of Advanced Research in Artificial Intelligence | 2013

Eye-Base Domestic Robot Allowing Patient to Be Self-Services and Communications Remotely

Kohei Arai; Ronny Mardiyanto

Eye-based domestic helper is proposed for helping patient self-sufficient in hospital circumstance. This kind of system will benefit for those patient who cannot move around, it especially happen to stroke patient who in the daily they just lay on the bed. They could not move around due to the body malfunction. The only information that still could be retrieved from user is eyes. In this research, we develop a new system in the form of domestic robot helper controlled by eye which allows patient self-service and speaks remotely. First, we estimate user sight by placing camera mounted on user glasses. Once eye image is captured, the several image processing are used to estimate the sight. Eye image is cropped from the source for simplifying the area. We detect the centre of eye by seeking the location of pupil. The pupil and other eye component could be easily distinguished based on the color. Because pupil has darker color than others, we just apply adaptive threshold for its separation. By using simple model of eye, we could estimate the sight based on the input from pupil location. Next, the obtained sight value is used as input command to the domestic robot. User could control the moving of robot by eye. Also, user could send the voice through text to speech functionality. We use baby infant robot as our domestic robot. We control the robot movement by sending the command via serial communication (utilizing the USB to serial adapter). Three types of command consist of move forward, turn left, and turn right are used in the system for moving the robot. In the robot, we place another camera for capturing the scenery in the front of robot. Between robot and user, they are separated by distance. They are connected over TCP/IP network. The network allows user control the robot remotely. We set the robot as server and user’s computer as client. The robot streams the scenery video and receives command sending by the client. In the other place, client (user) receives video streaming from server and control the robot movement by sending command via the network. The user could control the robot remotely even in the long distance because user could see the scenery in the front of robot. We have tested the performance of our robot controlled over TCP/IP network. An experiment measuring the robot maneuverability performance from start point avoiding and passing obstacles has been done in our laboratory. By implementing our system, patient in hospital could self-service by them self.

Collaboration


Dive into the Ronny Mardiyanto's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Djoko Purwanto

Sepuluh Nopember Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge