Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Siddharth S. Rautaray is active.

Publication


Featured researches published by Siddharth S. Rautaray.


Artificial Intelligence Review | 2015

Vision based hand gesture recognition for human computer interaction: a survey

Siddharth S. Rautaray; Anupam Agrawal

As computers become more pervasive in society, facilitating natural human–computer interaction (HCI) will have a positive impact on their use. Hence, there has been growing interest in the development of new approaches and technologies for bridging the human–computer barrier. The ultimate aim is to bring HCI to a regime where interactions with computers will be as natural as an interaction between humans, and to this end, incorporating gestures in HCI is an important research area. Gestures have long been considered as an interaction technique that can potentially deliver more natural, creative and intuitive methods for communicating with our computers. This paper provides an analysis of comparative surveys done in this area. The use of hand gestures as a natural interface serves as a motivating force for research in gesture taxonomies, its representations and recognition techniques, software platforms and frameworks which is discussed briefly in this paper. It focuses on the three main phases of hand gesture recognition i.e. detection, tracking and recognition. Different application which employs hand gestures for efficient interaction has been discussed under core and advanced application domains. This paper also provides an analysis of existing literature related to gesture recognition systems for human computer interaction by categorizing it under different key parameters. It further discusses the advances that are needed to further improvise the present hand gesture recognition systems for future perspective that can be widely used for efficient human computer interaction. The main goal of this survey is to provide researchers in the field of gesture based HCI with a summary of progress achieved to date and to help identify areas where further research is needed.


multimedia signal processing | 2011

Interaction with virtual game through hand gesture recognition

Siddharth S. Rautaray; Anupam Agrawal

Hand gesture recognition systems for virtual reality applications provides the users an enhanced interaction experience as it integrates the virtual and the real world object. Growth in virtual environments based upon computer systems and development of user interfaces influence the changes in the Human-Computer Interaction (HCI). Gesture recognition based interaction interface, endow with more realistic and immersive interaction compared to the traditional devices. The system enables a physically realistic mode of interaction to the virtual environment. The Hand gesture recognition system based interface proposed and implemented in this paper consists of a detection, tracking and recognition module. For the implementation of these modules various image processing algorithms as Camshift, Lucas Kanade, Haar like features etc has been employed. Comprehensive user acceptability has been considered to exhibit the accuracy, usefulness and ease of use to the proposed and implemented hand gesture recognition system. Hand gesture communication based vocabulary offers many variations ranging from simple action of using our finger to point at to using hands for moving objects around to the rather complex one like expression of the feelings. The proposed hand gesture recognition system offers intensions to traditional input devices for interaction with the virtual environments. The gesture based interaction interface being proposed here can be substantially applied towards many applications like Virtual Reality, Sign Language and Games. Though the present paper considered games as the application domain.


Proceedings of the First International Conference on Intelligent Interactive Technologies and Multimedia | 2010

A novel human computer interface based on hand gesture recognition using computer vision techniques

Siddharth S. Rautaray; Anupam Agrawal

In daily life, human beings communicate with each other and use broad range of gestures in the process of interaction. Apart of the interpersonal communication, many hours are spent in the interaction with the electronic devices. In the last decade, new classes of devices for accessing information have emerged along with increased connectivity. In parallel to the proliferation of these devices, new interaction styles have been explored. The objective of this paper is to provide a gesture based interface for controlling applications like media player using computer vision techniques. The human computer interface application consists of a central computational module which uses the Principal Component Analysis for gesture images and finds the feature vectors of the gesture and save it into a XML file. The Recognition of the gesture is done by K Nearest Neighbour algorithm. The Training Images are made by cropping the hand gesture from static background by detecting the hand motion using Lucas Kanade Pyramidical Optical Flow algorithm. This hand gesture recognition technique will not only replace the use of mouse to control the media player but also provide different gesture commands which will be useful in controlling the application.


international conference on recent advances in information technology | 2012

Hand data glove: A new generation real-time mouse for Human-Computer Interaction

Piyush Kumar; Siddharth S. Rautaray; Anupam Agrawal

This Human-Computer Interaction (HCI) is a field in which the developer makes a user friendly system. In this paper, a real-time Human-Computer Interaction based on the hand data glove and k-NN classifier for gesture recognition is proposed. HCI is becoming more and more natural and intuitive to be used. The important part of body that is hand is most frequently used as interaction in digital environment and thus complexity and flexibility of motion of hand is a research topic. To recognize hand gesture accurately and successfully data glove is used. Here, glove is used to capture current position and angle of hand and fingers, and further classify it using k-NN classifier. The gestures classified are clicking, dragging, rotating, pointing and ideal position. Recognizing these gestures relevant actions are taken, such as air writing and 3D sketching by tracking the path. This can be also used in the controlling of an image browser tool using data glove. The results show that glove used for interaction is better than normal static keyboard and mouse as the interaction process is more accurate and natural. Also it enhances the users interaction and immersion feeling.


International Journal of Computer Applications | 2010

A Vision based Hand Gesture Interface for Controlling VLC Media Player

Siddharth S. Rautaray; Anupam Agrawal

Human Computer Interaction can acquire several advantages with the introduction of different natural forms of device free communication. Gestures are a natural form of actions which we often use in our daily life for interaction, therefore to use it as a communication medium with computers generates a new paradigm of interaction with computers. This paper implements computer vision and gesture recognition techniques and develops a vision based low cost input device for controlling the VLC player through gestures. VLC application consists of a central computational module which uses the Principal Component Analysis for gesture images and finds the feature vectors of the gesture and save it into a XML file. The Recognition of the gesture is done by K Nearest Neighbour algorithm. The theoretical analysis of the approach shows how to do recognition in static background. The Training Images are made by cropping the hand gesture from static background by detecting the hand motion using Lucas Kanade Pyramidical Optical Flow algorithm. This hand gesture recognition technique will not only replace the use of mouse to control the VLC player but also provide different gesture vocabulary which will be useful in controlling the application.


2012 IEEE International Conference on Technology Enhanced Education (ICTEE) | 2012

Design of gesture recognition system for dynamic user interface

Siddharth S. Rautaray; Anupam Agrawal

With the escalating role of computers in educational system, human computer interaction, is becoming gradually more important part of it. The general believe is that with the progress in computing speed, communication technologies, and display techniques the existing HCI techniques may become a constraint in the effectual utilization of the existing information flow. The development of user interfaces influences the changes in the Human-Computer Interaction (HCI). Human hand gestures have been a mode of non verbal interaction widely used. The vocabulary of hand gesture communication has many variations. It ranges from simple action of using our finger to point at and using hands to move objects around for more complex expressions for the feelings and communicating with others. Also the hand gestures play a prominent role in teaching considering the explanations and exemplifications being highly dependent on hand gestures. Naturalistic and intuitiveness of the hand gesture has been an immense motivating aspect for the researchers in the field of Human Computer Interaction to put their efforts to research and develop the more promising means of interaction involving human and computers. The pursuance for the Human Computer Interaction research is moved by the central dogma of removing the complex and cumbersome interaction devices and replacing them with more obvious and expressive means of interaction which easily comes to the users with least cognitive burden like hand gestures. This paper designs a simple, natural system for gestural interaction between the user and computer for providing a dynamic user interface. The gesture recognition system uses image processing techniques for detection, segmentation, tracking and recognition of hand gestures for converting it to a meaningful command. This hand gesture recognition system has been proposed, designed and developed with the intensions to make it a substitute for mouse while making dynamic user interface between human and machine. Hence instead of making effort to develop a new vocabulary of hand gesture we have matched control instruction set of mouse to subset of most discriminating hand gestures, so that we get a robust interface. The interface being proposed here can be substantially applied towards different applications like image browser, games etc.


International Journal of Computer Applications | 2011

A Real Time Hand Tracking System for Interactive Applications

Siddharth S. Rautaray; Anupam Agrawal

In vision based hand tracking systems color plays an important role for detection. Skin color detection is widely used in different interactive applications, e.g. face and hand tracking, detecting people in video databases. This paper implements an effective hand tracking technique which is based on color detection. In this techniques based on the color distribution the segmentation of hand from background will take place in a real time. This technique provides to main benefits: The process of tracking is fast as the segmentation process is performed simultaneously in a specified area surrounding the hand. This technique is highly robust under different lightning conditions. To check the performance of the implemented technique a number of experiments have been performed. The implemented technique will be useful in various real time interactive applications, such as gesture recognition, augmented reality, virtual reality etc.


PerMIn'12 Proceedings of the First Indo-Japan conference on Perception and Machine Intelligence | 2012

Human computer interaction with hand gestures in virtual environment

Siddharth S. Rautaray; Anand Kumar; Anupam Agrawal

With the ever increasing and flourishing phenomena of growth in virtual environments based upon computer systems; demands for new kind of interaction devices have emerged. The present used devices like keyboard, mouse and pen are cumbrousome within these promising applications. The developments of user interfaces influence the changes in the Human-Computer Interaction (HCI). This paper focuses to design an application using computer vision and gesture recognition techniques which develop a relatively economic input device of interacting with virtual games using hand gestures. The architecture of the gesture recognition system comprises of different image processing techniques like camshift, and Lucas Kanade technique for tracking of hands and its gestures. Haar like features locates the position of the hand and recognizes the gesture being made by such located hand image. The modeling of gestures has been done for recognition through matching the feature of defects present in the hand with the assigned gestures. The virtual game is created using Open GL library. The application uses seven gestures for manipulating the virtual game. This main connotation of this hand gesture recognition system is providing a substitute for input devices while making interaction during the virtual games. Hence instead of making effort to develop a new vocabulary of hand gesture we have matched control instruction set of mouse to subset of most discriminating hand gestures, so that we get a robust interface.


International Conference on Intelligent Interactive Technologies and Multimedia | 2013

Adaptive Hand Gesture Recognition System for Multiple Applications

Siddharth S. Rautaray; Anupam Agrawal

With the increasing role of computing devices facilitating natural human computer interaction (HCI) will have a positive impact on their usage and acceptance as a whole. Techniques such as vision, sound, speech recognition allow for a much richer form of interaction between the user and machine. The emphasis is to provide a natural form of interface for interaction. As gesture commands are found to be natural for humans, the development of the gesture based system interface have become an important research area. One of the drawbacks of present gesture recognition systems is application dependent which makes it difficult to transfer one gesture control interface into multiple applications. This paper focuses on designing a hand gesture recognition system which is adaptable to multiple applications thus making the gesture recognition systems to be application adaptive. The designed system is comprised of the different processing steps like detection, segmentation, tracking, recognition etc. For making system application-adaptive different quantitative and qualitative parameters have been taken into consideration. The quantitative parameters include gesture recognition rate, features extracted and root mean square error of the system and the qualitative parameters include intuitiveness, accuracy, stress/comfort, computational efficiency, the user’s tolerance, and real-time performance related to the proposed system. These parameters have a vital impact on the performance of the proposed application adaptive hand gesture recognition system.


International Conference on Parallel Distributed Computing Technologies and Applications | 2011

Manipulating Objects through Hand Gesture Recognition in Virtual Environment

Siddharth S. Rautaray; Anupam Agrawal

Virtual environments have always been considered as a means for more visceral and efficient human computer interaction by a diversified range of applications. The spectrum of applications includes analysis of complex scientific data, medical training, military simulation, phobia therapy and virtual prototyping. Evolution of ubiquitous computing, current user interaction approaches with keyboard, mouse and pen are not sufficient for the still widening spectrum of Human computer interaction. Gloves and sensor based trackers are unwieldy, constraining and uncomfortable to use. Due to the limitation of these devices the useable command set based diligences is also limited. Direct use of hands as an input device is an innovative method for providing natural Human Computer Interaction which has its inheritance from text-based interfaces through 2D graphical-based interfaces, multimedia-supported interfaces, to full-fledged multi-participant Virtual Environment (VE) systems. Conceiving a future era of human-computer interaction with the implementations of 3D application where the user may be able to move and rotate objects simply by moving and rotating his hand - all without help of any input device.

Collaboration


Dive into the Siddharth S. Rautaray's collaboration.

Top Co-Authors

Avatar

Anupam Agrawal

Indian Institute of Information Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Anand Kumar

Indian Institute of Information Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge