Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Sanjay K. Boddhu is active.

Publication


Featured researches published by Sanjay K. Boddhu.


bioinformatics and bioengineering | 2007

A 2D Vibration Array as an Assistive Device for Visually Impaired

Dimitrios Dakopoulos; Sanjay K. Boddhu; Nikolaos G. Bourbakis

This paper deals with the design, simulation and implementation of a 2D vibration array used as a major component of an assistive wearable navigation device for visual impaired. The 2D vibration array consists of 16 (4x4) miniature vibrators connected to a portable computer, which is the main computing component of the entire wearable navigation system, called Tyflos. Tyflos consists of two miniature cameras (attached to a pair of dark glasses), a microphone, an ear speaker, the 2D vibration array, and a portable computer. The cameras capture images from the surrounding environment and after appropriate processing 3D representations are created. These 3D space representations are projected on the 2D array, which vibrates in various levels corresponding to the distances of the surrounding obstacles. The 2D array is attached to the users chest in order to provide the appropriate sensation (via vibrations) of the distances from the surroundings.


Proceedings of SPIE | 2013

A collaborative smartphone sensing platform for detecting and tracking hostile drones

Sanjay K. Boddhu; Matt McCartney; Oliver Ceccopieri; Robert L. Williams

In recent years, not only United States Armed Services but other Law-enforcement agencies have shown increasing interest in employing drones for various surveillance and reconnaissance purposes. Further, recent advancements in autonomous drone control and navigation technology have tremendously increased the geographic extent of dronebased missions beyond the conventional line-of-sight coverage. Without any sophisticated requirement on data links to control them remotely (human-in-loop), drones are proving to be a reliable and effective means of securing personnel and soldiers operating in hostile environments. However, this autonomous breed of drones can potentially prove to be a significant threat when acquired by antisocial groups who wish to target property and life in urban settlements. To further escalate the issue, the standard detection techniques like RADARs, RF data link signature scanners, etc..., prove futile as the drones are smaller in size to evade successful detection by a RADAR based system in urban environment and being autonomous, have the capability of operating without a traceable active data link (RF). Hence, towards investigating possible practical solutions for the issue, the research team at AFRL’s Tec^Edge Labs under SATE and YATE programs has developed a highly scalable, geographically distributable and easily deployable smartphone-based collaborative platform that can aid in detecting and tracking unidentified hostile drones. In its current state, this collaborative platform built on the paradigm of “Human-as-Sensors”, consists primarily of an intelligent Smartphone application that leverages appropriate sensors on the device to capture a drone’s attributes (flight direction, orientation, shape, color, etc..,) with real-time collaboration capabilities through a highly composable sensor cloud and an intelligent processing module (based on a Probabilistic model) that can estimate and predict the possible flight path of a hostile drone based on multiple (geographically distributed) observation data points. This developed collaborative sensing platform has been field tested and proven to be effective in providing real-time alerting mechanism for the personnel in the field to avert or subdue the potential damages caused by the detected hostile drones.


International Journal of Intelligent Computing and Cybernetics | 2010

Evolving neuromorphic flight control for a flapping‐wing mechanical insect

Sanjay K. Boddhu; John C. Gallagher

Purpose – The purpose of this paper is to present an approach to employ evolvable hardware concepts, to effectively construct flapping‐wing mechanism controllers for micro robots, with the evolved dynamically complex controllers embedded in a, physically realizable, micro‐scale reconfigurable substrate.Design/methodology/approach – In this paper, a continuous time recurrent neural network (CTRNN)‐evolvable hardware (a neuromorphic variant of evolvable hardware) framework and methodologies are employed in the process of designing the evolution experiments. CTRNN is selected as the neuromorphic reconfigurable substrate with most efficient Minipop Evolutionary Algorithm, configured to drive the evolution process. The uniqueness of the reconfigurable CTRNN substrate preferred for this study is perceived from its universal dynamics approximation capabilities and prospective to realize the same in small area and low power chips, the properties which are very much a basic requirement for flapping‐wing based micr...


Proceedings of SPIE | 2013

Context-aware event detection smartphone application for first responders

Sanjay K. Boddhu; Rakesh Dave; Matt McCartney; James West; Robert L. Williams

The rise of social networking platforms like Twitter, Facebook, etc…, have provided seamless sharing of information (as chat, video and other media) among its user community on a global scale. Further, the proliferation of the smartphones and their connectivity networks has powered the ordinary individuals to share and acquire information regarding the events happening in his/her immediate vicinity in a real-time fashion. This human-centric sensed data being generated in “human-as-sensor” approach is tremendously valuable as it delivered mostly with apt annotations and ground truth that would be missing in traditional machine-centric sensors, besides high redundancy factor (same data thru multiple users). Further, when appropriately employed this real-time data can support in detecting localized events like fire, accidents, shooting, etc…, as they unfold and pin-point individuals being affected by those events. This spatiotemporal information, when made available for first responders in the event vicinity (or approaching it) can greatly assist them to make effective decisions to protect property and life in a timely fashion. In this vein, under SATE and YATE programs, the research team at AFRL Tec^Edge Discovery labs had demonstrated the feasibility of developing Smartphone applications, that can provide a augmented reality view of the appropriate detected events in a given geographical location (localized) and also provide an event search capability over a large geographic extent. In its current state, the application thru its backend connectivity utilizes a data (Text & Image) processing framework, which deals with data challenges like; identifying and aggregating important events, analyzing and correlating the events temporally and spatially and building a search enabled event database. Further, the smartphone application with its backend data processing workflow has been successfully field tested with live user generated feeds.


world congress on computational intelligence | 2008

Evolved neuromorphic flight control for a flapping-wing mechanical insect model

Sanjay K. Boddhu; John C. Gallagher

This paper examines the feasibility of evolving analog neuromorphic devices to control flight in a realistic flapping-wing mechanical insect model. It will summarize relevant prior results in controlling a legged robot and explain why these results are relevant to the problem of winged flight. Following, it will present the outcomes of experiments to evolve flight controllers and discuss the implications of those results and possible future work.


genetic and evolutionary computation conference | 2005

Evolving analog controllers for correcting thermoacoustic instability in real hardware

Saranyan Vigraham; John C. Gallagher; Sanjay K. Boddhu

Previous research demonstrated that Evolvable Hardware (EH) techniques can be employed to suppress Thermoacoustic (TA) instability in a computer simulated combustion chamber. Though that work established basic feasibility, there were still significant questions concerning whether those techniques would function in the real world. This paper presents the results of the next incremental step between controlling in pure simulation and controlling a real combustion chamber. In it, we will examine issues involved with using EH methods to learn to control a hardware analog circuit model of a combustion chamber. In so doing, we establish that the basic methods work when interfaced to real hardware and uncover some interesting, potentially critical, differences between simulation and real environments. We will also establish that both the EA methods and the underlying reconfigurable hardware can be expected to learn effectively in noisy control environments and that they are well-suited for upcoming use in a live engine.


congress on evolutionary computation | 2005

A reconfigurable continuous time recurrent neural network for evolvable hardware applications

John C. Gallagher; Sanjay K. Boddhu; Saranyan Vigraham

Evolvable hardware is reconfigurable hardware plus an evolutionary algorithm. Continuous time recurrent neural networks (CTRNNs) have been proposed for use as the reconfigurable hardware component. Until recently, however, nearly all CTRNN based EH was simulation based. This poster details a design for a reconfigurable analog CTRNN computer that supports both extrinsic and intrinsic CTRNN evolvable hardware.


analysis, design, and evaluation of human-machine systems | 2013

Towards building a generic sensor cloud for human-centric sensing applications

Sanjay K. Boddhu; Ed Wasser; Shiva Satya Bhupathiraju; Matt McCartney; James West; Robert L. Williams

Abstract The advances in smart devices technology and their profuse availability have made the prospective of human-centric sensing and computing paradigms a viable reality. There already exist various operational intelligent systems in different domains like defense, healthcare, energy and disaster management that have been developed by employing human-centric sensing as their backbone. But, to support building more complex or novel human-centric based systems that have to integrate with existing sensors/devices and possible future sensors, there exists practical issue like accommodating disparate data formats, modality and connectivity interfaces. These low-level issues make integration of different sensing devices and fusion of sensed data a challenge and time consuming process, delaying the high-level implications, which can be targeted towards solving real world problems. In this paper, a generic architecture has been presented that has been developed to solve the mentioned challenges and support seamless integration and development of human-centric sensing devices and also platforms.


analysis, design, and evaluation of human-machine systems | 2013

Augmenting Situational Awareness for First responders using Social media as a sensor

Rakesh Dave; Sanjay K. Boddhu; Matt McCartney; James West

Abstract First responders to an emergency situation rely on ground truths measured by various sensing mechanisms for effective decision making. The sensors are typically airborne or ground based. Seamless sharing of information among users using Social networking provides for a unique type of sensor. This human-as-a sensor is already deployed in the field and only requires harvesting of the information to glean ground truth. Further, the proliferation of the smartphones and their connectivity networks has powered the ordinary individuals to share and acquire information regarding the events happening in his/her immediate vicinity in a real-time fashion. The information provided by these sensors is already annotated with descriptions such as “urgency” “critically wounded” which normally would not be found in traditional machine based sensors. Further, when appropriately employed this real-time data can support in detecting localized events like fire, accidents, shooting, etc…, as they unfold and pin-point individuals being affected by those events. The spatio-temporal information can be indexed, grouped and deployed on Smartphones and other devices that first responders can use in the field to augment decision making. In this vein, under SATE and YATE programs, the research team at AFRL TecˆEdge Discovery labs had demonstrated the feasibility of developing Smartphone applications, that can provide a augmented reality view of the appropriate detected events in a given geographical location (localized) and also provide an event search capability over a large geographic extent. In its current state, the application thru its backend connectivity utilizes a data (Text & Image) processing framework, which deals with data challenges like; identifying and aggregating important events, analyzing and correlating the events temporally and spatially and building a search enabled event database. Further, the smartphone application with its backend data processing workflow has been successfully field tested with live user generated feeds.


2009 IEEE Workshop on Evolvable and Adaptive Hardware | 2009

Evolving non-autonomous neuromorphic flight control for a flapping-wing mechanical insect

Sanjay K. Boddhu; John C. Gallagher

In the previous work, it was demonstrated that one can effectively employ CTRNN-EH (a neuromorphic variant of EH method) methodology to evolve autonomous neuromorphic flight controllers for a flapping wing robot. This paper further explores the similar prospective by evolving non-autonomous neuromorphic controllers, which would render them potentially deployable controllers in a realistic micro-scale flapping wing robot. A summary of the previous work related to evolving Autonomous flight controllers for a flapping-wing mechanical insect is provided along with the incentives towards evolving the non-autonomous controllers for the same. Further, the details of the experimental design and the outcomes of experiments to evolve non-autonomous flight controllers are provided with appropriate discussion and implications for possible future work.

Collaboration


Dive into the Sanjay K. Boddhu's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Monica Sam

Wright State University

View shared research outputs
Top Co-Authors

Avatar

Robert L. Williams

Air Force Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

Matt McCartney

Air Force Research Laboratory

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

James West

Wright State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge