Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Anh Nguyen is active.

Publication


Featured researches published by Anh Nguyen.


international conference on embedded networked sensor systems | 2016

A Lightweight and Inexpensive In-ear Sensing System For Automatic Whole-night Sleep Stage Monitoring

Anh Nguyen; Raghda Alqurashi; Zohreh Raghebi; Farnoush Banaei-Kashani; Ann C. Halbower; Tam Vu

This paper introduces LIBS, a light-weight and inexpensive wearable sensing system, that can capture electrical activities of human brain, eyes, and facial muscles with two pairs of custom-built flexible electrodes each of which is embedded on an off-the-shelf foam earplug. A supervised non-negative matrix factorization algorithm to adaptively analyze and extract these bioelectrical signals from a single mixed in-ear channel collected by the sensor is also proposed. While LIBS can enable a wide class of low-cost self-care, human computer interaction, and health monitoring applications, we demonstrate its medical potential by developing an autonomous whole-night sleep staging system utilizing LIBSs outputs. We constructed a hardware prototype from off-the-shelf electronic components and used it to conduct 38 hours of sleep studies on 8 participants over a period of 30 days. Our evaluation results show that LIBS can monitor biosignals representing brain activities, eye movements, and muscle contractions with excellent fidelity such that it can be used for sleep stage classification with an average of more than 95% accuracy.


Proceedings of the 2nd Workshop on Micro Aerial Vehicle Networks, Systems, and Applications for Civilian Use | 2016

Investigating Cost-effective RF-based Detection of Drones

Phuc Nguyen; Mahesh Ravindranatha; Anh Nguyen; Richard Han; Tam Vu

Beyond their benign uses, civilian drones have increasingly been used in problematic ways that have stirred concern from the public and authorities. While many anti-drone systems have been proposed to take them down, such systems often rely on a fundamental assumption that the presence of the drone has already been detected and is known to the defender. However, there is a lack of an automated cost-effective drone detection system. In this paper, we investigate a drone detection system that is designed tao autonomously detect and characterize drones using radio frequency wireless signals. In particular, two technical approaches are proposed. The first approach is active tracking where the system sends a radio signal and then listens for its reflected component. The second approach is passive listening where it receives, extracts, and then analyzes observed wireless signal. We perform a set of preliminary experiments to explore the feasibility of the approaches using WARP and USRP software-defined platforms. Our preliminary results illustrate the feasibility of the proposed system and identify the challenges for future research.


international conference on mobile systems, applications, and services | 2017

Matthan: Drone Presence Detection by Identifying Physical Signatures in the Drone's RF Communication

Phuc Nguyen; Hoang Truong; Mahesh Ravindranathan; Anh Nguyen; Richard Han; Tam Vu

Drones are increasingly flying in sensitive airspace where their presence may cause harm, such as near airports, forest fires, large crowded events, secure buildings, and even jails. This problem is likely to expand given the rapid proliferation of drones for commerce, monitoring, recreation, and other applications. A cost-effective detection system is needed to warn of the presence of drones in such cases. In this paper, we explore the feasibility of inexpensive RF-based detection of the presence of drones. We examine whether physical characteristics of the drone, such as body vibration and body shifting, can be detected in the wireless signal transmitted by drones during communication. We consider whether the received drone signals are uniquely differentiated from other mobile wireless phenomena such as cars equipped with Wi- Fi or humans carrying a mobile phone. The sensitivity of detection at distances of hundreds of meters as well as the accuracy of the overall detection system are evaluated using software defined radio (SDR) implementation.


Proceedings of the 2016 Workshop on Wearable Systems and Applications | 2016

In-ear Biosignal Recording System: A Wearable For Automatic Whole-night Sleep Staging

Anh Nguyen; Raghda Alqurashi; Zohreh Raghebi; Farnoush Banaei-Kashani; Ann C. Halbower; Thang N. Dinh; Tam Vu

In this work, we present a low-cost and light-weight wearable sensing system that can monitor bioelectrical signals generated by electrically active tissues across the brain, the eyes, and the facial muscles from inside human ears. Our work presents two key aspects of the sensing, which include the construction of electrodes and the extraction of these biosignals using a supervised non-negative matrix factorization learning algorithm. To illustrate the usefulness of the system, we developed an autonomous sleep staging system using the output of our proposed in-ear sensing system. We prototyped the device and evaluated its sleep stage classification performance on 8 participants for a period of 1 month. With 94% accuracy on average, the evaluation results show that our wearable sensing system is promising to monitor brain, eyes, and facial muscle signals with reasonable fidelity from human ear canals.


Proceedings of the 9th ACM Workshop on Wireless of the Students, by the Students, and for the Students | 2017

Demo: Low-power Capacitive Sensing Wristband for Hand Gesture Recognition

Hoang Truong; Phuc Nguyen; Nam Bui; Anh Nguyen; Tam Vu

Along with the rapid development of human-computer interaction (HCI), controlling objects and smart devices remotely from afar has become the trend to satisfy the needs of consumers. Hand gesture recognition is among the methods that highly attract the attention in this field yet there still exists many aspect to explore and improve. We propose a low-cost low-power wristband-form hand gesture recognition system utilizing capacitive sensing technique. We provide an open source system which includes low-power, low-cost hardware components and user-friendly software stack. This system will be available for users and developers to customize various hand gesture set and integrate into third part application, from computer remote command to video game controller.


european symposium on research in computer security | 2016

Android Permission Recommendation Using Transitive Bayesian Inference Model

Bahman Rashidi; Carol J. Fung; Anh Nguyen; Tam Vu

In current Android architecture, users have to decide whether an app is safe to use or not. Technical-savvy users can make correct decisions to avoid unnecessary privacy breach. However, most users may have difficulty to make correct decisions. DroidNet is an Android permission recommendation framework based on crowdsourcing. In this framework, DroidNet runs new apps under probation mode without granting their permission requests up-front. It provides recommendations on whether to accept or reject the permission requests based on decisions from peer expert users. To seek expert users, we propose an expertise rating algorithm using transitional Bayesian inference model. The recommendation is based on the aggregated expert responses and its confidence level. Our evaluation results demonstrate that given sufficient number of experts in the network, DroidNet can provide accurate recommendations and cover majority of app requests given a small coverage from a small set of initial experts.


Construction Research Congress 2016 | 2016

Sensing Occupant Comfort Using Wearable Technologies

Moatassem Abdallah; Caroline M. Clevenger; Tam Vu; Anh Nguyen

Thermal comfort of building occupants is a major criterion in evaluating the performance of building systems. It is also a dominant factor in designing and optimizing building’s operation. However, existing thermal comfort models, such as Finger’s model currently adopted by ASHRAE Standard 55, rely on factors that require bulky and expensive equipment to measure. This paper attempts to take a radically different approach towards measuring the thermal comfort of building occupants by leveraging the ever-increasing capacity and capability of mobile and wearable devices. Today’s commercially-off-the-shelf (COST) wearable devices can unobtrusively capture a number of important parameters that may be used to measure thermal comfort of building occupants, including ambient air temperature, relative humidity, skin temperature, perspiration rate, and heart rate. This research evaluates such opportunities by fusing traditional environmental sensing data streams with newly available wearable sensing information. Furthermore, it identifies challenges for using existing wearable devices and to developing new models to predict human thermal comfort. Findings from this exploratory study identify the inaccuracy of sensors in cellphones and wearable as a challenge, yet one which can be improved using customized wearables. The study also suggests there exists a high potential for developing new models to predict human thermal sensation using artificial neural networks and additional factors that can be individually, unobtrusively and dynamically measured using wearables.


international conference on mobile systems, applications, and services | 2018

TYTH-Typing On Your Teeth: Tongue-Teeth Localization for Human-Computer Interface

Phuc Nguyen; Nam Bui; Anh Nguyen; Hoang Truong; Abhijit Suresh; Matt Whitlock; Duy Pham; Thang N. Dinh; Tam Vu

This paper explores a new wearable system, called TYTH, that enables a novel form of human computer interaction based on the relative location and interaction between the users tongue and teeth. TYTH allows its user to interact with a computing system by tapping on their teeth. This form of interaction is analogous to using a finger to type on a keypad except that the tongue substitutes for the finger and the teeth for the keyboard. We study the neurological and anatomical structures of the tongue to design TYTH so that the obtrusiveness and social awkwardness caused by the wearable is minimized while maximizing its accuracy and sensing sensitivity. From behind the users ears, TYTH senses the brain signals and muscle signals that control tongue movement sent from the brain and captures the miniature skin surface deformation caused by tongue movement. We model the relationship between tongue movement and the signals recorded, from which a tongue localization technique and tongue-teeth tapping detection technique are derived. Through a prototyping implementation and an evaluation with 15 subjects, we show that TYTH can be used as a form of hands-free human computer interaction with 88.61% detection rate and promising adoption rate by users.


IEEE Transactions on Information Forensics and Security | 2018

Android User Privacy Preserving Through Crowdsourcing

Bahman Rashidi; Carol J. Fung; Anh Nguyen; Tam Vu; Elisa Bertino

In current Android architecture, users have to decide whether an app is safe to use or not. Expert users can make savvy decisions to avoid unnecessary privacy breach. However, the majority of normal users are not technically capable or do not care to consider privacy implications to make safe decisions. To assist the technically incapable crowd, we propose DroidNet, an Android permission control framework based on crowdsourcing. At its core, DroidNet runs new apps under probation mode without granting their permission requests up-front. It provides recommendations on whether to accept or reject the permission requests based on decisions from peer expert users. To seek expert users, we propose an expertise ranking algorithm using a transitional Bayesian inference model. The recommendation is based on the aggregated expert responses and its confidence level. Our simulation and real user experimental results demonstrate that DroidNet provides accurate recommendations and cover the majority of app requests given a small coverage from a small set of initial experts.


international conference on embedded networked sensor systems | 2017

PhO2: Smartphone based Blood Oxygen Level Measurement Systems using Near-IR and RED Wave-guided Light

Nam Bui; Anh Nguyen; Phuc Nguyen; Hoang Truong; Ashwin Ashok; Thang N. Dinh; Robin R. Deterding; Tam Vu

Accurately measuring and monitoring patients blood oxygen level plays a critical role in todays clinical diagnosis and healthcare practices. Existing techniques however either require a dedicated hardware or produce inaccurate measurements. To fill in this gap, we propose a phone-based oxygen level estimation system, called PhO2, using camera and flashlight functions that are readily available on todays off-the-shelf smart phones. Since phones camera and flashlight are not made for this purpose, utilizing them for oxygen level estimation poses many challenges. We introduce a cost-effective add-on together with a set of algorithms for spatial and spectral optical signal modulation to amplify the optical signal of interest while minimizing noise. A light-based pressure detection algorithm and feedback mechanism are also proposed to mitigate the negative impacts of users behavior during the measurement. We also derive a non-linear referencing model that allows PhO2 to estimate the oxygen level from color intensity ratios produced by smartphones camera. An evaluation using a custom-built optical element on COTS smartphone with 6 subjects shows that PhO2 can estimate the oxygen saturation within 3.5% error rate comparing to FDA-approved gold standard pulse oximetry. A user study to gauge the reception of PhO2 shows that users are comfortable self-operating the device, and willing to carry the device when going out.

Collaboration


Dive into the Anh Nguyen's collaboration.

Top Co-Authors

Avatar

Tam Vu

University of Colorado Boulder

View shared research outputs
Top Co-Authors

Avatar

Phuc Nguyen

University of Colorado Boulder

View shared research outputs
Top Co-Authors

Avatar

Hoang Truong

University of Colorado Boulder

View shared research outputs
Top Co-Authors

Avatar

Ann C. Halbower

University of Colorado Denver

View shared research outputs
Top Co-Authors

Avatar

Nam Bui

University of Colorado Boulder

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Raghda Alqurashi

University of Colorado Boulder

View shared research outputs
Top Co-Authors

Avatar

Thang N. Dinh

Virginia Commonwealth University

View shared research outputs
Top Co-Authors

Avatar

Zohreh Raghebi

University of Colorado Denver

View shared research outputs
Top Co-Authors

Avatar

Ashwin Ashok

Georgia State University

View shared research outputs
Researchain Logo
Decentralizing Knowledge