Lifford McLauchlan
Texas A&M University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Lifford McLauchlan.
software engineering, artificial intelligence, networking and parallel/distributed computing | 2015
Soumya Saha; Lifford McLauchlan
Topology control is critical to extend the lifetime of energy constrained Wireless Sensor Networks (WSNs). Topology control mechanism can be divided into two processes: topology construction and topology maintenance. During topology construction one creates a reduced topology to ensure network connectivity and coverage. In topology maintenance, one recreates or changes the reduced topology when the network is no longer optimal. In this research the authors concentrate on Minimum Spanning Tree (MST) which is a commonly seen problem during the design of a topology construction protocol for WSNs. As the amount of running time and messages successfully delivered are important metrics to measure the efficacy of distributed algorithms, much research to create simple, local and energy efficient algorithms for WSNs thereby creating sub optimal MSTs has been studied. In this research, two popular approaches are discussed to create a Spanning Tree in the WSNs- Random Nearest Neighbor Tree (Random NNT) and Euclidian Minimum Spanning Tree (Euclidian MST). Next, the authors propose a method which has the goals to balance the network load evenly among all of the nodes and increase the number of successful message deliveries to the sink. Finally a comparison between the three algorithms is conducted in the Matlab environment. Simulation results demonstrate significant improvement for both load balancing and number of message deliveries after implementation of the proposed algorithm.
Proceedings of SPIE | 2012
Mehrube Mehrubeoglu; Evan Ortlieb; Lifford McLauchlan; Linh Pham
A real-time iris detection and tracking algorithm has been implemented on a smart camera using LabVIEW graphical programming tools. The program detects the eye and finds the center of the iris, which is recorded and stored in Cartesian coordinates. In subsequent video frames, the location of the center of the iris corresponding to the previously detected eye is computed and recorded for a desired period of time, creating a list of coordinates representing the moving iris center location across image frames. We present an application for the developed smart camera iris tracking system that involves the assessment of reading patterns. The purpose of the study is to identify differences in reading patterns of readers at various levels to eventually determine successful reading strategies for improvement. The readers are positioned in front of a computer screen with a fixed camera directed at the readers eyes. The readers are then asked to read preselected content on the computer screen, one comprising a traditional newspaper text and one a Web page. The iris path is captured and stored in real-time. The reading patterns are examined by analyzing the path of the iris movement. In this paper, the iris tracking system and algorithms, application of the system to real-time capture of reading patterns, and representation of 2D/3D iris track are presented with results and recommendations.
Proceedings of SPIE | 2005
Lifford McLauchlan; Mehrube Mehrubeoglu
In digital security and authentication, watermarking has emerged as a solution to unauthorized digital copies, monitoring of broadcasts, information embedding, as well as end-user and transaction authentication. In the field of watermarking, the discrete cosine transform (DCT) domain, as well as other transform domains, have been shown to be advantageous over most spatial domain techniques by its increased robustness to image processing operations and possible distortions. In this research an adaptive watermarking scheme and its implementation are investigated in images in which the watermarks are embedded in the discrete cosine transform domain. The adaptive scheme has an advantage in that watermark strength can be adjusted according to image characteristics. In addition, the watermarked image degradation is also analyzed. Finally, the systems resistance to attacks is demonstrated.
12th Biennial International Conference on Engineering, Construction, and Operations in Challenging Environments; and Fourth NASA/ARO/ASCE Workshop on Granular Materials in Lunar and Martian Exploration | 2010
Jayson Durham; Mehrube Mehrubeoglu; Lifford McLauchlan; David Carter; Ross McBee
Micro Aerial Vehicles (MAVs) and larger Unmanned Aerial Vehicles (UAVs) constitute a powerful tool for the modern warfighter and first responder. However, every developing technology must first be tested before being exported to the battlefield or hot spot to assure predictable behaviors. In todays digital age, computers give designers and testers unprecedented ability to quickly and easily test virtual models before physical prototypes are created and field tested. We have utilized emerging Free Open Source Software (FOSS) simulators and workbenches (e.g. FlightGear and AUV Workbench) to improve mission-driven MAV/UAV video and imagery capture and analysis capabilities. In this project, use of FlightGear to analyze flight trajectories and video capture capabilities of different UAVs is presented. FlightGear allows the free implementation of changeable flight dynamic models, weather, autopilot, GPS, and multiplayer to be included in the analysis of simulated flights. Another attractive aspect of FlightGear is its extensible and easy-to-modify implementation. Modifications are accomplished by changing or adding human readable XML files in the FlightGear subdirectories. In this effort, existing aerial vehicle models are incorporated into FlightGear to create flight trajectories and analyze video capture capabilities to determine the necessary toolset for analyzing imagery data to meet the mission needs. The limitations of the existing camera systems as well as effects of various flight parameters on mosaicking and georegistration are presented here. Sample analysis of the effect of aerial vehicle speed on the overlap of frames captured at the typical 30 frames/second frame rate, which provides a base for comparison of flight attributes, are demonstrated together with other parameters that affect mission performance. 1941 Earth and Space 2010: Engineering, Science, Construction, and Operations in Challenging Environments
2013 ASEE Annual Conference & Exposition | 2013
Lifford McLauchlan; Mehrube Mehrubeoglu; Jayson Durham
2007 Annual Conference & Exposition | 2007
Mehrube Mehrubeoglu; Lifford McLauchlan
2016 ASEE Annual Conference & Exposition | 2016
Lifford McLauchlan; Mehrube Mehrubeoglu
Archive | 2014
Jaycon Durham; Riley Zeller-Townson; Lifford McLauchlan; Mehrube Mehrubeoglu; Richard Cardenas; Fernando Dejesus; John McDonnell
2014 ASEE Annual Conference & Exposition | 2014
Lifford McLauchlan; Mehrube Mehrubeoglu
Archive | 2010
Jayson Durham; Mehrube Mehrubeoglu; Lifford McLauchlan