Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Claude Turner is active.

Publication


Featured researches published by Claude Turner.


technical symposium on computer science education | 2011

Security in computer literacy: a model for design, dissemination, and assessment

Claude Turner; Blair Taylor; Siddharth Kaza

While many colleges offer specialized security courses and tracks for students in computing majors, there are few offerings in information security for the non-computing majors. Information security is becoming increasingly critical in many fields, yet most computer literacy courses insufficiently address the security challenges faced by our graduates. This paper discusses the development and impact of a set of modules designed to integrate security into computer literacy across two universities and several community colleges in the state of Maryland. Results from our comparative analyses based on pre- and post- test analysis show significant improvements in post-test results.


Procedia Computer Science | 2014

Applying Moving Average Filtering for Non-interactive Differential Privacy Settings

Kato Mivule; Claude Turner

Abstract One of the challenges of implementing differential data privacy, is that the utility (usefulness) of the privatized data tends to diminish even as confidentiality is guaranteed. In such settings, due to excessive noise, original data suffers loss of statistical significance despite the strong levels of confidentiality assured by differential privacy . This in turn makes the privatized data practically valueless to the consumer of the published data. Additionally, researchers have noted that finding equilibrium between data privacy and utility requirements remains intractable, necessitating trade- offs. Therefore, as a contribution, we propose using the moving average filtering model for non-interactive differential privacy settings. In this model, various levels of differential privacy (DP) are applied to a data set, generating a variety of privatized data sets. The privatized data is passed through a moving average filter and the new filtered privatized data sets that meet a set utility threshold are finally published. Preliminary results from this study show that adjustment of ɛ epsilon parameter in the differential privacy process, and the application of the moving average filter might generate better data utility output while conserving privacy in non-interactive differential privacy settings.


Proceedings of the ITiCSE working group reports conference on Innovation and technology in computer science education-working group reports | 2013

Cybersecurity, women and minorities: findings and recommendations from a preliminary investigation

Rose Shumba; Kirsten Ferguson-Boucher; Elizabeth Sweedyk; Carol Taylor; Guy Franklin; Claude Turner; Corrine Sande; Gbemi Acholonu; Rebecca G. Bace; Laura L. Hall

This paper presents the work done by the ACM ITiCSE, 2013 Conference Working Group (WG) on Cybersecurity, Women and Minorities: How to Succeed in the Career! The ITiCSE 2013 conference was held July 1-3, 2013, in Canterbury, United Kingdom. The overall goal of the WG was to conduct a preliminary investigation into the reasons behind the lack of women and minorities within the field of Cybersecurity. This is not just an issue of academic or research interest, but is important in ensuring that a greater number of women and minorities progress through a full career in Cybersecurity. There are currently no statistics available on the numbers of women and minorities either currently enrolled in or graduated from these programs. There is a need to explore the full range of factors that influence women and minoritys decisions not to consider a career in Cybersecurity.


international conference on innovations in information technology | 2009

A steganographic computational paradigm for wireless sensor networks

Claude Turner

This article provides a brief review of steganography, steganalysis and wireless sensor networks (WSNs). It also presents a computational framework for steganography in WSN. The technique uses the concept of redundancy and distributed computing to add robustness to the steganographic embedding and extraction algorithms.


Procedia Computer Science | 2012

Towards A Differential Privacy and Utility Preserving Machine Learning Classifier

Kato Mivule; Claude Turner; Soo-Yeon Ji

Abstract Many organizations transact in large amounts of data often containing personal identifiable information (PII) and various confidential data. Such organizations are bound by state, federal, and international laws to ensure that the confidentiality of both individuals and sensitive data is not compromised. However, during the privacy preserving process, the utility of such datasets diminishes even while confidentiality is achieved--a problem that has been defined as NP-Hard. In this paper, we investigate a differential privacy machine learning ensemble classifier approach that seeks to preserve data privacy while maintaining an acceptable level of utility. The first step of the methodology applies a strong data privacy granting technique on a dataset using differential privacy. The resulting perturbed data is then passed through a machine learning ensemble classifier, which aims to reduce the classification error, or, equivalently, to increase utility. Then, the association between increasing the number of weak decision tree learners and data utility, which informs us as to whether the ensemble machine learner would classify more correctly is examined. As results, we found that a combined adjustment of the privacy granting noise parameters and an increase in the number of weak learners in the ensemble machine might lead to a lower classification error.


Procedia Computer Science | 2013

A Comparative Analysis of Data Privacy and Utility Parameter Adjustment, Using Machine Learning Classification as a Gauge☆

Kato Mivule; Claude Turner

Abstract During the data privacy process, the utility of datasets diminishes as sensitive information such as personal identifiable information (PII) is removed, transformed, or distorted to achieve confidentiality. The intractability of attaining an equilibrium between data privacy and utility needs is well documented, requiring trade-offs, and further complicated by the fact that making such trade-offs also remains problematic. Given such complexity, in this paper, we endeavor to empirically investigate what parameters could be fine-tuned to achieve an acceptable level of data privacy and utility during the data privacy process, while making reasonable trade-offs. Therefore, we present the comparative classification error gauge (Comparative x-CEG) approach, a data utility quantification concept that employs machine learning classification techniques to gauge data utility based on the classification error. In this approach, privatized datasets are passed through a series of classifiers, each of which returns a classification error, and the classifier with the lowest classification error is chosen; if the classification error is lower or equal to a set threshold then better utility might be achieved, otherwise, adjustment to the data privacy parameters are made to the chosen classifier. The process repeats x times until the desired threshold is reached. The goal is to generate empirical results after a range of parameter adjustments in the data privacy process, from which a threshold level might be chosen to make trade-offs. Our preliminary results show that given a range of empirical results, it might be possible to choose a tradeoff point and publish privacy compliant data with an acceptable level of utility.


Procedia Computer Science | 2011

The Wavelet and Fourier Transforms in Feature Extraction for Text-Dependent, Filterbank-Based Speaker Recognition

Claude Turner; Anthony Joseph; Murat Aksu; Heather Langdond

An important step in speaker recognition is extracting features from raw speech that captures the unique characteristics of each speaker. The most widely used method of obtaining these features is the filterbank-based Mel Frequency Cepstral Coefficients (MFCC) approach. Typically, an important step in the process is the employment of the discrete Fourier transform (DFT) to compute the spectrum of the speech waveform. However, over the past few years, the discrete wavelet transform (DWT) has gained remarkable attention, and has been favored over the DFT in a wide variety of applications. This work compares the performance of the DFT with the DWT in the computation of MFCC in the feature extraction process for speaker recognition. It is shown that the DWT results in significantly lower order for the Gaussian Mixture Model (GMM) used to model speech and marginal improvement in accuracy.


Procedia Computer Science | 2015

A Wavelet Packet and Mel-Frequency Cepstral Coefficients-Based Feature Extraction Method for Speaker Identification

Claude Turner; Anthony Joseph

Abstract One of the most widely used approaches for feature extraction in speaker recognition is the filter bank-based Mel Frequency Cepstral Coefficients (MFCC) approach. The main goal of feature extraction in this context is to extract features from raw speech that captures the unique characteristics of a particular individual. During the feature extraction process, the discrete Fourier transform (DFT) is typically employed to compute the spectrum of the speech waveform. However, over the past few years, the discrete wavelet transform (DWT) has gained remarkable attention, and has been favored over the DFT in a wide variety of applications. The wavelet packet transform (WPT) is an extension of the DWT that adds more flexibility to the decomposition process. This work is a study of the impact on performance, with respect to accuracy and efficiency, when the WPT is used as a substitute for the DFT in the MFCC method. The novelty of our approach lies in its concentration on the wavelet and the decomposition level as the parameters influencing the performance. We compare the performance of the DFT with the WPT, as well as with our previous work using the DWT. It is shown that the WPT results in significantly lower order for the Gaussian Mixture Model (GMM) used to model speech, and marginal improvement in accuracy with respect to the DFT. WPT mirrors DWT in terms of the order of GMM and can perform as well as the DWT under certain conditions.


ACM Inroads | 2015

LUCID: a visualization and broadcast system for cyber defense competitions

Claude Turner; Jie Yan; Dwight Richards; Pamela O'Brien; Jide Odubiyi; Quincy Brown

In this article, we discuss LUCID, a visualization and broadcast system targeted to improving a spectators ability to understand and make sense of cyber defense competitions. The system aims to engage the spectator by presenting information pertinent to understanding the real-time events of the competition as they unfold. It accomplishes this through a combination of techniques, including real-time network security visualization, live video and audio monitoring, animation, computer graphics, user profiling, and commentary. We examine, specifically, how the LUCID system enables the audience to make sense of ongoing activities in a cyber defense competition.


Procedia Computer Science | 2014

The Treasury Bill Rate, the Great Recession, and Neural Networks Estimates of Real Business Sales☆

Anthony Joseph; Maurice Larrain; Claude Turner

Abstract This paper analyzes out-of-sample forecasts of real total business sales. We study monthly data from January 1 970 to June 2012. The predictor variable, 3-month Treasury bill interest rate, was used with both the regression (used as a benchmark) and neural network models. The neural network models’, trained in supervised learning with the Levenberg-Marquardt backpropagation through time algorithm, prediction accuracy was confirmed with correlation coefficient and root mean square tests. The activation function used for the focused gamma models of the time-lag recurrent networks in both the hidden and output layers was tanh. The forecast period ranged from January 2006 to June 2012 thus encompassing the past recession. The real business sales variable is one of the indicators used as a coincident index of the U.S. business cycle, and is included among the variables studied by the Federal Reserve to formulate monetary policy. It is thus an important indicator surrogating for real GDP, which is reported quarterly and with a longer time delay. Our analysis shows that recent recessions have increased in duration, so that using a 36-month change to approximate an average cycle in estimating and forecasting is more relevant and accurate than past usage of a 24-month change.

Collaboration


Dive into the Claude Turner's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Kato Mivule

Bowie State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

David Enke

University of Minnesota

View shared research outputs
Top Co-Authors

Avatar

Jie Yan

Bowie State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Alfred Herrera

Virginia Commonwealth University

View shared research outputs
Top Co-Authors

Avatar

Andrew E. Mercer

Mississippi State University

View shared research outputs
Researchain Logo
Decentralizing Knowledge