Manil Maskey
University of Alabama in Huntsville
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Manil Maskey.
Ecological Informatics | 2010
Helen Conover; Gregoire Berthiau; Mike Botts; H. Michael Goodman; Xiang Li; Yue Lu; Manil Maskey; Kathryn Regner; Bradley T. Zavodsky
Abstract Standard interfaces for data and information access facilitate data management and usability by minimizing the effort required to acquire, catalog and integrate data from a variety of sources. The authors have prototyped several data management and analysis applications using Sensor Web Enablement Services, a suite of service protocols being developed by the Open Geospatial Consortium specifically for handling sensor data in near-real time. This paper provides a brief overview of some of the service protocols and describes how they are used in various sensor web projects involving near-real-time management of sensor data.
Earth Science Informatics | 2012
Manil Maskey; Ajinkya Kulkarni; Helen Conover; Udaysankar S. Nair; Sunil Movva
Abstract“Open science,” where researchers share and publish every element of their research process in addition to the final results, can foster novel ways of collaboration among researchers and has the potential to spontaneously create new virtual research collaborations. Based on scientific interest, these new virtual research collaborations can cut across traditional boundaries such as institutions and organizations. Advances in technology allow for software tools that can be used by different research groups and institutions to build and support virtual collaborations and infuse open science. This paper describes Talkoot, a software toolkit designed and developed by the authors to provide Earth Science researchers a ready-to-use knowledge management environment and an online platform for collaboration. Talkoot allows Earth Science researchers a means to systematically gather, tag and share their data, analysis workflows and research notes. These Talkoot features are designed to foster rapid knowledge sharing within a virtual community. Talkoot can be utilized by small to medium sized groups and research centers, as well as large enterprises such a national laboratories and federal agencies.
oceans conference | 2012
Eoin Howlett; Kyle Wilcox; Alex Crosby; Andrew. Bird; Sara J. Graves; Manil Maskey; Ken Keiser; Richard A. Luettich; Richard P. Signell; Liz Smith; Don Wright; Jeffrey L. Hanson; Rebecca Baltes
Coastal waters and lowlands of the U.S. are threatened by climate change, sea-level rise, flooding, oxygen depleted “dead zones”, oil spills and unforeseen disasters. With funding from U.S. Integrated Ocean Observing System (IOOS®), the Southeast University Research Association (SURA) facilitated strong and strategic collaborations among experts from academia, federal operational centers and industry and guided the U.S. IOOS Coastal and Ocean Modeling Testbed (COMT) through its successful pilot phase. The focus of this paper is the development of the cyberinfrastructure, including successes and challenges during this pilot phase of the COMT. This is the first testbed intended to serve multiple federal agencies and be focused on the coastal ocean and Great Lakes. National Oceanic and Atmospheric Administrations (NOAA) National Center for Environmental Prediction (NCEP) has offered an operational base for the COMT, which addresses NCEP modeling challenges in coastal predictions by enabling the transition of research improvements into NCEPs operational forecast capability. Additional Federal participants include Navy, U.S. Geological Survey (USGS), Environmental Protection Agency and the U.S. Army Corps of Engineers (USACE). The mission of the Coastal and Ocean Modeling Testbed (COMT) is to use targeted research and development to accelerate the transition of scientific and technical advances from the coastal and ocean modeling research community to improve identified operational ocean products and services (i.e. via research to applications and also applications to research). The vision of the program is to enhance the accuracy, reliability, and scope of the federal suite of operational ocean modeling products, while ensuring its user community is better equipped to solve challenging coastal problems and recognize the COMT to be where the best coastal science is operationalized. Since its initiation in June, 2010, the COMT has developed to include a flexible and extensible community research framework to test and evaluate predictive models to address key coastal environmental issues. Initially, the COMT addressed three general research challenges of socioeconomic relevance: estuarine hypoxia, shelf hypoxia, and coastal inundation. A cyberinfrastructure was developed to facilitate model assessment based on community standards, including a distributed data repository, automated cataloging mechanism, quick browse facility, and tools for flexible and detailed scientific investigation of both model output and data. Models, tools and techniques from the Testbed are starting to be incorporated into the NOAA research and operational frameworks, reducing the transition time from research to federal operations. Ultimately, the COMT has had many successes as a pilot project and provides an effective and efficient environment for coordinating and improving coastal ocean and Great Lakes modeling efforts needed by the federal operational forecasting community.
2015 19th International Conference on Information Visualisation | 2015
Manil Maskey; Timothy S. Newman
The role of a textures directionality (i.e., Orientedness) in multivariate visualization is explored. A key emphasis here is determining if directional textures can be an effective component in the visualization of multiple attribute data, in particular weather data. Toward that end, a new directional texture-based data visualization technique is described and exhibited. Results of user-based evaluations of directional textures in visualization are also reported.
international geoscience and remote sensing symposium | 2008
Sara J. Graves; Christopher Lynnes; Manil Maskey; Ken Keiser; Long Pham
This paper describes approaches and methodologies facilitating the analysis of large amounts of distributed scientific data. The existence of full-featured analysis tools, such as the Algorithm Development and Mining (ADaM) toolkit and online data repositories now provide easy access and analysis capabilities to large amounts of data. However, there are obstacles to getting the analysis tools and the data together in a workable environment. Does one bring the data to the tools or deploy the tools close to the data? The large size of many current Earth science datasets incurs significant overhead in network transfer for analysis workflows, even with the current advanced networking capabilities. We are developing two solutions for this problem that address different analysis scenarios. The first is a Data Center Deployment of the analysis services for large data selections, orchestrated by a remotely defined analysis workflow. The second is a Data Mining Center approach of providing a cohesive analysis solution for smaller subsets of data. The two approaches can be complementary and thus provide flexibility for researchers to exploit the best solution for their data requirements.
international geoscience and remote sensing symposium | 2006
Helen Conover; Bruce Beaumont; M. Drewry; Sara J. Graves; Ken Keiser; Manil Maskey; Matthew H. Smith; Philip Bogden; Joanne Bintz
The Southeastern Universities Research Association (SURA) coastal ocean observing and prediction (SCOOP) program is a SURA Coastal Research initiative that is deploying cutting edge information technology to advance the science of environmental prediction and hazard planning for our nations coasts. SCOOP is a distributed program, incorporating heterogeneous data, software and hardware; thus the use of standards to enable interoperability is key to SCOOPs success. Standards activities range from internal coordination among SCOOP partners to participation in national standards efforts. As the lead partner in the SCOOP program for both data management and data translation, the University of Alabama in Huntsville (UAH) is developing a suite of advanced technologies to provide core data and information management services for scientific data, including the SCOOP Catalog and a suite of standards-based web services providing Catalog access. Currently under development is a web service that will export information on SCOOP data collections in a schema compliant with the Federal Geographic Data Committees Content Standard for Digital Geospatial Metadata. SCOOP is also a participant in the OpenlOOS Interoperability Demonstration, which leverages open geospatial consortium (OGQ standards such as the Web map service (WMS) and Web Feature Service (WFS) protocols to display near real time coastal observations together with water level, wave, and surge forecasts. SCOOP partners are also active participants in several data and metadata standards efforts, including the national ocean sciences data management and communications metadata studies and the marine metadata interoperability project. Continued close cooperation between the IT and coastal science modeling communities is producing positive results toward a real-time modeling environment that will benefit coastal stakeholders through better predictive capabilities.
Journal of Applied Remote Sensing | 2017
Manil Maskey; J. J. Miller
Abstract. Automated classification of images across image archives requires reducing the semantic gap between high-level features perceived by humans and low-level features encoded in images. Due to rapidly growing image archives in the Earth science domain, it is critical to automatically classify images for efficient sorting and discovery. In particular, classifying images based on the presence of Earth science phenomena allows users to perform climatology studies and investigate case studies. We present applications of deep learning-based classification of Earth science images.
international geoscience and remote sensing symposium | 2009
Sara J. Graves; Todd Berendes; Manil Maskey; Chidambaram Chidambaram; Sundar A. Christopher; Patrick Hogan; Tom Gaskins
There is a dearth of software tools that allow users to easily visualize, analyze and mine satellite imagery. The few tools that are available are expensive commercial packages that provide limited functionality. As part of a NASA funded project, a software tool named GLIDER is currently being developed to fill this void. GLIDER allows users to visualize and analyze satellite data in its native sensor view. Users can enhance the image by applying different image processing algorithms on the data. GLIDER provides the users with a full suite of pattern recognition and data mining algorithms that can be applied to the satellite imagery to extract thematic information. The suite of algorithms includes both supervised and unsupervised classification algorithms. In addition, users can project satellite imagery and analysis/mining results onto a 3D globe for visualization. GLIDER also allows users to add additional layers to the globe along with the projected image. Users can open multiple views within GLIDER to manage, visualize and analyze many data files all at once. This paper describes the features of GLIDER version 1.0.
international conference on geoinformatics | 2009
Sara J. Graves; Todd Berendes; Manil Maskey; Chidambaram Chidambaram; Sundar A. Christopher; Patrick Hogan; Tom Gaskins
There is a dearth of software tools that allow users to easily visualize, analyze and mine satellite imagery. The few tools that are available are expensive commercial packages that provide limited functionality. As part of a NASA funded project, a software tool named GLIDER is currently being developed to fill this void. GLIDER allows users to visualize and analyze satellite data in its native sensor view. Users can enhance the image by applying different image processing algorithms on the data. GLIDER provides the users with a full suite of pattern recognition and data mining algorithms that can be applied to the satellite imagery to extract thematic information. The suite of algorithms includes both supervised and unsupervised classification algorithms. In addition, users can project satellite imagery and analysis/mining results onto a 3D globe for visualization. GLIDER also allows users to add additional layers to the globe along with the projected image. Users can open multiple views within GLIDER to manage, visualize and analyze many data files all at once.
international conference on web services | 2017
Qihao Bao; Jia Zhang; Xiaoyi Duan; Tsengdar J. Lee; Yankai Zhang; Yuhao Xu; Seungwon Lee; Lei Pan; Patrick Gatlin; Manil Maskey
Service (API) discovery and recommendation is key to the wide spread of service oriented architecture and service oriented software engineering. Service recommendation typically relies on service linkage prediction calculated by the semantic distances (or similarities) among services based on their collection of inherent attributes. Given a specific context (mashup goal), however, different attributes may contribute differently to a service linkage. In this paper, instead of training a model for all attributes as a whole, a novel approach is presented to simultaneously train separate models for individual attributes. Meanwhile, a latent attribute modeling method is developed to reveal context-aware attribute distribution. Experiments over real-world datasets have demonstrated that this fine-grained method yields higher link prediction accuracy.