Nina Siu-Ngan Lam
Louisiana State University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Nina Siu-Ngan Lam.
Cartography and Geographic Information Science | 1983
Nina Siu-Ngan Lam
Two forms of spatial interpolation, the interpolation of point and areal data, are distinguished. Traditionally, point interpolation is applied to isarithmic, that is, contour mapping and areal interpolation to isopleth mapping. Recently, areal interpolation techniques have been used to obtain data for a set of administrative or political districts from another set of districts whose boundaries do not coincide. For point interpolation, the numerous methods may further be classified into exact and approximate. Exact methods include most distance-weighting methods, Kriging, spline interpolation, interpolating polynomials, and finite-difference methods. Approximate methods include power-series trend models, Fourier models, distance-weighted least-squares, and least-squares fitting with splines. Areal interpolation methods, on the other hand, are classified according to whether they preserve volume. Traditional areal interpolation methods which utilize point interpolation procedures are not volume-preserving,...
Journal of remote sensing | 2007
Lawrence M. Kiage; Kam-biu Liu; Nan D. Walker; Nina Siu-Ngan Lam; O. K. Huh
Many parts of East Africa are experiencing dramatic changes in land‐cover/use at a variety of spatial and temporal scales, due to both climatic variability and human activities. Information about such changes is often required for planning, management, and conservation of natural resources. Several methods for land cover/change detection using Landsat TM/ETM+ imagery were employed for Lake Baringo catchment in Kenya, East Africa. The Lake Baringo catchment presents a good example of environments experiencing remarkable land cover change due to multiple causes. Both the NDVI differencing and post‐classification comparison effectively depicted the hotspots of land degradation and land cover/use change in the Lake Baringo catchment. Change‐detection analysis showed that the forest cover was the most affected, in some sections recording reductions of over 40% in a 14‐year period. Deforestation and subsequent land degradation have increased the sediment yield in the lake resulting in reduction in lake surface area by over 10% and increased turbidity confirmed by the statistically significant increase (t = −84.699, p<0.001) in the albedo between 1986 and 2000. Although climatic variations may account for some of the changes in the lake catchment, most of the changes in land cover are inherently linked to mounting human and livestock population in the Lake Baringo catchment.
Computers & Geosciences | 1993
Sandeep Jaggi; Dale A. Quattrochi; Nina Siu-Ngan Lam
Fractal geometry is increasingly becoming a useful tool for modeling natural phenomena. As an alternative to Euclidean concepts, fractals allow for a more accurate representation of the nature of complexity in natural boundaries and surfaces. The purpose of this paper is to introduce and implement three algorithms in C code for deriving fractal measurement from remotely sensed data. These three methods are: the line-divider method, the variogram method, and the triangular prism method. Remote-sensing data acquired by NASAs Calibrated Airborne Multispectral Scanner (CAMS) are used to compute the fractal dimension using each of the three methods. These data were obtained as a 30 m pixel spatial resolution over a portion of western Puerto Rico in January 1990. A description of the three methods, their implementation in PC-compatible environment, and some results of applying these algorithms to remotely sensed image data are presented.
Cartography and Geographic Information Science | 2002
Nina Siu-Ngan Lam; Hong Lie Qiu; Dale A. Quattrochi; Charles W. Emerson
Previously, we developed an integrated software package called ICAMS (Image Characterization and Modeling System) to provide specialized spatial analytical functions for interpreting remote sensing data. This paper evaluates three fractal dimension measurement methods that have been implemented in ICAMS: isarithm, variogram, and a modified version of triangular prism. To provide insights into how the fractal methods compare with conventional spatial techniques in measuring landscape complexity, the performance of two spatial autocorrelation methods, Morans I and Gearys C, is also evaluated. Results from analyzing 25 simulated surfaces having known fractal dimensions show that both the isarithm and triangular prism methods can accurately measure a range of fractal surfaces. The triangular prism method is most accurate at estimating the fractal dimension of surfaces having higher spatial complexity, but it is sensitive to contrast stretching. The variogram method is a comparatively poor estimator for all surfaces, particularly those with high fractal dimensions. As with the fractal techniques, spatial autocorrelation techniques have been found to be useful for measuring complex images, but not images with low dimensionality. Fractal measurement methods, as well as spatial autocorrelation techniques, can be applied directly to unclassified images and could serve as a tool for change detection and data mining.
International Journal of Remote Sensing | 2005
Charles W. Emerson; Nina Siu-Ngan Lam; Dale A. Quattrochi
The accuracy of traditional multispectral maximum‐likelihood image classification is limited by the multi‐modal statistical distributions of digital numbers from the complex, heterogenous mixture of land cover types in urban areas. This work examines the utility of local variance, fractal dimension and Morans I index of spatial autocorrelation in segmenting multispectral satellite imagery with the goal of improving urban land cover classification accuracy. Tools available in the ERDAS ImagineTM software package and the Image Characterization and Modeling System (ICAMS) were used to analyse Landsat ETM + imagery of Atlanta, Georgia. Images were created from the ETM + panchromatic band using the three texture indices. These texture images were added to the stack of multispectral bands and classified using a supervised, maximum likelihood technique. Although each texture band improved the classification accuracy over a multispectral only effort, the addition of fractal dimension measures is particularly effective at resolving land cover classes within urbanized areas, as compared to per‐pixel spectral classification techniques.
Computers & Geosciences | 2005
Guiyun Zhou; Nina Siu-Ngan Lam
Fractal geometry has been actively researched in a variety of disciplines. The essential concept of fractal analysis is fractal dimension. It is easy to compute the fractal dimension of truly self-similar objects. Difficulties arise, however, when we try to compute the fractal dimension of surfaces that are not strictly self-similar. A number of fractal surface dimension estimators have been developed. However, different estimators lead to different results. In this paper, we compared five fractal surface dimension estimators (triangular prism, isarithm, variogram, probability, and variation) using surfaces generated from three surface generation algorithms (shear displacement, Fourier filtering, and midpoint displacement). We found that in terms of the standard deviations and the root mean square errors, the triangular prism and isarithm estimators perform the best among the five methods studied.
PLOS ONE | 2009
Nina Siu-Ngan Lam; Kelley Pace; Richard Campanella; James P. LeSage; Helbert Arenas
Background Empirical observations on how businesses respond after a major catastrophe are rare, especially for a catastrophe as great as Hurricane Katrina, which hit New Orleans, Louisiana on August 29, 2005. We analyzed repeated telephone surveys of New Orleans businesses conducted in December 2005, June 2006, and October 2007 to understand factors that influenced decisions to re-open amid post-disaster uncertainty. Methodology/Principal Findings Businesses in the group of professional, scientific, and technical services reopened the fastest in the near term, but differences in the rate of reopening for businesses stratified by type became indistinguishable in the longer term (around two years later). A reopening rate of 65% was found for all businesses by October 2007. Discriminant analysis showed significant differences in responses reflecting their attitudes about important factors between businesses that reopened and those that did not. Businesses that remained closed at the time of our third survey (two years after Katrina) ranked levee protection as the top concern immediately after Katrina, but damage to their premises and financing became major concerns in subsequent months reflected in the later surveys. For businesses that had opened (at the time of our third survey), infrastructure protection including levee, utility, and communications were the main concerns mentioned in surveys up to the third survey, when the issue of crime became their top concern. Conclusions/Significance These findings underscore the need to have public policy and emergency plans in place prior to the actual disaster, such as infrastructure protection, so that the policy can be applied in a timely manner before business decisions to return or close are made. Our survey results, which include responses from both open and closed businesses, overcome the “survivorship bias” problem and provide empirical observations that should be useful to improve micro-level spatial economic modeling of factors that influence business return decisions.
Natural Hazards Review | 2016
Nina Siu-Ngan Lam; Margaret A. Reams; Kenan Li; Chi Li; Lillian P. Mata
The abundant research examining aspects of social-ecological resilience, vulnerability, and hazards and risk assessment has yielded insights into these concepts and suggested the importance of quantifying them. Quantifying resilience is complicated by several factors including the varying definitions of the term applied in the research, difficulties involved in selecting and aggregating indicators of resilience, and the lack of empirical validation for the indices derived. This paper applies a new model, called the resilience inference measurement (RIM) model, to quantify resilience to climate-related hazards for 52 U.S. counties along the northern Gulf of Mexico. The RIM model uses three elements (exposure, damage, and recovery indicators) to denote two relationships (vulnerability and adaptability), and employs both K-means clustering and discriminant analysis to derive the resilience rankings, thus enabling validation and inference. The results yielded a classification accuracy of 94.2% with 28 predictor variables. The approach is theoretically sound and can be applied to derive resilience indices for other study areas at different spatial and temporal scales.
Computers & Geosciences | 2009
Wenxue Ju; Nina Siu-Ngan Lam
Despite the many applications of fractals in geosciences, the problem of inconsistent results derived from different fractal calculation algorithms remains. Previous research found that the modified triangular prism method was the most accurate for calculating the fractal dimension of complex surfaces such as remote sensing images. However, when extending the application of the technique into local measurements, new problems arise. Hence, adjustment to the existing technique is needed. This paper introduces a new algorithm for calculating the fractal dimension within a local window based on the triangular prism method. Instead of using arbitrary geometric steps, the new algorithm computes the number of steps needed for fractal calculation according to the window size. The new algorithm, called the divisor-step method, was tested using 4000 simulated surfaces and found to be more robust and accurate than the conventional geometric-step method. The new divisor-step method is recommended especially for local measurements.
PLOS ONE | 2012
Nina Siu-Ngan Lam; Helbert Arenas; Kelley Pace; James P. LeSage; Richard Campanella
We analyzed the business reopening process in New Orleans after Hurricane Katrina, which hit the region on August 29, 2005, to better understand what the major predictors were and how their impacts changed through time. A telephone survey of businesses in New Orleans was conducted in October 2007, 26 months after Hurricane Katrina. The data were analyzed using a modified spatial probit regression model to evaluate the importance of each predictor variable through time. The results suggest that the two most important reopening predictors throughout all time periods were the flood depth at the business location and business size as represented by its wages in a logarithmic form. Flood depth was a significant negative predictor and had the largest marginal effects on the reopening probabilities. Smaller businesses had lower reopening probabilities than larger ones. However, the nonlinear response of business size to the reopening probability suggests that recovery aid would be most effective for smaller businesses than for larger ones. The spatial spillovers effect was a significant positive predictor but only for the first nine months. The findings show clearly that flood protection is the overarching issue for New Orleans. A flood protection plan that reduces the vulnerability and length of flooding would be the first and foremost step to mitigate the negative effects from climate-related hazards and enable speedy recovery. The findings cast doubt on the current coastal protection efforts and add to the current debate of whether coastal Louisiana will be sustainable or too costly to protect from further land loss and flooding given the threat of sea-level rise. Finally, a plan to help small businesses to return would also be an effective strategy for recovery, and the temporal window of opportunity that generates the greatest impacts would be the first 6∼9 months after the disaster.