Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where George Vellidis is active.

Publication


Featured researches published by George Vellidis.


Ecology | 1998

SPATIOTEMPORAL DISTRIBUTIONS OF BACTERIVOROUS NEMATODES AND SOIL RESOURCES IN A RESTORED RIPARIAN WETLAND

Christien H. Ettema; I David C. Coleman; George Vellidis; Richard Lowrance; Stephen L. Rathbun

Spatial and temporal variability in soil biotic populations reflect heteroge- neity in soil resources, affect patterns of soil process rates, and facilitate coexistence of diverse biota. We investigated these relationships in a 0.7-ha restored riparian wetland in the Coastal Plain of Georgia, USA, for an abundant and diverse group of soil fauna, the bacterivorous nematodes. We quantified spatial distributions in four different seasons for the eight most dominant bacterivorous taxa in the wetland and related their individual distributions to patterns of microbial respiration, inorganic nitrogen, moisture, and soil organic matter. We used geostatistics to quantify spatial aggregation and draw isopleths. For all variates except two nematode taxa, 36-99% of sample population variance was spatially dependent, over ranges of 11-84 m. Isopleths and spatial trend analysis showed that individual bacterivorous taxa exhibited divergent spatial distributions, with populations aggregating into different hotspots in the wetland. Although these large-scale trends per- sisted at all sampling dates, small-scale patterning showed significant temporal variation due to rise and fall of local populations. Individual nematode distributions did not corre- spond well to the (temporally more static) soil resource patterns, except occasionally to soil moisture and nitrate content. We attribute the general lack of correlation between nematode and soil resource patterns in part to the young age (2.5-3.5 yr) of the investigated wetland site. Although nematode patterns remain inadequately explained, we suggest that the observed spatiotemporal divergence among populations of bacterivorous taxa has im- portant implications for our understanding of soil ecosystem and community processes, notably the spatiotemporal distribution of nematode-influenced nitrogen cycling rates and the maintenance of field-scale nematode diversity.


Transactions of the ASABE | 2003

WATERSHED--SCALE SIMULATION OF SEDIMENT AND NUTRIENT LOADS IN GEORGIA COASTAL PLAIN STREAMS USING THE ANNUALIZED AGNPS MODEL

J. B. Suttles; George Vellidis; David D. Bosch; Richard Lowrance; J. M. Sheridan; E. L. Usery

Sediment and nutrient loadings in the Little River Research Watershed in south central Georgia were modeled using the continuous simulation Annualized Agricultural Nonpoint–Source Pollution (AnnAGNPS) model, part of the AGNPS suite of modeling components. Specifically, nitrogen, phosphorus, sediment, and runoff were predicted over a seven–year period. Land under cultivation makes up approximately 25% of the 333 km2 watershed. Livestock facilities include swine, poultry, dairy cows, and beef cattle. Results from the simulation were compared to seven years of monitoring data at the outlet of five nested subwatersheds and at the outlet of the Little River Research Watershed (LRRW). The average annual predicted runoff in the upper part of the watershed was one–third to half of observed runoff. In contrast, predicted runoff in the lower part of the watershed was close to observed, and was 100% of observed at the outlet of the watershed. Runoff underprediction was attributed to the method of landcover discretization. The extent of forest land in the upper watershed (55% to 63%) and the fragmented landscape that has relatively small fields surrounded by riparian forests and tracts of forest resulted in overestimation of forested area in the watershed. In addition to runoff, sediment and nutrient loads were also underpredicted in the upper part of the LRRW. Two factors are most likely responsible for underprediction. Runoff is underpredicted at these sites, which reduces the carrying capacity of sediment loads. In addition, the overestimation of forested areas at these sites coincides with underestimation of sediment–producing areas, such as cropland. In contrast to the upper part of the watershed, sediment and nutrient loads were overpredicted in the lower part of the watershed. This may have resulted from inadequately simulating nonpoint–source pollution attenuation by the extensive riparian forests and forested in–stream wetland areas found in these watersheds. Prediction results can be improved through better input into the model, as well as modification of the processes within the model to account for forest and riparian conditions.


Transactions of the ASABE | 2009

Effect of Spatial Distribution of Rainfall on Temporal and Spatial Uncertainty of SWAT Output

J. Cho; David D. Bosch; Richard Lowrance; Timothy C. Strickland; George Vellidis

Accurate rainfall data are critical for accurate representation of temporal and spatial uncertainties of simulated watershed-scale hydrology and water quality from models. In addition, the methods used to incorporate the rainfall data into the simulation model can significantly impact the results. The objectives of this study were to (1) assess the hydrologic impacts of different methods for incorporating spatially variable rainfall input into the Soil and Water Assessment Tool (SWAT) in conjunction with subwatershed delineation level and (2) assess seasonal and spatial uncertainty in hydrologic and water quality simulations of SWAT with respect to rain gauge density. The study uses three different methods to incorporate spatially variable rainfall into the SWAT model and three levels of subwatershed delineation. The impacts of ten different gauge-density scenarios on hydrology and water quality were subsequently evaluated by using the highest gauge-density scenario as a baseline for comparison. Through the centroid method, which is currently used by the AVSWAT-X interface, variations in the representation of measured annual rainfall as model input and corresponding simulated streamflow increased as subwatershed delineation level decreased from high-density to low-density. The rainfall input by the Thiessen averaging method for each subwatershed (Thiessen method) and the inverse-distance-weighted averaging method for the entire watershed (average method) were not sensitive to subwatershed delineation. The impacts of delineation on streamflow were also less with these two methods. The Thiessen method is recommended for SWAT simulation of a watershed with high spatial variability of rainfall. The currently used AVSWAT-X centroid method will also accurately represent spatially variable rainfall if a subwatershed delineation is used that sufficiently incorporates the density of observed rainfall stations. As the number of rain gauges used for the simulation decreased, the uncertainty in the hydrologic and water quality model output increased exponentially. Total phosphorus was most sensitive to the changes in rain gauge density, with an average coefficient of variation of root mean square difference (CVRMSD) of 0.30 from three watersheds, followed by sediment, total nitrogen, and streamflow, showing CVRMSD values of 0.24, 0.18, and 0.17, respectively. Seasonal variations in simulated streamflow and water quality were higher during summer and fall seasons compared to spring and winter seasons. These seasonal and temporal variations can be attributed to the rainfall patterns within the watershed.


Journal of Soil and Water Conservation | 2010

Water quality effects of simulated conservation practice scenarios in the Little River Experimental watershed

J. Cho; George Vellidis; David D. Bosch; Richard Lowrance; Timothy C. Strickland

The goal of this study was to evaluate the water quality effects of alternative conservation practice scenarios using the SWAT (Soil and Water Assessment Tool) model in the Little River Experimental watershed, a representative coastal plain watershed located in southern Georgia. We simulated the water quality effect of two suites of upland conservation practices (CPs)—one targeting erosion and the other targeting nutrients. We also simulated the impact of riparian forest buffers. Finally, we evaluated three different management scenarios for implementing the upland CPs: using a random approach, using subwatershed stream order as a prioritization criterion, and using subwatershed nonpoint source pollutant load as a prioritization criterion. The study showed that using subwatershed nonpoint source pollutant load as a prioritization criterion resulted in the most rapid water quality improvements. This improvement in water quality was nonlinear, while the other implementation schemes yield linear returns. Full implementation of the suite of CPs targeting erosion resulted in the greatest reductions of sediment (54.7%) and total phosphorus (55.9%) loads from upland crop areas. Full implementation of the suite of CPs targeting nutrient reduction resulted in the greatest total nitrogen load reduction (10.3%). Overall, an intact riparian forest buffer offered the most comprehensive reduction of nonpoint source pollutant loads—20.5% for sediment, 19.5% for total phosphorus, and 7.0% for total nitrogen. Simulation results indicate that at the current time, the single greatest contributor to nonpoint source pollutant reduction in the Little River Experimental watershed may be the current level of riparian forest cover.


Environmental Management | 2013

Model for Prioritizing Best Management Practice Implementation: Sediment Load Reduction

Taeil Jang; George Vellidis; Jeffrey B. Hyman; Erin S. Brooks; Lyubov A. Kurkalova; Jan Boll; Jaepil Cho

Understanding the best way to allocate limited resources is a constant challenge for water quality improvement efforts. The synoptic approach is a tool for geographic prioritization of these efforts. It uses a benefit-cost framework to calculate indices for functional criteria in subunits (watersheds, counties) of a region and then rank the subunits. The synoptic approach was specifically designed to incorporate best professional judgment in cases where information and resources are limited. To date, the synoptic approach has been applied primarily to local or regional wetland restoration prioritization projects. The goal of this work was to develop a synoptic model for prioritizing watersheds within which suites of agricultural best management practices (BMPs) can be implemented to reduce sediment load at the watershed outlets. The model ranks candidate watersheds within an ecoregion or river basin so that BMP implementation within the highest ranked watersheds will result in the most sediment load reduction per conservation dollar invested. The model can be applied anywhere and at many scales provided that the selected suite of BMPs is appropriate for the evaluation area’s biophysical and climatic conditions. The model was specifically developed as a tool for prioritizing BMP implementation efforts in ecoregions containing watersheds associated with the USDA-NRCS conservation effects assessment project (CEAP). This paper presents the testing of the model in the little river experimental watershed (LREW) which is located near Tifton, Georgia, USA and is the CEAP watershed representing the southeastern coastal plain. The application of the model to the LREW demonstrated that the model represents the physical drivers of erosion and sediment loading well. The application also showed that the model is quite responsive to social and economic drivers and is, therefore, best applied at a scale large enough to ensure differences in social and economic drivers across the candidate watersheds. The prioritization model will be used for planning purposes. Its results are visualized as maps which enable resource managers to identify watersheds within which BMP implementation would result in the most water quality improvement per conservation dollar invested.


Transactions of the ASABE | 1998

NITROGEN ASSIMILATION BY RIPARIAN BUFFER SYSTEMS RECEIVING SWINE LAGOON WASTEWATER

R. K. Hubbard; G. L. Newton; Jessica G. Davis; Richard Lowrance; George Vellidis; C. R. Dove

A three-year study was conducted to determine the feasibility of using riparian buffer systems to assimilate nitrogen (N) from swine lagoon effluent. Replicated 30 ×4 m plots were established at the interface of a pasture and riparian forest. Wastewater from the third lagoon of the University of Georgia Coastal Plain Experiment Station main swine research unit was applied to each plot by overland flow from tanks at the top end of each plot. The wastewater, which contained an average N concentration of 160 mg L–1 N, primarily as ammonium (NH4-N), was applied to the plots at two different rates (either once per week [1 ×, 1285 L/plot] or twice per week [2 ×, 2570 L/plot]). Three different vegetative buffer treatments were evaluated: (1) 10 m grass buffer draining into 20 m existing riparian zone vegetation; (2) 20 m grass buffer draining into 10 m existing riparian zone vegetation; and (3) 10 m grass buffer draining into 20 m maidencane (Panicum hematomon). The effects of the wastewater on surface runoff and groundwater quality were evaluated by transects of surface runoff collectors, suction lysimeters, and shallow groundwater wells which extended from the top to the bottom of each plot. Data analyses showed differences due to wastewater application rate and distance downslope from the wastewater application pipe. Nitrogen concentrations increased over time at the top ends of the plots but showed little increase at the bottom ends of the plots. Overall, all three vegetative treatments were successful in assimilating N from the wastewater. The study showed that riparian buffer systems, where wastewater is applied by overland flow, can be effective in assimilating N contained within lagooned animal wastes.


Applied and Environmental Microbiology | 2014

Diversity and Antimicrobial Resistance of Salmonella enterica Isolates from Surface Water in Southeastern United States

Baoguang Li; George Vellidis; Huanli Liu; Michele Jay-Russell; Shaohua Zhao; Zonglin Hu; Anita C. Wright; Christopher A. Elkins

ABSTRACT A study of prevalence, diversity, and antimicrobial resistance of Salmonella enterica in surface water in the southeastern United States was conducted. A new scheme was developed for recovery of Salmonella from irrigation pond water and compared with the FDAs Bacteriological Analytical Manual (8th ed., 2014) (BAM) method. Fifty-one isolates were recovered from 10 irrigation ponds in produce farms over a 2-year period; nine Salmonella serovars were identified by pulsed-field gel electrophoresis analysis, and the major serovar was Salmonella enterica serovar Newport (S. Newport, n = 29), followed by S. enterica serovar Enteritidis (n = 6), S. enterica serovar Muenchen (n = 4), S. enterica serovar Javiana (n = 3), S. enterica serovar Thompson (n = 2), and other serovars. It is noteworthy that the PulseNet patterns of some of the isolates were identical to those of the strains that were associated with the S. Thompson outbreaks in 2010, 2012, and 2013, S. Enteritidis outbreaks in 2011 and 2013, and an S. Javiana outbreak in 2012. Antimicrobial susceptibility testing confirmed 16 S. Newport isolates of the multidrug resistant-AmpC (MDR-AmpC) phenotype, which exhibited resistance to ampicillin, chloramphenicol, streptomycin, sulfamethoxazole, and tetracycline (ACSSuT), and to the 1st, 2nd, and 3rd generations of cephalosporins (cephalothin, amoxicillin-clavulanic acid, and ceftriaxone). Moreover, the S. Newport MDR-AmpC isolates had a PFGE pattern indistinguishable from the patterns of the isolates from clinical settings. These findings suggest that the irrigation water may be a potential source of contamination of Salmonella in fresh produce. The new Salmonella isolation scheme significantly increased recovery efficiency from 21.2 (36/170) to 29.4% (50/170) (P = 0.0002) and streamlined the turnaround time from 5 to 9 days with the BAM method to 4 days and thus may facilitate microbiological analysis of environmental water.


Transactions of the ASABE | 2009

Adapting the CROPGRO-Cotton model to simulate cotton biomass and yield under southern root-knot nematode parasitism.

Brenda V. Ortiz; Gerrit Hoogenboom; George Vellidis; Kenneth J. Boote; Richard F. Davis; Calvin D. Perry

Cotton (Gossypium hirsutum L.) yield losses by southern root-knot nematode (RKN; Meloidogyne incognita (Kofoid & White) Chitwood) are usually assessed after significant damage has been caused. However, estimation of potential yield reduction before planting is possible by using crop simulation. The main goal of this study was to adapt the Cropping System Model (CSM)-CROPGRO-Cotton for simulating growth and yield of cotton plants infected with RKN. Two hypotheses were evaluated to simulate RKN damage: (1) RKN acting as a sink for soluble assimilate, and (2) RKN inducing a reduction of root length per root mass and root density. The model was calibrated and adapted using data collected in an experiment that was conducted in 2007 and was part of a long-term crop rotation study. The experiment had a split-plot design, replicated six times, with drought stress levels assigned to the main plots and fumigation levels assigned to the subplots. The model was evaluated with seed cotton weight data collected in an experiment that was conducted in 2001 and was part of the same long-term crop rotation experiment. The fumigation treatments created various levels of RKN population densities. The model was adapted by coupling the RKN population to the removal of daily assimilates and decreasing root length per unit mass. The assimilate consumption rate was obtained after minimizing the error between simulated and observed biomass and yield components for the limited drought stress, non-fumigated treatment. Different values of root length per unit root weight (RFAC1) were used to account for early symptoms of RKN damage on leaf area index (LAI) and vegetative biomass under the non-fumigated, drought stress conditions. After model adaptation, the simulations indicated that LAI, total biomass, boll weight, and seed cotton decreased with elevated RKN population. The impact of RKN was more pronounced under severe drought stress. The lowest RMSE of LAI simulations occurred for the non-fumigated treatments under medium and severe drought stress (0.71 and 0.65 m2 m-2, respectively). Biomass was simulated with a prediction error within a range of 6% to 18.4% and seed cotton within a range of -11.2% to 2.7%. Seed cotton weight losses associated with RKN infection increased with the level of drought stress (9%, 20%, and 18% for the low, medium, and severe drought stress). Model evaluation showed that seed cotton weight was slightly more overpredicted for the fumigated than for the non-fumigated treatments, with prediction errors of 28.2%, 15.8%, and 2.0% for the low, medium, and severe drought stress, respectively. Similar to the calibration of the model, the yield losses increased with the combination of RKN and drought stress (20% and 29% for the low and severe drought stress). The results showed the potential for using the CSM-CROPGRO-Cotton model to account for RKN damage as well as to simulate yield reduction. However, further model evaluation might be needed to evaluate the values of assimilate consumption and root length per unit weight for different environmental conditions and management practices.


Applied and Environmental Microbiology | 2015

Distribution and Characterization of Salmonella enterica Isolates from Irrigation Ponds in the Southeastern United States.

Zhiyao Luo; Ganyu Gu; Amber Ginn; Mihai C. Giurcanu; Paige Adams; George Vellidis; Ariena H. C. van Bruggen; Michelle D. Danyluk; Anita C. Wright

ABSTRACT Irrigation water has been implicated as a likely source of produce contamination by Salmonella enterica. Therefore, the distribution of S. enterica was surveyed monthly in irrigation ponds (n = 10) located within a prime agricultural region in southern Georgia and northern Florida. All ponds and 28.2% of all samples (n = 635) were positive for Salmonella, with an overall geometric mean concentration (0.26 most probable number [MPN]/liter) that was relatively low compared to prior reports for rivers in this region. Salmonella peaks were seasonal; the levels correlated with increased temperature and rainfall (P < 0.05). The numbers and occurrence were significantly higher in water (0.32 MPN/liter and 37% of samples) than in sediment (0.22 MPN/liter and 17% of samples) but did not vary with depth. Representative isolates (n = 185) from different ponds, sample types, and seasons were examined for resistance to 15 different antibiotics; most strains were resistant to streptomycin (98.9%), while 20% were multidrug resistant (MDR) for 2 to 6 antibiotics. DiversiLab repetitive extragenic palindromic-element sequence-based PCR (rep-PCR) revealed genetic diversity and showed 43 genotypes among 191 isolates, as defined by >95% similarity. The genotypes did not partition by pond, season, or sample type. Genetic similarity to known serotypes indicated Hadar, Montevideo, and Newport as the most prevalent. All ponds achieved the current safety standards for generic Escherichia coli in agricultural water, and regression modeling showed that the E. coli level was a significant predictor for the probability of Salmonella occurrence. However, persistent populations of Salmonella were widely distributed in irrigation ponds, and the associated risks for produce contamination and subsequent human exposure are unknown, supporting continued surveillance of this pathogen in agricultural settings.


Transactions of the ASABE | 1996

Nutrient Concentrations in the Soil Solution and Shallow Groundwater of a Liquid Dairy Manure Land Application Site

George Vellidis; R. K. Hubbard; Jessica G. Davis; Richard Lowrance; Randall G. Williams; J. C. Johnson; G. L. Newton

Land application of liquid animal manures offers the potential for recycling large volumes of slurries by using the nutrients available in the manure for plant growth in place of conventional inorganic fertilizers. A study was initiated to determine environmentally and economically sustainable liquid dairy manure application rates on a year-round forage production system. Treatments based on nitrogen application rates of 200, 400, 600, and 800 kg N ha–1yr–1 were established. This work reports on nutrient concentrations in the soil solution of the vadose zone and in shallow groundwater after three years of land application. A 96-instrument network of high tension soil solution samplers was installed at 0.5, 1.0, 1.5, and 2.0 m depths and used to collect biweekly samples from June 1991 through September 1994. A network of 72 shallow groundwater monitoring wells was installed at 3 and 6 m depths and used to collect biweekly samples from May 1991 through September 1994. Statistically significant NO3-N treatment effects were observed at the 0.5, 1.0, 1.5, and 2.0 m depths. NO3-N treatment effect were not observed at the 3.0 or 6.0 m depths or at any depth for NH4-N, TKN, TN, PO4-P, and TP. Mean annual NO3-N soil solution concentrations ranged from a low of 1.45 mgL–1 to a high of 22.70 mgL–1. Concentrations of NH4-N and TKN were low for all depths while PO4-P and TP concentrations were nearly always below detection limits. After three years of study, treatment effects were clearly observed in the vadose zone. If not for very low subsoil permeability, it is likely that treatment effects would have been observed below 2.0 m.

Collaboration


Dive into the George Vellidis's collaboration.

Top Co-Authors

Avatar

Richard Lowrance

United States Environmental Protection Agency

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

David D. Bosch

Agricultural Research Service

View shared research outputs
Top Co-Authors

Avatar

R. K. Hubbard

Agricultural Research Service

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge