Robert O. Evans
North Carolina State University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Robert O. Evans.
Journal of Soil and Water Conservation | 2012
R. Wayne Skaggs; Norman R. Fausey; Robert O. Evans
This article introduces a series of papers that report results of field studies to determine the effectiveness of drainage water management (DWM) on conserving drainage water and reducing losses of nitrogen (N) to surface waters. The series is focused on the performance of the DWM (also called controlled drainage [CD]) practice in the US Midwest, where N leached from millions of acres of cropland contributes to surface water quality problems on both local and national scales. Results of these new studies are consistent with those from previous research reported in the literature that DWM can be used to reduce N losses (primarily in the nitrate nitrogen [NO3-N] form) from subsurface drained fields. The measured impact varied over a wide range (18% to more than 75% reduction in N loss to surface waters), depending on drainage system design, location, soil, and site conditions. Crop yields were increased by DWM on some sites and not on others, with the year-to-year impacts of DWM on yields dependent on weather conditions, as well as the above factors. Papers reporting advances in the development of datasets and models to predict the impact of drainage intensity and DWM on hydrology and water quality at watershed and…
Wetlands | 1991
R. W. Skaggs; J. W. Gilliam; Robert O. Evans
The hydrology of pocosins is dependent on plant, soil, site, and climatological factors. A simulation study was conducted to determine the effects of natural factors, such as depressional storage, and changes in drainage and land use on pocosin hydrology. The water management model DRAINMOD was used to simulate the hydrology of drained and undrained pocosins. Hourly rainfall data from a 33-year period of climatological record were used in DRAINMOD to predict evapotranspiration (ET), subsurface drainage, runoff, and water table depth on a day-by-day basis. Results were summarized to determine annual and long-term average effects. Pocosins with a large amount of surface depressional storage have water ponded on the surface during most of the year, high ET, and low surface runoff. As the depth of depressional storage decreases, average annual ET decreases, and runoff increases. Average annual runoff predicted for a natural pocosin on a Portsmouth soil near Wilmington, North Carolina increase from 277 mm to 384 mm as the depth of depressional storage was decreased from 300 mm to 5 mm. However, year-to-year variation in annual runoff was much greater than the effects of all other factors considered. Drainage ditches at spacings greater than 400 m had no effect on average annual runoff for constant surface depressional storage and conditions analyzed in this study. Decreasing the ditch spacing from 400 to 100 m increased average total annual outflow by only 7%. However, more than half of the predicted outflow for the 100 m spacing occurred as subsurface flow compared to less than 5% of the total for the 400 m spacing. Conversion from natural pocosin vegetation to a managed pine forest with a deeper rooting zone decreased predicted annual outflow by about 9%. Conversion to agricultural uses increased predicted average outflow by 7% compared to natural conditions.
Transactions of the ASABE | 1991
Robert O. Evans; R. W. Skaggs; R. E. Sneed
ABSTRACT High water table conditions reduce crop yields. This study developed com and soybean relative yield models for high water table conditions. The relative yield models were based on Stress-Day-Index (SDI) relationships using SEW30 (0.3-m water table depth) to describe the high water table stress criteria and normalized SDI crop susceptibility (CS) factors. The normalized crop susceptibility (NCS) factors were determined from previous studies conducted in North Carolina. The models were developed using existing field data for SDI and crop yield from Ohio. The resulting com model was tested against data from India and North Carolina and explained 69% of the relative yield variance for the pooled data. The soybean model explained 66% of the variance in relative yield for six years of soybean data from Ohio. The models developed should improve relative yield estimates using DRAINMOD, a water table management simulation model.
Transactions of the ASABE | 1997
M. A. Brevé; R. W. Skaggs; J. W. Gilliam; J. E. Parsons; A. T. Mohammad; George M. Chescheir; Robert O. Evans
This study was conducted to evaluate the performance of DRAINMOD-N, a nitrogen fate and transport model for artificially drained soils, based on a comparison between predicted and observed hydrologic and nitrogen variables for an experimental site in eastern North Carolina. The site consisted of six plots drained by subsurface drain tubes 1.25 m deep and 23 m apart. Each plot was instrumented to measure water table depth, subsurface drainage, surface runoff and subirrigation rates. There were two replications of three water management treatments: conventional drainage, controlled drainage and subirrigation. Crops were winter wheat followed by soybean. Results showed the model did a good job in describing the hydrology of the site. On average the predicted daily water table depths were within 0.13 m of observed during the 14-month study period. Differences between predicted and observed cumulative subsurface drainage and surface runoff volumes were less than 0.10 and 0.09 m, respectively, for all treatments. Predictions for the movement and fate of nitrogen were also in good agreement with measured results. Simulated nitratenitrogen (NO3-N) losses in subsurface drainage water were within 1.5 kg/ha of the observed values for the 14-month period. Differences between simulated and observed total NO3-N losses (subsurface drainage plus surface runoff) were within 3.0 kg/ha. Results of this study indicated DRAINMOD-N could be used to simulate nitrogen losses in poorly drained soils with artificial drainage. The model, however, needs to be tested for longer periods of time and under different climatic conditions and soil types, before it can be recommended for general use.
Transactions of the ASABE | 1990
Robert O. Evans; R. W. Skaggs; R. E. Sneed
ABSTRACT Crop susceptibility factors for plants stressed by excessive soil water conditions (wet stress) are presented for five growth stages based on four years of experimental data for com and five years of data for soybean. Com was most susceptible to wet stress just prior to tasseling and soybean was most susceptible during the pod filling stage. A normalizing approach is presented that reduces the sensitivity of the crop susceptibility factor to the level of stress imposed. Evidence is presented based on data from three independent studies on com to show that use of the normalized crop susceptibility (NCS) factor reduces its dependence on stress duration.
Transactions of the ASABE | 2010
R. W. Skaggs; Mohamed A. Youssef; J. W. Gilliam; Robert O. Evans
Field studies have shown that subsurface drainage systems can be managed to conserve water and reduce losses of nitrogen (N) to surface waters. The practice, called controlled drainage (CD) or drainage water management (DWM), is a viable alternative for reducing N loads from drained cropland, including millions of acres in the Midwest. This article reviews past studies on the effect of CD on drainage volumes and N losses for a wide range of soils and climatological conditions and uses simulations to examine mechanisms affecting the practice. Results published in the literature show that CD has reduced drainage volumes and N losses in drainage waters by 17% to over 80%, depending on soil properties, crops, drainage intensities, control strategies, and location. This study resulted in the following conclusions. CD reduces subsurface drainage and raises water tables, while increasing ET, seepage, and surface runoff. Seepage, which depends on soil properties and site conditions, is an important factor that often governs the effectiveness of CD. Experiments to determine the effect of CD on drainage volumes and N losses should be conducted on the field or watershed scale so that impacts of seepage are properly represented. Increases in ET in response to CD are important but are rarely greater than 10%. The effect of this increase in water use on drainage water loss is also less than 10% for most locations. CD reduces N losses in drainage water by about the same percentage as its effect on subsurface drainage volume in most cases. The effect of CD on N loss to surface waters depends on denitrification, both in the profile and in reduced zones along seepage paths. For soils that do not develop reduced zones, the effect of CD on N loss may be substantially less than its effect on drainage volume.
Transactions of the ASABE | 2002
Michael D. Dukes; Robert O. Evans; J. W. Gilliam; S. H. Kunickis
The effect of riparian buffer width and vegetation type on shallow groundwater quality has not been evaluated in the Middle Coastal Plain of North Carolina. Four riparian buffer vegetation types and no–buffer (no–till corn and rye rotation or pasture) were established at 8 and 15 m widths as follows: cool season grass (fescue), deep–rooted grass (switch grass), forest (pine and mixed hardwood), and native vegetation. Nested groundwater monitoring wells were installed at the field/buffer edge and the stream edge in the middle of each riparian buffer plot at three depths. Most deep, mid–depth, and shallow wells were 3.0 m, 1.8 m, and 0.6 m deep from the ground surface to the top of the 0.6 m perforated section, respectively. Wells were sampled for 23 months beginning July 1998. Although the ditch well nitrate–nitrogen concentrations at the middle well depth were significantly lower in the 15 m wide plots compared to the 8 m plots over half the monitoring period, extreme flooding as a result of a hurricane in the middle of the study confounded the results. The effect of vegetation was not significant at any time, including the no–buffer cropped and fertilized plots. The effect of vegetation was minimized because at the early stage in the buffer vegetation establishment, vegetative cover and root mass were not fully developed, the hurricane–induced flooding forced the re–establishment of several vegetation types (forest and fescue), and there was likely some mixing of groundwater flowing toward the vegetation plots. Establishment of buffers along streams where groundwater flowed away from the stream did not result in lower groundwater nitrate levels.
Transactions of the ASABE | 2006
Garry L. Grabow; Rodney L. Huffman; Robert O. Evans; David L. Jordan; R. C. Nuti
A subsurface drip irrigation (SDI) system was installed in 2001 in the Coastal Plain of North Carolina. Initially, four zones were installed, each with 0.91 m dripline spacing. In 2002, a fifth zone with 1.82 m dripline spacing was added. This system irrigated a cotton (Gossypium hirsutum L.) and peanut (Arachis hypogea L.) rotation on a Norfolk sandy loam soil. Seed cotton yield data was collected from 2001 to 2004. In addition to SDI, overhead sprinkler irrigation was applied to cotton plots from 2001 to 2003. This study was concurrent with another study that evaluated the effect of irrigation system type, cotton growth regulator (mepiquat chloride), herbicide (glyphosate) treatment, and planting date on lint yield and quality. Although the soil is classified as a sandy loam, water moved laterally to the midpoint of the 1.82 m spaced dripline; this was likely due to the pan layer found at about 0.3 m just below the dripline depth of 0.23 m. There was no difference in lateral water movement between the two dripline spacings. Seed cotton yield and irrigation water use efficiency was not statistically different between irrigation system type or dripline spacing over all years in the study. Seed cotton yield averaged 3.44 Mg ha-1 for the 0.91 m dripline spacing and 3.22 Mg ha-1 for the 1.82 m spacing for the three-year period 2002-2004 compared to an unirrigated average of 2.58 Mg ha-1 for the same period. Average irrigation water use efficiency was greater for the 0.91 m dripline spacing but not statistically different from the 1.82 m spacing. For 2001-2003, when sprinkler-irrigated plots existed, seed cotton yield averaged 3.55 Mg ha-1 for the 0.91 m dripline spacing, 3.35 Mg ha-1 for the sprinkler-irrigated plots, and 2.56 Mg ha-1 for the unirrigated plots. Drought conditions existed in 2002, when 258 mm of rain occurred between planting and final irrigation. The other growing seasons received relatively high amounts of rainfall: 524, 555, and 643 mm in 2001, 2003, and 2004, respectively.
Journal of Environmental Quality | 2016
S. E. King; Deanna Osmond; J. Smith; Michael R. Burchell; Michael D. Dukes; Robert O. Evans; S. Knies; S. Kunickis
Agricultural contributions of nitrogen are a serious concern for many water resources and have spurred the implementation of riparian buffer zones to reduce groundwater nitrate (NO). The optimum design for buffers is subject to debate, and there are few long-term studies. The objective of this project was to determine the effectiveness over time (12 yr) of buffer types (trees, switchgrass, fescue, native, and a control) and buffer widths (8 and 15 m) by measuring groundwater NO-N and dissolved organic carbon (DOC) trends. At the intermediate groundwater depth (1.5-2.1 m), NO-N reduction effectiveness was 2.5 times greater (46 vs. 16%) for the wider buffer, and, regardless of width, buffer effectiveness increased 0.62% yr. Buffer vegetative type was never statistically significant. In the deep-groundwater depth (2.1-3.5 m), there was no change in NO-N removal over time, although the statistical interaction of width and vegetative type indicated a wide range of removal rates (19-82%). The DOC concentrations were analyzed at the field/buffer and buffer/stream sampling locations. Depending on location position and groundwater sampling depth, DOC concentrations ranged from 1.6 to 2.8 mg L at Year 0 and increased at a rate of 0.13 to 0.18 mg L yr but always remained low (≤5.0 mg L). Greater DOC concentrations in the intermediate-depth groundwater did not increase NO-N removal; redox measurements indicated intermittent reduced soil conditions may have been limiting. This study suggests that riparian buffer width, not vegetation, is more important for NO-N removal in the middle coastal plain of North Carolina for a newly established buffer.
Transactions of the ASABE | 1994
C. L. Munster; R. W. Skaggs; J. E. Parsons; Robert O. Evans; J. W. Gilliam; M. A. Breve
The United States Geological Survey computer model Variably Saturated Two Dimensional Transport (VS2DT) was modified to treat boundary conditions imposed by parallel subsurface drain tubes. The modified model was used to simulate groundwater flow and aldicarb transport in research plots under conventional drainage, controlled drainage, and subirrigation. The reliability of the model was tested by comparing model predictions with field measurements.