Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where James L. Fouss is active.

Publication


Featured researches published by James L. Fouss.


Transactions of the ASABE | 1993

GLEAMS Hydrology Submodel Modified for Shallow Water Table Conditions

M. R. Reyes; R. L. Bengtson; James L. Fouss; James S. Rogers

GLEAMS-Water Table (GLEAMS-WT) is a modified version of GLEAMS that accounts for shallow water table fluctuations. The modification was accomplished by replacing the evapotranspiration and percolation algorithms in GLEAMS with evapotranspiration and percolation routines that are affected by shallow water table. Furthermore, routines to account for depression storage, steady state upward flux from the water table, and water table depth predictions were added. The simulation performances of GLEAMS and GLEAMS-WT were evaluated by comparing their predictions with seven years (1981 through 1987) of measured data from a runoff-erosion-drainage experimental plot at Baton Rouge, Louisiana. The GLEAMS-WT predictions of surface runoff volume were very satisfactory. Total predicted surface runoff volume for seven years was only 0.6 cm (0%) greater than the observed runoff volume, a significant improvement from GLEAMS underprediction of surface runoff volume by 54%. GLEAMS-WT predictions of water table depth were satisfactory.


Transactions of the ASABE | 2004

CLIMATE IMPACTS ON NITRATE LOSS IN DRAINAGE WATERS FROM A SOUTHERN ALLUVIAL SOIL

B. C. Grigg; L. M. Southwick; James L. Fouss; T. S. Kornecki

Fertilizer nitrogen transported via agricultural drainage has caused eutrophication of nearby surface waters. In the Lower Mississippi River Valley region, periods of drought are occurring more frequently. The impacts of drought on nutrient loss from agricultural lands of this region have not been reported. Field studies were used to evaluate the impact of climate (rainfall) on nitrate loss from agricultural fields in both normal (1996) and drought (1999) periods at the Ben Hur Water Quality Site in Baton Rouge, Louisiana. Four replicates of two treatments, surface drainage only (SUR) and surface drainage + deep controlled drainage (DCD), were initiated on 0.21 ha plots planted to corn (Zea mays L.). After each rainfall/runoff event, the volumes of runoff and subsurface drainage were analyzed for soluble nitrate concentration and loss. No significant drainage treatment impacts were found on runoff volume and nitrate loss in runoff. Nitrate loss in runoff was impacted by climate, with a four-fold decrease in nitrate loss during the drought, caused by decreased volume of runoff. Conversely, the mass of nitrate loss in leachate increased two-fold during the drought. Diverting subsurface drainage effluent (DCD) to surface receiving waters increased nitrate transport to these waters by 2.6 times in the normal climate, and over ten-fold during the drought, compared to SUR management. In either climate, but particularly during drought, subsurface drainage could potentially accelerate eutrophication of receiving waters of this region. When compared to DCD, these results suggest that SUR should be the water management practice in this region.


Transactions of the ASABE | 1994

GLEAMS-WT hydrology submodel modified to include subsurface drainage

M. R. Reyes; R. L. Bengtson; James L. Fouss

The model GLEAMS-SWAT (GLEAMS with Subsurface drainage and WAter Table) is a modified version of GLEAMS that accounts for shallow water table fluctuations and subsurface drainage. The modification was accomplished by incorporating a subsurface drainage routine in GLEAMS-WT. Simulation performances of GLEAMS and GLEAMS-SWAT were evaluated by comparing their predictions with seven years (1981-1987) of measured data from a runoff-erosion-drainage experimental plot at Baton Rouge, Louisiana. Validations to test the accuracy of GLEAMS-SWAT predictions of surface runoff volume, subsurface drainage volume, total volume (surface runoff + subsurface drainage), and water table depth were satisfactory. Total predicted surface runoff volume for the seven-year period was 94% of the observed runoff volume, an improvement from GLEAMS underprediction of surface runoff volume which was 71% of the observed runoff. Subsurface drainage volume and total drainage (runoff + subsurface drainage) volume predictions were, respectively, 99% and 96% of the observed volumes. Water table depth prediction was deeper than the observed depth, especially during the pregrowing and growing seasons.


Transactions of the ASABE | 2003

DRAINAGE SYSTEM IMPACTS ON SURFACE RUNOFF, NITRATE LOSS, AND CROP YIELD ON A SOUTHERN ALLUVIAL SOIL

B. C. Grigg; L. M. Southwick; James L. Fouss; T. S. Kornecki

Excess rainfall and subsequent surface runoff is a challenge to farmers of the Lower Mississippi River Valley region. In 1993, we established an experimental field site in Baton Rouge, Louisiana, consisting of 16 hydraulically isolated plots (0.2 ha) on a Commerce soil (Aeric Fluvaquents). Our objective was to determine drainage system impacts on surface runoff, subsurface drainage effluent, nitrate loss, and corn (Zea mays L.) yield. We evaluated the following drainage systems (four replications) in 1995 and 1996: surface drainage only (SUR), controlled subsurface drainage at 1.1 m below the soil surface (DCD), and shallow water table control at a 0.8 m depth via controlled -drainage/subirrigation (CDSI). Planting date, fertility management, and minimum tillage were consistent across treatments. When compared to SUR, DCD and CDSI did not reduce surface runoff or nitrate loss in runoff. This is in contrast to previous research showing that subsurface drainage systems decreased runoff on this soil, the difference being that we did not use deep tillage. Our results suggest that subsurface drainage systems should be coupled with deep tillage to reduce nutrient loss in runoff from this alluvial soil. DCD and CDSI controlled the shallow water table, but the increased annual effluent from subsurface drainage increased nitrate loss compared to SUR. DCD and CDSI had no affect on corn yield under these rainfall conditions. With respect to nitrate loss and crop yield in this region, typical SUR drainage may be the best management practice (BMP) in the absence of effective runoff mitigation, such as deep tillage.


Transactions of the ASABE | 1989

Sump-Controlled Water Table Management Predicted with DRAINMOD

James L. Fouss; James S. Rogers; Cade E. Carter

ABSTRACT Fluctuations in depth of the water table midway between subsurface-drainage/subirrigation conduits in a sump-controlled water table management system were predicted within an average deviation of 8 cm in DRAINMOD simulations conducted for a Commerce silt loam soil in the Mississippi Delta. The average daily sump water level from a field experiment was used as a model input to establish changes in the drainage outlet water level boundary conditions.


Agricultural Water Management | 1988

Rainfall probability forecasts used to manage a subdrainage-subirrigation system for watertable control

James R. Cooper; James L. Fouss

Abstract Water management of a subsurface-drainage and subirrigation system was simulated using a daily rainfall probability index ( rpi ), to control the watertable depth ( wt ) in the soil profile. Daily management of ‘free drainage’, controlled drainage, or subirrigation, was based upon the rpi value. The rpi was computed from the daily rainfall probability in forecasts issued by the U.S. National Weather Service. Climatic data and weather forecast records (1979–1985) for the lower Mississippi Valley were used in the DRAINMOD program to simulate daily fluctuations in the watertable. Various statistical and summation equations were used for computing the rpi . Management success was evaluated by conditions of excess and deficit soil water in the root zone, and by predicted crop yield. Using only the ‘today’ and ‘tonight’ segments of the morning (5:25 h) forecast, 75% of the significant rainfall events occurring during the growing season were successfully predicted when the rpi ≥ 0.60. Free drainage in advance of predicted storms significantly reduced the duration of excess soil water in the root zone and increased simulated maize yield 0 to 11%, when compared to controlled drainage where the water level at the drain outlet was maintained constant at a level above the drain.


Journal of Agricultural and Food Chemistry | 2009

Runoff and leaching of metolachlor from Mississippi River alluvial soil during seasons of average and below-average rainfall.

Lloyd M. Southwick; Timothy W. Appelboom; James L. Fouss

The movement of the herbicide metolachlor [2-chloro-N-(2-ethyl-6-methylphenyl)-N-(2-methoxy-1-methylethyl)acetamide] via runoff and leaching from 0.21 ha plots planted to corn on Mississippi River alluvial soil (Commerce silt loam) was measured for a 6-year period, 1995-2000. The first three years received normal rainfall (30 year average); the second three years experienced reduced rainfall. The 4-month periods prior to application plus the following 4 months after application were characterized by 1039 +/- 148 mm of rainfall for 1995-1997 and by 674 +/- 108 mm for 1998-2000. During the normal rainfall years 216 +/- 150 mm of runoff occurred during the study seasons (4 months following herbicide application), accompanied by 76.9 +/- 38.9 mm of leachate. For the low-rainfall years these amounts were 16.2 +/- 18.2 mm of runoff (92% less than the normal years) and 45.1 +/- 25.5 mm of leachate (41% less than the normal seasons). Runoff of metolachlor during the normal-rainfall seasons was 4.5-6.1% of application, whereas leaching was 0.10-0.18%. For the below-normal periods, these losses were 0.07-0.37% of application in runoff and 0.22-0.27% in leachate. When averages over the three normal and the three less-than-normal seasons were taken, a 35% reduction in rainfall was characterized by a 97% reduction in runoff loss and a 71% increase in leachate loss of metolachlor on a percent of application basis. The data indicate an increase in preferential flow in the leaching movement of metolachlor from the surface soil layer during the reduced rainfall periods. Even with increased preferential flow through the soil during the below-average rainfall seasons, leachate loss (percent of application) of the herbicide remained below 0.3%. Compared to the average rainfall seasons of 1995-1997, the below-normal seasons of 1998-2000 were characterized by a 79% reduction in total runoff and leachate flow and by a 93% reduction in corresponding metolachlor movement via these routes. An added observation in the study was that neither runoff of rainfall nor runoff loss of metolachlor was influenced by the presence of subsurface drains, compared to the results from plots without such drains that were described in an earlier paper.


Transactions of the ASABE | 1995

COMPARISON OF EROSION PREDICTIONS WITH GLEAMS, GLEAMS-WT, AND GLEAMS-SWAT MODELS FOR ALLUVIAL SOILS

M. R. Reyes; R. L. Bengtson; James L. Fouss; Cade E. Carter

Simulation performances of GLEAMS, GLEAMS-WT, and GLEAMS-SWAT were evaluated by comparing their soil loss predictions with measured data from two runoff-erosion-drainage experimental plots at Baton Rouge, Louisiana, One of the experimental plots was surface drained only, and the other was both surface and subsurface drained. Although the hydrology components of GLEAMS-WT and GLEAMS-SWAT predicted surface runoff more accurately than the original GLEAMS, all three models seriously underpredicted total soil losses over a seven-year period (1981 to 1987). Transport capacity limited soil loss prediction values in the models. Hence, we recommend that any changes or modifications in the erosion submodel be focused on improving transport capacity simulation; changes in the detachment simulation routine may not be needed. A calibration parameter was added to the erosion subroutine to adjust transport capacity. However, even when the models were calibrated for a specific site, there were still substantial annual and monthly differences between predicted and observed soil losses. Keywords. GLEAMS, Models, Runoff, Erosion.


Transactions of the ASABE | 1988

Water Management Increases Sugarcane Yields

Cade E. Carter; James L. Fouss; Victor McDaniel

ABSTRACT THREE water management systems were installed on a 7-ha tract of Commerce silt loam soil in Assumption Parish, LA in 1983 and 1984 to determine if the water table could be managed on a field size area and to determine sugarcane response to water management. Each system consisted of closely spaced (15 m) subsurface drain lines that were connected to a water control sump which had facilities for removing water for drainage or adding water for sub irrigation. Land adjacent to the tract on which the water management systems were installed, was used as a check. Rainfall during the three-year experiment was near normal, 1,528 mm, except in 1985 when rainfall was 211 mm below normal. The soil and the crop responded favorably to water table management. In 1985, the systems were particularly useful for subirrigation during the summer drought. Sugar yields were 875, 1,656, and 1,321 kg/ha more than those in the check in 1984, 1985, and 1986, respectively. At 1987 sugar prices and drain installation cost estimates, these yield increases were more than enough to make annual payments on a water management system, excluding loan interest and tax credits. Crop production efficiency was enhanced since the amount of sugar produced by the area without water management, could be produced with 18% less land if water management were used.


2006 Portland, Oregon, July 9-12, 2006 | 2006

Methods for Removing Nitrate Nitrogen from Agricultural Drainage Waters: A Review and Assessment

Timothy W. Appelboom; James L. Fouss

Widespread adoption of conventional surface and subsurface drainage has resulted in increased nitrate losses from agricultural cropland to surface waters. This is due to the drainage water bypassing nutrient removing landscape features (i.e. riparian buffers and wetlands as it moves from the fields to the stream). A number of approaches have been identified to reduce these nitrate losses including controlled-drainage, routing of drainage water through natural/constructed wetlands and through constructed bioreactors, and in-stream denitrification. Controlled-drainage and infield bioreactors reduce the loss carried off field in drainage discharge, whereas natural /constructed wetlands and in-stream denitrification are post-drainage methods of nitrate reduction. The potential for nitrate reduction with each of these approaches is; approximately 50% for controlled drainage, 37% to 65% for natural/constructed wetlands (with up to an additional 18% if a berm is used in creation of the wetland), 60% to 90% for constructed bioreactors, and 1% to 66% for instream denitrification. Combinations of these methods would lead to even higher nitrate removal. A combination of controlled-drainage, constructed wetland and in-stream denitrification could possibly result in more than 75% nitrate removal prior to release to larger streams or other surface waters reducing water quality degradation.

Collaboration


Dive into the James L. Fouss's collaboration.

Top Co-Authors

Avatar

Cade E. Carter

Agricultural Research Service

View shared research outputs
Top Co-Authors

Avatar

Lloyd M. Southwick

United States Department of Agriculture

View shared research outputs
Top Co-Authors

Avatar

R. L. Bengtson

Louisiana State University

View shared research outputs
Top Co-Authors

Avatar

Brandon C. Grigg

United States Department of Agriculture

View shared research outputs
Top Co-Authors

Avatar

Guye H. Willis

United States Department of Agriculture

View shared research outputs
Top Co-Authors

Avatar

James R. Cooper

Agricultural Research Service

View shared research outputs
Top Co-Authors

Avatar

D. J. Boethel

Louisiana State University Agricultural Center

View shared research outputs
Top Co-Authors

Avatar

Ted S. Kornecki

United States Department of Agriculture

View shared research outputs
Researchain Logo
Decentralizing Knowledge