首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Nitrate in water removed from fields by subsurface drain ('tile') systems is often at concentrations exceeding the 10 mg N L(-1) maximum contaminant level (MCL) set by the USEPA for drinking water and has been implicated in contributing to the hypoxia problem within the northern Gulf of Mexico. Because previous research shows that N fertilizer management alone is not sufficient for reducing NO(3) concentrations in subsurface drainage below the MCL, additional approaches are needed. In this field study, we compared the NO(3) losses in tile drainage from a conventional drainage system (CN) consisting of a free-flowing pipe installed 1.2 m below the soil surface to losses in tile drainage from two alternative drainage designs. The alternative treatments were a deep tile (DT), where the tile drain was installed 0.6 m deeper than the conventional tile depth, but with the outlet maintained at 1.2 m, and a denitrification wall (DW), where trenches excavated parallel to the tile and filled with woodchips serve as additional carbon sources to increase denitrification. Four replicate 30.5- by 42.7-m field plots were installed for each treatment in 1999 and a corn-soybean rotation initiated in 2000. Over 5 yr (2001-2005) the tile flow from the DW treatment had annual average NO(3) concentrations significantly lower than the CN treatment (8.8 vs. 22.1 mg N L(-1)). This represented an annual reduction in NO(3) mass loss of 29 kg N ha(-1) or a 55% reduction in nitrate mass lost in tile drainage for the DW treatment. The DT treatment did not consistently lower NO(3) concentrations, nor reduce the annual NO(3) mass loss in drainage. The DT treatment did exhibit lower NO(3) concentrations in tile drainage than the CN treatment during late summer when tile flow rates were minimal. There was no difference in crop yields for any of the treatments. Thus, denitrification walls are able to substantially reduce NO(3) concentrations in tile drainage for at least 5 yr.  相似文献   

2.
Abstract: In this study, a set of nitrogen reduction strategies were modeled to evaluate the feasibility of improving water quality to meet total maximum daily loads (TMDLs) in two agricultural watersheds. For this purpose, a spatial‐process model was calibrated and used to predict monthly nitrate losses (1994‐96) from Sand and Bevens Creek watersheds located in south‐central Minnesota. Statistical comparison of predicted and observed flow and nitrate losses gave r2 coefficients of 0.75 and 0.70 for Sand Creek watershed and 0.72 and 0.67 for Bevens Creek watershed, respectively. Modeled alternative agricultural management scenarios included: six different N application rates over three application timings and three different percentages of crop land with subsurface drainage. Predicted annual nitrate losses were then compared with nitrate TMDLs assuming a 30% reduction in observed nitrate losses is required. Reductions of about 33 (8.6 to 5.8 kg/ha) and 35% (23 to 15 kg/ha) in existing annual nitrate losses are possible for Sand and Bevens Creek watersheds, respectively, by switching the timing of fertilizer application from fall to spring. Trends towards increases in tile‐drained crop land imply that attaining nitrate TMDLs in future may require other alternative management practices in addition to fertilizer management such as partial conversion of crop land to pasture.  相似文献   

3.
Subsurface tile drainage from row-crop agricultural production systems has been identified as a major source of nitrate entering surface waters in the Mississippi River basin. Noncontrollable factors such as precipitation and mineralization of soil organic matter have a tremendous effect on drainage losses, nitrate concentrations, and nitrate loadings in subsurface drainage water. Cropping system and nutrient management inputs are controllable factors that have a varying influence on nitrate losses. Row crops leak substantially greater amounts of nitrate compared with perennial crops; however, satisfactory economic return with many perennials is an obstacle at present. Improving N management by applying the correct rate of N at the optimum time and giving proper credits to previous legume crops and animal manure applications will also lead to reduced nitrate losses. Nitrate losses have been shown to be minimally affected by tillage systems compared with N management practices. Scientists and policymakers must understand these factors as they develop educational materials and environmental guidelines for reducing nitrate losses to surface waters.  相似文献   

4.
5.
Environmental pressure to reduce nutrient losses from agricultural fields has increased in recent years. To abate this nutrient loss to the environment, better management practices and new technologies need to be developed. Thus, research was conducted to evaluate if subsurface banding poultry litter (PL) would reduce nitrogen (N) and phosphorus (P) loss in surface water runoff using a four-row prototype implement. Rainfall simulations were conducted to create a 40-min runoff event in an established bermudagrass (Cynodon dactylon L.) pasture on soil types common to the Coastal Plain and Piedmont regions. The Coastal Plain soil type was a Marvyn loamy sand (fine-loamy, kaolinitic, thermic Typic Kanhapludults) and the Piedmont soil type was a Hard Labor loamy sand (fine, kaolinitic, thermic Oxyaquic Kanhapludults). Treatments consisted of surface- and subsurface-applied PL at a rate of 9 Mg ha(-1), surface broadcast-applied commercial fertilizer (CF; urea and triple superphosphate blend) at the equivalent N (330 kg N ha(-1)) and P (315 kg N ha(-1)) content of PL, and a nonfertilized control. The greatest loss for inorganic N, total N, dissolved reactive P (DRP), and total P occurred with the surface broadcast treatments, with CF contributing to the greatest loss. Nutrient losses from the subsurface banded treatment reduced N and P in surface water runoff to levels of the control. Subsurface banding of PL reduced concentrations of inorganic N 91%, total N 90%, DRP 86%, and total P 86% in runoff water compared with surface broadcasted PL. These results show that subsurface band-applied PL can greatly reduce the impact of N and P loss to the environment compared with conventional surface-applied PL and CF practices.  相似文献   

6.
Nonpoint-source pollution by phosphorus (P) poses a threat to waters in the Taihu Lake basin in China. The potential transfer of P in rice (Oryza sativa L.) fields through surface drainage and subsurface flow was investigated under simulated conventional irrigation-drainage management. Surface drainage events were conducted to avoid overflow across the plots after heavy rainfall and for rice harvest, at which time P losses were also investigated. This study was conducted in 2001 in a long-term rice field experiment. The experimental plots were treated with 0, 26, or 52 kg P ha(-1) as superphosphate or 26 kg P ha(-1) with equal parts of P supplied as superphosphate and pig manure. Phosphorus concentrations and loads in field floodwater on plots receiving P rapidly declined in a nonlinear manner before the first drainage, three weeks after fertilizer application. The combined application of fertilizer and manure P resulted in higher P transfer potential in field floodwater than with fertilizer P alone one week after P application. Phosphorus concentrations in interflow water sampled by Teflon suction cups inserted at a depth of 150 to 200 mm gradually increased within two weeks after P application, then declined. The concentration of P in interflow water was related to soil P buildup from long-term P application, as well as recently applied P. The 26 kg P ha(-1) treatment (the conventional P rate in this region) resulted in a loss of 0.74 kg total phosphorus (TP) ha(-1) and a drainage-weighted average concentration of 0.25 mg TP L(-1) from the three surface drainage events. Results indicate that avoiding overflow drainage after P input and extending the time between P application and drainage may reduce P losses from rice paddies.  相似文献   

7.
A paired watershed study consisting of agroforestry (trees plus grass buffer strips), contour strips (grass buffer strips), and control treatments with a corn (Zea mays L.)-soybean [Glycine max (L.) Merr.] rotation was used to examine treatment effects on runoff, sediment, and nutrient losses. During the (1991-1997) calibration and subsequent three-year treatment periods, runoff was measured in 0.91- and 1.37-m H-flumes with bubbler flow meters. Composite samples were analyzed for sediment, total phosphorus (TP), total nitrogen (TN), nitrate, and ammonium. Calibration equations developed to predict runoff, sediment, and nutrients losses explained 66 to 97% of the variability between treatment watersheds. The contour strip and agroforestry treatments reduced runoff by 10 and 1% during the treatment period. In both treatments, most runoff reductions occurred in the second and third years after treatment establishment. The contour strip treatment reduced erosion by 19% in 1999, while erosion in the agroforestry treatment exceeded the predicted loss. Treatments reduced TP loss by 8 and 17% on contour strip and agroforestry watersheds. Treatments did not result in reductions in TN during the first two years of the treatment period. The contour strip and agroforestry treatments reduced TN loss by 21 and 20%, respectively, during a large precipitation event in the third year. During the third year of treatments, nitrate N loss was reduced 24 and 37% by contour strip and agroforestry treatments. Contour strip and agroforestry management practices effectively reduced nonpoint-source pollution in runoff from a corn-soybean rotation in the clay pan soils of northeastern Missouri.  相似文献   

8.
Predicting nitrate leaching under potato crops using transfer functions   总被引:1,自引:0,他引:1  
Nitrate leaching is a major issue in many cultivated soils. Models that predict the major processes involved at the field scale could be used to test and improve management practices. This study aims to evaluate a simple transfer function approach to predict nitrate leaching in sandy soils. A convective lognormal transfer (CLT) function is convoluted with functional equations simulating N mineralization, plant N uptake, N fertilizer dissolution, and nitrification at the soil surface to predict solute concentrations under potato (Solanum tuberosum L.) and barley (Hordeum vulgare L.) fields as a function of drainage water. Using this approach, nitrate flux concentrations measured in drainable lysimeters (1-m soil depth) were reasonably predicted from 29 Apr. 1996 to 3 Dec. 1996. With average application rates of 16.9 g m(-2) of N fertilizer in potato crops, mean nitrate-leaching losses measured under potato were 8.5 g N m(-2). Tuber N uptake averaged 9.7 g N m(-2) and soil mineral N at start (spring) and end (fall) of N mass balance averaged 1.7 and 4.5 g N m(-2), respectively. Soil N mineralization was estimated by difference (4.3 g N m(-2) on average) and was small compared with N fertilization. Small nitrate flux concentrations at the beginning of the cropping season (May) resulted mainly from initial soil nitrate concentrations. Measured and predicted nitrate flux concentrations significantly increased at mid-season (July-August) following important drainage events coupled with complete dissolution and nitrification of N fertilizers, and declining N uptake by potato plants. Decreases in nitrate concentrations before the end of year (November-December) underlined the predominant effect of N fertilizers applied for the most part at planting acting as a pulse input of solute.  相似文献   

9.
ABSTRACT: A large number of agricultural drainage wells (ADWs) are located in north-central Iowa. These wells permit sediments, pesticides, nitrate, and bacteria in surface and subsurface drainage water to enter regional aquifers that are currently being used for drinking-water supplies, mostly by rural families and communities. This paper reports some possible alternatives to control the entry of surface and subsurface drainage waters into groundwater systems, and describes a methodology to make comprehensive economic feasibility studies of alternative drainage outlets. The estimated cost of providing main subsurface drains varied from $220 to $960 per hectare. If the use of ADWs was completely eliminated without providing alternative drainage, it is estimated that the average annual loss to the farmers of the area would be at least $270 per hectare in reduced crop yields. Of course, losses would be weather dependent and highly variable. Management practices to reduce the pollutant load in water draining to ADWs are also discussed.  相似文献   

10.
Nitrate losses from subsurface tile drained row cropland in the Upper Midwest U.S. contribute to hypoxia in the Gulf of Mexico. Strategies are needed to reduce nitrate losses to the Mississippi River. This paper evaluates the effect of fertilizer rate and timing on nitrate losses in two (East and West) commercial row crop fields located in south-central Minnesota. The Agricultural Drainage and Pesticide Transport (ADAPT) model was calibrated and validated for monthly subsurface tile drain flow and nitrate losses for a period of 1999-2003. Good agreement was found between observed and predicted tile drain flow and nitrate losses during the calibration period, with Nash-Sutcliffe modeling efficiencies of 0.75 and 0.56, respectively. Better agreements were observed for the validation period. The calibrated model was then used to evaluate the effects of rate and timing of fertilizer application on nitrate losses with a 50-yr climatic record (1954-2003). Significant reductions in nitrate losses were predicted by reducing fertilizer application rates and changing timing. A 13% reduction in nitrate losses was predicted when fall fertilizer application rate was reduced from 180 to 123 kg/ha. A further 9% reduction in nitrate losses can be achieved when switching from fall to spring application. Larger reductions in nitrate losses would require changes in fertilizer rate and timing, as well as other practices such as changing tile drain spacings and/or depths, fall cover cropping, or conversion of crop land to pasture.  相似文献   

11.
A significant portion of the NO3 from agricultural fields that contaminates surface waters in the Midwest Corn Belt is transported to streams or rivers by subsurface drainage systems or "tiles." Previous research has shown that N fertilizer management alone is not sufficient for reducing NO3 concentrations in subsurface drainage to acceptable levels; therefore, additional approaches need to be devised. We compared two cropping system modifications for NO3 concentration and load in subsurface drainage water for a no-till corn (Zea mays L.)-soybean (Glycine max [L.] Merr.) management system. In one treatment, eastern gamagrass (Tripsacum dactyloides L.) was grown in permanent 3.05-m-wide strips above the tiles. For the second treatment, a rye (Secale cereale L.) winter cover crop was seeded over the entire plot area each year near harvest and chemically killed before planting the following spring. Twelve 30.5x42.7-m subsurface-drained field plots were established in 1999 with an automated system for measuring tile flow and collecting flow-weighted samples. Both treatments and a control were initiated in 2000 and replicated four times. Full establishment of both treatments did not occur until fall 2001 because of dry conditions. Treatment comparisons were conducted from 2002 through 2005. The rye cover crop treatment significantly reduced subsurface drainage water flow-weighted NO3 concentrations and NO3 loads in all 4 yr. The rye cover crop treatment did not significantly reduce cumulative annual drainage. Averaged over 4 yr, the rye cover crop reduced flow-weighted NO3 concentrations by 59% and loads by 61%. The gamagrass strips did not significantly reduce cumulative drainage, the average annual flow-weighted NO3 concentrations, or cumulative NO3 loads averaged over the 4 yr. Rye winter cover crops grown after corn and soybean have the potential to reduce the NO3 concentrations and loads delivered to surface waters by subsurface drainage systems.  相似文献   

12.
Subsurface drainage is a beneficial water management practice in poorly drained soils but may also contribute substantial nitrate N loads to surface waters. This paper summarizes results from a 15-yr drainage study in Indiana that includes three drain spacings (5, 10, and 20 m) managed for 10 yr with chisel tillage in monoculture corn (Zea mays L.) and currently managed under a no-till corn-soybean [Glycine max (L.) Merr.] rotation. In general, drainflow and nitrate N losses per unit area were greater for narrower drain spacings. Drainflow removed between 8 and 26% of annual rainfall, depending on year and drain spacing. Nitrate N concentrations in drainflow did not vary with spacing, but concentrations have significantly decreased from the beginning to the end of the experiment. Flow-weighted mean concentrations decreased from 28 mg L(-1) in the 1986-1988 period to 8 mg L(-1) in the 1997-1999 period. The reduction in concentration was due to both a reduction in fertilizer N rates over the study period and to the addition of a winter cover crop as a "trap crop" after corn in the corn-soybean rotation. Annual nitrate N loads decreased from 38 kg ha(-1) in the 1986-1988 period to 15 kg ha(-1) in the 1997-1999 period. Most of the nitrate N losses occurred during the fallow season, when most of the drainage occurred. Results of this study underscore the necessity of long-term research on different soil types and in different climatic zones, to develop appropriate management strategies for both economic crop production and protection of environmental quality.  相似文献   

13.
Controlled drainage and wetlands could be very effective practices to control nitrogen pollution in the low-lying agricultural plains of northeast Italy, but they are not as popular as in other countries. An experiment on lysimeters was therefore carried out in 1996-1998, with the double aim of obtaining local information to encourage the implementation of these practices and to gain more knowledge on the effects involved. Controlled drainage + subirrigation and wetlands were all considered as natural systems where alternative water table management could ameliorate water quality, and were compared with a typical water management scheme for crops in the open field. Eight treatments were considered: free drainage on maize (Zea mays L.) and sugarbeet (Beta vulgaris L.), two treatments of controlled drainage on the same crops, and five wetland treatments using common reed [Phragmites australis (Cav.) Trin. ex Steud.], common cattail (Typha latifolia L.), and tufted sedge (Carex elata All.), with different water table or flooding levels. Lysimeters received about 130 g m 2 of N with fertilization and irrigation water, with small differences among treatments. The effects of treatments were more evident for NO3-N concentrations than for the other chemical parameters (total Kjeldahl nitrogen, pH, and electrical conductivity), with significantly different medians among free drainage (33 mg L(-1)), controlled drainage (1.6 and 2.6 mg L(-1)), and wetlands (0.5-0.7 mg L(-1)). Referring to free drainage, NO3-N losses were reduced by 46 to 63% in controlled drainage and 95% in the average of wetlands. Wetlands also reduced losses of total dissolved solids from 253 g m(-2) (average of crop treatments) to 175 g m(-2) (average of wetlands).  相似文献   

14.
The nitrates (NO(3)-N) lost through subsurface drainage in the Midwest often exceed concentrations that cause deleterious effects on the receiving streams and lead to hypoxic conditions in the northern Gulf of Mexico. The use of drainage and water quality models along with observed data analysis may provide new insight into the water and nutrient balance in drained agricultural lands and enable evaluation of appropriate measures for reducing NO(3)-N losses. DRAINMOD-NII, a carbon (C) and nitrogen (N) simulation model, was field tested for the high organic matter Drummer soil in Indiana and used to predict the effects of fertilizer application rate and drainage water management (DWM) on NO-N losses through subsurface drainage. The model was calibrated and validated for continuous corn (Zea mays L.) (CC) and corn-soybean [Glycine max (L.) Merr.] (CS) rotation treatments separately using 7 yr of drain flow and NO(3)-N concentration data. Among the treatments, the Nash-Sutcliffe efficiency of the monthly NO(3)-N loss predictions ranged from 0.30 to 0.86, and the percent error varied from -19 to 9%. The medians of the observed and predicted monthly NO(3)-N losses were not significantly different. When the fertilizer application rate was reduced ~20%, the predicted NO(3)-N losses in drain flow from the CC treatments was reduced 17% (95% confidence interval [CI], 11-25), while losses from the CS treatment were reduced by 10% (95% CI, 1-15). With DWM, the predicted average annual drain flow was reduced by about 56% (95% CI, 49-67), while the average annual NO(3)-N losses through drain flow were reduced by about 46% (95% CI, 32-57) for both tested crop rotations. However, the simulated NO(3)-N losses in surface runoff increased by about 3 to 4 kg ha(-1) with DWM. For the simulated conditions at the study site, implementing DWM along with reduced fertilizer application rates would be the best strategy to achieve the highest NO(3)-N loss reductions to surface water. The suggested best strategies would reduce the NO(3)-N losses to surface water by 38% (95% CI, 29-46) for the CC treatments and by 32% (95% CI, 23-40) for the CS treatments.  相似文献   

15.
In some high-fertility, high-stocking-density grazing systems, nitrate (NO(3)) leaching can be great, and ground water NO(3)-N concentrations can exceed maximum contaminant levels. To reduce high N leaching losses and concentrations, alternative management practices need to be used. At the North Appalachian Experimental Watershed near Coshocton, OH, two management practices were studied with regard to reducing NO(3)-N concentrations in ground water. This was following a fertilized, rotational grazing management practice from which ground water NO(3)-N concentrations exceeded maximum contaminant levels. Using four small watersheds (each approximately 1 ha), rotational grazing of a grass forage without N fertilizer being applied and unfertilized grass forage removed as hay were used as alternative management practices to the previous fertilized pastures. Ground water was sampled at spring developments, which drained the watershed areas, over a 7-yr period. Peak ground water NO(3)-N concentrations before the 7-yr study period ranged from 13 to 25.5 mg L(-1). Ground water NO(3)-N concentrations progressively decreased under each watershed and both management practices. Following five years of the alternative management practices, ground water NO(3)-N concentrations ranged from 2.1 to 3.9 mg L(-1). Both grazing and haying, without N fertilizer being applied to the forage, were similarly effective in reducing the NO(3)-N levels in ground water. This research shows two management practices that can be effective in reducing high NO(3)-N concentrations resulting from high-fertility, high-stocking-density grazing systems, including an option to continue grazing.  相似文献   

16.
This study was designed to evaluate the improved version of the Root Zone Water Quality Model (RZWQM) using 6 yr (1992-1997) of field-measured data from a field within Walnut Creek watershed located in central Iowa. Measured data included subsurface drainage flows, NO3-N concentrations and loads in subsurface drainage water, and corn (Zea mays L.) and soybean [Glycine mar (L.) Merr.] yields. The dominant soil within this field was Webster (fine-loamy, mixed, superactive, mesic Typic Endoaquolls) and cropping system was corn-soybean rotation. The model was calibrated with 1992 data and was validated with 1993 to 1997 data. Simulations of subsurface drainage flow closely matched observed data showing model efficiency of 99% (EF = 0.99), and difference (D) of 1% between measured and predicted data. The model simulated NO3-N losses with subsurface drainage water reasonably well with EF = 0.8 and D = 13%. The simulated corn grain yields were in close agreement with measured data with D < 10%. Nitrogen-scenario simulations demonstrated that corn yield response function reached a plateau when N-application rate exceeded 90 kg ha(-1). Fraction of applied N lost with subsurface drainage water varied from 7 to 16% when N-application rate varied from 30 to 180 kg ha(-1) after accounting for the nitrate loss with no-fertilizer application. These results indicate that the RZWQM has the potential to simulate the impact of N application rates on corn yields and NO3-N losses with subsurface drainage flows for agricultural fields in central Iowa.  相似文献   

17.
Large and repeated manure applications can exceed the P sorption capacity of soil and increase P leaching and losses through subsurface drainage. The objective of this study was to evaluate the fate of P applied with increasing N rates in dairy wastewater or poultry litter on grassland during a 4-yr period. In addition to P recovery in forage, soil-test phosphorus (STP) was monitored at depths to 180 cm in a Darco loamy sand (loamy, siliceous, semiactive, thermic Grossarenic Paleudults) twice annually. A split-plot arrangement of a randomized complete block design comprised four annual N rates (0, 250, 500, and 1000 kg ha(-1)) for each nutrient source on coastal bermudagrass [Cynodon dactylon (L.) Pers.] over-seeded with ryegrass (Lolium multiflorum L. cv. TAM90). Increasing annual rates of N and P in wastewater and poultry litter increased P removal in forage (P = 0.001). At the highest N rate of each nutrient source, less than 13% of applied P was recovered in forage. The highest N rates delivered 8 times more P in wastewater or 15 times more P in poultry litter than was removed in forage harvests during an average year. Compared with controls, annual P rates up to 188 kg ha(-1) in dairy wastewater did not increase STP concentrations at depths below 30 cm. In contrast, the highest annual P rate (590 kg ha(-1)) in poultry litter increased STP above that of controls at depth intervals to 120 cm during the first year of sampling. Increases in STP at depths below 30 cm in the Darco soil were indicative of excessive P rates that could contribute to nonpoint-source pollution in outflows from subsoil through subsurface drainage.  相似文献   

18.
Agriculture in the U.S. Midwest faces the formidable challenge of improving crop productivity while simultaneously mitigating the environmental consequences of intense management. This study examined the simultaneous response of nitrate nitrogen (NO3-N) leaching losses and maize (Zea mays L.) yield to varied fertilizer N management using field observations and the Integrated BIosphere Simulator (IBIS) model. The model was validated against six years of field observations in chisel-plowed maize plots receiving an optimal (180 kg N ha(-1)) fertilizer N application and in N-unfertilized plots on a silt loam soil near Arlington, Wisconsin. Predicted values of grain yield, harvest index, plant N uptake, residue C to N ratio, leaf area index (LAI), grain N, and drainage were within 20% of observations. However, simulated NO3-N leaching losses, NO3-N concentrations, and net N mineralization exhibited less interannual variability than observations, and had higher levels of error (20-65%). Potential effects of 30% higher (234 kg N ha(-1)) and 30% lower (126 kg N ha(-1)) fertilizer N use (from optimal) on NO3-N leaching loss and maize yield were simulated. A 30% increase in fertilizer N use increased annual NO3-N leaching by 56%, while yield increased by only 1%. The NO3-N concentration in the leachate solution at 1.4 m below the soil surface was 30.7 mg L(-1). When fertilizer N use was reduced by 30% (from optimal), annual NO3-N leaching losses declined by 42% after seven years, and annual average yield only decreased by 8%. However, NO3-N concentration in the leachate solution remained above 10 mg L(-1) (11.3 mg L(-1)). Clearly, nonlinear relationships existed between changes in fertilizer use and NO3-N leaching losses over time. Simulated changes in NO3-N leaching were greater in magnitude than fertilizer N use changes.  相似文献   

19.
ABSTRACT: The persistence of water quality problems has directed attention towards the reduction of agricultural nonpoint sources of phosphorus (P) and nitrogen (N). We assessed the practical impact of three management scenarios to reduce P and N losses from a mixed land use watershed in central Pennsylvania, USA. Using Scenario 1 (an agronomic soil P threshold of 100 mg Mehlich‐3 P kg‐1, above which no crop response is expected), 81 percent of our watershed would receive no P as fertilizer or manure. Under Scenario 2 (an environmental soil P threshold of 195 mg Mehlich‐3 P kg‐1, above which the loss of P in surface runoff and subsurface drainage increases greatly), restricts future P inputs in only 51 percent of the watershed. Finally, using scenario 3 (P and N indices that account for likely source and transport risks), 25 percent of the watershed was at high risk or greater of P loss, while 60 percent of the watershed was classified as of high risk of nitrate (NO3) leaching. Areas at risk of P loss were near the stream channel, while areas at risk of NO3 leaching were near the boundaries of the watershed, where freely draining soils and high manure and fertilizer N applications coincide. Remedial measures to minimize P export should focus on critical source areas, while remedial measures to reduce N losses should be source based, concentrating on more efficient use of N by crops.  相似文献   

20.
ABSTRACT: Surface and subsurface drainage make crop production economically viable in much of southern Minnesota because drainage allows timely field operations and protects field crops from extended periods of flooded soil conditions. However, subsurface drainage has been shown to increase nitrate/nitrogen losses to receiving waters. When engaging in drainage activities, farmers are increasingly being asked to consider, apart from the economic profit, the environmental impact of drainage. The Agricultural Drainage and Pesticide Transport model (ADAPT) was used in this study to evaluate the impact of subsurface drainage design on the soil water balance over a two‐year period during which observed drainage discharge data were available. Twelve modeling scenarios incorporated four drainage coefficients (DC), 0.64 cm/d, 0.95 cm/d, 1.27 cm/d, and 1.91 cm/d, and three drain depths, 0.84 m, 1.15 m, and 1.45 m. The baseline condition corresponded to the drainage system specifications at the field site: a drain depth and spacing of 1.45 m and 28 m, respectively (DC of 0.64 cm/d). The results of the two‐year simulation suggested that for a given drainage coefficient, soils with the shallower drains (but equal DC) generally have less subsurface drainage and can produce more runoff (but reduced total discharge) and evapotranspiration. The results also suggested that it may be possible to design for both water/nitrate/nitrogen reduction and crop water needs.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号