Influence of Tillage Systems and Winter Cover on Off-Site Losses of Sediment, Nutrients and Atrazine in Silage Corn Production

Nonpoint source pollution has been recognized as a primary cause of water pollution in the United States. Agricultural activities have been cited as the leading contributor of nonpoint source pollutants. Runoff and eroded sediment are the primary transport agents for agrichemical losses from agricultural fields. Conservation tillage has been promoted over the past two decades as a cost-effective agronomic practice that can reduce runoff and erosion from agricultural fields. The goal of this study was to compare the edge-of-field losses of waterborne substances from conservation tillage and conventional tilage plots both with and without the use of a winter cover crop under the corn-for-silage management program. Corn for silage is a prevailing practice in New England and comprises about 20 percent of the total acres harvested. Twelve field plots measuring 3.4 meters wide by 22.1 meters long with a 2.5 percent slope were equipped with an overland flow collection system. Runoff was monitored during the 1985 and 1986 growing seasons (June through November). Runoff samples were analyzed for sediment, nitrogen, and atrazine content. Runoff occurred on 22 out of 51 rainfall events that occurred during the study period. In all treatments, 57 to 62 percent and 70 to 77 percent of the runoff and soil

loss, respectively, were associated with excessive rate storms. Runoff and soil loss were considerably higher on plots with less than 30 percent residue cover. Surface residue from the winter cover crop reduced runoff and soil loss by 29 and 54 percent, respectively, compared to plots without the winter cover crop.
Total nitrogen losses through overland flow during the 1986 growing season ranged from 0.33 to 3.42 kg/ha or 0.1 to 1.3 percent of the applied nitrogen. Nitrogen losses were highest on plots without a winter cover crop. Total Kjeldahl nitrogen accounted for 89.5 to 94 percent of the total nitrogen loss~ The greatest losses of total Kjeldahl nitrogen were associated with the events that had the greatest sediment movement.
Total atrazine losses through overland flow was less than 0.5 percent of that applied for all treatments.
Atrazine losses were 74 percent lower in conservation tillage systems than in tillage systems with less than 30 percent residue cover. Tillage method had no significant effect on flow weighted atrazine concentrations in runoff.
The hydrology component of the CREAMS computer model predicted runoff closest to the observed runoff values using the breakpoint method in the conventional system and the curve number method in the notill system. The breakpoint method performed better than the curve number method for small intense, storms.
In order to obtain close agreement between predicted and observed runoff values, recommended  (Vigon;.
Agricultural activities have been cited as the leading contributor of nonpoint source pollutants in United States.
According to the 1977 National Water Quality Inventory (USEPA, 1978), surface water quality was affected by agricultural nonpoint pollution in 68 percent of the drainage basins throughout the country. Another study conducted by Resources for the Future, Inc. concluded that about 66 percent of the suspended solids loading in our nation's rivers is attributed to agricultural sources (Duda, 1985). The Conservation Foundation estimates that agricultural sources contribute about 70 percent of the 4.5 billion tons of soil loss estimated to occur each year throughout the country (Clark, 1983).
Along with soil erosion, the increasing use of pesticides and fertilizers poses a major threat to the quality of surface waters and drinking water supplies. The use of commercial nitrogenous fertilizers has increased five-fold from 2,400,000 metric tons in 1960 to 11,300,000 metric tons in 1980 (Ritter, 1986). In 1980 about 300,000 tons of pesticides were used in agriculture. Pesticide use is projected to exceed 1 million tons by the end of the decade (Chesters and Schierow, 1985).
Considering that 63 percent of all nonfederal land in the United States is used for agricultural purposes (USDA, 1981), it is not surprising that agricultural activities have been cited as a major source of nonpoint source pollution. Best management practices designed to reduce soil erosion and agrichemical losses from croplands could lead to substantial improvements in the quality of surface water downstream from agricultural land.
Conservation tillage has been promoted over the past two decades as a cost-effective agronomic practice that can reduce overland flow and erosion. Conservation tillage, as defined by the Soil Conservation Service, is any form of tillage that leaves at least 30 percent of the soil surface covered with crop residue after planting (SCSA, 1982).
conservation tillage practices range from notill farming, where planting occurs in the undisturbed residue of the previous crop, to modified tillage practices such as chisel plowing, disking, or ridge planting. The conventional moldboard plow buries at least 95 percent of the surface residue, leaving the bare soil exposed to erosive elements.
The common element in all conservation tillage practices is that soil disturbance is reduced and an appreciable amount of crop residue is left at the surface. This additional residue on the soil surf ace has been shown to be very effective in reducing soil detachment and sediment loss. 3 The goal of this study was to compare the edge-of-field losses of waterborne substances from conservaton and conventionally tilled plots both with and without the use of a winter cover crop under the corn-for-silage management program. To accomplish this goal a replicated field study of overland flow from natural rainstorms was conducted.
Corn harvested as silage is a principal crop in New England and constitutes about 20 percent of the total harvested acres.
Corn harvested for silage leaves relatively small amounts of crop residue on the surf ace after harvest (McGregor and Greer, 1982;Wendt and Burwell, 1985).

High erosion rates have been observed when insufficient
Plant residue has been left on the soil surface (Laflen and Colvin, 1981;Wendt and Burwell, 1985;Kenimer et al. 1986).
The combination of the inadequate amounts of surface residue and the fact that a major portion of the farming in New Systems, more commonly known as the CREAMS computer model (Knisle, 1980 (Moldenhauer et al., 1983;Burwell and Kramer, 1983;McDowell and McGregor, 1984;Wendt and Burwell, 1985). The substantial reduction in sediment loss has been mainly attributed to the ability of the surface residue to dissipate rainfall energy and, thereby, reduce soil particle detachment. Wischmeier and Smith (1978) computed that a soil surface covered with residue at rates of 20, 40, and 60 percent cover will receive 65, 35, and 25 percent, respectively, of the rainfall erosivity striking bare ground. Laflen and Colvin (1981) reported erosion to be an inverse, exponential function of percent residue cover.
The reduced sediment loss observed from conservation tillage systems can also be attributed to surf ace residue affects on runoff velocity. Niebling and Foster (1977) reported that runoff velocity decreased with increasing levels of residue. Partially incorporated corn stalk  (Siemens and Oschwald, 1976;Mannering et al., 1966;Lindstrom et al. 1981;Laflen and Colvin, 1981).
For most of the reported studies, the chisel plow apparently seems more effective in reducing runoff than no-till but is not as effective as no-till in reducing erosion. Lindstrom and Onstad (1984)  The partitioning between the sediment-adsorbed phase and the dissolved phase for any compound has been cited as the single most important factor in determining the fate of pesticides and nutrients in the field Helling (1970).
Adsorption or the adhesion of a substance to a soil particle is often described by the adsorption partition coefficient Ks, (Steenhuis and Walter, 1979). Ks is defined as the ratio of the concentration of the substance adsorbed to sediment divided by the concentration of the substance in solution. Substances with a high Ks value (>1000) such as organic nitrogen, ammonium nitrogen, solid phase Phosphorous, and paraquat will move with the soil.
Atrazine, a moderately adsorbed pesticide with a Ks value of about five, will move in solution and adsorb to sediment.
Nitrate nitrogen has a very low Ks value (0.05) and moves primarily in solution.
For substances moderately or weakly adsorbed to sediment, the highest concentrations in runoff have been found in runoff events occurring close to the time of application (Hall et al., 1972;Smith et al., 1974). Baker and Johnson (1979) found that alachlor and cynazine, herbicides transported primarily with water, were present in runoff occurring soon after application but concentrations rapidly declined in later runoff events.
Studies have demonstrated that chemicals which move adsorbed to sediments may have higher concentrations in the eroded material compared to the soil from which it originated. Massey et al. (1952) reported that eroded soil material contained 3.4 times as much available-P as the in situ soil. The authors also found that the enrichment of the eroded soil was inversely proportional to sediment concentration and net sediment loss. This relationship was also observed in another study done by Stoltenberg and White (1953). At sediment concentrations of 40,000, 2,700, and 440 milligrams per liter, the ratio of the nitrogen content on the eroded sediment to the in situ soil increased from 1.3 to 2.0 to 5.0 times, respectively. The authors found that through the selective erosion process, an increasing Proportion of finer soil particles and organic matter were Present in the runoff as the transport energy decreased. The lighter clay and organic matter particles were not as subject to deposition. Once suspended, these particles with their adsorbed substances were able to leave the field more readily than larger, heavier particles. The clay and organic matter particles have higher cation exchange capacities than the coarser particles and, therefore, can adsorb greater amounts of nutrients.

2.) Losses of Nitrogen from Agricultural Land
Researchers have found significant nitrogen losses associated with agricultral activity. Smolen (1981) monitored nutrient runoff from agricultural and non-agricultural watersheds for four years and reported that there was a 1.5 to two-fold increase in nitrogen concentration attributable to agricultural land use. Timmons et al. (1968) found N losses as high as 14.5 kg/ha per year from corn-cropped plots. In contrast, forested areas have been reported to have nitrogen losses ranging from less than 1 to 3.36 kg/ha (Frink, 1967).
In most cases, nitrogen leaving agricultural fields by surface runoff is in the organic-N form associated with eroded soil (Armstrong et al., 1974). Nitrate-N, the major anionic form of nitrogen, will normally be assimilated by plant roots or leach through the soil profile to the groundwater (Keaney, 1973). Losses of nitrate through overland flow are generally associated with storms occurring immediately after fertilization or the leaching of nutrients from surface residue (Timmons et al., 1970).

water Quality Impacts of Nitrogen Losses
Excessive offsite losses of nitrogen from fertilized agricultural fields can have substantial adverse effects on water quality. High nitrate concentrations in drinking water can cause methemoglobinemia ("Blue Baby Syndrome") in infants during the first six months of life (National Reaearch Council, 1978). The drinking water standard for nitrate-N concentration is set at 10 mg/l by the Environmental Protection Agency to safeguard against methemoglobinemia in infants (Safe Drinking Act U.S.P.L. 93-523). In brackish and salt water systems, increased nitrogen inputs into surface waters could promote excessive eutrophication (Ryther and Dunstan, 1971;Harlin and Thorne-Miller, 1981). Nitrogen, as ammonia, can also be acutely toxic to fish.
The current ·water quality standard for unionized NH 3 , which is the form toxic to fish, is 0.02 mg/l. At common temperatures and near neutral pH, 2 mg/l of NH4-N results in a NH3 concentration of about 0.02 mg/l, thus the 2 mg/l value for NH 4 -N is often quoted as a level of concern (U.S.EPA, 1976).

Influence of Tillage on Nitrogen Losses
Research has shown that concentrations of both soluble nitrogen (mg/l) and sediment-adsorbed nitrogen (mg/kg) are generally higher in the runoff from conservation tillage systems compared to the runoff from conventional plowed systems. However, total nitrogen loss depends primarily on the runoff volume and the sediment load generated from these systems. Romkens et al. (1973) reported a curvilinear relationship between soil loss and the loss of sediment-associated nitrogen. Conventional tillage had the highest loss of sediment-N even though conservation tillage systems with less soil loss had relatively higher concentrations of sediment-N. Differences in the nitrogen concentration in runoff sediment between tillage systems are primarily due to selective soil erosion processes. The authors also noted that the proportion of nitrogen that is organic, as well as the total nitrogen content of sediment, increased with the degree of erosion selectivity. Baker and Laflen (1982) observed higher concentrations of nitrate-N in runoff from conservation tillage plots than from conventionally-tilled plots. However, conservation tillage plots lost less than one-half of the total loading of nitrate-N that was lost from the conventional plots. The difference in loading was the result of runoff from conventional plots being 3.3 times that of conservation tillage plots. Baker and Laflen (1982) found that increasing surface residue on the soil surf ace decreased the volume of runoff d in turn, decreased nutrient losses by up to 80 percent an , for plots with the greatest residue amounts. McDowell and McGregor (1980) reported that both the percentage of the total nitrogen and the concentrations of nitrate-N transported in solution from conservation tillage 13 was significantly greater than in conventional tillage systems.
In conventional-till corn for grain only 9 percent of the total nitrogen was transported in solution in runoff compared to 40 percent in the no-till plots. However, conservation tillage systems reduced total nitrogen losses (solution and sediment) relative to that of conventional tillage because of the significant reduction in soil loss.
soil loss was reduced by more than 92 percent in reduced tillage and notill systems. The total loss of nitrogen was reduced by more than 70 percent compared to that in conventional tillage.
Increased concentrations of nitrate-N in the runoff from conservation tillage systems have been attributed to the combined effects of leaching of nitrogen from the surface residue, decreased fertilizer incorporation, and the enrichment of sediment as a result of selective erosion processes McDowell and McGregor, 1984;Timmons et al., 1970).

Atrazine Background
Atrazine (2-chloro-4-ethylamino-6-isopropylaminos-triazine) was introduced into the market in 1953 by the cIBA-Giegy Corp. of Ardsley, New York. Atrazine accounts for nearly one half of the herbicides used in corn production and was identified in 1971 as the most heavily used herbicide in the United States (Shoemaker and Harris, 1979).
Atrazine is an effective photosynthetic electron inhibitor in broadleaf and some grassy weeds. Corn has the ability to rapidly metabolize and, therefore, detoxify atrazine (Knisle, 1970). Its high degree of selectivity allows for applications to the soil before and after the emergence of the crop and greatly reduces the need for time consuming tillage operations.

Atrazine Toxicity
Direct toxicity of atrazine to fish is fairly low, with a reported 96 hour LC50 for rainbow trout of 4.5 mg/l (Smith, 1982). Atrazine has been implicated in the widespread decline of submerged vascular plants in the Chesapeake Bay (Wu, 1977). In a microcosm bioassay, the authors found that exposure of Vallisneria americana (Water reported Ks values ranging from 1 to 8 depending on the soil type and the amount of cations present (Cohen et al., 1984). Helling (1970) in his classification system of pesticide mobility in the environment, characterized atrazine as having intermediate mobility.
Atrazine has a solubility of 33 mg/l in water and will exist in both adsorbed and dissolved states in the soil environment (Colbert et al., 1975). Atrazine has been found to have a half-life of less than one month but it can persist in the soil up to 18 months (Wauchope, 1978). Persistence tends to be greater at lower soil depths and longer in northern latitudes (Kaufman and Kearney, 1970). Wauchope (1978) stated that atrazine had an apparent "runoff-available" half-life of 7 to 10 days, based on the fact that the surface applied herbicide was subject to volatilization, photodegradation, and leaching. Hall et al. (1972) found that at the recommended application rate of 2.24 kg/ha, 60 percent of the applied atrazine was lost from the soil through degradation after 1 month, and 91 percent was lost after 4 months. Sirens et al. (1973) found that about 15 percent of the applied atrazine remained after 2 weeks and after 52 weeks less than 10 percent remained.

16
A number of studies have shown that concentrations and losses of surf ace applied atrazine are highest when intense rainfall occurs immediately after herbicide application. In a rainfall simulation study, when rainfall was applied one hour after atrazine application, runoff samples collected at the onset of runoff had atrazine concentrations as high as 10.34 ppm but runoff samples collected at the end of the storm had concentratios as low as 0.34 ppm (White et al., 1967). Concentrations were 50 percent lower when the storm occurred 96 hours after application. Bailey et al. (1974) measured atrazine losses in runoff from a 100-year storm occurring one hour after application from bare soil plots with atrazine applied at two different rates. Losses from plots treated with 3.36 kg/ha of atrazine were 10 to 13 percent af that applied and losses from plots treated with 1.68 kg/ha of atrazine were 6.5 to 12.5 percent of that applied.
Under natural rainfall conditions, Hall (1974) observed runoff losses to be 5 percent of that applied when the first runoff event occurred 6 days after an atrazine application 2.2 kg/ha. Atrazine concentrations were highest in the first two runoff events following application and 87 to 93 percent of the total atrazine loss occurred in the first five runoff events occurring in the month following application. Baker and Johnson (1979) found that seasonal losses were less than five percent in years when the first runoff producing storms occurred two weeks or more after application. However, in one year in which a storm took place 24 hours after application, the losses were 16 percent of the applied atrazine.
Although the results of these studies vary in terms of the quantities of atrazine lost in runoff, they all have demonstrated that atrazine concentrations are much higher in the sediment portion of the runoff. Baker and Johnson (1979) found atrazine concentrations in eroded sediment were five times as large as those in the runoff water, but more atrazine was lost with the water portion of runoff because the volume of water lost was much higher than sediment lost. Hall (1974) reported atrazine concentrations to be about 2.5 times higher on sediment than in water but nearly 90 percent of the atrazine loss was in the dissolved phase. Bailey et al. (1974) also found that 70 to 80 percent of the atrazine loss was in the dissolved phase.

Influence of Tillage on Atrazine Losses
Researchers that have investigated the effects of conservation tillage on the losses of atrazine in runoff have had contradictory results. Concentrations of atrazine in the runoff and in the eroded sediment are generally higher in conservation tillage systems than in conventional tillage systems. However, the total waterborne losses of atrazine is dependent upon the effect of conservation tillage on total runoff and soil loss. Smith et al. (1974) reported substantially greater losses of atrazine from no-till plots than from conventional plots. For one storm, the maximum atrazine concentration in runoff was 0.87 ppm and total loss was 28.9 g/ha from conventional plots, while no-till plots had a maximum concentration of 1.7 ppm and a total loss of 108.8 g/ha. The sequence of agronomic activity is presented in Table   1. A winter cover crop of rye (Secale cereale L.) was  (1977). Estimates of the total residue cover (kg/ha) were obtained by removing all the visible residue contained in 0.01 square meter. Three samples were taken from each plot and were oven-dried (60° C) for a week before weighing.

Monitoring Approach
The surf ace water collection system for each plot  As specified by the Field Manual for Research in Agricultural Hydrology (Brakensiek et al., 1979), the collection system can was designed for a maximum runoff rate from a 10 year, 5 minute rainstorm (41 L/sec) with no infiltration (Hershfield, 1975). The storage capacity was designed to accomodate the runoff volume predicted by the 25 scs curve number method for a 5 year 24 hour storm (Mockus, 1964). Runoff and soil loss was monitored for two growing seasons (1985 and 1986) from the time of pesticide application to several weeks after the establishment of the winter cover (June through November). The depth of water in the collection tank was measured after each runoff event for runoff volume calculations. Runoff volume generated from precipitation falling directly on the concrete was calculated and then subtracted from the total volume.
A two liter aliquot sample was obtained from each collection tank with a manual pump after the sediment was thoroughly suspended by mixing. A

31
A non-parametric statistical analysis was conducted by using the Friedman two-way analysis of variance test in which runoff depth for each treatment was ranked within each event (Daniel, 1978). This method is able to accommodate the large variability that occurs between events due to differences in rainfall volumes, rainfall intensity, canopy cover and antecedent soil moisture conditions. The test indicates that chisel without the rye cover had significantly greater (p = .05) runoff than all other treatments except conventional without rye cover (Table 3).
Conventional tillage without rye cover also had significantly more runoff than chisel with rye cover and no-till with rye cover. Chisel with rye cover had    :f RC = Rye Cover Crop; NC = No Cover Crop. § Excessive rate storm is defined at PPT > 5 + 0.25t; where t is storm duration in minutes.
* Mean runoff totals that are followed by the same letter are not significantly different (0.05) based on the Friedman nonparametric two-way analysis of variance test. w w significantly less runoff than all other treatments except no-till with rye cover.

34
In the chisel plots without rye cover, surface sealing in the chisel furrows was observed after the first few rain events. These sealed furrows may have produced channelized flow down slope which might account for the large variation in runoff totals between chisel plots with the rye cover and chisel plots without rye cover. The surface sealing was not as evident on the chisel plots with rye cover. Lindstrom and Onstad (1984) reported that higher infiltration rates can be maintained on fields with surface residue than on fields with bare soil.
Percent cover after tillage was 77 and 80 percent in the chisel and no-till systems with rye cover, respectively, and only 9 percent in the conventional system (Table 4).
Based on SCSA def intion of conservation tillage systems ( >30 percent surface residue after planting), only chisel and notill with rye cover treatments · can be classified as conservation tillage systems. (SCSA, 1982). Residue from the winter cover crop reduced total runoff by 60, 16, and 11 percent in the chisel, no-till, and conventional tillage treatments, respectively. Averaged over all three tillage treatments residue from the winter cover crop reduced runoff by 29 percent. Wendt and Burwell (1985) reported that crop residue from a winter cover crop reduced annual runoff Volumes in no-till silage corn by approximately 50 percent. A frequency analysis was employed to analyze the frequency at which a specified runoff depth was exceeded within each treatment (Figure 2).

Soil loss 39
Average soil loss totals for all treatments are summarized by event in Table 5. The Friedman nonparametric statistical test indicates that chisel without rye cover has significantly more soil loss (p = .05) than all treatments r that had rye cover. Chisel with rye cover had significantly less soil loss than all treatments without rye cover.
The fact that the treatments ranked in almost the same order for both runoff and soil loss suggests that the two processes are closely related. Quansah (1983) found that higher runoff velocities can increase soil particle detachment, and larger runoff volumes are capable of transporting more sediment. Since conventional tillage with rye cover had more runoff but less soil loss than no-till Without rye cover, it appears that rye cover has more of an effect on soil loss than on overland flow. In a 6 year study, Wendt and Burwell (1985) also found that without a  winter cover crop, no-till plots in corn-for-silage consistently had higher annual soil loss totals than no-till and conventional plots with the winter cover.
The highest concentrations were observed in the runoff from chisel and conventional tillage plots · in storms occurring within several weeks after planting in the 1985 season.    (Aug. 26, 1985), occurred during the cropstage P3 which is when the crop canopy is fully developed (Wischmeier and Smith, 1978). Sediment concentrations during these events were generally lower than in less intense rain events that occurred during other crop stages. This demonstrates the effectiveness of crop canopy in reducing erosive storm energy.

Event-based Analysis
The majority of the soil loss and overland flow that occurred during both seasons was associated with the eight excessive rate storms that occurred over the two seasons of study. An excessive rate storm is defined by the National Weather Service as a storm which produces a volume (mm) greater than or equal to 5 + 0.25t, where t is the storm duration (minutes). Although these storms represented only 17 percent of the total precipitation during the study, they generated 57 to 67 percent of the total runoff and 70 to 77 percent of the total soil loss from all six treatments. Greer (1971) found during a 6 year study that excessive rate storms generated 77 percent of the soil loss.
Rye cover appears to be more effective in reducing runoff volume on the smaller events than on the larger events.
In runoff events that had less than 25.4 mm of rainfall, conventional, no-till, and chisel plots with rye cover had 36, 61, and 84 percent less runoff, respectively, than the same tillage treatments without rye cover. In comparison to the total runoff generated in the 18 events, runoff from the conventional, no-till, and chisel plots with rye cover was 11, 16, and 60 percent less, respectively, than the same tillage treatments without rye cover. Rye cover also had a major influence on the occurrence of runoff. During 7 of the 18 events, plots with less than 20 percent surf ace residue cover generated runoff but the Plots within the same tillage treatment that had substantial rye cover had no runoff (Table 3). This effect on the occurrence of runoff was even more apparent during the early season events that occurred in the seedbed cropstage of 19 85. The conservation tillage systems generated no runoff in the first three storms except for the conventional plots that had runoff in the third storm. Kramer (1984) also noted that for small runoff events occurring in the seedbed cropstage, runoff was less frequent in conservation tillage than in coventional tillage.
Reducing the occurrence of runoff in the early season events has important ramifications on the offsite losses of soluble agrichemicals and nutrients. Higher losses of chemicals and nutrients can be expected in the early season events when fertilizers and pesticides have been recently applied . Hall (1974) found that 87 to 93 percent of the total seasonal loss of atrazine occurred in the first five runoff events following application.

48
Soil loss, generated in the early season events of 1985, was also considerably high even though runoff volumes were relatively small (Tables 2 and 4).
In the chisel, no-till, and conventional treatments without rye cover 9, 17.5, and 40.5 percent of the the total seasonal soil loss occurred on the first event (June 24). On the same event, less than 2 percent of the total seasonal runoff occurred in all three treatments. When the soil loss from the two subsequent storms are included, percentages of the total seasonal soil loss increase to 23.5, 32, and 58 percent for the chisel, no-till, and conventional treatments without rye cover, respectively.

49
The large sediment movement observed on plots without cover in early storms is indicative of the vulnerability of  were highest in the tillage treatments without rye cover (Table 7). Chisel, no-till, and conventional tillage without rye cover had 90.5, 58, and 27 percent more TKN losses, respectively, than the same tillage treatments with rye cover. These differences in TKN losses closely relate to the differences in soil loss within the same treatments in 1986. Total soil loss in chisel, no-till, and conventional tillage without rye cover was 82, 63, and 26 Percent higher, respectively, than the same tillage treatment with rye cover.
Total Kjeldahl nitrogen, which is a measure of the cationic nitrogen species, adsorbs readily to sediment  particles and remains in the "runoff-mixing" zone of the soil profile for extended periods following application.
on an event basis, the greatest losses of TKN were associated with the events that had the largest amount of sediment movement. On all twelve plots, 66 to 96 percent of the seasonal TKN loss occurred on July 13 and November 21.
These two events also generated 60 to 94 percent of the total soil loss that occurred in 1986. Other studies have also documented that high percentages of the total nitrogen removed are associated with large sediment movement (Romkens et al., 1973;McDowell and McGregor, 1980).
Total nitrate loading through runoff was particularly low for all treatments and only accounted for 6 to 11 percent of the total nitrogen loss. As with TKN, higher nitrate losses were also apparent in tillage treatments without rye cover. Chisel and no-till treatments with rye cover had 80 and 40 percent less nitrate loss compared to conventional tillage with rye cover. Baker and Laflen (1982) reported an 82 percent reduction in nitrate loading on plots with 1500 kg/ha residue compared to plots with no residue, even though flow-weighted means were similar. In a extensive review of the literature, Baker and Laflen (1982) stated that seasonal losses of nitrate-N in overland flow generally range between 1.0 and 2.7 kg/ha, however losses through leaching can be as high as 20.0 kg/ha with 10.0 cm of percolation. The observed flow weighted mean concentrations of nitrate during the 1986 season are extremely low (Table 8). Baker and Laflen (1982) reported flow weighted mean concentrations of N03-N ranging from 3.9 to 4.7 ppm in runoff from field plots fertilized with 143 kg/ha of nitrogen. Romkens et al. (1973)      Even under optimum conditions these transformations would be expected to take at least 5 to 7 days (Keaney, 1973).

56
Following the first runoff event, a considerable amount (27.5 mm) of non-runoff producing rainfall occurred before in the next runoff event. Kanwar et al. (1985) showed that nitrate movement is primarily downward into the soil profile and between 40 and 90 percent of the nitrate present after surface application may leach below 30 cm in the first 12.7 cm of rainfall.

Atrazine Concentrations in Runoff
Atrazine concentrations in the runoff water ranged from trace amounts (< 1.5 ppb) to 275.6 ppb (Table 10).
Ninety-five percent of all the runoff samples analyzed had atrazine concentrations below 30 ppb. Treatment effects on atrazine concentrations were not obvious. Flow weighted mean concentrations in runoff were not significantly different between treatments according to the Friedman nonparametric statistical test. Triplett et al. (1978) also found that concentrations of atrazine in runoff water was not influenced by tillage method.
Highest concentrations of atrazine in runoff were observed on the first event of the 1985 season, 12 days after atrazine application. Concentrations on this event exceeded the 180 ppb level considered to be the "no adverse effect" concentration by the National Research Institute      (1985).
concentrations never exceeded 25 ppb from any events 58 that occurred more than 6 weeks after atrazine application.
The general decline in measured runoff concentration as time after application increased, coincides with the results of the regression analysis performed by Triplett et al. (1978).
A regression analysis of days after application vs.
concentration of atrazine in runoff showed a significant negative correlation (r = -0.90).

Atrazine loss through runoff
Seasonal atrazine losses for both growing seasons were very low with less than 0.01 percent of the applied atrazine being lost through runoff (Table 11). Substantial portions of the applied atrazine are likely to have leached out of the "runoff mixing zone" by non-runoff producing rainfall that occurred before the first runoff event in both seasons.
Following atrazine application in 1985 and 1986, 30 and 150 mm of non-runoff producing rainfall occurred before the first runoff event, respectively.
Other studies have demonstrated that significant amounts of atrazine can be lost through leaching (Whetje et al. 1984;Wu, 1980). Hall (1974) Hall et al. (1972) found that 60 percent of the applied atrazine was lost through degradation in the soil one month after application. 60 The timing of the first runoff-producing storm relative to application is very important in determining atrazine runoff losses. Baker and Johnson (1979) observed losses of 10 percent of the applied atrazine when an intense rainstorm occurred 24 hours after atrazine application. In contrast, when the first runoff-producing storm did not occur until at least 2 weeks after application, average total growing season losses were less than 2 percent.
Conservation tillage systems did influence seasonal atrazine losses considerably by reducing runoff volumes and the occurrence of runoff in early season events. Average seasonal losses of atrazine from chisel and notill plots with rye cover were 71 and 76 percent lower, respectively, than the same treatments without rye cover. Kenimer et al. (1986) reported that conservation tillage systems reduced total atrazine loss by 92 percent compared to that in conventional tillage because runoff volumes were considerably lower in conservation tillage systems.
Based on the results of this study, it would be difficult to make long-term predictions on the amount of atrazine that could be potentially lost through overland floW from corn-for-silage fields.
During the two growing seasons of this study, a major runoff event did not occur until the month of July which resulted in low seasonal losses of atrazine. A computer model that provides continuous simulation of runoff, erosion, and chemical transport processes, by using long-term weather data, would greatly improve predictions of potential losses of atrazine from silage corn fields.

PERFORMANCE ASSESSMENT OF THE CREAMS HYDROLOGY COMPONENT
6.1 Modeling Approach The CREAMS hydrology component incorporates two options for runoff prediction (Knisel, 1980 (Leggett and Williams, 1981;Thomann, 1982;Reckhow and Chapra, 1983). 3) Reliability Index (K) developed by Leggett and Williams (1981)  S) paired Comparison t-test: expressed as ( 2 ) where d is the mean difference between observed and the predicted values, and sx is the standard deviation.
If the t value generated by the equation is greater than the test statistic at the 0.05 level, then the null hypothesis H 0 : d = 0 is rejected in favor of H1:d > 0. predicted values.

Hydrology Component Description
The curve number method (Option 1), as modified by Williams and LaSeur (1976), relates direct runoff to daily rainfall as a function of a curve number. The curve number is a function of soil type, residue cover, management practice, and antecedent rainfall. Daily runoff, Q (mm) is related to daily rainfall, P (mm), a retention parameter, S (mm), and an initial abstraction parameter, Ia (mm), as Q = (P -Ia)2/(P -Ia + S) Storage, S, in equation (1) where UL is the upper limit of soil water storage in the root zone (mm} and SM is the soil-water content in the root zone (mm}. The maximum storage, Smax (mm}, in equation (4) is estimated from the curve number for moisture condition I The breakpoint intensity method (Option 2), is based on the Green and Ampt infiltration equation (Green and Ampt, 1911;Smith and Parlange, 1978). where A = RCtp/2, D = ~s -~i, ~s = water content at saturation, ~i = intital water content, and RC is the saturated infiltration rate (mm/hr). The infiltration-based model is highly sensitive to three parameters; GA, RC, and DS (Rudra et al. 1985).
Runoff is initiated when the precipitation rate for a        (Table 15). Several rain events had predicted runoff when no runoff was observed. The overpredictions on these events most likely resulted because the infiltration rate in the field was probably much higher due to the disturbance to the soil surface from tillage.
Runoff was also overpredicted on several early season events in the the no-till treatment (Table 16). In this treatment as well, surface sealing is likely to occur after several rain events have occurred. Changes in the infiltration rate that occur in the field as a result of surface sealing cannot be simulated by the model. In order to achieve close agreement to observed runoff values on the the majority of the storms, which occurred later in the season, best-fit parameter values representing lower infiltration rates were required. However, the change in values was not as dramatic as in the conventional tillage (Table 12).
The breakpoint intensity method was much better at predicting runoff for small, intense storms than the curve number number method in the conventional tillage treatment ( Figure 4).
Using the curve number method there were twice as many days where no runoff was predicted for an observed runoff event than when using the breakpoint intensity method. The curve number method depends more on the amount of    t Excessive rate storms precipitation rather than the intensity. As a result, runoff predictions are obscured in storms with small rainfall amounts but have high enough intensity to generate runoff. Failure to predict runoff in these events, especially if they occur close to the time of fertilizer or pesticide application, could seriously underestimate edge-of-field losses of agrichemicals. As was seen in this study, even small runoff events (June 24, 1985) can transport large amounts of soluble pesticides if they occur close to the time of application. Hall (1974) stated that 87 to 93 percent of the total pesticide loss can occur in the first five storms occurring after application.

80
Comparing the performance of the runoff prediction methods in excessive rate storms, the breakpoint intensity method proved to predict runoff better than the curve number method. On large excessive rate storms, (7/13/86 and 8/8/86), major differences in runoff volumes were observed between the breakpoint intensity method and the curve number method in the conventional tillage treatment (Table 13).
Runoff predictions for the conventional treatment, using the curve number method, were less than observed on 7 out of the 8 excessive rate storms. On four of the eight events no runoff was predicted at all and on 2 of the events less than 1.0 mm of runoff was predicted. It is critical that runoff predictions for excessive rate storms are accurate because most of the soil movement and sediment-adsorbed nitrogen losses are associated with these storms.
In most cases, the hydrology component predicted runoff reasonably close to observed values. In the conventional tillage system, the breakpoint intensity method performed well for small, large, and excessive rate storms. However, the RMS Error and the Reliability Index were relatively high indicating that there was a considerably amount of variability in the predictions. The linear regression line of the curve number method showed an acceptable correlation to the observed values but the problems associated with the excessive rate storms severely limit the utility of this method for estimating overall edge-of-field losses.
For the no-till treatments the curve number predicting runoff reasonably well. The breakpoint intensity method grossly underpredicted runoff for the two largest storms and the results of the statistical tests were the worst for all four simulations. The soil property dynamics in the no-till system are least understood (Blevins, 1985). Further adjustments to soil property parameters beyond the recommended range of values would be required to greatly improve the runoff predictions in this treatment.

7.
CONCLUSIONS 1) Surface rye crop residue can significantly reduce runoff and erosion losses from silage corn fields.
2) Reduced tillage without a winter cover crop does not significantly improve overland runoff and erosion control.
3 7) Eliminate chisel and notill plots without rye cover from the field study so that replicates of the remaining treatments can be increased.