Using Drainage Lysimeters to Evaluate Irrigation and Nitrogen Interactions in Cotton Production

E.C. Martin, Maricopa Agricultural Center
E. J. Pegelow, Maricopa Agricultural Center
J. Watson, Maricopa Agricultural Center

Abstract

This is a continuing report on the effects of over-irrigation in cotton production. Started in the Spring of 1995, this study uses drainage lysimeters to study the impact of over-irrigation on nitrate leaching losses. Furthermore, yield and other growth components are monitored to see what effect, if any, the over-irrigation has. The study was initiated at the Maricopa Agricultural Center, Maricopa, Arizona. The drainage lysimeters used are large, open-topped steel boxes filled with soil and placed underground in the experimental field. Crops are grown directly above the lysimeters and the water that moves through the soil profile is collected at the bottom of the lysimeter and analyzed. In this study, two lysimeters were installed. The lysimeters were 80" wide (two row widths), five feet long, and six feet deep. They were placed 18 inches below the soil surface and filled with soil as to best represent the soil in its natural condition. The data presented in this paper are from three years of an ongoing experiment. Throughout the growing season, water samples were taken from the lysimeters in the field. Nitrogen applications were made according to field conditions and weekly petiole sampling. Irrigations were made according to field conditions and using the AZSCHED irrigation scheduling program. Treatment one was irrigated according to the schedule recommended by AZSCHED. The amount applied was equal to the total crop water use since the last irrigation. In treatment two, the timing was the same as treatment one, but the amount of irrigation water applied was 1.5 times more water. Yield samples were taken at the end of each season and showed no significant differences between treatments, with yields averaging about 1100 lb./acre of lint in 1995, 940 lb./acre of lint in 1996 and 1300 lb./acre in 1997. Cumulative drainage was 8 inches in lysimeter one and 28 inches in lysimeter two. Nitrate losses for the three years totaled 126 lb. N/acre for lysimeter two and 72.5 lb. N/acre for lysimeter one.

Introduction

Although many cotton growers are aware of the relationship between irrigation water applied and yield, very few know how their irrigation management effects nitrate losses out of the rootzone. In most cases, growers know that too little water can reduce crop yields while too much water could cause excessive vegetative growth and a reduction in the yield. However, little is known of the fate of nitrogen due to excessive drainage caused by excess irrigation. This paper discusses an ongoing project to study the interaction between irrigation strategy and the loss of nitrate below the rootzone in cotton production.

Materials and Method

In Spring of 1994, two large, stainless steel drainage lysimeters were constructed and placed into the ground at the Maricopa Agricultural Center, M aricopa, AZ. The drainage lysimeters were large steel boxes with the top open. The installation was similar to that described by Martin et al., 1994. The lysimeters used in this study were 80" wide (two row widths), five feet long and six feet deep. In the site where the lysimeters were to be placed, the soil was removed, layer by layer, and separated into individual piles. Once all of the soil was removed, the lysimeters were set in place, approximately 18 inches below the soil surface and filled with soil, again layer by layer, as to best represent the soil in its natural condition.

The lysimeters were placed approximately 75 ft. from the head end of the field. A collection bucket was placed in each lysimeter and tubing was connected to allow a pump to be used to drain the lysimeter when the collection bucket became full. The sampling was done weekly though a sample was not always present because no drainage had occurred. The leachate was measured and samples were taken and analyzed for nitrate-N content.

Irrigation timing and amount of water applied were determined using neutron probe measurements, field observations and a computerized irrigation scheduling program called AZSCHED (Fox, et al., 1999). The maximum allowable deficit (MAD) of soil water in the rootzone was set to 50%. Once the 50% MAD target was reached, the amount of water applied was determined by the amount needed to refill the soil profile in the rootzone to 100%. Thus, at the time of irrigation, the amount of water applied was equal to the total crop water use (ETC) since the last irrigation, plus any system inefficiency. In this study, Treatment one was irrigated at a level of 1.0 * ETC. Treatment two was over- irrigated and the amount applied was 1.5 * ETC. Each treatment was replicated four times with one plot in each treatment containing a drainage lysimeter. The nitrogen applications were made based on University of Arizona recommendations using preseason soil sampling and weekly in-season petiole sampling (Doerge and Des Rosiers, 1992). The plots were dry planted and watered up (April 10, 1995; April 19, 1996; and April 1, 1997).

Results and Discussions

Irrigation

The total amount of water applied to each treatment for 1995 is shown in Table 1. Water applied to each treatment remained the same until layby (July 19). Before this time, approximately four inches of water was applied to each treatment when irrigation was called for. This was done because four inches was the minimum amount of water that could be applied and still effectively cover the entire plot. In many cases, the target amount was less than four inches but four inches were still applied. In 1996 (Table 2), less water was applied. Again, early season irrigations remained the same for the two treatments until the end of June, when different amounts could be applied without affecting irrigation efficiency. In 1997 (Table 3), slightly less water was applied, though about the same as in 1996. Again, as with the other years, early irrigations were the same across all treatments until layby.

The Nitrogen Applications

Nitrogen applications were made based on preseason soil sampling and in-season petiole sampling. In 1995, preseason soil tests showed a deficient level of soil nitrogen and 40 lb./acre of N were applied. Another application of 50 lb./acre of N was made to the field on June 18 and a final application of 50 lb./acre of N was made July 18. All of the plots showed the same relative petiole concentrations and there were little or no differences between treatments. In 1996, a preseason soil test showed no N deficit. Nitrogen was applied on May 23 at a rate of 60 lb. N /acre. On June 23, another application of 75 lb./acre of N was made. A final application of 30 lb N/acre was made on July 20. In 1997, the nitrogen was applied three times during the growing season. According to petiole and plant mapping information, 47 lbs. N/acre was applied on May 14. This was followed by two more applications of approximately 47 lbs. N/acre on June 25 and again on July 9.

The Yield

Yield data for 1995 were collected on November 22, 1995 (Table 1). There was no significant difference in yield between the treatments. Although Treatment two yielded slightly more, the difference was not significant. In 1996, harvest was on November 21, 1996 (Table 2). Again, there were no significant differences in the yields between the treatments. In 1997 (Table 3), the yield was up from the previous years. Once again, treatment two had a slightly higher yield but with no significant difference.

Water Drainage

The lysimeters began to drain almost immediately after the first irrigation in 1995. The seasonal drainage data are shown in Table 1 and presented in graphic form in Fig. 1. As seen in Fig. 1, the lysimeters drained at approximately the same rate until about July 19 (layby). This makes sense since the amount of water applied prior to July 19 was the same for all treatments. Also, we can see from the graph that lysimeter one (treatment one) had almost no drainage after a layby. This was because after layby, the target amount could be obtained and treatment one did not receive any excess irrigation water. During the 1996 season (Table 2-Fig. 1), there was no drainage from any lysimeter for the first part of the season. This primarily due to the low soil water content at the beginning of the season and the lag time associated with drainage lysimeters. Once drainage did begin occurring, around the first of July, the lysimeters drained according to the irrigation application amounts, with lysimeter one draining more than lysimeter two. In 1997 (Table 3-Fig. 1), we had an increase in drainage. This was due part to the irrigation of the winter rye crop which left the soil profile quite full at the beginning of the cotton season. Also, there were some early season rains which caused much of the drainage.

Nitrate Losses

The total amount of nitrogen recovered in the drainage water for 1995 is shown in Table 1 and Fig. 2. The nitrate-N losses follow the drainage water closely. This would be expected since nitrates move with water quite easily. As with the drainage water, lysimeter two lost more nitrate-N (33 lb./ac) than lysimeter one (20 lb/ac). In 1996, this trend continued, but at a somewhat lower rate (Table 2-Fig. 2). Lysimeter two recorded 24 lb. N/ac leached and lysimeter one recorded almost 14 lb. N/ac leached. The differences in the leaching amounts were very similar to those measured in 1995. In 1997 (Table 3, Fig. 2), there was a large increase in both drainage and nitrate losses. This increase could have been due to the rains or to the fact that rye was planted during the winter as a cover crop and may have helped to keep nitrogen losses that normally occur during the winter tied up in the organic matter. This may have allowed for a bigger flush of nitrogen to occur early in the season in 1997. Also, rains that occurred early in the season may have caused additional nitrate leaching.

Summary

The lysimeter data gathered thus far helps to show that even with close scrutiny of our water, some nitrates will be loss. However, we can control them and after three years, lysimeter one has lost an average of about 25 lbs. N/yr (Table 4). If we choose not to control our water, over-irrigation can easily be turned into nitrate losses. During the past three years, lysimeter two has lost an average of 42 lbs. N/acre.

References

  1. Doerge, T.A. and Des Rosiers, S. 1992. PLANTEST, Version 1. Cooperative Extension. University of Arizona, Tucson, AZ.
  2. Fox, F.A., Jr, T.F. Scherer, D.C. Slack and L.J.Clark. 1992. Arizona Irrigation Scheduling (AZSCHED Version 1.01): Users Manual. Cooperative Extension. University of Arizona, Tucson, AZ. Publication number: 191049.
  3. Martin, E.C., T.L. Loudon, J.T. Ritchie and A. Werner. 1994. Use of Drainage Lysimeters to Evaluate Nitrogen and Irrigation Management Strategies to Minimize Nitrate Leaching in Maize Production. Transactions of ASAE. 37(1):79-83.

This is a part of publication AZ1006: "Cotton: A College of Agriculture Report," 1998, College of Agriculture, The University of Arizona, Tucson, Arizona, 85721. Any products, services, or organizations that are mentioned, shown, or indirectly implied in this publication do not imply endorsement by The University of Arizona. The University is an Equal Opportunity/Affirmative Action Employer.
This document located at http://ag.arizona.edu/pubs/crops/az1006/az10065c.html
Return to Cotton 98 index