Skip to main content
U.S. flag

An official website of the United States government

Publication Date
10 November 2020

Interplay of Changing Irrigation Technologies and Water Reuse: Example from the Upper Snake River Basin, Idaho, USA

Subtitle
The volume of water needed to stabilize regional aquifers can exceed the amount that could be “saved” by improving irrigation efficiency.
Print / PDF
Powerpoint Slide
Science

Access to irrigation water is critical for determining the future resiliency of many agricultural systems (Foley et al.,

2011), and the challenges of providing irrigation water require a closer scrutiny of its efficient use (Grafton et al., 2018).  We quantify the impact of changing irrigation efficiency on basin water stocks, aquifer recharge, downstream discharge,

and within-basin water reuse using the test case of the Upper Snake River Basin (USRB) of Idaho, USA, which is an

intensive agricultural setting in the semiarid American west that relies on both surface and groundwater. In this study, we framed a series of model parameterizations together to test the following hypotheses guided by the key constraints of water resource management in the USRB: 1) only a fraction of the reduced incidental returns from modernizing technology would be needed to maintain aquifer volume if introduced as enhanced aquifer recharge (EAR), 2) increasing EAR would decrease irrigation water reuse in the basin.  We performed a suite of simulations using the University of New Hampshire Water Balance Model (WBM) that used progressively more efficient irrigation technologies, which are represented explicitly in the model.  Each simulation was run with and without sufficient enhanced aquifer recharge to maintain the aquifer head at today’s level.  We found support for the first hypothesis; after an amount of recharge needed to eliminate aquifer drawdown at today’s baseline parameterization, less additional EAR was required than was saved from increasing irrigation efficiency.  However, we found that irrigation reuse increased with the adaptations resulting from a shift towards more groundwater irrigation use as the water was diverted away from reservoirs for recharge operations.

Impact

Coupling the two adaptations mitigated adverse effects from each individually; however, aquifer head was always affected more than downstream flow.  Where surface and groundwater are managed as a single resource (conjunctively), groundwater irrigators will likely experience more significant shortfalls as both measures are implemented. 

Summary

Careful allotment of water resources for irrigation is critical to ensuring the resiliency of agriculture in semi-arid regions, and modernizing irrigation technology to minimize inefficient losses is an important tool for farmers and agricultural economies.  While modernizing irrigation technology can achieve reductions in non-beneficial use of water such as bare soil evaporation, non-consumptive losses or water returned back to the landscape are also reduced, often eliminating flowpaths that other users rely on.  In basins using a combination of surface and groundwater, replenishing aquifer storage by the enhanced aquifer recharge (EAR) of seasonally available water can mitigate the aquifer drawdown that results from reduced recharge when irrigation efficiency is improved.  We examine the effects of EAR on the system-scale efficiency of modernizing irrigation technology and the resulting changes to the reuse of non-consumptive losses using a macro-scale hydrologic model applied to the semi-arid Upper Snake River Basin (USRB) of western Wyoming and southern Idaho, USA.  Irrigation technologies were represented explicitly in the model, and available data-informed baseline parameterizations of irrigation technology.  A suite of parameterizations was simulated that updated existing technologies to be more efficient, both with and without sufficient EAR to cause stabilization of the aquifer at the present-day head.  As expected, simulated changes to irrigation technology resulted in greater downstream export of pristine water and a higher rate of aquifer drawdown when EAR was not simulated.  Under current water use and cropping patterns, we were not able to simulate aquifer stabilization and maintain discharge downstream at any level of irrigation efficiency.  We found support for the hypothesis that as efficiency improves, less EAR is required to maintain a stable aquifer than returns flows are reduced due to increased efficiency.  To evaluate the hypothesis, we defined the management benefit as a metric that compared the difference between the change in irrigation’s net recharge from the change in MAR required as irrigation technology became more efficient.  The metric generally indicated that less EAR was needed than net recharge was lost, but only for the most efficient case did the management benefit exceed the EAR needed at baseline to stabilize the aquifer.  Increasing efficiency of irrigation technology reduced reuse, the gross irrigation derived from prior non-consumptive losses, but simulating EAR increased reuse for a given parameterization, leading to higher effective irrigation efficiency.  We find that local groundwater storage that users depend on is generally more sensitive to management decisions than downstream flows, and drawdown of the aquifer without EAR always exceeded any decrease in discharge induced by EAR.  Improving resource sufficiency in semi-arid systems like the USRB will require an array of solutions that will necessarily weigh benefits to local and downstream users.

Point of Contact
John Weyant
Institution(s)
Stanford University
Funding Program Area(s)
Publication