Careful allotment of water resources for irrigation is critical for ensuring the resiliency of agriculture in semiarid regions, and modernizing irrigation technology to minimize inefficient water losses is an important tool for farmers and agricultural economies. While modernizing irrigation technology can achieve reductions in the nonbeneficial use of water, such as bare soil evaporation and nonconsumptive losses, water returned to the landscape is also reduced, often eliminating flow paths that other users rely on. In basins using a combination of surface and groundwater, replenishing aquifer storage by the managed aquifer recharge (MAR) of seasonally available water can mitigate the aquifer drawdown that results from reduced recharge when irrigation efficiency is improved. We examine the effects of MAR on the system-scale efficiency of modernizing irrigation technology and the resulting changes in the reuse of nonconsumptive losses, using a macroscale hydrologic model applied to the semiarid upper Snake River basin (USRB) of western Wyoming and southern Idaho, USA. Irrigation technologies were represented explicitly in the model, and available data informed baseline parameterizations of the irrigation technology. A suite of parameterizations were simulated that updated the existing technologies to be more efficient, both with and without sufficient MAR to cause a stabilization of the aquifer at the present-day head. As expected, simulated changes in irrigation technology resulted in greater downstream export of pristine water and a higher rate of aquifer drawdown when MAR was not simulated. Under current water use and cropping patterns, we were not able to simulate aquifer stabilization and maintain discharge downstream at any level of irrigation efficiency. We found support for the hypothesis that, as efficiency improves, less MAR is required to maintain a stable aquifer than when return flows are reduced due to increased efficiency. To evaluate the hypothesis, we defined the management benefit as a metric that compared the difference between the change in irrigation's net recharge and the change in MAR required as irrigation technology became more efficient. The metric generally indicated that less MAR was needed than net recharge was lost, but only for the most efficient case did the management benefit exceed the MAR needed at the baseline to stabilize the aquifer. Increasing efficiency of irrigation technology reduced the reuse of the gross irrigation derived from prior nonconsumptive losses, but simulating MAR increased reuse for a given parameterization, leading to higher effective irrigation efficiency. We find that local groundwater storage that users depend on is generally more sensitive to management decisions than downstream flows, and the drawdown of the aquifer without MAR always exceeded any decrease in discharge induced by MAR. Improving resource sufficiency in semiarid systems like the USRB will require an array of solutions that will need to balance benefits to local and downstream users.