This study evaluates the accuracy of RRTMG_LW, a radiation code used in the CESM as well as other GCMs, when the surface spectrally dependent emissivity is included in the radiative transfer calculation.
By benchmarking the code, we can understandard the accuracy of the RRTMG_LW for computing atmospheric radiative cooling rate when surface spectral emissivity is taken into account.
Recent studies have shown that representation of surface spectral emissivity has a non-negligible impact on the polar radiation budget and on simulated climate. This fact motivates the inclusion of surface spectral emissivity into the radiative transfer scheme employed in the climate models. Using a line-by-line radiative transfer model, LBLRTM, as a benchmark, we quantify the errors due to the approximation of surface spectral emissivity in the RRTMG_LW, a radiation scheme widely used by climate models and numerical forecast models. Error quantifications are done using representative sounding profiles in the tropics, mid-latitude summer, sub-arctic winter and Sahara desert interfaced with either ocean or desert spectral emissivities. Though including surface emissivity causes little difference in broadband outgoing longwave radiation, it can cause noticeable differences in longwave radiative cooling rate from surface to 700 hPa. Compared to the benchmark, the largest deviation in radiative cooling rate usually happens at the lowest atmospheric layers next to the surface, ranging from −0.23 K/day to 0.19 K/day with the largest fractional difference being −16.22%. The discrepancies caused solely by the representation of surface emissivity are confined at the lowest atmospheric layer. Using the emissivity at the diffusive angle or the hemispherically averaged emissivity can affect the calculation accuracy, especially for strongly anisotropic surface emission. The band-by-band contribution to the discrepancies between the RRTMG_LW and benchmark reveals compensating errors among the bands used in the RRTMG_LW.