Experimental Investigation of Thermal Dispersion in Saturated Soils with One-Dimensional Water Flow
Microscopic differences in soil pore water velocity cause hydrodynamic dispersion of solute transport. By analogy, thermal dispersion should be considered in soils where water and heat flow occur simultaneously. Few published data are available regarding thermal dispersion in soils. In this study, we investigated thermal dispersion influences on heat transport in saturated soils with one-dimensional water flow. A new soil container was developed to eliminate wall flow influences on saturated water flow. An inverse model was applied to obtain the thermal dispersion coefficient (d) by fitting the conduction–convection–dispersion (CCD) model to soil temperature change as a function of time from heat-pulse measurements. Thermal dispersion became significant when the water flux was higher than a threshold value. In the studied water flux range, the maximum contribution of hydrodynamic dispersion to effective heat conduction [d/(d + 0), where 0 is the bulk thermal conductivity] was 6, 9, and 12% in a sand, a silt loam, and a sandy clay loam, respectively. A power function relationship was established between d and the soil water flux density (Jw): d = kJw0.9, where k is a coefficient related to soil texture. Experimental evaluation indicated that, compared with the conduction–convection model, the CCD model performed better in describing the temperature change vs. time data from heat-pulse measurements.