Ultrasonic sensors are pretty cool. Transducers send and receive waves, measurements magically get spit out by the sensor, and nothing had to touch anything, right? You can choose sensors based on range, or type of transducer, or even safety ratings. Most sensors declare their range and frequency together, which seems kind of odd.
Why is the range of an ultrasonic sensor tied to its frequency? For the answer, we turn again to our old friend, high school physics.
Ultrasonic sensors use ultrasonic waves to measure distance; ultra being the fancy science word for “higher than” or “more than,” and sonic meaning sound (rather than a speedy, blue cartoon hedgehog or a drive-in/drive-through fast food chain). But you knew that.
The general idea behind ultrasonic sensors is Distance = Time * Velocity. For a given set of atmospheric conditions (air temperature, humidity, air pressure, etc.) the velocity of sound is known. So, the time required for a sound wave to travel from the transducer of the sensor to a target object or surface and back must be directly related to the distance between them. But again, you knew that.
What we want to know is why higher frequency sensors have shorter effective detection ranges.
Attenuation is the culprit behind all kinds of signal loss. We blame attenuation for the accumulation of factors working against signal propagation the way we blame friction for the accumulation of factors working against motion.
This holds true for sound waves. Sound waves physically move through the air (and water, and physical objects, etc.), so properties of the air that oppose that motion get bundled together and called attenuation. As we noted above, atmospheric conditions affect the speed of sound; as such, they are a big part of the attenuation of sound in air.
A decent approximation for the maximum attenuation of sound waves between 50 kHz and 300 kHz is α(f) = 0.022 * f – 0.6, where α(f) is the maximum attenuation in dB/ft, and f is the frequency of the sound wave in kHz.
At first glance, this equation doesn’t seem to grant much wisdom to anyone about anything.
“A factor of 0.022? That’s pretty minuscule!” Yes, yes it is.
At 50 kHz, attenuation is 0.5 dB/ft. But at 150 kHz? Attenuation has jumped to 2.7 dB/ft. Considering that dB is measured logarithmically, the difference between 5 dB lost over 10 feet and 27 dB over the same 10 feet isn’t a factor of ~5 (27/5 = 5.4). It’s closer to 160 (10 / 10 = 158)!
And that’s the where the signal goes! That logarithmic scale describes not a gradual decline but a sharp drop off in sound propagation.
Granted, that’s not the end of the story. The equation above generalizes for the full range of humidity and doesn’t account for atmospheric pressure changes, but their effects are minimal. But remember that the equation measures dB lost per foot. It doesn’t know how sensitive your sensor is or how much power is behind your signal.
And that’s why it’s important to remember that not all ultrasonic sensors are created equal. Some sensors are very basic, with little to no configurable options. Others let you configure ALL THE THINGS! In the end, you still need to the sensor that best fits your needs. Yes, the frequency is an important piece of the puzzle, but it’s still only one piece.