Consequences of oversampling

0
- By: ,

Courtesy of Applied Technologies, Inc. (ATI)

Oversampling is generally perceived as an inefficient practice because of the redundancy it creates in the data, but rarely is it considered detrimental. The resulting increase in bandwidth seldom if ever, provides new information, as the signal is often badly attenuated and not very useful above a certain frequency. The decision to oversample is often based on the assumption that no undue harm will result from sampling more frequently.

The problem arises when oversampling is performed on the output of a digital-to-analog (D/A) converter, where the signal is maintained at a constant voltage level (equal to the last reading) between successive samples of the original signal. This is tantamount to repeating each reading n times in the interval between samples (see Figure 1), where n (an integer) is the factor by which the sampling rate is increased. Clearly, the variance of the time series is unaffected by this oversampling. In a spectral plot it means the area under the curve remains the same but is redistributed to cover the added bandwidth, a process that is almost the reverse of aliasing.

A case in point is the Applied Technologies' (ATI) sonic anemometer output, which is updated every 0.1 s. in addition to providing serial digital readings representing wind measurements every 0.1 s, the manufacturer offers D/A outputs of the same readings, designed primarily for monitoring purposes. Many investigators not only treat the latter as their primary signal source, but sample them at rates which are multiples of 10 Hz, sometimes even at rates arbitrarily chosen to suit the requirements of other sensors in the field. In some cases, the investigators have noted a sharper than expected drop in spectral response at the high-frequency end, without knowing what caused it.

Customer comments

No comments were found for Consequences of oversampling. Be the first to comment!