Gabor uncertainty

Gabor uncertainty is a limit to the accuracy of time-frequency analysis, analogous to Heisenberg's uncertainty principle.

Dennis Gabor (1946 ) was the first to realize that the uncertainty principle applies to information theory and signal processing. Thanks to wave-particle duality, signals turn out to be exactly analogous to quantum systems. As a result, the exact time and frequency of a signal can never be known simultaneously: a signal cannot plot as a point on the time-frequency plane. This uncertainty is a property of signals, not a limitation of mathematics.

Heisenberg’s uncertainty principle is usually written in terms of the standard deviation &sigma; of position x, the standard deviation of momentum p, and Planck’s constant h :


 * $$\sigma_x \sigma_p \geq \frac{\mathrm{h}}{4\mathrm{\pi}}\ \approx \ 5.3 \times 10^{-35}\ \mathrm{m}^2\,\mathrm{kg.s}^{-1}$$

In other words, the product of the uncertainties of position and momentum is no smaller than some non-zero constant. For signals, we do not need Planck’s constant to scale the relationship to quantum dimensions, but the form is the same. If the standard deviations of the time and frequency estimates are t and f respectively, then we can write Gabor’s uncertainty principle thus:


 * $$\sigma_t \sigma_f \geq \frac{1}{4\pi} \approx 0.08\ \mathrm{cycles}$$

So the product of the standard deviations of time, in milliseconds, and frequency, in Hertz, must be at least 80 (the units are ms.Hz, or millicycles).