— Ch. 1 · Mathematical Foundations And History —
Deconvolution.
~2 min read · Ch. 1 of 6
Norbert Wiener published Extrapolation, Interpolation, and Smoothing of Stationary Time Series in 1949. This book laid the groundwork for deconvolution theory at the Massachusetts Institute of Technology. The research inside had been classified during World War II before its release to the public. Early applications emerged in weather forecasting and economics shortly after publication. These fields needed ways to recover original signals from distorted measurements taken by instruments.
Core Techniques And Algorithms
Raw deconvolution collapses into simple filter reversing when measurement error remains very low. In physical measurements, noise enters the recorded signal as an epsilon term that complicates recovery. If a noisy image assumes no noise exists, statistical estimates become incorrect and amplify errors. The lower the signal-to-noise ratio, the worse the final estimate becomes. Wiener deconvolution improves results if knowledge of white noise types exists within the data set.Seismic Imaging Applications
Enders Robinson worked with Norbert Wiener and Norman Levinson at MIT in 1950. They developed the convolutional model for reflection seismograms used to map Earth structure. A seismic wavelet w(t) convolves with an Earth-reflectivity function e(t) to create the recorded signal s(t). Seismologists assume reflectivity is white to simplify calculations regarding power spectra. Designing a Wiener filter shapes the estimated wavelet into a Dirac delta spike for clearer interpretation.