— Ch. 1 · Origins And Early Development —
Facial recognition system.
~4 min read · Ch. 1 of 6
In the 1960s, Woody Bledsoe began a project that required human hands to map every face. A person used a graphics tablet to mark coordinates like pupil centers and eye corners on photographs. This manual process created a database of twenty individual distances between facial features. The computer then compared these measurements to find matches among forty pictures processed per hour. Takeo Kanade later demonstrated a system in 1970 that located anatomical features without human intervention. His machine calculated distance ratios between the chin and other points automatically. Despite early limitations, interest grew enough for Kanade to publish the first detailed book on the subject in 1977.
Algorithmic Evolution And Machine Learning
Matthew Turk and Alex Pentland developed Eigenfaces in the early 1990s using principal component analysis. Their method reduced data processing by encoding faces as weighted combinations of global orthogonal features. Pentland defined specific eigen-features like eyes and mouths to advance this linear model approach. By 1997, researchers improved upon Eigenfaces with Fisherfaces using linear discriminant analysis. Christoph von der Malsburg introduced the Bochum system which utilized Gabor filters to record facial grids. This software could identify subjects through mustaches or glasses and was sold commercially as ZN-Face. Paul Viola and Michael Jones launched AdaBoost in 2001 to enable real-time face detection in video footage. Modern systems now employ deep learning neural nets with over 120 million connection weights to achieve high accuracy rates.