— Ch. 1 · Origins And Invention —
A* search algorithm.
~4 min read · Ch. 1 of 6
In 1968, three researchers at Stanford Research Institute published a paper that would change how computers navigate. Peter Hart, Nils Nilsson, and Bertram Raphael introduced the A* algorithm to solve pathfinding problems for Shakey the Robot. This mobile robot was designed to plan its own actions in an uncertain world. The team needed a way to guide the machine through physical obstacles without human intervention. Nils Nilsson initially proposed using the Graph Traverser algorithm for this task. That method relied on a heuristic function estimating distance from a node to the goal while ignoring the cost already traveled. Bertram Raphael suggested combining both values into a single score. He added the actual cost from start to current node with the estimated remaining distance. Peter Hart then developed the mathematical concepts of admissibility and consistency for these heuristics. Their work proved that no other algorithm could expand fewer nodes if the heuristic remained consistent. The original paper included a theorem stating this optimality condition held true under specific tie-breaking rules.
Mathematical Foundations
The core strength of A* lies in its ability to guarantee optimal solutions when certain conditions are met. An algorithm is admissible if it always returns the best possible result. When the heuristic function used by A* satisfies admissibility, the search will never miss the shortest path. Consistency adds another layer of reliability. If the heuristic value for any node never exceeds the actual cost to reach a neighbor plus the neighbor's heuristic estimate, the algorithm becomes monotone. This property ensures each node is processed only once during the search. In 1985, Rina Dechter and Judea Pearl published a definitive study showing that consistency was indeed required for full optimality. They demonstrated cases where an admissible but inconsistent heuristic caused A* to expand exponentially more nodes than necessary. Their research clarified that while admissibility guarantees finding the shortest path, consistency prevents redundant processing. Without consistency, Dijkstra's algorithm might outperform A* significantly in pathological scenarios. These mathematical properties form the theoretical backbone of modern pathfinding systems.