— Ch. 1 · Origins And Creation —
Kismet (robot).
~2 min read · Ch. 1 of 6
In the 1990s, Dr. Cynthia Breazeal built a robot head at Massachusetts Institute of Technology to test affective computing. She named this machine Kismet after a Turkish word meaning fate or luck. The project aimed to create a device that could recognize and simulate human emotions through physical interaction. This experiment sought to bridge the gap between cold machinery and warm social connection. Breazeal wanted to see if a simple head could learn to engage with people like an infant does.
Hardware Design Specs
The physical build cost approximately US$25,000 in materials during its construction phase. Engineers installed four Motorola 68332 processors alongside nine 400 MHz PCs and one additional 500 MHz PC. These components powered the auditory and visual sensors required for basic interaction. The design included ears, eyebrows, eyelids, lips, jaw, and head mechanisms to generate facial expressions. Proprioception abilities allowed the head to sense its own position relative to the environment.Software Architecture Systems
A synthetic nervous system processed raw data from stereo cameras and microphones into emotional categories. The vision module detected eye contact, motion, and skin color despite some controversy over the latter method. When Kismet moved its head, it temporarily disabled motion detection to avoid registering self-motion as external threats. Audio processing focused on identifying affective speech patterns common in infant-directed communication. The system classified five specific types of vocal intent including approval, prohibition, attention, comfort, and neutral tones.Social Interaction Models
Breazeal framed her relationship with the robot as similar to an infant-caretaker dynamic where she acted as the parent figure. She provided scaffolding for Kismet's development through a motivation system that dictated behavioral responses. At any given moment, the machine could only occupy one emotional state such as anger or happiness. Breazeal explicitly stated that Kismet was not conscious and possessed no actual feelings behind these displays. The robot communicated motivational states through emotive facial expressions like extreme anger, disgust, excitement, fear, interest, sadness, surprise, tiredness, and sleep.Audio And Motor Output
The device spoke using a proto-language filled with phonemes resembling human baby babbling sounds. A DECtalk voice synthesizer handled pitch changes, timing adjustments, and articulation shifts to convey different emotions. Intonation varied between question-like utterances and statement-like declarations depending on the context. Lip synchronization followed animation strategies prioritizing visual shorthand over perfect imitation of mouth movements. This approach ensured the viewer accepted the simulation without challenging its simplicity.Legacy And Museum Display
Kismet now resides within the MIT Museum following the conclusion of its experimental period. The physical head remains on display as a historical artifact of early robotics research. It stands as evidence of attempts to make machines socially responsive during the 1990s. Visitors can observe the original hardware components that powered this pioneering affective computing project. The museum preserves the legacy of Dr. Cynthia Breazeal's work in social robotics.