In a study at New York University, a baby named Sam wore a headcam in weekly sessions that captured his experience for 18 months. Scientists loaded the 61 hours of sights and sounds into a simple AI program to investigate how children learn language, which simultaneously demonstrated that AI is able pick up some basic elements of language from a relatively low amount of input—and without prior knowledge of language structure. Considering that most well-known AI technologies—such as ChatGBT—are “learning” their language from trillions of words as input, the software used in this study more closely mimicked how children acquire vocabulary (although the technology wasn’t able to acquire the child’s views on, say, his favorite foods.)