According to a report from The Scientist, MIT researchers led by William Freeman have used high-speed video to capture a surprising amount of information from the small movements made by ordinary objects when exposed to sounds.
Not only were their recordings able to reconstruct the movements of inanimate objects with extreme sensitivity, they were able to translate those movements back into the audio that created them.
“Mary Had A Little Lamb” was reconstructed from the vibrations the song created on a potted plant, and the human voice was reassembled from the vibrations on a bag of chips. Thanks to Freeman’s research, our environment can literally spy on us. Think carefully next time before ranting in front of your Lay’s.
Top image: Researchers analyzed the vibrations from a set of earbuds and were able to identify the music playing. (Credit: Abe Davis, MIT)