Sometimes even science fiction can’t keep up with technological developments…
Consider the excellent Person of Interest, a television show that over four seasons (the fifth and final season began last night) has had a long and fascinating story arc exploring the impact of artificial intelligences on our world, especially in the context of surveillance and law enforcement. In the final episode of season 4 – which aired around a year ago -, one scene has the character ‘Control’ placing cell phones in a sound-proof box, to stop the artificial intelligence named ‘Samaritan’ from hearing the content of her discussion by hijacking the cell’s microphone.
Turns out that may have been a touch naive, especially coming from someone in her position. Because in real life, a couple of months earlier at TED 2015), researcher Abe Davis demoed software that allowed him and his team to reconstruct audio of an event, purely from video taken (ie. no audio required at all) on an off-the-shelf camera. They did so by creating algorithms that looked for tiny movements in objects in the environment (on the order of micrometres, invisible to the human eye and even a fraction of a pixel). As ‘sound’ is actually air vibration perceived by our ears, the micro-movements captured by the video can be used to reconstruct the air vibrations that caused them.
Subtle motion happens around us all the time, including tiny vibrations caused by sound. New technology shows that we can pick up on these vibrations and actually re-create sound and conversations just from a video of a seemingly still object. But now Abe Davis takes it one step further: Watch him demo software that lets anyone interact with these hidden properties, just from a simple video.
Watch the amazing demonstration below:
While Samaritan would surely employ such a system in its ‘sousveillance’ arsenal, Davis does point out that there are other applications as well, including creating an actual ‘3D model’ of an object and its natural movements, simply by capturing its subtle motion.
Related: