Neural networks are already beating us at games, organizing our smartphone photos, and answering our emails. Eventually, they could be filling jobs in Hollywood. Over at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), a team of six researchers created a machine-learning system that matches sound effects to video clips. Before you get too excited, the CSAIL algorithm can’t do its audio work on any old video, and the sound effects it produces are limited. The team fed that initial batch of 1,000 videos through its AI algorithm. By analyzing the physical appearance of objects in the videos, the movement of each drumstick, and the resulting sounds, the computer was able to learn connections between physical objects and the sounds they make when struck. Then, by “watching” different videos of objects being whacked, tapped, and scraped by drumsticks, the system was able to calculate the appropriate pitch, volume, and aural properties of the sound that should accompany each clip. Learn more at http://www.wired.com/2016/06/mit-artificial-sound-effects/