5 Shopping Cart
Your cart has been updated

Cover image via

MIT’s Movie Magic Will Change the Future of Filmmaking

Michael Maher

The MIT artificial intelligence lab is about to change the way movies are made and viewed. Take a look at three revolutionary new programs that must be seen to be believed.

Top image via TED Talks

MIT’s Computer Science and Artificial Intelligence Laboratory (MIT CSAIL) continues to make filmmaking headlines — so much so that we need to roundup all the future industry-changing advances they’ve already made.

Instant VFX — Touch and Move Objects in Post

MIT's Movie Magic Will Change the Future of Filmmaking - Dynamic Video
GIF via MIT / Abe Davis

Have you ever needed to film a tree blowing in the wind, only to go out and shoot on a day when the air is as still as can be? With Interactive Dynamic Video (IDV) you would be able to move an object after it was filmed. Let that sink in for a minute.

You could shoot the stationary tree, then, in post, you could grab the leaves and shake the tree. That’s a rather insane concept to consider. Much like a motion designer can manipulate shape layers however they want, filmmakers could do the same with backgrounds —and potentially actors.

MIT's Movie Magic Will Change the Future of Filmmaking - IDV
GIF via MIT/Abe Davis

A computer algorithm searches the entire video clip for incredibly small vibrations or movements of an object. In the video below, you’ll see how researchers shot a five-second video of a wireframe character. By analyzing the character’s subtle movements, the algorithm creates the Interactive Dynamic Video — which allows users to push, pull, and interact with an object with the mouse cursor.

The algorithm creates “vibration modes” to determine realistic movements. The algorithm also fills in the blank space in the background when an object changes position. The algorithm works without even knowing the 3D shape of the object.

The study was conducted by Abe Davis, who previously revealed IDV at a 2015 TED Talk. It was there he also talked about previous projects, like studying the effects sound waves have on object — the Visual Microphone.

Visit the Interactive Dynamic Video website to read all about the work being done by Davis and his fellow students.

AI-Generated Sound Effects

MIT's Movie Magic Will Change the Future of Filmmaking - Visually Indicated Sound

In June 2016, the first whispers of MIT’s technological filmmaking advances arose in a Washington Post article on computer sound design. It was such a breakthrough, we included it in an article titled Six Technologies Poised to Change the Future of Film.

Dubbed Visually Indicated Sounds, the researchers created an algorithm that could accurately predict the sound of one object hitting another.

From the MIT video below, “[S]ounds can tell us how objects behave when we physically interact with them. Then can convey whether and object is hard or soft, smooth or rough, and rigid or deformable.”

For the study, researchers recorded themselves using a drumstick to hit various objects like walkways, bookshelves, glass, and bushes. They then loaded a series of predictive sounds into a computer. When a silent film of a person hitting an object with a drumstick appears on screen, the algorithm creates the appropriate sound effect to match the action.

In this paper, we propose the task of predicting what sound an object makes when struck as a way of studying physical interactions within a visual scene. We present an algorithm that synthesizes sound from silent videos of people hitting and scratching objects with a drumstick. This algorithm uses a recurrent neural network to predict sound features from videos and then produces a waveform from these features with an example-based synthesis procedure. We show that the sounds predicted by our model are realistic enough to fool participants in a “real or fake” psychophysical experiment, and that they convey significant information about material properties and physical interactions.

Here’s a look at the algorithm in action. You’ll notice how it correlates movement and the matter an object being hit is composed of. It’s truly eye-opening stuff.

This technology has enough potential to become the definitive source for foley sounds. Studios would be able to record foley artists performing a variety of repetitive tasks, and the computer could determine the best and most realistic use of the sound — or potentially combine sounds to create the perfect sound effect.

Glasses-Free 3D Movie Theater Screens

Ahead of SIGGRAPH 2016, MIT announced yet another major advancement — this one targeted at movie-going audiences. Cinema 3D is a prototype lens display that projects 3D images to every seat in the theater without the need for 3D glasses. Working with Israel’s Weizmann Institute of Science, MIT CSAIL’s latest paper proclaimed the astonishing feat.

Existing approaches to glasses-free 3-D require screens whose resolution requirements are so enormous that they are completely impractical… This is the first technical approach that allows for glasses-free 3-D on a large scale. – MIT professor Wojciech Matusik

Here’s a look at how Cinema 3D works.

You can read more about glasses-free 3D in our previous article.

Enjoy reading about future tech? Let us know if you’d like to see more articles like these.