A Brief History of RAW Footage in Video Production
RAW footage wasn’t a video production option until a little more than a decade ago. Let’s take a look at how the industry got to where it is today.
Cover image via Disney.
Most productions take the ability to shoot and work with RAW footage for granted. Many people forget that RAW wasn’t even a concept for video cameras until about a decade ago.
In this article, we’ll go through the fascinating story of the history of RAW and see what that might show us about the future of this kind of imaging.
Early History of Digital Cinema
Film is called “film” for a reason. While celluloid remained the primary medium for cinema for over a century, the history of electronic video cameras also dates back to the early 20th century. Early cathode-tube cameras were used extensively for television broadcast until the 1970s, when Panasonic and JVC released the first magnetic tape-recording medium. This ushered in the era of tape recording, which would make its way into nearly every household with the advent of VHS technology.
Video technology continued to improve as the millennium drew to an end. With the rise of personal computing and increasing internet speed and connectivity, the possibility of all-digital motion picture capture and finishing started to become a reality.
The (much-maligned) Star Wars prequel trilogy, specifically the first two installments, brought digital cinema to the common consciousness. While earlier films were produced entirely digitally, no production of the Star Wars magnitude had taken the gamble of trading stock for drives.
Between The Phantom Menace and Attack of the Clones, Tim Burton’s Planet of the Apes laid the foundation for the encoding and decoding of compressed MPEG 2 footage. Jurassic Park III was the world’s first entirely digital premiere.
The success of these films led to a rapid advancement of the technologies necessary to deliver and project digital files visually equivalent to or better than the 35mm-standard for cinema projection.
In 2005, Digital Cinema Initiatives (DCI) published the first “Digital Cinema Technical Specification” manual. DCI is an entity founded and chaired by members of the 7 major motion picture studios whose primary goal is setting standards for all processes involved in digital imaging, editing, mastering, delivery, and projection. With best practices now governed, near-equal industry innovation now met the rapid adoption of digital formats.
It was a fascinating and exciting time, but RAW video recording hadn’t been thought of yet.
The Origins of RAW
In fact, RAW was being invented and adopted into digital photography at nearly the exact time as digital cinema was gaining supporters in Hollywood.
In 2004, Adobe released the first version of it’s .DNG, or Digital Negative, file format. The tech behind DNG was based on the TIFF/EP standard. DNG was the first codec to record the image directly from the sensor, bypassing the heavy compression found in cameras of the time. DNG saw near-immediate integration into cameras from Hasselblad and Leica, two of the biggest names in high-end photography.
A key advantage to building the DNG format on top of the TIFF/EP standard was that it allowed the storage of information not recorded in the image in a separate metadata file for reading, processing, and adjusting on a computer at any point.
RAW image capture quickly became the focus of the photography community at large.
Eventually, camera manufacturer-specific varieties of RAW largely replaced DNG.
During this time, the video and digital cinema industries were largely focused on the transition from SD to HD and the transition away from magnetic tape recording media to more reliable, higher-quality, solid-state recording.
The industry wasn’t prepared for what was on the way.
The Era of RAW Video
In 2007, an unknown company helmed by the founder and (then-) CEO of Oakley sunglasses announced a camera with a feature set that was nothing short of revolutionary — and with a price orders of magnitude less than the competition.
At the heart of the RED ONE’s groundbreaking features was its ability to record RAW video. Coupled with 4K recording at a time when the industry was still looking to HD (and the roughly $20,000 price tag), the response was overwhelmingly positive. customers waited in line for months and years for their pre-orders to be filled.
As RED worked out the kinks in their supply chain and got a little better about software bugs, it became clear that they were here to stay. The massive camera conglomerations had to wheel about to respond or risk losing their professional customer base.
The industry was far from settled, however. In late 2008, Canon released the 5D MK II, updating the mid-range flagship line of DSLRs — with one notable item on the feature sheet: 1920×1080 video recording at 30fps. Previous cameras from both Canon and Nikon offered “HD” recording at either 720p resolutions or unusable frame rates.
The film and video world immediately began to buzz, imagining the possibilities of shooting with an IMAX-sized sensor inside a DSLR-sized package.
The following generation of DSLRs brought “Full HD” recording at usable frame rates for video use from nearly every major manufacturer.
For a while, the DSLR craze took the attention from RAW video. Around this time, ProRes recording was becoming more prevalent through the use of external recording devices, largely solving the primary drawback of shooting to the heavily compressed codecs inside the cameras.
In 2009, Trammel Hudson released the first version of his “Magic Lantern” firmware for the 5D MK II. It allowed users to take the camera far beyond its advertised capabilities, enabling high bitrate recording and eventually RAW video recording.
Panasonic’s GH line of mirrorless micro 4/3 cameras each had their own unofficial firmware, but in a rather unprecedented move (by most corporate standards), Panasonic answered the modder market by adapting large amounts of their code into the next camera in the line.
Camera companies now needed to answer to customers who now knew what the cameras were capable of — and were demanding official support.
Around this time, Blackmagic Design announced their 2.5K “Production Camera,” which recorded the user’s choice of ProRes or RAW in-camera at a $3,000 price point. While Blackmagic’s first forays into the digital cinema market fell fairly flat, the prosumer market had gotten a taste for small, low-cost, RAW-capable cameras. Blackmagic worked out many of the kinks in their original cameras and eventually brought cameras to market that rivaled any other cinema camera, but much like RED, they did so at fractions of the cost of the competition.
This desire for the small, affordable, RAW cameras had long remained unmet outside of Blackmagic’s line. Advances in LOG profiles in conjunction with higher bitrate recording granted RAW-like dynamic range to otherwise compressed footage. While LOG profiles allow a great deal of flexibility in the color grade, they lack the flexibility true RAW offers for image fidelity and preservation.
With the announcement of ProRes RAW at NAB in 2018, RAW seems to be back in the spotlight once more. Eliminating the primary drawback of shooting RAW — the massive file sizes — seems to be squarely what Apple has in mind. ProRes RAW could very well be the technology to drive the next wave of smaller, more capable, RAW-enabled cameras.
RAW video has brought possibilities for image quality and control previously reserved for high-end production houses and film labs to the hands of much of the film and video communities. New advancements for RAW could lead to it becoming an endemic format found on even budget cameras.
Looking for more information on the film and video industry? Check out these articles.