Never second-guess again. The new Creator License covers personal projects online and on social media. See details.

5 Shopping Cart
Your cart has been updated
checkout
Categories

Cover image via

What Is CGI? How Reality and CGI Blend in Films

Annie St. Cyr
Published: Last Updated:

Embark on a journey through the evolution of CGI and explore its potential impact on the future landscape of cinema.

In the past, filmmakers solely depended on practical effects and optical tricks to shape audience perception, utilizing techniques such as camera angles, lens selection, and elaborately designed sets.

While we take advantage of green screen and volumetric LED screens, in the past, set extensions would be meticulously painted by hand.

This method gradually transitioned to digital manipulation, where computers and various software enabled creators to sculpt and design anything imaginable. This shift towards digital creativity catalyzed the advancement of CGI technology.

Charlie Chaplin movie GIF

Now, it’s essential to make clear distinctions among the various areas of CGI, particularly when comparing it to live action, practical effects, and CGI itself. At first glance, the differentiation might appear straightforward.

For example, the terms VFX and CGI are often lumped together as the same thing, yet they represent distinct concepts. Although CGI is a subset of VFX, the scope of VFX is broader, encompassing tasks like compositing matte paintings, erasing stunt cables, and artificially adding snow to scenes.

CGI, in contrast, exclusively involves generating visuals using computer graphics.

The question might arise: “Aren’t they essentially the same?”

Consider this illustrative example from a behind-the-scenes look at Marvel’s The Avengers. Mark Ruffalo is shown wearing a motion-capture suit, the basis for the CGI Hulk model. Meanwhile, the filming takes place in a studio against a green screen, with a prop taxi labeled “New York,” indicating where VFX artists will later insert a New York streetscape.

The transformation of Ruffalo into the Hulk relies on motion-capture data, which is CGI. The integration of a New York street scene behind the actors, achieved through green screen technology, exemplifies VFX’s work.

Actor Mark Ruffalo during a CGI scene in the movie The Avengers
Image via Marvel Studios.

Now more than ever, the two are being intertwined in mind-blowing ways. Most recently, the (extremely significant) arrival of virtual backgrounds that almost replace green screens is a new way of lighting. The technology is in its infancy and has received some criticism, but we’ll talk more about that briefly.

As technology and virtual production have progressed, there are live previews of what the CGI character will look like interacting with the actors. This makes it easier for an actor to deliver a stellar performance and ground them in the scene.

It’s important to note that it hasn’t always been this way. Where we are today has quite a history of incredible technological advancements in every part of the production space.


The First Uses of CGI

While we often equate CGI to modern films, especially from the 2000s onward, CGI is nearly as old as most of Hollywood’s current legacy stars.

The early examples of computer-generated imagery (CGI) in significant film productions can be traced back to 1973’s Westworld and 1977’s Star Wars. Westworld was particularly notable for featuring the first instance of CGI in film, albeit briefly, for approximately 10 seconds.

This effect was achieved by John Whitney Jr. and Gary Demos, who utilized NASA technology to digitize an image and deconstruct it into numerical data. This allowed the computer to manipulate these numbers in various ways—altering colors, compressing, and expanding the imagery—resulting in a point-of-view (POV) shot from a robot’s perspective.

Turning our attention to Star Wars, the film primarily relied on a mix of practical effects, miniature models, and lighting techniques to bring its universe to life.

However, one of the film’s landmark moments featured an innovative CGI sequence. The sequence is towards the film’s climax, as Luke Skywalker and the other X-wing pilots prepare for their assault on the Death Star.

We are shown a 3D tactical map that outlines their assault. This sequence was developed by the University of Illinois at Chicago’s Electronic Visualization Laboratory team. Despite its simplicity by today’s standards, this sequence was a groundbreaking moment in filmmaking, at the time.

The innovation extended beyond merely creating images and objects through new means; it fundamentally altered the narrative capabilities within filmmaking. Suddenly, all genres had access to unprecedented storytelling possibilities.

Star Wars stole the lead with innovation by introducing the first entirely CGI character, which had a significant role in film history. This is, of course, none other than Jar Jar Binks in 1999’s’ Star Wars Episode 1: The Phantom Menace.

“Dis Is Nutsen. Oh, Gooberfish!” – Jar Jar Binks

Despite becoming a subject of humor and meme culture, mainly due to CGI’s early state, creating an entirely CGI character like Jar Jar Binks, who could interact in real-time with live actors, was profoundly impactful.

Now, there is often debate over whether the stained-glass knight from Young Sherlock is the first CGI character, but it’s important to note that while the stained-glass knight precedes Jar Jar Binks, Jar Jar is the first to have spoken lines and dynamic interaction with live actors.


Iconic Uses of CGI

Regardless of genre or subject, the biggest blockbusters of all time have one thing in common. They’re big-budgeted productions with heavy CGI use throughout the entire film. Think big action films like Star Wars or the Avengers series, with CGI and VFX in nearly every scene.

Star Wars: A New Hope (1977)

While this wasn’t always the case, this norm started to shift somewhere. Look no further than George LucasStar Wars: A New Hope, released in the summer of 1977. The mind-blowing use of CGI elements mixed with practical and miniature effects dawned a new age of filmmaking.

The Star Wars films changed the audience’s expectations for what a summer blockbuster could be. Looking at the ’80s, you see how this massive shift influenced movies like Tron, Predator, Superman, E.T., and Aliens in cinema.

Terminator 2: Judgement Day (1991)

There is a common saying in most film schools: Star Wars started the game, and then Jurassic Park changed it.

While I still think that sentiment is true, there’s one movie I’d like to add to that statement: Terminator 2: Judgement Day.

I would argue that this is one of the most influential films ever made when it comes to CGI. T2 gave the audience another giant leap in perspective on what CGI could be in a movie.

The T-1000 villain was sometimes a complete CGI liquid metal character that could morph and form into whatever shape it needed to. ILM pushed new CGI technology to the absolute brink. They achieved the effect by painting a 2×2″ grid on actor Robert Patrick and shooting reference footage of him walking. Then, they scanned his head for further modeling.

If you want to dive deeper into T2‘s brilliant CGI, check this out. It’s a great breakdown of how the team pulled off one of the most mind-blowing shots in the movie. It’s a first-hand account of how the artist created the sequence. T2 still holds up in terms of visual effects and should be appreciated.

Jurassic Park (1995)

Jurassic Park is another juggernaut that forever changed summer blockbuster expectations. Are you tired of hearing me say that? That’s because it’s true. Jurassic Park contributed to even more giant leaps in CGI’s history.

The film blends real-life animatronic dinosaur models with CGI dinosaurs, changing how the audience sees the dinosaurs interacting with the actors. They would use a practical dinosaur model when they needed a close-up of a dinosaur head next to a human. Then, for the wider shots, they could use CGI versions of the dinosaurs, which provided an ultra-realistic look.

This genius blend of perspective and mix of visual effects changed storytelling forever. Jurassic Park will always be known for bringing CGI in as an organic part of the story, in a way that improves the heightened sense of realism.

The Matrix (1999)

You already know the scene that blew audiences away. The bullet-time slow-motion dodge that took audiences and the filmmaking world by storm!

Through the use of a 360-degree camera rig, taking individual photos of Keanu Reaves in the middle, the Wachowskis knocked down barriers to what actors and CGI could do together.

This iconic and flawlessly executed sequence blurred the line more than ever between reality and filmmaking trickery.

Avatar (2009)

If the aforementioned films set the ball, James Cameron spiked it.

Avatar was a breathtaking blend of motion capture and CGI madness. The WETA-produced film employed highly detailed face-mapping technology of the actors using cameras that could detect even the slightest movement in their facial expressions. That data would then transfer to the computer-generated character models.

Avatar utilized 60% CGI imagery throughout the entire three-hour runtime. This was unheard of in 2009. The results were stunning. The movie became the highest-grossing film of all time, up to that point.

Inception (2010)

Christopher Nolan‘s Inception is recognized for its innovative application of practical effects, CGI, and complex narrative structure. It explores the depths of the human psyche through dream manipulation. As such, Inception represents a milestone in the evolution of film.

The visual effects, particularly the folding city scene and the zero-gravity hallway fight, showcase an exceptional blend of CGI with practical effects. These methods culminated in creating a visually stunning and intellectually stimulating experience.

While CGI makes for an outstanding visual spectacle, it also enhances storytelling by challenging the viewer’s perception of reality and immersing the audience in its multidimensional world.

Avatar: The Way of Water (2022)

The long-awaited sequel to James Cameron’s Avatar, The Way of Water is another prime example of going beyond and further developing cinematic technology and storytelling.

Alongside the visual exhibition of the first film, The Way of Water explores the aquatic regions of Pandora and successfully showcases an impressive culmination of cutting-edge CGI and performance capture technology. Cameron and his team developed innovative underwater motion capture methods to achieve such mesmerizing visuals.

Such techniques resulted in breathtaking scenes that created an immersive experience for the audience to take in the detailed nature of the underwater world.

Furthermore, the sequel adds depth to the thematic exploration of environmentalism, family, and survival, intertwined with the well-established visual language from its predecessor.

This film represents the promising potential CGI has for future films, creating vast, immersive worlds that are visually remarkable.


Why Is CGI Good for Movies?

Over recent years, there has been a growing trend among studios and filmmakers to shun CGI and VFX.

For instance, Tom Cruise mentioned in relation to Top Gun: Maverick, “If I can figure it out, if all of us can figure it out, it’d be fun to do. I’d like to fly those jets again, but we got to do all the jets practical, no CGI on the jets.”

This statement became a significant part of the marketing campaign. However, it was later revealed that the film contained over 2,500 VFX shots, including many CG aircraft.

If you’re going to watch one referenced video within this article, make it this one.

This trend primarily stems from audiences growing tired of subpar CG work and believing they prefer practical effects. While practical effects can feel more organic, the issue boils down to poor CGI, not CGI itself.

A prime example of CGI’s “seamless integration” can be seen in the 2015 film Mad Max: Fury Road.

The movie is renowned for its practical effects and stunts that dominate the screen, blended skillfully with CGI elements to the point where distinguishing between reality and digital enhancement becomes challenging.

This ambiguity is precisely what effective CGI aims to achieve.

Take the case of Ex Machina, a film with a modest budget that earned an Oscar for Visual Effects. The film’s strategic use of CGI, particularly in its restrained approach, was exemplified by Alicia Vikander‘s portrayal of Ava. Vikander donned a form-fitting suit, parts of which were digitally replaced with CGI to depict robotic components.

Vikander’s suit also functioned as a motion capture tool for the visual effects team—a brilliant fusion of practical costume design and CGI. This blend of real-world elements with digital enhancements contributed significantly to the film’s immersive experience.

Effective CGI, when integrated seamlessly and logically into the narrative, can significantly enrich the viewing experience.

CGI also democratizes the filmmaking process through its accessibility. With software like After Effects, Blender (free), and Cinema4D, virtually anyone can craft a 3D scene or even a complete project, provided they’re willing to learn these tools. It’s often said that “anything is possible” and, in CGI, this adage holds a lot of truth.

Educational resources for CGI and VFX are abundant, ranging from YouTube tutorials to specialized online courses offered by platforms like Learn Squared, VFX Apprentice, and School of Motion. These resources make it easier for beginners to dive into the world of digital creation.

Moreover, the impact of CGI extends to blockbuster entertainment, offering audiences thrilling experiences. Most filmmakers concur that a compelling script is paramount; it forms the foundation of a film’s success. However, with the advancements in CGI, the scope of storytelling has expanded significantly.

It’s also essential to address the challenges and criticisms associated with CGI, which we will explore next.


Why Is CGI Bad for Movies?

Closeup of a bad CGI example in the movie Thor
Image via Marvel Studios.

Okay, we noted that studios try to market their films as ”CGI-free” because audiences are worn down in patience and enthusiasm. This weariness stems from film productions over-relying on CGI as a quick fix for challenges they either lack time to address or are unwilling to allocate budget to in other production facets.

Audiences can often detect when a production has cut corners, evidenced by unrealistic or incongruous visuals that disrupt the immersive experience of the film.

It was recently revealed that most Marvel films undergo extensive reshoots, as Marvel figurehead Kevin Feige suggests after watching dailies. Because of the oversight and changes throughout production, many elements of the film—such as locations, effects, and even costumes—are not finalized until the late stages.

Therefore, instead of visual effects artists having numerous months to refine their workflow, they are faced with a rushed crunch time of several weeks, which results in less refined visual effects.

CGI can sometimes introduce elements that appear odd or unnecessarily fabricated, such as superfluous objects, backgrounds, or other digital additions that do not advance the narrative.

A notable instance of this can be seen in a scene from Uncharted (the film adaptation featuring Tom Holland), where a character stands before a large window with the nighttime sky in the background. The green screen effect feels conspicuously out of place, despite being lit by an interior setup.

Poor green screen example from the movie Uncharted
Image via Sony.

With a reported budget of 120 million dollars, one wonders why such a film struggles with achieving a believable green-screen shot. Partly, this issue stems from an over-dependence on CGI and green screens for accommodating last-minute script alterations and scenes deemed less crucial.

When a film allocates a significant portion of its VFX budget to a few elaborate action sequences, other scenes may suffer due to budgetary and priority constraints. Achieving a realistic green-screen effect is feasible, yet issues arise when the scene isn’t lit appropriately or without a clear vision of the final effect.

This approach can promote a form of filmmaking that leans too heavily on post-production magic, assuming that any issue can be “fixed in post.” Decisions about CGI additions are often deferred until after filming, based on factors like audience feedback or studio demands, fostering a mindset of “we’ll adjust it later.”

Yet, CGI and post-production cannot remedy a story or plot’s fundamental flaws. This is increasingly evident in major films rushed to release.

A critical view of CGI’s impact is exemplified by Marvel’s recent projects, such as Thor: Love and Thunder and Doctor Strange in the Multiverse of Madness, where the quality of VFX work and the treatment of VFX artists have been subjects of controversy.

The industry’s push for quantity over quality, without consideration for the time and resources needed by VFX artists, is unsustainable.

This current trend in production and post-production practices needs re-evaluation. Ideally, the industry will learn from these experiences and revert to a model where CGI and VFX compliment and enhance storytelling, rather than serve as a crutch for expedited production or narrative shortcomings.


How to Create CGI

The good news is it’s 2024, which means there are thousands of resources, programs, methods, and entries into the field of visual effects through CGI work. If you want to learn about or get into this field, it’s more than possible, and you don’t have to go to film school.

If you’re wondering where to start, look no further than the Blender guru Andrew Price. He’ll begin by walking you through making a computer-generated donut in this free program.

Separate from Blender, there are a ton of great YouTube channels you can follow that teach, talk to, and train future CGI artists with all kinds of programs and types of CGI.

Check out some of these channels:


The Future of CGI

As we contemplate the trajectory of CGI and VFX in film and television, it’s evident that much of their future lies within the confines of the studio.

The era of lavish budgets permitting the practical execution of superb set pieces is dwindling as studios seek greater production efficiency. However, efficiency does not need to equate to a decline in quality. The advent of virtual sets featuring LED screens that display photorealistic backgrounds in real-time heralds an era of unprecedentedly high-quality CGI.

Cinematographers, such as Greig Fraser, are at the forefront, demonstrating the potential of this innovative technology in works like The Batman, Dune, and The Mandalorian.

While accessing such technology may seem daunting, the evolution of CGI over the years mirrors the broader trend toward democratization in filmmaking technology. It’s becoming increasingly feasible for creators across the spectrum of budget sizes to utilize advanced CGI tools.

Consider the example of deepfakes. With essential software and a knack for impersonation, creating photorealistic transformations is within reach.

Furthermore, companies like Rokoko offer reasonably priced motion capture suits, enabling filmmakers to animate characters based on an actor’s movements.

Resources such as the tutorial “Getting Started with Motion Capture” by Am I A Filmmaker? offer valuable guidance on leveraging such technology.


Summary

Since its early days in the 1970s, CGI has undergone transformative development, influencing countless groundbreaking films and expanding the boundaries of storytelling and visual creativity.

With each passing year, new techniques and innovations push the field in exciting new directions, making CGI tools more accessible and versatile. This evolution ensures that learning and mastering CGI is more attainable than ever before.

Embarking on the journey into the world of VFX offers limitless potential for creativity and innovation.

Good luck out there on your journey into the world of VFX.


For more on CGI, check out these articles:

Cover image via Universal Pictures.

A