How Avengers: Infinity War turned Josh Brolin into an eight-foot purple madman

Despite the title of Avengers: Infinity War, the main character is not Iron Man, Captain America or any of the other Avengers who protect the world from evil. From start to finish, Joe and Anthony Russo's blockbuster is about Thanos, the great eight-foot intergalactic villain, played by Josh Brolin. Thanos is as dominant and destructive as fans of the show could expect, but he is not the one-dimensional villain that could have been so easily.

His manic plan to restore the equilibrium of the universe by arbitrarily killing half of them. Leaving the inhabitants aside, Thanos is sometimes a strangely sympathetic character. He is able to feel compassion for a young Gamora, even after he murders his mother and half of his planet. He is legitimately hurt when she turns her back on him, and feels real grief when she sacrifices her to achieve her goals.

While Brolin's performance drives the character, none of those moments would have been possible without the special effects of Digital's magic. Domain. The visual effects company, founded by James Cameron and the late Stan Winston 25 years ago, had the task of giving life to Thanos, and in the biggest possible scenario. I jumped on the phone with visual effects supervisor Kelly Port to talk about the scope of Infinity War the company's complex motion capture system, and how machine learning played a crucial role in turning Josh Brolin in a crazy Titan.

This interview has been edited for clarity and brevity.





Photo: Marvel Studios

Several visual effects houses worked on several aspects of Infinity War at the same time. How do you destroy something this big logistically?

We reflected on Titanic days, late 90's, where a great movie of effects would be considered around 300 shots. While this movie, on the other hand, has more than 3,000 shots of visual effects. Almost all the planes of the film, to a certain extent, have been touched by visual effects, be it a simple blue screen, a cosmetic improvement or a cable elimination, which would be at the simplest end of things, until the end. a complete CG shot that has multiple characters, large crowds and environments, and what you have. Then the spectrum of complexity runs. But over time, what is considered a great visual effects movie, the definition of that has changed. Obviously, this was as big as possible.

On the Marvel side, we have Dan DeLeeuw and Jennifer Underdahl, the visual effects supervisor and visual effects producer for Marvel. They break down the script, and eventually they have preview to work on, so they have a better idea of ​​what is needed in a particular shot, since it is cut in sequence. And then, finally, that makes its way into a shooting plan when we get to the real action shooting. At that time, they make a decision about how they want to distribute the work, and I think that, in general, they did not want to put all their eggs in one basket. I think there are probably at least seven major visual effects studies, including Digital Domain, involved in the work.

In which areas did Digital Domain and you focus?

We ended up doing Thanos mainly, and Weta also did Thanos, but only by one sequence, on Titan. So we ended up being responsible for a lot of the really heavy and depressing scenes, as you can imagine, after watching the movie. It was often a joke: "Hey, give us something about Baby Groot or something like that, because we're so depressed here." We were engaged about three or four months before the actual action shooting, along with Weta, to start doing some tests for Thanos, and that was to have Josh Brolin meet with the Russo brothers and discuss the character casually, and then come in and it left the character while they discussed it.

I was interpreting it very casually and normally, just throwing ideas. But it was really a way to test the system and make sure that [effects design] was the right sound, because there is a lot of pressure on this. Thanos, as a character, has more than 40 minutes of screen time. He is the main character in the movie, and if he did not succeed, he would not be as successful as a movie. I think it was very important that Thanos not only look photorealist as technically and artistically as possible, but even more importantly, Marvel and Dan transmitted it to us, that Josh Brolin's performance in all its subtleties really needed to happen as much as possible as well. .

When we presented that proof from the beginning, literally, the first day of filming, it was an interesting presentation. Because we had Josh Brolin there, we had [Marvel Studios president] Kevin Feige and all the Marvel executives there, we had the Russo brothers. And me and my digital effects supervisor hiding in the corner, trying not to make our presence known, and waiting for Josh Brolin to check this. Everyone else was excited about that, but it was really important to get Josh's comments on that too, and I really loved him. I think what was really great about his reaction was his feeling that he, as an actor, did not need to exaggerate the performance. When we did this test, I was doing it as a fairly informal conversation. He was not trying to pressure him for technical reasons, such as: "Maybe if I push too hard, it will not leak as much, and what I'm trying to do will appear." 19659018] But what he saw in the test was that even in a very casual performance, many subtle facial details were coming. So moving forward, he was able to play it as subtle as he wanted, or intense, or whatever. It was not completely filtered [in the motion-capture process]. I think there was a big sigh of relief in the room, which in fact, we could achieve, that this would work. And obviously from that moment, from the shooting, and then a year and a half, two years later, when we finally finished the movie, a lot of that improved a lot. We are very happy with the results and we work very hard to make this happen technically and visually.


1525983808 883 how avengers infinity war turned josh brolin into an eight foot purple madman


Photo: Marvel Studios

Thanos has been around for decades in comics and has already appeared in the movies. How did your design change this time?

We certainly started with the historical design, then we got an update as a digital sculptor from the Marvel Art Department. Then we take it from there, where we slightly modify the proportions of the eyes and the relation of the eyes to the mouth, for example, slightly towards Brolin. There is a balance between Brolin and the historical character of Thanos, no doubt, and we try to find that balance, but we did introduce more of Brolin, I think in Thanoses earlier in the movies. I think that, in general, that helps in the overall pairing of performance and in being a little closer to feeling what Brolin was doing with the character as well. Then, of course, the details of the disguise and the textural details of the skin, these are all things that took months, and maybe even almost a year, to come and go between Marvel and our teams.

What was the process of motion capture like in this movie?

What's really nice about this [project] is that when we do motion capture, historically, we usually have a "capture volume", an empty space with a lot of cameras around you. Actually, there are not many fixed parts or anything. In this case, they built the cameras that capture the movement of the body in and around the pieces configured by [production designer] Charlie Wood and his team. And that allowed the actors to really be on the set interacting with each other, and all the time, you have as many characters as you want. I think sometimes we had more than 10 characters in a scene.

Josh would be wearing a motion capture body, with tracking markers and things you've seen before. Then, I would also use a helmet, with two HD cameras with a vertical layout running at 60 frames per second, and then I would have the tracking points on my face. Once the edition [movie] is compiled we would get a time code for the capture of body movement, we would get a time code for the facial capture, and then we would obtain, of course, the associated images that go with that: the set , live action, clean plates, reference passes, all kinds of things that will help us do a better job in the future.

Then we process the facial tracking. This is done in a two step process, where we have this patented tool that we call Masquerade, which takes the relatively low resolution facial geometry of the helmet's camera. Then we have another step, where we work with Disney Research in a technology called Medusa, which obtains high resolution images of the facial forms of Brolin, which are basically placed in a reference library. And it's not just "form A" and "form B". It's about 50 different ways we captured, but it's also the transition of those forms. So, what happens to the muscles, to the skin, to the face, to the bone, when it also makes the transition from forms A to B? So that's all in this massive database. Then everything goes through an automatic learning algorithm that we have developed that says: "This is a low resolution face, but we want the high resolution face to be essentially equivalent". [At that point, the system builds the equivalent face, using the Medusa imagery as the reference point.]


1525983808 244 how avengers infinity war turned josh brolin into an eight foot purple madman


Photo: Marvel Studios [19659030] Then you run through this automated process and it gives you a result, and then we look at that result and say: "Is this on target?" And if it is not timely, we give a small correction, feed back into the system. Now you know, now you have trained, essentially, to know that this is a better result. By doing this hundreds of times, learn from this process and effectively improve over time. In the course of production, we were able to make fewer and fewer corrections.

The next step is to move from the high resolution actor [model] to the high resolution character, in this case, Thanos. And indeed we also have the same process there. We make an automated transfer. We look at the result; You look at Brolin next to Thanos. Are you transmitting the same emotional expression? Are you transmitting the same emotion with your face? And we make a subjective call. "Yes, it is, this looks great." Or, "You know what, there is something that is not right." Maybe it's an element of surprise, or maybe your forehead needs to rise a little more. So we just make a quick little adjustment to that, feed it back into the system, and learn from that. This is training data, so the next time a similar expression appears, it becomes a bit more accurate. That finally ends up going to the animation department, which combines that with the capture of the body.

How long has the Digital Domain been using machine learning?

Not so long ago. We have used it in some projects now, and we presented an article in SIGGRAPH some time ago. But this is really the first project that has been used in a comprehensive way, I would say, so it's a relatively new technology.

A common weakness of CG characters is their eyes. In Infinity War however, there are many times when Thanos' eyes really take home an emotional moment. What was the secret of your focus there?

The eyes are, as they say, the windows of the soul, and such a critical part of a facial expression and what emotion it is transmitting. But, ironically, that's the lowest fidelity we get from that gas pipeline, because you can not put tracking markers in people's eyes. Then, in the Medusa scans, we get a really nice sense of the topology of the eye, that is, what is happening in the eyelids, and when the eyelids are closed or are squashed tightly, how the skin folds and all that .

We take that information, we make a production model figure that represents our Thanos model, and then the animation makes a ton to make sure these shapes are linked So there is a visual reference of what is happening with Brolin, and we have to agree with that, but having said all this, what we really focus on are the eyes in Digital Domain, and there is a lot of work, and a lot of details not only the geometry of the eyelids and the surrounding skin around the eyes, but also the eyeball, the conjunctiva and these very thin translucent layers.There are multiple layers of tissue that are semi-translucent that pass over the same eye, and these are all the things that we have incorporated into the system, so once they are there, they are there, but it is a lot of work to get to that point. how they move, how they look and biology of an eye too. All this contributes to the overall final realism.

Were there other particular problems that arose with this character that needed to be solved?

The photorealism of the character. That part is a lot of different departments that need to work in concert. It is not just one thing. If the lighting is not good, it will not look real. Or if the composition is not good. You can have all of that, but if the animation is not good because it's moving in a strange way, it's going to hit people. Then, everyone has to work, and everything has to be joined at the same time.

Leave a Comment

Scroll to Top