This interview contains mild Ex Machina spoilers.
The summer movie season is still some weeks off, but for fans that really love the genre, we probably already have the year’s best science fiction film. Indeed, Ex Machina marks the directorial debut of screenwriter Alex Garland (28 Days Later, Sunshine, Dredd), and it also stands as a indelibly creepy chamber piece about two men (Oscar Isaac and Domhnall Gleeson) and the artificially intelligent being they covet (Alicia Vikander).
Pulling from the great sci-fi traditions of Isaac Asimov or Philip K. Dick, with a little bit of gothic horror thrown thanks to overt Frankenstein parallels (and perhaps some subtler Bram Stoker ones as well), Ex Machina creates a taut 108-minute thriller about the deceptively simple execution of a Turing test—which is to prove artificial intelligence exists.
When Caleb (Gleeson) is invited by Nathan (Isaac), a search engine monopoly’s mercurial CEO, to a mysterious and unvisited country estate, Caleb discovers he is only allowed in certain rooms. But in one of them, he meets Ava (Vikander), a likely sentient robot. Nathan insists that Caleb is there to perform a Turing test on Ava, but it quickly becomes apparent, there is more to this experiment than meets the eye.
In preparation for Ex Machina’s limited release this weekend, we sat down for a roundtable interview with Isaac and Vikander to discuss the future of artificial intelligence, the preparation process in bringing it to the screen, and just where our relationship with technology is headed.
Oscar Isaac also had a few words about the villainous Apocalypse, a “creator” of sorts that he is gearing up to play in X-Men: Apocalypse.
Obviously this is a uniquely elaborate film. Alicia, could you describe how much time you spent in hair and make-up? How did you apply such an intricate look to both the robotic and flesh elements?
Alicia Vikander: I spent four and a half hours [in make-up]. I think we cut it down to [three hours and 45 minutes] at the end, but I think my pick-up call time was at 3:50 in the morning to be on set by 8am. They just did a mold of my body. A lot of people assume [there’s] green screen action going on, but you don’t apparently need that anymore. So, the silver mesh that you see is just a full bodysuit, so I looked like Spider-Man.
And then they continued and slicked my hair, and they had a bald cap and did the silver mesh on top of my head, and then they build my forehead on top of my skull. So the form you see for Ava in the film is actually me, and then they haven taken away some of the parts.
What do you think is this continuing fascination with artificial intelligence?
Oscar Isaac: Well, you can trace it back even further to the idea of us creating a Frankenstein or something that we can’t control. I think because we know that we’re the top dogs on the planet, and we also know how shitty we are [Laughs], and so the idea that we could create something that we would not control or that would be imbued with some of our worse qualities, I think is a real fear.
AV: Well, I think it’s the same thing. It’s all those questions about consciousnesses, about AIs, but mostly it came down to talking about human beings at the end, and feelings of making something that has its own will.
OI: I also think it forces us to ask questions about the nature of our self-awareness or our consciousness, which is like what every religion is basically trying to figure out. This is just another way of talking about it. If your job is to reconstruct a human mind, where do you start? What is necessary? Is sexuality necessary? What kind of interaction is necessary? Is some sense of organic material—all these kind of things what we are. Is consciousness just a byproduct of something else happening?
AV: Yeah, reading about the human brain and realizing that it all comes down to signals and hormones, and love. You can kind of get a chemical form to try and describe whatever your feeling is. Then [you] start to read it like that, and suddenly I started seeing my whole body as machinery. Then you start to just fantasize in your head. If it comes down to all those parts, could we just create those parts and put them all together?
I think Alex Garland had influence from the Murray Shanahan book [Embodiment and the Inner Life]. Did you guys happen to read that or was it just the script?
OI: There were so many different elements that he drew from that for me it was very important to sit and talk with him. So that actually was one. A lot of [Dan] Dennett’s material on consciousness, Noam Chomsky’s material on language was very fascinating, the idea, and it’s even echoed when Caleb says, some people believe that language is inherent in them, and it’s just about unlocking the tools to let it come out.
So all of that was interesting to at least get an understanding. I mean there’s no way I could ever really understand what that’s about.
How about Alan Turing since you did a Turing test in this one, and we’ve done the movie already.
OI: But what’s interesting is that it’s not a real Turing test, and it’s actually a ruse. The idea is “oh, you’re going to do this Turing test to see if she’s conscious.” Clearly she’s conscious from the second you meet her. So there’s actually another test that’s happening.
When it came to creating this character, did you talk with [Alex Garland] about raising a familiar question in science fiction of whether there is morality in science? Did you talk at all about if there’s a guiding principle or morality in Nathan’s intellectual curiosity?
OI: Yeah, I would describe it more of an ethical question as opposed to necessarily moral ones, but I guess that’s semantic. The ethical question of when you know something is self-aware, then what is your responsibility with it? Because Nathan finds himself in this interesting predicament of [developing a machine that wants to escape].
For him, I don’t think he has much empathy for human beings. So, why is he going to have much empathy for this creation, which he knows is going to come and eradicate—it’s the next evolution, anyway. You feel bad for her? Feel bad for us, because this is the end of us.
AV: Also that thing, which is one of the lines in the script, is “Why?” Because I can. Like with all things, it’s evolution. If I don’t do it, but we can do it, someone else will do it. It’s like putting a red button in front of any human beings. They will eventually push it.
Technology is certainly advancing, but do you think mankind is advancing with it? Where on the spectrum of pessimism versus optimism do you find yourself?OI: I’m more on the pessimistic side of the spectrum for sure [Laughs]. Just because I think history has shown us that we tend to lose control very quickly over not only the machines, but the systems, be they economic or social. We immediately lose control over them, or very few end up getting control over what those things are. So, I don’t have any reason to believe that it would start to be different.
AV: I would say that I’m probably the same, except it’s that difference with everything we’ve made so far that’s beyond our control does not have consciousness. Maybe it’s a good thing. Suddenly, if you make something that’s actually conscious, maybe they’re greater–
OI: They’re better than us [Laughs]. I think Alex would say that he is an optimist. He’s pessimistic about humans but optimistic about the machines. Then you have people like [Ray] Kurzweil, who’s a futurist, and who’s a very optimistic futurist where he believes as the machines get more sophisticated that we will become actually more machine with nano technology and [there will be] different ways of making us able to compete for the things that we create, which is a very optimistic thing.
As cerebral as the movie is, it is also has a strong physicality. How did you develop that?
AV: I did spend a lot of time trying to find the physicality and voice of Ava, because it had to be something we haven’t seen before. But also, I tried to embody the fact—Nathan knows he’s already created something that has a conscious. So my aim was not to try to portray a robot; it was to make a girl. With this creature or thing that’s been made, if it’s her main will to become this girl and aim for that—the perfection of trying to walk like a human or talk like a human, it made her more robotic, because humans have flaws and are more inconsistent than maybe Ava is. We’ll think maybe she’s a bit offbeat.
OI: It’s great because it’s like acting self-awareness. She’s like an entity that’s hyper-self-aware. That’s a very hard thing to do so well.
AV: So being a bit more perfect, being something a bit more human, like the 2.0 Human is Ava and that made her different from us.
How did she evolve during the course of the film?
AS: I wanted it to be a bit of a journey, especially because we have Ava in a room, and she meets for the first time the first human being apart from her maker ever. So, I think it’s a very human thing. She’s very curious, and I think she wants to read Caleb more than Nathan [realizes].
Oscar, can you talk about your physicality as well?
OI: One of the things I liked so much about the script is Alex created this very cerebral character who’s also incredibly physical. He’s constantly working out, gratuitously so. And I thought that was great, because not only is he Caleb’s intellectual superior, he’s his physical superior as well. He’s someone who’s seemingly insurmountable.
Your character builds a creation that he ends up having no control over. As a being of superior intellect to everyone you meet, your character is able to create anything. So why doesn’t he create a kill switch?
OI: His whole point is to create something that will be smart enough to escape. He’s not looking for control. He makes something that’s self-aware, and it immediately wants to escape. Interesting. But it can’t escape; it’s too stupid to. Let’s make the next one. Ah, this one is getting better, this one is getting better, this one is getting better. So finally, when he thinks he finds the one that can do it, he brings somebody else in and dangles a piece of cheese in front of her to see how smart [he has] made this one.
He says it’s only a matter of time. There’s this idea that it’s this God Complex, but the truth is he knows that one of them will be smart enough to escape, and when that happens we’re done. And why not me to be the one who does it?
You talk about his God Complex and his need to create. Do you see any similarities between that and the character of Apocalypse, such as creating Archangel?
OI: Ah, that’s interesting [Smiles]. This is a trap, right? [Laughs] Sure. The thing is with Nathan is he doesn’t have a God Complex. All of the characters to a certain extent are playing roles in order for the experiment to go a certain way…in order for his plan to work. So, he needs to sell to Caleb that she is someone, it is someone, that needs to be saved from this horrible God Complex, drunk, violent, un-empathetic person, because he needs to make Caleb think, this very smart kid [that] he wants to marry her or something. [Laughs]
AV: So, in one way, it’s one of those brilliant films or scripts where going back and watching it the second time, or reading it a second time, it’s a very, very different story. He says a lot of quite brilliant things, and I definitely think he’s playing the part.
OI: And the difference with Apocalypse is Apocalypse is God. [Laughs]
Well backing up, what are some of the basic elements that you think turn a sci-fi movie into a classic and what are some of the films that would make a great double feature with Ex Machina?
AV: Sci-fi throughout its time is about our own society, normally, and ourselves as humans, and how we behave. That’s what sci-fi has been forever. That’s what the genre does: bring up those elements, put them in another world, but we can relate to it.
But what would make a great double feature?
OI: I’ll say it: 2001. I think what she’s able to achieve, and what Alex is able to achieve, with imbuing an artificial intelligent machine with real emotion and yet at the same time be completely otherworldly is completely comparable, and again an interesting exploration from two different time periods of a very similar thing, which is how do you control it, and what rights does it have ultimately?
You know when HAL is saying, “Don’t David, don’t, I can feel it,” you’re like, “Ooooh, it’s horrible.” But clearly, he should kill HAL, because HAL’s horrible. But it really makes—it does this thing, which great sci-fi does, which is fill you with awe but also dread. Because what we can do is both amazing and horrible.