Television is Not Replacing Film

There is an exciting narrative going around about TV replacing film in popular culture.

If there truly is such a thing as “peak TV,” then we must surely be nearing it. On nearly every cable network, streaming service, and even occasionally a creaky broadcaster, there is a drama, anthology, or comedy—and often more than one—that is challenging our previous notion of what television as a medium represents, and what its future could be. In other words, we are undeniably in a golden age of serialized storytelling.

Unlike the most expensive Hollywood films at the moment, the TV platform is offering storytellers—particularly writers and actors—outlets for creativity and risk-taking that is now almost alien to the modern studio system. While American cinema continues to constrict and reduce itself into a candy colored haze of capes and CGI, television is taking similar genre conventions and using it to challenge viewers in captivating ways with international acclaim, be it the familiar, gritty crime saga being expanded into beautiful long-form within AMC’s Breaking Bad, or in visions of sword and sorcery epics becoming a mirror for medieval power play dynamics, as well as our own grim ones in 2017, a la HBO’s Game of Thrones.

It is all enough to say in the shadow of this weekend’s Oscars that television has supplanted movies as the dominant, most popular form of escapism in the 21st century, particularly for those of a younger age. In fact, many have said it. For years.

Television is undeniably having a moment, a gilded age of success. When contrasted with the hegemonic mundanity that has crippled corporatized studios, which now seek to make mostly obscenely budgeted popcorn diversions fueled by adolescent fantasies, the narrative of television replacing film is an appealing one. And yet, we contend, one medium’s rise does not signal another’s downfall—it merely suggests the need for cinema to reinvent itself in favor of its advantages. And it will.

Ad – content continues below

After all, this is not the first time this sort of thing has happened.

The First Golden Age

There was once another era which was heralded as the living room box’s high point… and it was a lifetime ago. The original golden age of television is roughly designated as the years between 1947 and 1960. The first years of television exploded with smaller, younger, and cheaper content that had to fill the airways weekly. Live hour-long anthology dramas and plays would be depicted every seven days or less on series like The Chevrolet Tele-Theatre, and future comedy and movie legends like Carl Reiner, Mel Brooks, and Woody Allen got their start writing and/or acting on Sid Caesar’s live variety series, Your Show of Shows. Rod Serling would pull audience heartstrings on the live Playhouse 90 with “Requiem for a Heavyweight,” and then challenge them every week with clever, anthological genre writing in The Twilight Zone.

The upstart television medium was forced to make up for its obvious deficits in visual and production value. Film studios reacted with ambivalence and then terror, countering the tube by drowning their celluloid fantasies in new types of technicolor and with new camera techniques and exhibition styles, including cinemascope, VistaVision, supervision, Stereophonic Sound, and even that gimmick called 3D. Oh, it’ll never catch on!

The innovations gave us the widescreen format, but during that time, the movies which most utilized these elements were also generally the most formulaic and conventional: Westerns, musicals, biblical epics, and anything else folks today compare superhero movies to.

Join Amazon Prime – Watch Thousands of Movies & TV Shows Anytime – Start Free Trial Now

These trends did not last for either medium: By the 1970s, television had embraced the stable bottom lines of formulaic dramas and sitcoms while Hollywood underwent a Baby Boomer crisis after the newer generation rejected their parents’ entertainment, and the last vestiges of the classic studio system collapsed. In its ashes, filmmakers got younger, more creative, and helped cement the French “auteur” theory while embarking on a new golden age of American moviemaking.

Obviously, comparisons between popular entertainment in the mid-20th century and now is not one-to-one. With the democratization of television and filmmaking (and the means to produce it on the internet), it is unlikely that such total conformity could occur again, particularly with cable dramas blowing past censors. And yet, the same basic advantages that allowed cinema to cling to its status as pop culture’s reigning medium remain in movies’ favor.

Ad – content continues below

Point of Entry

On the most fundamental level, no matter the time period, film has the natural advantage of mostly offering standalone and complete experiences. A film by its very nature is comprised of a beginning, middle, and end. This can be presented in the most straightforward, three-act structure imaginable, or it can be as notoriously standoffish as a David Lynch production (a major voice in both mediums). Either way, you know that traditionally entering a movie theater, or even firing up a film on Netflix and Amazon Prime, will give you a complete story.

The most notable exception to this at the moment is the increasingly bombastic and overloaded superhero movies and action films, which covet a serialized format, reminiscent of television. Character arcs in the bigger Hollywood films can be distributed amongst two, three, even six movies, and plot threads may not have actual cathartic resolution for years. But while this may hint at the muddying, disposable legacy of modern blockbusters in the years to come, it still does not undercut the natural edge enjoyed by closed system storytelling.

In other words, if one wants to revisit a film that they’ve seen countless times, they can do so in a viewing experience that lasts anywhere between 90 minutes to three hours. Even the newest most popular form of television consumption often places three hours as the minimum for a “binge.” This means any television show is ultimately a larger commitment to view, and the longer the show, as well as the more seasons it has, the harder it is to jump on the bandwagon.

Currently, the 2017 Oscar season features two populist entertainments that have taken off with moviegoers: La La Land and Hidden Figures. Each has crossed $130 million in the U.S. alone based on glowing reviews and word of mouth. Neither a nostalgic musical or a biopic about the first women of color to succeed in early 1960s NASA are subjects for modern blockbuster numbers, however each built a reputation that has garnered larger and larger audiences, who could commit an evening or afternoon to either. This also will be those movies’ advantage to be discovered and rediscovered in the years to come.

Similarly, if one does develop an interest in musicals after La La Land, it is easy to seek out Singin’ in the Rain (1952) or Top Hat (1935). Conversely, television consumption has a far higher threshold for point of entry, particularly for the small screen’s most influential forefathers that can clock in at hundreds of hours. And ironically, modern cable dramas are often chasing a traditionally more cinematic form. Breaking Bad, Game of Thrones, The Walking Dead, and Stranger Things all intentionally invoke and reference cinematic language to tell their stories.

This is primarily because film has been arguably more nuanced in its past vocabulary of genre and narrative, but it’s also because some of them strive to tell a complete and relatively succinct experience over the course of a season or series, particularly Bad and Thrones. Conversely, that means they must be all watched in order, from the beginning to the end. If you jumped on when they were new, this is fabulous… but as the episodes pile up every year, it becomes more and more daunting for the uninitiated.

Ad – content continues below

Simply put, short-form storytelling will always be easier to consume than long-form. When television becomes a time-commitment comparable to reading novels, audiences tend to stick with what they’re “reading” now, and what is topical, as opposed to discovering the classics in the past, which leads to several more advantages for film…

Sum Totals

Unless one is watching a limited series or miniseries, television shows are not valued solely by the individual quality of their episodes or seasons—they’re measured, created, and usually consumed by their whole. The sum is always considered greater than its parts.

This is a roundabout way to say that a television series can (and often does) dip in quality, which usually affects its fans’ interest, as well as its longer-term legacy. After all, the term “jumping the shark” did not catch on because everyone considered Fonzie leaping over a Great White to be a rich moment, unique only to Happy Days.

Many shows are said to feature “jump the shark” moments, which television producers often cringe and try to avoid. However, by the sheer nature of television, it is almost impossible to maintain a permanent style and consistency to a series’ identity year after year, when casts change, actors leave or character arcs end, and the director’s eye behind the camera almost always must transform.

This becomes a feedback loop for TV writers, making it increasingly hard to maintain the same energy that might have a series be a critical darling in its freshman year. While there are shows that maintain their status to the very end, there are many more defined in the pop culture memory by their decline: Dexter, Heroes, Homeland, anything created by Ryan Murphy.

There is no denying that the first season of Homeland is extraordinary, with compelling acting, writing, and an addictive plot. But convincing other people to jump on board a series in its sixth season—or to start it from scratch—when its legacy becomes defined not by its best moments but its worst, is a challenge.

Ad – content continues below

It relates to the previous point about complete, standalone films having a lower point of entry, because a film is either good, bad, or some grading in between by its own merit, and not usually judged by something written years later.

Generational Bias

All of the above concerns also inform why, even in current moments where popular television is overall superior to the most popular movies, film still has a certain allure and credibility—allowing the Oscars to continue to be a bigger cultural touchstone (and more watched event) than the Emmys, even as the latter awards the often more viewed entertainment. I would argue this is because the best movies continue to have an endurance that passes from each generation to the next. Just as music can be defined as the “soundtrack of your life,” with individual songs conjuring up specific memories, or feelings, certain films often mark lifetime mementos that are determined by when they are first seen or released. People hold on to their favorite movies and pass them with greater ease to children and grandchildren, in part because film production value tends to date slower than its television counterpart, but also because individual films can offer an emotional or sentimental time travel.

At this point, cinema is over a hundred years old and television is approaching its own centennial soon enough, but cinephiles and film buffs who explore the history of past “greats,” be they from the American studio system, the French New Wave, Italian Neo-realists, or any other nationality, far outnumber in influence and size the historic impact of most (read: not all) television series. In other words, it is easier to pass to young ones previous childhood classics like The Wizard of Oz (1939) and Sleeping Beauty (1959), just as it is simpler for parents now to also bequeath The Princess Bride (1987) and Beauty and the Beast (1991) than it is Saved by the Bell or Doug—and undoubtedly as children today may one day share Frozen (2013) or, for the hipper set, Queen of Katwe (2016) with their own rugrats.

Beyond this, the movies that had profound cultural impact are easier to explore in completeness through single and repeated viewings, be it Casablanca (1942), Breakfast at Tiffany’s (1961), The Godfather (1972), The Exorcist (1973) Star Wars (1977), Raiders of the Lost Ark (1981), Do the Right Thing (1989), Pulp Fiction (1994), or Lost in Translation (2003). All of these films, famous for their impact on audiences, as well as usually some type of history with the Oscars (even if its due to a lack of recognition from the period), lingers and are easier to access for those not there at the time.

While older television is now easier to discover in the age of streaming, with services like Netflix and Hulu even favoring binged TV content, it’s unclear whether that newer viewing habit will actually differ greatly from Generation X and Millennials being reared on syndicated reruns, from I Love Lucy to M*A*S*H* to Friends. All of those series also had massive effects on the zeitgeist when they premiered, and all of them lived on to the next generation or two. But due to length and other above issues, few are viewed in order or to completion, and fewer still popular shows from each decade seem to survive the test of time as well as the most influential films.

The reason television shows have a greater difficulty in developing larger cult followings decades on is they tend to be most impactful on the people who were there when the shows were new, even if they had not seen it until later. As time passes, it becomes easier to convince a younger generation to attempt to view films in a style they love, such as crime dramas like Scarface or Goodfellas, than it is to pass along Miami Vice and NYPD Blue from the same decades.

Ad – content continues below

This could change the more focused, concise, and, well, cinematic television programs become. Still, I suspect something as rooted in post-9/11, Bush era paranoia as 24 will continue to find difficulty in getting those who don’t recall that actual devastating day to view the series in its entirety. Films like The Bourne Trilogy, Zero Dark Thirty, and even The Dark Knight, however, had a similar cultural impact and will be ready time capsules for viewers so intrigued.

The Auteur Signature

Traditionally, the motion picture is considered to be the director’s medium. Just as the stage tends to be the actor’s purest wheelhouse, and television strongly favors the influence of writers and producers. Lone filmic stories are thus most influenced by the impact of the moving eye.

This is nothing new, 1940s French film critics laid the groundwork for what American critic Andrew Sarris dubbed in 1962 to be “The Auteur Theory.” In short, it means camera-pen, as in the director who oversees all visual and sound influences on a movie can bend them to his or her own personal authorship. Not all filmmakers are auteurs, but the best ones can leave a unique and noticeable stamp—as well as demand from the medium an individuality that by the format of weekly serialization eludes television, and thereby gives cinema its greatest distinguishing characteristic.

Even the best TV shows are not planned by the episode, but usually by the season. Episode breakdowns happen in the writing room, and directors are frequently changed from hour to hour, hired to fulfill the showrunner’s vision. This gives TV its own advantages in developing characterization at a more nuanced pace. However, it prevents it from completely extracting itself from the narrative pen and embracing a visual, aural experience. One that is far less reliant on words and subtitles when the language can be purely cinematic.

This can be glimpsed in the measured and patient creative choices taken by films that grueling TV schedules could never allow, such as this year’s La La Land relying entirely on original music written for the picture, and filmed in deliberately designed wide, cinemascope frame—and shot on celluloid at that, a luxury almost exclusive now to the occasional Hollywood film. The result is a nostalgic but singular look reflective of director Damien Chazelle’s own sensibility.

To take a step further in ambition, Alejandro G. Iñárritu has won the two previous Best Director Oscars. The first was for Birdman, which made the eccentric but ultimately hypnotic choice to tell its story in a series of elaborate Steadicam shots, in which the camera becomes an unseen extra cast member, an ensemble player in the ballet. It creates a wholly visual component that is so laborious in execution that television could never duplicate it due to Iñárritu’s mandatory desire for visual composition perfection.

Ad – content continues below

The following year’s The Revenant is vastly different in subject and theme with its bloodthirsty, relentless primordial Western setting. But Iñárritu’s bizarre perfectionism and unhinged shooting conditions can still inform a movie that shoots primarily on location and only with natural light, meaning the beauty (and suffering) you see onscreen is real—creating a visceral, raw viewing experience unable to be replicated on a weekly basis.

Of course, the true advantage of auteur visuals can be seen in the best (and worst) movies have had to offer. The near silent sequences of Stanley Kubrick’s space station drifting around in orbit to Johann Strauss’ “The Blue Danube Waltz” in 2001: A Space Odyssey (1968), a picture that is often near mute save for its music and sound design terrors of man vs. machine. There is also Gordon Willis’ black and white symphonic ode to New York vistas in Woody Allen’s Manhattan (1979), composed without a care in the world to character development or a narrative’s plot—it’s a visual poem, pulsating to the rhythms of George Gershwin.

From Alfred Hitchcock building an oversized coffee mug to get the intended effect of poisoned dread in Notorious (1946) to Spike Lee interrupting the flow of his story in Do The Right Thing to have his characters offer all their racial prejudices and barely hidden angers to a fourth-wall breaking camera, they’re singular choices that value mise en scène above a constantly building narrative or character-driven beats.

More unique to film is that it also allows a director to leave a filmography, an authorship that audiences can follow. Hitchcock, Hawks, Ford, Huston, Lean, Kubrick, Coppola, Spielberg, Scorsese, Kurosawa, Truffaut, Bergman, Fellini, Leoni, De Palma, Tarantino, Fincher, Aronofsky, Nolan… all last names that do not need a first in order for the mind to conjure distinct techniques, camera set-ups, styles, and approaches that act as signatures.

While television has its own distinguishing voice emanating from showrunners and certain writers, the impact can be diluted or messy as a showrunner takes on several projects at once, or divests interest over the years, blurring his or her impact. Also, the reliance on more cooks in the kitchen makes harder for a showrunner’s signature to be as pronounced as the symmetrical lines of an exquisitely composed Wes Anderson shot.

Symbiotic Mediums

In the end, television is a wonderful medium that grows more sophisticated and richer by the day. It has more than earned its place as a genuine alternative for popular culture influence next to film and the written word of literature. The most popular TV shows now—excluding the soul-crushing awfulness of reality television—are often more rewarding than the most popular films.

Ad – content continues below

But not always, and nor do these ebbs and flows in studio systems last. By their very insurmountable differences, film has a variety of advantages as showcased above that makes it easier to last through the years, and to be embraced in the present. It also offers an experience still unique to its format.

Television of course has its own perks, but for a medium that transports when the lights are low, cinema’s magic is still its own rare spell that continues to be cast from American summer escapism to the laurels and instant mainstreaming of “smaller” fare that occurs when it simply crosses that Oscar stage… and is seen around the world on fawning television screens.