Discussion of “What’s Ailing Hollywood?” has always been a dependable method of gobbling up column space in newspapers, magazines and, now, their digital counterparts. In 1964, Pauline Kael asked “Are the Movies Going to Pieces?” in The Atlantic Monthly, claiming that the younger generation’s embrace of crudely made films and the intelligentsia’s fondness for intentionally confusing ones was responsible for Hollywood’s decline. Twelve years earlier, Manny Faber had written “Blame the Audience,” although he held bad reviewers mainly responsible for the declining intelligence of the average movie fan.
In 2009, as journalism copes with reports of its own demise, reviewers are again pondering the decline of movies. This May, The New York Times’s Manohla Dargis and A.O. Scott presented a snarky “Memo to Hollywood” filled with a finger-pointing litany of reasons why they ‘don’t make ‘em like they used to.’
Most of their suggestions have a familiar ring. Dargis complains about Hollywood’s taste for homophobia, violence and misogyny while worrying that digital filmmaking has marginalized the talents of the cinematographer. It’s no secret that Hollywood has never been much for catering to the sensitivities of any constituency other than action-loving straight white males. But if a taste for multiculturalism wasn’t much in evidence during previous ‘golden’ eras, why would a dose of it necessarily improve things now? And while digital effects can obviously be abused, fears of technological innovation have been a staple of movie criticism since the earliest critics claimed the talkies were ruining the so-called ‘artistry’ of the silent era.
For his part, Scott informs Martin Scorsese and Steven Spielberg that sky-high budgets are destroying their once “scrappy, personal” vision. He suggests they emulate Brian De Palma and Francis Ford Coppola, who’ve each made low-budget films on high-definition video (the disappointing Redacted and the just-released Tetro). Unsurprisingly, Scott fails to mention the string of big budget flops that forced the pair into accepting steep pay-cuts. Instead of asking the industry’s top directors to work for Hollywood’s equivalent of the minimum wage, Scott might have better luck encouraging budget-challenged young filmmakers to forsake Mumblecore, the independent movement notoriously short on entertainment value. In fact, Dargis offers the Mumblecore crowd some criticism, reminding “filmmakers under 40” that “the tripod is your friend.” Ironically, Scott praises these same “hungry directors with digital cameras, time on their hands and not much money.”
Scott also contends that, in the Obama era, audiences are craving weighty political dramas rather than the tongue-in-cheek ‘seriousness’ of State of Play. The mentality that equates verisimilitude and lofty subject matter with quality has often dominated film criticism. In the U.S., it went unchallenged until the sixties when Farber, Andrew Sarris and Kael wrote effusively of the charms of highly artificial pop masterpieces by the likes of the now-canonized Howard Hawks and Alfred Hitchcock.
Dargis and Scott are not oblivious to the limits of their stunted imaginations, summing things up flippantly:
“…enough with the serial killers (unless you’re David Fincher); period dramas; movies in which children die or are endangered; (bad) literary adaptations; superhero epics; tween-pop exploitation vehicles; scenes with bubble-breasted women working the pole in strip clubs; shady ladies with hearts of gold; Google Earth-like zoom-ins of the world; sensitive Nazis; sexy Nazis; Nazis period; dysfunctional families; dysfunctional families with guns; suburban ennui; suburban ennui with guns; wisecracking teenagers; loser dudes scoring with hot women who would never give them the time of day even if they were drunk out of their minds or too young to know any better (hello, Judd Apatow!); feature films that should have been sketch comedy routines; shopping montages; makeover montages; bromances (unless the guys get it on with each other); flopping penises; spray-on tans; Kate Hudson; PG-13 horror remakes; or anything that uses any of the “classic” songs that we are sick of hearing. What’s left? We don’t know. Isn’t that your job?”
In other words, they will cheerfully admit they don’t know what’s wrong with the movies or how to fix them. But they’re certain something’s rotten in Lotusland.
Amidst a deep recession, the movie industry remains strong when measured in terms of profitability. To date, box office returns are slightly ahead of 2008. Yet critics have long been enamored of implying Hollywood’s profits are inversely proportional to the quality of its product. In 1980, Kael wrote a New Yorker article titled “Why Movies Are So Bad? Or, The Numbers,” explaining the harm done by marketing techniques adopted in the wake of Jaws and Star Wars. By the mid-eighties, Andrew Britton would write of “the virtual disappearance of significant work from the Hollywood cinema…and the audience’s rejection of such significant work as there is.” David Denby’s 1998 New Yorker article “Mourning the Movies” also expressed alarm about “younger moviegoers, reared on little but American movies” who could not appreciate the ‘significant work’ that Britton couldn’t find in theaters. The theory that audiences have, en masse, gotten less discriminating (like claims about the declining intelligence of the average student) is not especially convincing to anyone but the nostalgic mindset given to churning out such articles.
A more convincing counter-argument suggests that it’s simply standards of merit which fluctuate historically. The corollary to this is the notion that, in all eras, most of what humanity produces is garbage. In 1964’s “I Can’t Get That Monster Out of my Mind,” Joan Didion mocked the sensibility which views greedy executives as responsible for the ‘decline’ of Hollywood, a place she once dubbed the “last stable extant society” amidst the countercultural chaos. Referencing the misguided notion that “most people, left to their own devices, think not in clichés but with originality and brilliance,” Didion wryly commented on a perceived lack of original films by observing “we would all agree a novel is nothing if not the expression of an individual voice…and how many good or even interesting novels, of the thousands published, appear each year?”
Hollywood has always known the economic advantage of the lowest common denominator, although the jaw-dropping success of Paul Blart: Mall Cop provides ample proof of its resiliency. Judd Apatow pragmatically explains the wisdom in this, commenting on his brief struggle with the industry’s focus group mentality, “What I took from it is, the audience is supposed to like the film, as simple as that sounds.” No critic would grant that 40 Year-Old Virgin-style success is the same thing as artistic excellence. Today, there is widespread suspicion that many reviewers harbor the opposite view: popularity is too be suspected, difficulty is a virtue.
This tendency leads to the public’s often-voiced opinion of the critics: “They’re just a bunch of snobs paid to be critical.” Most movie fans could care less if the critics hate The Dark Knight since we all know it’s either A) a pretty cool flick or B) the second coming, depending on how big a superhero fanboy you are. As logic-challenged as journalists typically are on the state of Hollywood, they’re correct that something has changed. Several things, in fact.
Veteran critic Jonathan Rosenbaum addresses some of the changes in the surprisingly unsentimental memoir “My Filmgoing in 1968.” Disputing the idea that American moviegoers once embraced films by the likes of Fellini, Godard, and Antonioni, Rosenbaum points to six high-brow features he fell in love with that year which all flopped in the U.S. He also illuminates the larger context behind the shift in attitudes since the sixties.
As Rosenbaum notes, for much of the twentieth century, movies were the meat-and-potatoes of Americans’ entertainment diet. People viewed moviegoing as an everyday past-time, expecting the routine and being pleasantly surprised by the extraordinary. Today, that meat-and-potatoes role is filled by television and other home-delivery sources. To convince audiences to fork over the ticket price, movies must court the appeal of the ‘event film.’ This swing-for-the-fences approach can certainly limit the kind of fare that executives are willing to fund.
While independent theaters have consequently suffered from a lack of supply, they have also been squeezed competitively by the deregulatory mindset of the Reagan era which allowed studios to re-enter the exhibition business. As this sad history concludes, we confront the present dystopia: our cherished neighborhood theaters transformed into soul-sucking Levittown multiplexes, clogged with nothing but teen comedies, testosterone-fueled action flicks and, come December, a few morsels of Oscar-bait.
This simplified tale of paradise lost obscures the primary causes for the transformations it dimly perceives. While advertising plays a stunningly important role in contemporary Hollywood, producers, distributors and exhibitors have always been suspicious of quirky films that do not lend themselves to short taglines or eye-catching posters. If such reticence is more prevalent than ever, it’s principally because the M&A mania of the last few decades only accelerated the corporate consolidation that began just prior to Hollywood’s last ‘golden age’ in the seventies. Today, Los Angeles-based studio heads steeped in the past glories of Hollywood-the-dream-factory are at the mercy of their Wall Street-based bosses schooled in the timeless efficiencies of ROI.
Squeezing Hollywood bottom lines further, in 1976, Congress repealed a 1971 tax law that had briefly funded the release of many groundbreaking movies which doubled as tax shelters for the wealthy. Taking advantage of this loophole, Lester Perksy’s Persky-Bright Services Corporation raised $150 million, or a staggering 20% of Hollywood’s total production costs, from 1973-76. These kinds of loopholes funded offbeat fare like Two-Lane Blacktop, American Graffiti, The Last Detail, California Split, Shampoo and Taxi Driver. While the counterculture, Vietnam, and Watergate all influenced the vaunted Hollywood New Wave, it was economically enabled by rich investors who, for a time, were happy to accept losses on the balance sheet.
In some ways, we are in a new ‘golden’ age for movie fans given the increased accessibility of films of all types thanks to DVDs, the proliferation of movie-themed cable channels, and internet delivery. In addition, the thriving online film culture testifies to the fact that people remain as passionate as ever about the movies. But these changes also figure largely in the current appeal of the “What’s ailing the movies?” talk. Like most discriminating viewers, I’ll grant that there seems to be a kernel of truth to it all. To explain my pessimism, a blog-eared bit of personal history is necessary.
In the mid-nineties, I was a young cinéphile living in Manhattan. Like many movie enthusiasts, I regularly braved the multiplexes on opening weekend for scores of hits and misses. I haunted the independent theaters, keeping up on the thriving indie scene. I discovered forgotten masterpieces while sweating in stuffy art-houses. I splurged on tickets to the month-long movie orgies like the New York Film Festival. I had a subscription to Film Comment and wore out my membership cards for Kim’s Video on Bleecker Street and the Film Forum on West Houston. I wrote screenplays for never-filmed shorts and starred in one directed by a French trust-fund kid I knew slightly.
In 2009, I live in Los Angeles, getting my movie fix mainly by firing up the laptop, like most twenty-first century cinéphiles. I order forgotten gems and indie releases on Netflix (or just stream them online). I post about films on social networks and write for several blogs. I can still be talked into venturing out to the multiplex or the art-house, but these treks are fewer and farther in between.
A few years ago, I was a glass-half-full optimist about online film culture. I was dimly aware that using the TV and laptop to feed my addiction meant I was experiencing movies at their most unflattering. But I blissfully preached the virtues of “Web 2.0” to Film Comment editor Gavin Smith in an e-mail rant about the magazine’s behind-the-times attitude. In retrospect, I was mainly registering my displeasure that their paltry website meant they weren’t giving in to the Net Generation’s now commonplace expectations of free content.
Responding to my criticisms, Smith observed that most bloggers are more interested in being writers than readers. Given that social networkers and bloggers are forever creating “Top Fives” and myopically debating the merits of the latest buzzworthy film, this seems indisputable. But it only hints at the limits of online communication.
Undoubtedly, all of the information I used to devour from film books, magazines and festival calendars is now at my fingertips via online search engines. But thanks to the internet’s strong self-section component, like most people, I rarely engage online with those who hold opposing views. A world of likeminded users reinforcing their own opinions may make for potent flame wars, but it tends to inhibit the artistic appreciation of the unfamiliar.
The internet has also amplified the power of the anonymous misanthrope with no interest in being either a writer or reader. These sorts spread of their invective in the comment sections of blogs, articles, and sites like YouTube. Before the rise of Web 2.0, there was always the opportunity for this sort to bitch aloud to friends or fire off angry letters to the editor. But as my fellow blogger Richard Wink puts it, these days “criticism as a guide or a compass is getting swamped by vicious bile.” The resulting cynicism has only increased the cultural-malaise factor that nags at critics and fans like me.
The pervasiveness of the online lifestyle turns us all into technological determinists, accepting the pixilated new world as an inevitable outcome rather than pondering alternatives. In the Web’s snark-perfumed corridors, it’s easy to accept the facile arguments of the mourning-the-movies brigade. But reports of the death of Hollywood, our “last extant stable society,” remain as exaggerated as ever.