It’s rather widely agreed upon that the decade of the ’80s was a pop culture golden age. Cinematically, most every genre took on a very distinctive identity that would have its films forever associated with that era, and horror was certainly no exception. The years of 1980 to 1989 would see 104 horror movies wind up on their respective year-end top 100 lists in regards to box office receipts. By comparison, the ’90s would go on to have nearly a 50% drop in that figure – a decline that would have been even worse if not for the saving grace of Wes Craven’s Scream in late 1996 and the revival it inspired. Despite the many masterpieces, fan favorites, and iconic characters that arose from the ’80s, the early ’90s would prove that it couldn’t last forever. Ironically, that which provided the basis for all the terror in the cineplexes was likely the very thing that ultimately poisoned the well for years to follow.
As 1980 began, Hollywood – that monolithic mecca that is universally recognized as the entertainment capital of Western culture – had almost fully regained its stature from a time that had almost killed it. Like the many things that Millennials are attributed for ‘ruining’, the Boomers were implicated for almost bankrupting the major American movie studios. Between the convenience of television and an appetite for the more explicit and challenging cinematic fare of European cultures, Hollywood was bleeding money on epic WWII movies and big-spectacle musicals that people weren’t buying tickets for anymore. The puritanical Hays code (or Motion Picture Production Code), which had governed acceptable movie content since the 1920s, was obliterated when studio MGM ignored a code denial and defiantly released Michelangelo Antonioni’s Blow-Up – with its nudity, sexual content, and drug use uncensored – to the tune of $120 million in adjusted-for-inflation (AFI) revenue. Studio executives, very much against their will, were forced to relinquish creative control to these young libertine filmmakers to save their businesses. The MPAA rating system would be introduced in 1968 and theater screens would never be the same again. The New Hollywood was born.
Upon its release in December of 1966, Blow-Up, a sexually provocative murder mystery set in the London fashion world, blew up the long-standing adherence to moral conservatism in Hollywood and prompted the creation of the much more permissive MPAA rating system.
This time would prove to be a playground for budding auteurs and resourceful independent filmmakers alike. With the doors blown off of prior constraints on content, dark themes and mature material previously forbidden were now presented with a confrontational boldness to audiences that hadn’t realized how much they wanted all this grim realism. Novels that could have never graced the silver screen before – like Mario Puzo’s The Godfather ($719m AFI), Anthony Burgess’ A Clockwork Orange ($143m AFI), and most notably to horror, William Peter Blatty’s The Exorcist ($943m AFI) – became the kind of critically adored blockbusters that studio executives live for. The market was robust for aspiring indie directors too, with George A. Romero’s Night of the Living Dead (est. $73m AFI), Wes Craven’s The Last House on the Left (est. $17m AFI), Tobe Hooper’s The Texas Chainsaw Massacre (est. $125m AFI), and John Carpenter’s Halloween ($184m AFI) – making millions on their miniscule production budgets and eventually propelling all 4 filmmakers to Hollywood stardom. All of these noteworthy successes, however, were not of the type that play to the wishes of the industry’s elite power players – the bankrollers, producers, and money managers. For them, relinquishing that much creative control to the creators themselves is far too risky and uncomfortable, as they would much rather hold the proverbial reins themselves. Ironically though, the daring films that proved to be popular would lead studios to re-finding the pulse of audience demand, and on the backs of filmmakers like Steven Speilberg and George Lucas, the age of the fine-tuned blockbuster would be reborn and their control over finished products would be restored.
Pumping out what the people wanted certainly wasn’t a problem at first, as many beloved classics were made throughout the ’80s, but a trend would develop that eventually proceeded to rot the genre from the inside out. Generally, the first entry in what would develop into a long-running franchise – Sean Cunningham’s Friday the 13th and Wes Craven’s A Nightmare on Elm Street being the most famed examples – would stand quite well as a quality film, but cash-drunk producers and executives were enraptured with the money mills they could transform these titles into. This often resulted in a string of hastily-made sequels that typically abandoned the pretenses of story and atmosphere, along with input from original creators, in favor of pure crowd-pleasing elements, even if the final product made no coherent sense.
In 1973, The Exorcist would forever alter the perception of horror in mainstream culture. When adjusting ticket prices for inflation, its success at the American box office is on par with Star Wars: The Force Awakens.
While plenty of favorites still managed to arise from this cynical gold rush, the utter lack of long-term vision in cultivating these box-office behemoths, and their engineered copycats, would result in both massive audience fatigue and a wholesale critical black-listing of horror altogether. Factor in the impact of the emergent VHS and cable TV formats that allowed for even greater market saturation of often lower-quality films, and the party was over. A golden age of horror had come to a close. 1990 was still a modestly successful year however, but two troubling trends would emerge. Flatliners was a solid hit, but despite its overt horror subject matter, would see its marketing campaign focus more heavily on its sci-fi thriller elements and its all-star cast, consciously distancing it from the troubled genre. Also, many of the other notable films of the year – Arachnophobia, Tremors, and Gremlins 2 (which flopped, failing to make back its budget during its theatrical run) – were firmly horror-comedies, sometimes bordering on farce, and audiences buying such obvious mockery of a thing doesn’t generally bode well for the thing itself. So, the stage was set for the bottom to truly drop out in ’91.
That may seem like an absurd statement to make, considering that horror hallmark The Silence of the Lambs would be released in February of that year. Having gone on to sweep the Academy Awards’ ‘Big Five’ – picture, director, actor, actress, and screenplay – the following spring, this film alone should save ’91 from being considered a low water mark, but therein lies a big problem. It’s not technically a horror movie. This, perhaps more than any other film, triggers the debate of ‘What is horror?‘ Some posit that it’s a difference in focus between suspense and fear, others present it as the presence or absence of gory violence or supernatural elements, and others still say that it involves whether or not death is the central-most theme. With the many inconsistencies in the application of the horror label over the decades, it appears to be largely a matter of the marketing approach to a thematically dark film, and Silence of the Lambs, modeling the success that Misery had the previous year as a ‘dramatic thriller’, followed suit and as much distance as possible was placed between the film and the horror genre by both the studio, Orion Pictures, and the A-grade Hollywood talent involved in the production. The tremendous success it enjoyed, both financially and critically, only exasperated horror’s commercial struggles and cheap reputation while establishing a trend of disallowing the genre, at least initially, to be associated with other banner films like Interview with the Vampire, The Crow, and Se7en.
Despite becoming one of the most iconic characters in horror cinema, Hannibal Lector was initially distanced as far as possible from the genre. The marketing framed The Silence of the Lambs as a dramatic suspense thriller that just happened to contain moments of graphic violence and genuine terror. It would win big at the Oscars that year and since, no other film has won ‘The Big Five’ at the Academy Awards.
’91 also saw the development of studios vacating horror releases from the prime dates on the release schedule, relegating them largely to dumping grounds in January-March and August-October. Only in recent years has this trend started to reverse, both by horror being released steadily along the calendar and by genre blockbusters emerging from non-traditional times of the year. 1991 saw 3 releases throughout the post-holiday winter season – Warlock (Jan. 11, $19.8m AFI), Popcorn (Feb. 1, $9.1m AFI), and The Unborn (Mar. 29, $2.6m AFI), none of which succeeded to gain much attention, if not being outright flops entirely. (As a point of reference, most horror films need to gross at least $20-25m to be considered modest successes, but that is by no means a universal figure.) The following week, LGBT horror film Poison, slapped with an NC-17 rating and condemned by an American senator, would go on to earn a paltry $1.2m AFI. This was not an era where notoriety could be reliably counted upon as profitable publicity, and Poison would hardly be the only casualty of that fact, as demonstrated by the next film.
It would be another 4 months before horror would get to the big screen again, but this one was supposed to reverse the tide. With a budget of $20m AFI, Body Parts was easily the most expensive original horror production of the year, and with a August 2nd release date, Paramount Pictures was gambling that this gory thriller about a man who receives a limb transplant possessed by its former psychopath owner would take advantage of the shortage of competition and make some decent bank. Then, Jeffrey Dahmer happened. Not but 10 days before its scheduled release, the story about the discovery and apprehension of the infamous cannibal serial killer and his ghastly collection of people pieces captivated a nation. Disgusted by the gruesome details but driving the story’s exposure into the stratosphere through morbid curiosity, Dahmer dominated the news cycle. As such, public backlash against the upcoming release of Body Parts developed immediately, just by irrational association of the coincidental title, and marketing for the film was quickly suppressed. Occurring so close to the scheduled date, Paramount went ahead with the film’s opening, which sputtered to a $20m AFI run, failing to generate profit or cover the movie’s ancillary costs. Again, the very early ’90s were not a good proving ground for the idiom that any publicity is good publicity.
Paramount Pictures was expecting its original horror-thriller Body Parts to hack up the competition. Instead, it was eaten alive by the media fervor surrounding infamous serial killer Jeffrey Dahmer.
Up next was Child’s Play 3, hitting the theaters four weeks later and less than 10 months after the release of Child’s Play 2. As the reliable horror money mills were consistently drying up, Chucky’s second outing had bucked the trend, pulling down $62m AFI at the box office, prompting to Universal Pictures to place the third film on a turnaround fast track of ridiculous speed. Audiences noticed the rush job, and the film ended up grossing only $32m AFI on a budget of $24m AFI – bleeding nearly half the revenue of its predecessor. Still regarded as one of the worst of the now 7-movie series – although it’s been received a little more warmly in recent years – there would not be another entry released until 1998, which would also scuttle the Child’s Play franchise title in favor of the ‘Of Chucky’ iteration.
A mere two weeks later, and doing no favors for the fortunes of little Chucky, came Freddy’s Dead: The Final Nightmare. Financially, it was the biggest success for the genre that year, and by a sizable margin too, with a total haul of $75.7M AFI and an opening weekend that rivaled the Elm Street franchise’s biggest hit, The Dream Master. Improving 50% over the gross of previous entry The Dream Child, this would count as a win for New Line Cinema and appear as a bright spot in an otherwise dreary stretch, but all was not well in Springwood. It should come as no surprise that the film was received poorly by critics, for at this point, virtually all horror was dead on arrival with the film aficionado set, but the movie was not well regarded by fans either. If IMDb scores are to be trusted, Freddy’s Dead remains the lowest rated among all 9 entries in the series – this includes the tonally inconsistent Freddy’s Revenge, the shrugworthy predecessor Dream Child, and the frequently lambasted 2010 remake. It made for a thoroughly unsatisfactory close for one of the definitive franchises of the 1980’s (although, in ’93, Jason Goes to Hell would give it a run for every penny of its money in regards to disappointing fans) and basically served as the sickly swan song for an entire generation of horror.
Box Office for the Elm Street franchise with total grosses and opening weekends (adjusted for inflation)
|A Nightmare on Elm Street (1984)
|Freddy’s Revenge (1985)
|Dream Warriors (1987)
|The Dream Master (1988)
|The Dream Child (1989)
|Freddy’s Dead (1991)
|New Nightmare (1994)
|Freddy Vs. Jason (2003)
|A Nightmare on Elm Street (2010)
While Freddy’s Dead: The Final Nightmare was a box office success, its damage to the franchise was so thorough that not even a return of series creator Wes Craven with an ambitious and critically well-received entry was enough to rehabilitate its image with weary fans.
Yet, the death of an age is an opportunity for the birth of another, and seven weeks later would see the release of a film that made a stride toward something new, and it came from someone who was an architect of both the previous era and the one to follow. Wes Craven’s The People Under the Stairs was a subversion of what many had come to expect from mainstream horror, with its black inner-city protagonists, sharp biting satire of American society, and its tendency to mix moments of humorous levity in with its genuine scares – it was almost a total departure from the formula that had led the genre into its derivative dearth. Bafflingly released the day after Halloween (the only ‘horror’ release that October was Ernest Scared Stupid), the movie still went on to be a substantial hit, opening at #1 at the box office and going on to gross $52.5M AFI – more than quadrupling its production budget. Five years later, Craven would fully realize his blend of satirical themes and a distinct meta self-awareness mixed into a generally more playful horror tone with the genre-resuscitating blockbuster Scream.
From that release of what is often recognized as the quintessential 90’s horror film, the genre has never collapsed quite like it did in 1991. While its cultural influence has fluctuated, there has been virtually an unbroken chain of successful overlapping trends in horror cinema – the brief slasher resurgence inspired by Scream, Japanese-influenced offerings (The Ring, The Grudge), films shaped by the New French Extremity movement (the Saw series, Hostel, Alejandro Aja’s remake of Wes Craven’s The Hills Have Eyes), a string of remakes of ’70s and ’80s classics, a return of the zombie, the found footage format, The Conjuring and Insidious franchises of James Wan, the ascent of Blumhouse, and a renaissance of slowburn psychological terror. As 2019 approaches, there is now more horror and more quality horror at any one time than ever before. Technological advances have allowed independent filmmakers access to more sophisticated equipment and processes while the proliferation of the internet have given them the same leaps forward in marketing and distribution. We currently have an era where intrepid auteurs are thriving along side a Hollywood system that is getting increasingly better at balancing its profit-driven controls with more autonomy for creators, and this is the case for most every type of movie – not just horror. It would seem as if we are in the midst of a new golden age right now, and hopefully, there is no recurrence of 1991 in the future.