Thoughts from my study of Horror, Media, and Narrrative

Posts tagged “Daniel Boorstin

The Most Important Product Is You!

“The Culture Industry” seems to be one of those seminal pieces in the cannon of Cultural Studies that elicits a visceral (and often negative) reaction from modern scholars. Heavily influenced by the Birmingham School, generations of scholars have been encouraged to recognize agency in audiences, with the Frankfurt School often placed in direct opposition to the ideals of modern inquiry. Read one way, Horkheimer and Adorno appear elitist, privileging what has come to be known as “high culture” (e.g., classical music and fine art) over the entertainment of the masses. Horkheimer and Adorno argue that the culture industry creates a classification scheme in which man exists; whereas man previously struggled to figure out his place in the world, this job is done for him by the culture industry and its resultant structure of artificial stratification. Ultimately, then, because he does not have to think about his position in culture, man is not engaged in his world in the same way as he was before, which therefore allows content to become formulaic and interchangeable.

Later echoed in Adorno’s “How to Look at Television,” “The Culture Industry” laments the predictable pattern of televisual media, with audiences knowing the ending of movies as soon as they begin. (Interestingly, there is some overlap with this and Horror with audiences expecting that offerings follow a convention—one might even argue that the “twist ending” has become its own sort of genre staple—and that a movie’s failure to follow their expectations leaves the audience disgruntled. This of course raises questions about whether modern audiences have been conditioned to expect certain things out of media or to engage in a particular type of relationship with their media and whether plot progression, at least in part, defines the genre.) Horkheimer and Adorno’s attitude speaks to a privileging of original ideas (and the intellectual effort that surrounds them) but the modern context seems to suggest that the combination of preexisting ideas in a new way holds some sort of cultural value.

Adorno’s “How to Look at Television” also points out a degradation in our relationship to media by highlighting the transition from inward-facing to outward-facing stances, equating such migration with movement away from subtlety. Although the point itself may very well be valid, it does not include a robust discussion of print versus televisual media:  Adorno’s footnote that mentions the different affordances of media (i.e., print allows for contemplation and mirrors introspection while television/movies rely on visual cues due to their nature as visual media) deserves further treatment as the implications of these media forms likely has repercussions on audience interactions with them. Almost necessarily, then, do we see a televisual viewing practice that does not typically rely on subtlety due to a different form of audience/media interaction.  (It might also be noted that the Saw movies have an interesting take on this in that they pride themselves on leaving visual “breadcrumbs” for viewers to discover upon repeated viewings although these efforts are rarely necessary for plot comprehension.)

To be fair, however, one might argue that Horkheimer and Adorno wrote in a radically different media context. Sixty years later, we might argue that there’s not that much left to discover and that prestige has now been shifted to recombinations of existent information. Moreover, Horkheimer and Adorno’s position also assumes a particular motivation of the audience (i.e., that the payoff is the conclusion instead of the journey) that may no longer be completely true for modern viewers.

Although Horkheimer and Adorno rightly raise concerns regarding a lack of independent thinking (or even the expectation of it!), we are perhaps seeing a reversal of this trend with transmedia and attempts at audience engagement. Shows now seem to want people to talk about their shows (message boards, Twitter, etc.) in order to keep them invested and although we might quibble about the quality of such discourse and whether it is genuine or reactionary, it seems that this practice must be reconciled with Horkheimer and Adorno’s original position. It should be noted, however, that the technology on which such interaction relies was not around when Horkheimer and Adorno wrote “The Culture Industry” and the Internet has likely helped to encourage audience agency (or at least made it more visible).

Seeking to challenge the notion that the Horkheimer and Adorno discounted audience agency, John Durham Peters argues for the presence of both industry and audience influence in the space of culture and furthermore that while audiences may be empowered, their actions serve to reinforce their submission to the dominant wishes of industry in a realization of hegemonic practice. Although Horkheimer and Adorno, writing in the shadow of World War II were undoubtedly concerned with the potential undue influence of mass media as a vehicle for fascist ideology—as evidenced by quotes such as “The radio becomes the universal mouthpiece of the Fuhrer” and “The gigantic fact that the speech penetrates everywhere replaces its content”—they were also concerned that the public had relinquished its ability to resist by choosing to pursue frivolous entertainment rather than freedom (Adorno, 1941). From this position, Peters extracts the argument that Horkheimer and Adorno did in fact recognize agency on the part of audiences, but also that such energies were misspent.

The notion of “the masses” has long been an area of interest for me as it manifests throughout suburban Gothic horror in the mid-20th century. In many ways, society was struggling to come to terms with new advances with technology and the implications for how these new inventions would bring about resultant changes in practice and structure. Below is an excerpt from a longer piece about a movie that also grappled with some of these issues.

Reacting to atrocities witnessed throughout the course of World War II, Americans in the 1950s became obsessed with notions of power and control, fearing that they would be subsumed by the invisible hand of a totalitarian regime. In particular, the relatively young medium of television became suspect as it represented a major broadcast system that seemed to have an almost hypnotic pull on its audience, leaving viewers entranced by its images. And images, according to author and historian Daniel Boorstin, were becoming increasingly prominent throughout the 19th century as part of the Graphic Revolution replete with the power to disassociate the real from its representation (1962). For cultural critics still reeling from the aftereffects of Fascism and totalitarianism, this was a dangerous proposition indeed.

Although these underlying anxieties of mid-century American society could be examined via a wide range of anthropological lenses and frameworks, visual media has historically provided a particularly vivid manifestation of the fears latent in the people of the United States (Haskell, 2004). This is, of course, not to imply that visual media is necessarily the best or only means by which we can understand prevailing ideologies in the years after World War II, but merely one of the most visible. However, as a critical examination of the entire media landscape of the 1950s would be beyond the scope of a single paper of this magnitude, discussion shall be primarily concentrated around Elia Kazan’s 1957 movie A Face in the Crowd with particular attention paid to the contrasting channels of cinema and television.[1] This paper will seek to briefly position A Face in the Crowd in the larger context of paranoia-driven cinema of the 1950s before using the film as an entryway to discuss critiques of mass culture. Given the film’s apparent sustained resonance as indicated by its relatively recent mention (Vallis, 2008; Hoberman, 2008b; Franklin, 2009), the arguments of Critical Theory will then be applied to modern American culture in an attempt to ascertain their continued validity. Finally, an argument will be made that acknowledges the potential dangers facing mass culture in the 21st century but also attempts to understand the processes that underlie these pitfalls and provides a suggestion for recourse in the form of cultural and media literacy.

Paranoia, Paranoia, Everyone’s Coming to Get Me

The post-war prosperity of the 1950s caused rapid changes in America, literally altering the landscape as families began to flood into the newly-formed suburbs. With the dream and promise of upward social mobility firmly ensconced in their heads, families rushed to claim their piece of the American dream, replete with the now-iconic front yard and white picket fence. And yet, ironically, a new set of worries began to fester underneath the idyllic façade of the suburbs as the troubles of the city were merely traded for fears of paranoia and invasion; the very act of flight led to entrapment by an ethos that subtly precluded the possibility of escape.

As with many other major cultural shifts, the rapid change in the years following World War II caused Americans to muse over the direction in which they were now headed; despite a strong current of optimism that bolstered dreams of a not-far-off utopia, there remained a stubborn fear that the quickly shifting nature of society might have had unanticipated and unforeseen effects (Murphy, 2009). Life in the suburbs, it seemed, was too good to be true and inhabitants felt a constant tension as they imagined challenges to their newly rediscovered safety:  from threats of invasion to worries about conformity, and from dystopian futures to a current reality that could now be obliterated with nuclear weapons, people of the 1950s continually felt the weight of being a society under siege. An overwhelming sense of doubt, and more specifically, paranoia, characterized the age and latent fears manifested in media as the public began to struggle with the realization that the suburbs did not fully represent the picturesque spaces that they had been conceived to be. In fact, inhabitants were assaulted on a variety of levels as they became disenchanted with authority figures, feared assimilation and mind control (particularly through science and/or technology), began to distrust their neighbors (who could easily turn out to be Communists, spies, or even aliens!), and felt haunted by their pasts, all of which filled the movie screens of the decade (Jensen, 1971; Murphy, 2009; Wolfe, 2002).[2] Following solidly in this tradition, Kazan’s A Face in the Crowd picks up on some of the latent strains of paranoia in American culture while simultaneously serving as a platform for a set of critiques regarding mass culture.

Somewhere, a Star Is Made

The storyline of A Face in the Crowd is rather straightforward and yet deceptively complex in its undertones:  on the surface, we experience a rather heavy-handed morality tale in the form of country bumpkin Larry “Lonesome” Rhodes, a relative nobody who is plucked from obscurity and made (and subsequently broken) through powers associated with television. Yet, it is only when we begin to connect the movie to a larger societal context that we begin to understand the ramifications of the film’s message; a careful examination of A Face in the Crowd reveals striking suspicions regarding the role that media plays (in this case, primarily television and cinema) in shaping American culture. Stars, director Elia Kazan argues, are not so much born as made, a distinction that portends dire consequences.

It is worth noting that Kazan’s film was made during a time when the concept of the “celebrity” was being renegotiated by America; for a large part of its history, the United States, firmly grounded in a Puritan work ethic, had honored heroes who exemplified ideals associated with a culture of production and was struggling to reconcile these notions in the presence of an environment whose emphasis was now focused on consumption. Although modern audiences might initially find this shift difficult to appreciate, one need only consider that the premium placed on production is so central to American ideology that it continues to linger today:  in a culture that exhibits rampant consumerism, we still value the “self-made man” and sell the myth of America as a place where anyone can achieve success through hard work. To abandon these ideas would necessitate that we reinterpret the very meaning of “America.” Thus, we become more sympathetic to the critics of the day who lamented the loss of the greatness of man and bristled against the notion that fame or celebrity could be manufactured—such a system could would only result in individuals who were lacking and unworthy of their status (Gamson, 1994; Benjamin, 1973)

Such is the case it seems, with Larry Rhodes, who is discovered by roving reporter Marcia Jeffries in an Arkansas jail. Although it cannot be denied that Rhodes has some modicum of talent and a certain charisma that comes from being unafraid to speak one’s mind, Marcia ushers Rhodes onto the path of greatness by dubbing him “Lonesome” and thus creates a character that transforms Rhodes from a despondent drunk to a winsome drifter. This scene—the first major one in the movie—thusly introduces the important notion that those involved in the media can be implicitly involved in the manipulation of the information that travels over the airwaves. Subtly adding to the insidious nature of the media, A Face in the Crowd portrays Marcia as a character that seems likable enough, but also a person who is, in a way, exploiting the people in jail as she rushes in with her tape recorder intent on prying the stories from the characters she finds (or creates!) and does not exhibit much concern in truly understanding why these men are imprisoned in the first place. Taken to an extreme, we later come across the character of The General, who further perverts the connection between media and power as he conspires with Lonesome to remake the image of Senator Worthington Fuller as the congressman runs for President.

Yet, as Lonesome Rhodes grows in his role as a media personality, he quickly demonstrates that the power to manipulate does not lie solely with those who sit behind the cameras. In Memphis, Rhodes incites a riot against the Luffler mattress company and also solicits donations in order to help a Black woman rebuild her house. In light of this, we can see that while Kazan focuses on the negative implications of television and celebrity, that the relative good or bad that comes from these actions is not necessarily the point—instead, the one constant in all of the depicted scenarios is a public who is manipulated into performing actions on the behalf of others. Although the characters of Lonesome and The General are vilified throughout the film, it is the masses for which Kazan demonstrates true disdain.

Extraordinary Popular Delusions

Perhaps nowhere is this contempt more apparent than at the end of the film where, in an attempt to offer a small moment of solace to Marcia after her unmasking of Lonesome, writer Mel Miller notes, “We get wise to them, that’s our strength” (Kazan, 1957). And Miller is not wrong:  Western tradition has long recognized the correlation between knowledge and power and Miller’s assertion touches upon the revelatory clout inherent in the realignment of perception and reality as noted by public relations guru Howard Bragman (2008). A more critical examination of the film’s closing scene, however, raises an important question:  Who is Miller’s “we”? Although one might be tempted to read this line as indicative of an egalitarian philosophical view, it is important to note that the only two characters in the shot represent the film’s arguably upper-middle class, and pointedly Eastern-educated, elite—nowhere to be seen are representatives of the small Arkansas town from the film’s opening or denizens of Memphis, both of whom serve to characterize the majority of Lonesome’s devoted viewers.[3] In fact, if we take time to reflect upon the movie, we realize that the majority of the audience was only alerted to Lonesome’s dual nature after Marcia flipped a control room switch and revealed the underlying deterioration; the masses oscillated from one position to the next without understanding how or why and once again adopted a passive stance in their relationship with media. Moreover, as Courtney Maloney points out, Kazan’s depiction of the agency of the masses is actually limited in scope:  despite a montage of audience members vehemently phoning in, sponsors are simultaneously shown to be acting independently as they withdraw their association with Lonesome (1999). Moreover, the subtext of the scene distances the rational decision-making of the truly powerful from the impassioned beseeching of the masses, likening the power of the latter to that of a mob. Knowledge and its associated authority, clearly, are afforded to a select group.

This idea, that the world can be divided between those who “get wise” and those who do not, serves to develop a rather sharp classist criticism against the medium of television and those who would watch it:  moviegoers, by virtue of witnessing Kazan’s work, find themselves elevated in status and privy to “the man behind the curtain” (to borrow a phrase). In contrast, the malleable masses were considered to be pacified and placated by idealistic portrayals of life in the 1950s in the form of television programs like Leave It to Beaver, The Donna Reed Show, and The Adventures of Ozzie and Harriet. Clearly, Kazan creates a dichotomy imbued with a value judgment descended from the thoughts of prominent thinkers in the Frankfurt School who, as far as aesthetics were concerned, preferred the high culture of cinema to the conformity and manipulated tastes of television (Horkheimer & Adorno, 2002; Adorno, 1985; Quart, 1989). This distinction between high and low culture would be a crucial supporting idea for critics as a prominent fear of mass culture was that it portended a collapse between concepts (e.g., fame, celebrity, or intellectual value) of objectively different quality, essentially rendering all manifestations the same and therefore all equally mundane (Boorstin, 1962; Hoberman, 2008b; Kierkegaard, 1962).  Even worse for critics, perhaps, was the perception of the masses’ refusal to grow out of its immature interests, a behavior that was characterized as both childlike and stubborn (Adorno, 1985).

And the fears of such theorists, all of whom were reacting to recent and rapid advances in broadcast technology, were not unfounded. Consider, for example, that radio had been popularized a scant fifty years prior and had vastly altered critics’ understanding of media’s potential impact, creating a precedent as it proliferated across the country and began to develop a platform for solidarity and nationalism. Yet, while the effects of radio were decidedly pro-social, due in part to its propagation of orchestral music and transmission of fireside chats, television was viewed as a corrosive force on society that spurred on the destruction of culture instead of enriching it.[4]For the critics of the Frankfurt School, television was indicative of an entrenched sentiment that regarded mass-produced culture as formulaic and perfectly suitable for a generation of passive consumers who sat enraptured in front of the glowing set. Associating the potential dissemination of propagandist ideology with television as a form of mass broadcast, cultural theorists evoked notions of totalitarian regimes akin to Hitler and Stalin in an effort to illustrate the potential subjugation of individual thought (Mattson, 2003). These simmering fears, aggrandized by their concurrence with the rising threat of Communism and collectivist cultures, found fertile soil in the already present anxiety-ridden ethos of the United States during the 1950s.


[1] It should be noted, however, that the comics of this time—those that belong to the end of the Golden Age and beginning of the Silver Age—also provide an additional understanding of the ways in which Americans indirectly wrestled with their fears.

[2] For a more exhaustive list of movies that support this point, see Wolfe, 2002.

[3] Let us also not forget the fact that Lonesome exhibits a rather patronizing attitude toward his audience in his later career, instituting the Cracker Barrel show with its manufactured country lackeys (Yates, 1974). In contrast to his first stint in Memphis, Lonesome has begun to embrace his country image as a means (if an inauthentic one) to connect with his audience, a point of contention to which we will return.

[4] Curiously, however, we see that this relationship between presidential addresses (like the aforementioned fireside chats) and mass media did not elicit notable complaints from critics who were generally wary of the merging of politics and entertainment (Quart, 1989; Benjamin, 1973). Although a larger discussion is warranted regarding the subtleties of this distinction, I would suggest that part of the differentiation stems from a high-low culture dichotomy. Although critics linked the negative presence of television with corporate advertising, James Twitchell suggests that there has always been a rather intimate relationship between arts and commerce, most saliently exhibited by wealthy citizens or entities who act as patrons (Twitchell, 1996).

 

Works Cited

Adorno, T. (1941). On Popular Music. Studies in Philosophy and Social Science, 9, 17-48.

Adorno, T. (1985). On the Fetish Character in Music and the Regression of Listening. In A. Arato, & E. Gebhardt (Eds.), The Essential Frankfurt School Reader (pp. 270-299). New York, NY: Continuum.

Benjamin, W. (1973). The Work of Art in the Age of Mechanical Reproduction. In H. Arendt (Ed.), Illuminations (H. Zohn, Trans., pp. 217-242). London, England: Schocken.

Boorstin, D. (1962). The Image: A Guide to Pseudo-Events in America. New York, NY: Athenenum.

Bragman, H. (2008). Where’s My Fifteen Minutes?: Get Your Company, Your Cause, or Yourself the Recognition You Deserve. New York, NY: Portfolio.

Gamson, J. (1994). Claims to Fame: Celebrity in Contemporary America. Berkeley: University of California Press.

Haskell, M. (2004, August 8). Whatever the Public Fears Most, It’s Right Up There on the Big Screen. The New York Times, pp. 4-5.

Horkheimer, M., & Adorno, T. W. (2002). Dialectic of Enlightenment: Philosophical Fragments. Stanford, CA: Stanford University Press.

Jensen, P. (1971). The Return of Dr. Caligari. Film Comment, 7(4), 36-45.

Kazan, E. (Director). (1957). A Face in the Crowd [Motion Picture].

Maloney, C. (1999). The Faces in Lonesome’s Crowd: Imaging the Mass Audience in “A Face in the Crowd”. Journal of Narrative Theory, 29(3), 251-277.

Mattson, K. (2003). Mass Culture Revisited: Beyond Tail Fins and Jitterbuggers. Radical Society, 30(1), 87-93.

Murphy, B. M. (2009). The Suburban Gothic in American Popular Culture. Basingstoke, Hampshire, England: Palgrave Macmillan.

Quart, L. (1989). A Second Look. Cineaste, 17(2), pp. 30-31.

Wolfe, G. K. (2002). Dr. Strangelove, Red Alert, and Patterns of Paranoia in the 1950s. Journal of Popular Film, 57-67.


Mutable Masses?

It’s the End of the World as We Know It (And I Feel Fine)

Notably, however, the fears associated with the masses have not been limited to one particular decade in American history:  across cultures and times, we can witness examples akin to tulip mania where unruly mobs exhibited relatively irrational behavior. Given the reoccurring nature of this phenomenon, which receives additional credence from psychological studies exploring groupthink and conformity (Janis, 1972; Asch, 1956), we might choose to examine how, if at all, the cultural critiques of the 1950s apply to contemporary society.

Recast, the criticisms of mass culture presumably resonate today in a context where popular culture holds sway over a generally uncritical public; we might convincingly argue that media saturation has served to develop a modern society in which celebrities run wild while evidencing sexual exploits like badges of honor, traditional communities have collapsed, and the proverbial apocalypse appears closer than ever. Moreover, having lost sight of our moral center while further solidifying our position as a culture of consumption since the 1950s, the masses have repeatedly demonstrated their willingness to flash a credit card in response to advertising campaigns and to purchase unnecessary goods hawked by celebrity spokespeople in a process that demonstrates a marked fixation on appearance and the image in a process reminiscent of critiques drawn from A Face in the Crowd (Hoberman, 2008a; Ecksel, 2008). Primarily concerned with the melding of politics, news, and entertainment, which harkens back to Kierkegaard-inspiried critiques of mass culture, current critics charge that the public has at long last become what we most feared:  a mindless audience with sworn allegiances born out of fielty to the all-mighty image (Hoberman, 2008a).

Arguably the most striking (or memorable) recent expression of image, and subsequent comingling bewteen politics and entertainment, centered around Sarah Palin’s campaign for office in 2008. Indeed, much of the disucssion regarding Palin centered around her image and colloquisims rather than focusing solely on her abilities. [1] Throughout her run, Palin positioned herself as an everyman figure, summoning figures such as “Joe Six-Pack” and employing terms such as “hockey mom” in order to covey her relatability to her constituents.[2] In a piece on then-Vice-Presidential candidate Sarah Palin, columnist Jon Meacham questions this practice by writing:  “Do we want leaders who are everyday folks, or do we want leaders who understand everyday folks?” (2008). Palin, it seemed to Meacham, represented much more of the former than the latter; this position then  leads to the important suggestion that Palin was placed on the political bill in order to connect with voters (2008). Suddenly, a correlary between Palin and Lonesome Rhodes from A Face in the Crowd becomes almost self-evident.

At our most cynical, we could argue that Palin is a Lonesome-type figure, cleverly manipulating her image in order to connect with the disenfranchised and disenchanted. More realistically, however, we might consider how Palin could understand her strength in terms of her relatability instead of her political acumen; she swims against the current as a candidate of the people (in perhaps the truest sense of the term) and provides hope that she will represent the voice of the common man, in the process challenging the status quo in a government that has seemingly lost touch with its base. In some ways, this argument continues to hold valence in post-election actions that demonstrate increasing support of the Tea Party movement.

However, regardless of our personal political stances, the larger pertinent issue raised by A Face in the Crowd is the continued existence of an audience whose decision-making process remains heavily influenced by image—we actually need to exert effort in order to extract our opinion of Sarah Palin the politician from the overall persona of Sarah Palin. Although admittedly powerful, author Mark Rowlands argues that a focus on image—and the reliance on the underlying ethereal quality described by Daniel Boorstin as being “well known for [one’s] well-knownness” (Boorstin, 1962, p. 221)—is ultimately damning as the public’s inability to distinguish between items of quality leads them to focus on the wrong questions (and, perhaps worse, to not even realize that we are asking the wrong questions) in ways that have very real consequences. Extrapolating from Rowlands, we might argue that, as a culture that is obsessed with image and reputation, we have, in some ways, forgotten how to judge the things that really matter because we have lost a sense of what our standards should be.

Ever the Same?

So while the criticisms of critics from the Frankfurt School might appear to hold true today, we also need to realize that modern audiences exist in a world that is, in some ways, starkly different from that of the 1950s. To be sure, the mainstream media continues to exist in a slightly expanded form but new commentary on the state of American culture must account for the myriad ways in which current audiences interact with the world around them. For instance, work published after Theodor Adorno’s time has argued against the passive nature of audiences, recognizing the agency of individual actors (Mattson, 2003; Shudson, 1984).[3] Moreover, the new activity on the part of audiences has done much to comingle the once distinctly separate areas of high and low culture in a process that would have likely confounded members of the Frankfurt School. The current cultural landscape encompasses remix efforts such as Auto-Tune the News along with displays of street art in museum galleries; projects once firmly rooted in folk or pop art have transcended definitional boundaries to become more accepted—and even valued—in the lives of all citizens. While Adorno might be tempted to cite this as evidence of high culture’s debasement, we might instead argue that these new manifestations have challenged the long-held elitism surrounding the relative worth of particular forms of art.

Additionally, examples like Auto-Tune the News suggest that advances in technology have also had a large impact on the cultural landscape of America over the past half century, with exponential growth occurring after the widespread deployment of the Internet and the resulting World Wide Web. While the Internet certainly provided increased access to information, it also created the scaffolding for social media products that allowed new modes of participation for users. Viewed in the context of image, technology has helped to construct a world in which reputations are made and broken in an instant and we have more information circulating in the system than ever before; the appearance of technology, then, has not only increased the velocity of the system but has also amplified it.

Although the media often showcases deleterious qualities of the masses’ relationship with these processes (the suicide of a student at Rutgers University being a recent and poignant example), we are not often exposed to the incredible pro-social benefits of a platform like Twitter or Facebook. While we might be tempted to associate such pursuits with online predators (a valid concern, to be sure) or, at best, unproductive in regard to civic engagement (Gladwell, 2010), to do so would to ignore the powerfully positive uses of this technology (Burnett, 2010; Lehrer, 2010; Johnston, 2010). Indeed, we need only look at a newer generation of activist groups who have built upon Howard Rheingold’s concept of “smart mobs” in order to leverage online technologies to their benefit (2002)—a recent example can be found in the efforts of groups like The Harry Potter Alliance, Invisible Children, and the Kristin Brooks Hope Center to win money in the Chase Community Giving competition (Business Wire, 2010). Clearly, if the masses can self-organize and contribute to society, the critiques of mass culture as nothing more than passive receptors of media messages need to be revised.

Reconsidering the Masses

If we accept the argument that audiences can play an active part in their relationship with media, we then need to look for a framework that begin to address media’s role in individuals’ lives and to examine the motivations and intentions that underlie media consumption. Although we might still find that media is a corrosive force in society, we must also realize that, while potentially exploiting an existing flaw, it does not necessarily create the initial problem (MacGregor, 2000).

A fundamental building block in the understanding of media’s potential impact is the increased propensity for individuals (particularly youth) to focus on external indicators of self-worth, with the current cultural climate of consumerism causing individuals to focus on their inadequacies as they begin to concentrate on what they do not have (e.g., physical features, talent, clothes, etc.) as opposed to their strengths. Simultaneously both an exacerbation of this problem and an entity proffering solutions, constructs like advertising provide an easy way for youth to compensate for their feelings of anxiety by instilling brands as a substitute for value:  the right label can confer a superficial layer of prestige and esteem upon individuals, which can act as a temporary shield against criticism and self-doubt. In essence, one might argue that if people aren’t good at anything, they can still be associated with the right brands and be okay. Although we might be tempted to blame advertising for this situation, it actually merely serves to exploit our general unease about our relationship to the world, a process also reminiscent of narcissism (Lasch, 1979).

Historian Christopher Lasch goes on to argue that, once anchored by institutions such as religion, we have become generally disconnected from our traditional anchors and thus have come to substitute media messages and morality tales for actual ethical and spiritual education (1979). The overlapping role of religion and advertising is noted by James Twitchell, who contends that, “Like religion, which has little to do with the actual delivery of salvation in the next world but everything to do with the ordering of life in this one, commercial speech has little to do with material objects per se but everything to do with how we perceive them” (1996, 110). Thus, we might classify religion, advertising, entertainment, and celebrity as examples of belief systems (i.e., a certain way of seeing the world complete with their own set of values) and use these paradigms to begin to understand their respective (and ultimately somewhat similar!) effects on the masses.

A Higher Power

Ideologies such as those found in popular culture, religion, or advertising tell believers, in their own ways, what is (and is not) important in society, something that Twitchell refers to as “magic” (1996, 29). Each manifestation also professes a particular point of view and attempts to integrate itself into everyday life, drawing on our desire to become part of something (e.g., an idea, a concept, or a movement) that is larger than ourselves. Perhaps, most importantly, the forces of advertising, entertainment, religion, and art (as associated with high/pop/folk culture) play on this desire in order to allow humans to give their lives meaning and worth, in terms of the external:  God, works of art, and name brands all serve as tools of classification. While cynics might note that this stance bears some similarities to the carnival sideshows of P. T. Barnum—it does not matter what is behind the curtain as long as there is a line out front (Gamson, 1994; Lasch, 1979)—the terms survive because they continue to speak to a deep desire for structure; the myth of advertising works for the same reasons that we believe in high art, higher education, and higher powers. Twitchell supports this idea by mentioning that “the real force of [the culture of advertising] is felt where we least expect it:  in our nervous system, in our shared myths, in our concepts of self, and in our marking of time” (1996, 124). Constructs like advertising or entertainment, it seems, not only allow us to assemble a framework through which we understand our world, but also continually informs us about who we are (or who we should be) as a collection of narratives that serves to influence the greater perceptions of individuals in a manner reminiscent of the role of television in Cultivation Theory (Gerbner & Gross, 1976). The process of ordering and imbuing value ultimately demonstrates how overarching ideologies can not only create culture but also act to shape it, a process evidenced by the ability of the aforementioned concepts to consume and/or reference previously shared cultural knowledge while simultaneously contributing to the cultural milieu.

Given our reconsideration of mid-century cultural critiques, it follows that we should necessarily reevaluate proposed solutions to the adverse issues present within mass culture. We recall the advice of A Face in the Crowd’s Mel Miller (i.e., “We get wise to them”) and reject its elitist overtones while remaining mindful of its core belief. We recognize that priding ourselves on being smart enough to see through the illusions present in mass culture, while pitying those who have yet to understand how they are being herded like so many sheep, makes us guilty of the narcissism we once ascribed to the masses—and perhaps even more dangerous than the uneducated because we are convinced that we know better. We see that aspects of mass culture address deeply embedded desires and that our best hope for improving culture is to satisfy these needs while educating audiences so that they can better understand how and why media affects them. Our job as critics is to encourage critical thinking on the part of audiences, dissecting media and presenting it to individuals so that they can make informed choices about their consumption patterns; our challenge is to convincingly demonstrate that engagement with media is a crucial and fundamental part of the process. If we ascribe to these principles, we can preserve the masses’ autonomy and not merely replace one dominant ideology with another.


[1] Certainly being a female did not help this as American women are typically subject to a “halo effect” wherein their attractiveness (i.e., appearance) affects their perception (Kaplan, 1978)

[2] Palin has continued the trend, currently employing the term “mama grizzlies,” a call-to-arms that hopes to rally the willingness of women to fight in order to protect things that they believe in. Interestingly, a term that reaffirms the traditional role of women as nurturing matriarchs has been linked to feminist movements, a move that seems to confuse the empowerment of women with a socially conservative construct of their role in American life (Dannenfelser, 2010).

[3] We can also see much work conducted in the realm of fan studies that supports the practice of subversive readings or “textual poaching,” a term coined by Henry Jenkins (1992), in order to discuss contemporary methods of meaning making and resistance by fans.


Love Me or Hate Me, Still an Obsession

Reacting to atrocities witnessed throughout the course of World War II, Americans in the 1950s became obsessed with notions of power and control, fearing that they would be subsumed by the invisible hand of a totalitarian regime. In particular, the relatively young medium of television became suspect as it represented a major broadcast system that seemed to have an almost hypnotic pull on its audience, leaving viewers entranced by its images. And images, according to author and historian Daniel Boorstin, were becoming increasingly prominent throughout the 19th century as part of the Graphic Revolution replete with the power to disassociate the real from its representation (1962). For cultural critics still reeling from the aftereffects of Fascism and totalitarianism, this was a dangerous proposition indeed.

Although these underlying anxieties of mid-century American society could be examined via a wide range of anthropological lenses and frameworks, visual media has historically provided a particularly vivid manifestation of the fears latent in the people of the United States (Haskell, 2004). This is, of course, not to imply that visual media is necessarily the best or only means by which we can understand prevailing ideologies in the years after World War II, but merely one of the most visible. However, as a critical examination of the entire media landscape of the 1950s would be beyond the scope of a single paper of this magnitude, discussion shall be primarily concentrated around Elia Kazan’s 1957 movie A Face in the Crowd with particular attention paid to the contrasting channels of cinema and television.[1] This paper will seek to briefly position A Face in the Crowd in the larger context of paranoia-driven cinema of the 1950s before using the film as an entryway to discuss critiques of mass culture. Given the film’s apparent sustained resonance as indicated by its relatively recent mention (Vallis, 2008; Hoberman, 2008b; Franklin, 2009), the arguments of Critical Theory will then be applied to modern American culture in an attempt to ascertain their continued validity. Finally, an argument will be made that acknowledges the potential dangers facing mass culture in the 21st century but also attempts to understand the processes that underlie these pitfalls and provides a suggestion for recourse in the form of cultural and media literacy.

Paranoia, Paranoia, Everyone’s Coming to Get Me

The post-war prosperity of the 1950s caused rapid changes in America, literally altering the landscape as families began to flood into the newly-formed suburbs. With the dream and promise of upward social mobility firmly ensconced in their heads, families rushed to claim their piece of the American dream, replete with the now-iconic front yard and white picket fence. And yet, ironically, a new set of worries began to fester underneath the idyllic façade of the suburbs as the troubles of the city were merely traded for fears of paranoia and invasion; the very act of flight led to entrapment by an ethos that subtly precluded the possibility of escape.

As with many other major cultural shifts, the rapid change in the years following World War II caused Americans to muse over the direction in which they were now headed; despite a strong current of optimism that bolstered dreams of a not-far-off utopia, there remained a stubborn fear that the quickly shifting nature of society might have had unanticipated and unforeseen effects (Murphy, 2009). Life in the suburbs, it seemed, was too good to be true and inhabitants felt a constant tension as they imagined challenges to their newly rediscovered safety:  from threats of invasion to worries about conformity, and from dystopian futures to a current reality that could now be obliterated with nuclear weapons, people of the 1950s continually felt the weight of being a society under siege. An overwhelming sense of doubt, and more specifically, paranoia, characterized the age and latent fears manifested in media as the public began to struggle with the realization that the suburbs did not fully represent the picturesque spaces that they had been conceived to be. In fact, inhabitants were assaulted on a variety of levels as they became disenchanted with authority figures, feared assimilation and mind control (particularly through science and/or technology), began to distrust their neighbors (who could easily turn out to be Communists, spies, or even aliens!), and felt haunted by their pasts, all of which filled the movie screens of the decade (Jensen, 1971; Murphy, 2009; Wolfe, 2002).[2] Following solidly in this tradition, Kazan’s A Face in the Crowd picks up on some of the latent strains of paranoia in American culture while simultaneously serving as a platform for a set of critiques regarding mass culture.

Somewhere, a Star Is Made

The storyline of A Face in the Crowd is rather straightforward and yet deceptively complex in its undertones:  on the surface, we experience a rather heavy-handed morality tale in the form of country bumpkin Larry “Lonesome” Rhodes, a relative nobody who is plucked from obscurity and made (and subsequently broken) through powers associated with television. Yet, it is only when we begin to connect the movie to a larger societal context that we begin to understand the ramifications of the film’s message; a careful examination of A Face in the Crowd reveals striking suspicions regarding the role that media plays (in this case, primarily television and cinema) in shaping American culture. Stars, director Elia Kazan argues, are not so much born as made, a distinction that portends dire consequences.

It is worth noting that Kazan’s film was made during a time when the concept of the “celebrity” was being renegotiated by America; for a large part of its history, the United States, firmly grounded in a Puritan work ethic, had honored heroes who exemplified ideals associated with a culture of production and was struggling to reconcile these notions in the presence of an environment whose emphasis was now focused on consumption. Although modern audiences might initially find this shift difficult to appreciate, one need only consider that the premium placed on production is so central to American ideology that it continues to linger today:  in a culture that exhibits rampant consumerism, we still value the “self-made man” and sell the myth of America as a place where anyone can achieve success through hard work. To abandon these ideas would necessitate that we reinterpret the very meaning of “America.” Thus, we become more sympathetic to the critics of the day who lamented the loss of the greatness of man and bristled against the notion that fame or celebrity could be manufactured—such a system could would only result in individuals who were lacking and unworthy of their status (Gamson, 1994; Benjamin, 1973)

Such is the case it seems, with Larry Rhodes, who is discovered by roving reporter Marcia Jeffries in an Arkansas jail. Although it cannot be denied that Rhodes has some modicum of talent and a certain charisma that comes from being unafraid to speak one’s mind, Marcia ushers Rhodes onto the path of greatness by dubbing him “Lonesome” and thus creates a character that transforms Rhodes from a despondent drunk to a winsome drifter. This scene—the first major one in the movie—thusly introduces the important notion that those involved in the media can be implicitly involved in the manipulation of the information that travels over the airwaves. Subtly adding to the insidious nature of the media, A Face in the Crowd portrays Marcia as a character that seems likable enough, but also a person who is, in a way, exploiting the people in jail as she rushes in with her tape recorder intent on prying the stories from the characters she finds (or creates!) and does not exhibit much concern in truly understanding why these men are imprisoned in the first place. Taken to an extreme, we later come across the character of The General, who further perverts the connection between media and power as he conspires with Lonesome to remake the image of Senator Worthington Fuller as the congressman runs for President.

Yet, as Lonesome Rhodes grows in his role as a media personality, he quickly demonstrates that the power to manipulate does not lie solely with those who sit behind the cameras. In Memphis, Rhodes incites a riot against the Luffler mattress company and also solicits donations in order to help a Black woman rebuild her house. In light of this, we can see that while Kazan focuses on the negative implications of television and celebrity, that the relative good or bad that comes from these actions is not necessarily the point—instead, the one constant in all of the depicted scenarios is a public who is manipulated into performing actions on the behalf of others. Although the characters of Lonesome and The General are vilified throughout the film, it is the masses for which Kazan demonstrates true disdain.

Extraordinary Popular Delusions

Perhaps nowhere is this contempt more apparent than at the end of the film where, in an attempt to offer a small moment of solace to Marcia after her unmasking of Lonesome, writer Mel Miller notes, “We get wise to them, that’s our strength” (Kazan, 1957). And Miller is not wrong:  Western tradition has long recognized the correlation between knowledge and power and Miller’s assertion touches upon the revelatory clout inherent in the realignment of perception and reality as noted by public relations guru Howard Bragman (2008). A more critical examination of the film’s closing scene, however, raises an important question:  Who is Miller’s “we”? Although one might be tempted to read this line as indicative of an egalitarian philosophical view, it is important to note that the only two characters in the shot represent the film’s arguably upper-middle class, and pointedly Eastern-educated, elite—nowhere to be seen are representatives of the small Arkansas town from the film’s opening or denizens of Memphis, both of whom serve to characterize the majority of Lonesome’s devoted viewers.[3] In fact, if we take time to reflect upon the movie, we realize that the majority of the audience was only alerted to Lonesome’s dual nature after Marcia flipped a control room switch and revealed the underlying deterioration; the masses oscillated from one position to the next without understanding how or why and once again adopted a passive stance in their relationship with media. Moreover, as Courtney Maloney points out, Kazan’s depiction of the agency of the masses is actually limited in scope:  despite a montage of audience members vehemently phoning in, sponsors are simultaneously shown to be acting independently as they withdraw their association with Lonesome (1999). Moreover, the subtext of the scene distances the rational decision-making of the truly powerful from the impassioned beseeching of the masses, likening the power of the latter to that of a mob. Knowledge and its associated authority, clearly, are afforded to a select group.

This idea, that the world can be divided between those who “get wise” and those who do not, serves to develop a rather sharp classist criticism against the medium of television and those who would watch it:  moviegoers, by virtue of witnessing Kazan’s work, find themselves elevated in status and privy to “the man behind the curtain” (to borrow a phrase). In contrast, the malleable masses were considered to be pacified and placated by idealistic portrayals of life in the 1950s in the form of television programs like Leave It to Beaver, The Donna Reed Show, and The Adventures of Ozzie and Harriet. Clearly, Kazan creates a dichotomy imbued with a value judgment descended from the thoughts of prominent thinkers in the Frankfurt School who, as far as aesthetics were concerned, preferred the high culture of cinema to the conformity and manipulated tastes of television (Horkheimer & Adorno, 2002; Adorno, 1985; Quart, 1989). This distinction between high and low culture would be a crucial supporting idea for critics as a prominent fear of mass culture was that it portended a collapse between concepts (e.g., fame, celebrity, or intellectual value) of objectively different quality, essentially rendering all manifestations the same and therefore all equally mundane (Boorstin, 1962; Hoberman, 2008b; Kierkegaard, 1962).  Even worse for critics, perhaps, was the perception of the masses’ refusal to grow out of its immature interests, a behavior that was characterized as both childlike and stubborn (Adorno, 1985).

And the fears of such theorists, all of whom were reacting to recent and rapid advances in broadcast technology, were not unfounded. Consider, for example, that radio had been popularized a scant fifty years prior and had vastly altered critics’ understanding of media’s potential impact, creating a precedent as it proliferated across the country and began to develop a platform for solidarity and nationalism. Yet, while the effects of radio were decidedly pro-social, due in part to its propagation of orchestral music and transmission of fireside chats, television was viewed as a corrosive force on society that spurred on the destruction of culture instead of enriching it.[4] For the critics of the Frankfurt School, television was indicative of an entrenched sentiment that regarded mass-produced culture as formulaic and perfectly suitable for a generation of passive consumers who sat enraptured in front of the glowing set. Associating the potential dissemination of propagandist ideology with television as a form of mass broadcast, cultural theorists evoked notions of totalitarian regimes akin to Hitler and Stalin in an effort to illustrate the potential subjugation of individual thought (Mattson, 2003). These simmering fears, aggrandized by their concurrence with the rising threat of Communism and collectivist cultures, found fertile soil in the already present anxiety-ridden ethos of the United States during the 1950s.


[1] It should be noted, however, that the comics of this time—those that belong to the end of the Golden Age and beginning of the Silver Age—also provide an additional understanding of the ways in which Americans indirectly wrestled with their fears.

[2] For a more exhaustive list of movies that support this point, see Wolfe, 2002.

[3] Let us also not forget the fact that Lonesome exhibits a rather patronizing attitude toward his audience in his later career, instituting the Cracker Barrel show with its manufactured country lackeys (Yates, 1974). In contrast to his first stint in Memphis, Lonesome has begun to embrace his country image as a means (if an inauthentic one) to connect with his audience, a point of contention to which we will return.

[4] Curiously, however, we see that this relationship between presidential addresses (like the aforementioned fireside chats) and mass media did not elicit notable complaints from critics who were generally wary of the merging of politics and entertainment (Quart, 1989; Benjamin, 1973). Although a larger discussion is warranted regarding the subtleties of this distinction, I would suggest that part of the differentiation stems from a high-low culture dichotomy. Although critics linked the negative presence of television with corporate advertising, James Twitchell suggests that there has always been a rather intimate relationship between arts and commerce, most saliently exhibited by wealthy citizens or entities who act as patrons (Twitchell, 1996).


Light Up the Sky Like a Flame

But what is reality television? Although the genre seems to defy firm definitions, we, like Justice Stewart, instinctually “know it when [we] see it.” The truth is that reality television spans a range of programming, from clip shows like America’s Funniest Home Videos, to do-it-yourself offerings on The Food Network, investigative reporting on newsmagazines like 60 Minutes, the docu-soap Cops, and many other sub-genres in between, including the reality survival competition that forms the basis for The Hunger Games. Although a complete dissection of the genre is beyond the scope of this chapter—indeed, entire books have been written on the subject—reality television and its implications will serve as a lens by which we can begin to understand how Katniss experiences the profound effects of image, celebrity, and authenticity throughout The Hunger Games.

She Hits Everyone in the Eye

For the residents of Panem, reality television is not just entertainment—it is a pervasive cultural entity that has become inseparable from citizens’ personal identity. Although fans of The Hunger Games can likely cite overt allusions to reality television throughout the series, the genre also invokes a cultural history rife with unease regarding the mediated image in the United States.

Reacting to atrocities witnessed throughout the course of World War II, Americans in the 1950s became obsessed with notions of power and control, fearing that they would be subsumed by the invisible hand of a totalitarian regime. In particular, the relatively young medium of television became suspect as it represented a major broadcast system that seemed to have a hypnotic pull on its audience, leaving viewers entranced by its images. And images, according to author and historian Daniel Boorstin, were becoming increasingly prominent throughout the 19th century as part of the Graphic Revolution replete with the power to disassociate the real from its representation. Boorstin argued that although the mass reproduction of images might provide increased levels of access for the public, the individual significance of the images declined as a result of their replication; as the number of images increased, the importance they derived from their connection to the original subject became more diffuse. And, once divorced from their original context, the images became free to take on a meaning all their own. Employing the term “pseudo-event” to describe an aspect of this relationship, Boorstin endeavored to illuminate shifting cultural norms that had increasingly come to consider the representation of an event more significant than the event itself.

Katniss unwittingly touches upon Boorstin’s point early inThe Hunger Games, noting that the Games exert their control by forcing Tributes from the various districts to kill another while the rest of Panem looks on. Katniss’ assertion hints that The Hunger Games hold power primarily because they are watched, voluntarily or otherwise; in a way, without a public to witness the slaughter, none of the events in the Arena matter. Yet, what Katniss unsurprisingly fails to remark upon given the seemingly ever-present nature of media in Panem is that the events of The Hunger Games are largely experienced through a screen; although individuals may witness the Reaping or the Tribute’s parade in person, the majority of their experiences result from watching the Capitol’s transmissions. Without the reach of a broadcast medium like television (or, in modern culture, streaming Internet video), the ability of The Hunger Games to effect subjugation would be limited in scope, for although the Games’ influence would surely be felt by those who witnessed such an event in person, the intended impact would rapidly decline as it radiated outward. Furthermore, by formulating common referents, a medium like television facilitates the development of a mass culture, which, in the most pessimistic conceptualizations, represents a passive audience ripe for manipulation. For cultural critics of the Frankfurt School (1923-1950s), who were still reeling from the aftereffects of Fascism and totalitarianism, this was a dangerous proposition indeed. Although the exact nature of modern audiences is up for debate, with scholars increasingly championing viewers’ active participation with media, Panem has seemingly realized a deep-seeded fear of the Frankfurt School. It would appear, then, that The Hunger Games function as an oppressive force precisely because of its status as a mediated spectacle of suffering.

But perhaps we should not be so hard on Katniss. Growing up in an environment that necessitated the cultivation of skills like hunting and foraging, Katniss’ initial perspective is firmly grounded in a world based on truth. Plants, for example, must be checked (and double-checked!) to ensure their genuineness, lest a false bite result in death. In order for Katniss to survive, not only must she be able to identify plants but must also trust in their authenticity; prior to her experience in the Arena, Katniss undoubtedly understands the world in rather literal terms, primarily concerned with objects’ functional or transactional value. However, as hinted by Boorstin, additional layers of meaning exist beyond an item’s utility—layers that Katniss has not yet been trained to see.

Echoing portions of Boorstin’s previous work, French philosopher Jean Baudrillard conceptualized four types of value that objects could possess in modern society: functional, transactional, symbolic, and sign. Admittedly a more complex theory than the description provided herein, we can momentarily consider how Baudrillard’s value categories of “functional” and “transactional” might align with Boorstin’s previously introduced concept of the “real,” while “symbolic” and “sign” evidence an affinity toward “representation.” Whereas the functional and transactional value of items primarily relates to their usefulness, the categories of “symbolic” and “sign” are predominantly derived as a result of the objects’ relationship to other objects (sign) or to actors (symbolic). Accordingly, being relatively weak in her comprehension of representation’s nuances, Katniss characteristically makes little comment on Madge’s gift of a mockingjay pin. However, unbeknownst to Katniss (and most likely Madge herself), Madge has introduced one of the story’s first symbols, in the process imbuing the pin with an additional layer of meaning. Not just symbolic in a literary sense, the mockingjay pin gains significance because it is attached to Katniss, an association that will later bear fruit as fans well know.

Before moving on, let’s revisit the import of The Hunger Games in light of Baudrillard: what is the value of the Games? Although some might rightly argue that The Hunger Games perform a function for President Snow and the rest of the Capitol, this is not the same as saying the Games hold functional value in the framework outlined by Baudrillard. The deaths of the Tributes, while undeniably tragic, do not in and of themselves fully account for The Hunger Games’ locus of control. In order to supplement Boorstin’s explanation of how The Hunger Games act to repress the populace with the why, Baudrillard might point to the web of associations that stem from the event itself: in many ways, the lives and identities of Panem’s residents are defined in terms of a relationship with The Hunger Games, meaning that the Games possess an enormous amount of value as a sign. The residents of the Capitol, for example, evidence a fundamentally different association with The Hunger Games, viewing it as a form of entertainment or sport, while the denizens of the Districts perceive the event as a grim reminder of a failed rebellion. Holding a superficial understanding of The Hunger Games’ true import when we first meet her, Katniss could not possibly comprehend that her destiny is to become a symbol, for the nascent Katniss clearly does not deal in representations or images. Katniss, at this stage in her development, could not be the famed reality show starlet known as the “girl on fire” even if she wanted to.

By All Accounts, Unforgettable

Returning briefly to reality television, we see that Panem, like modern America, finds itself inundated with the genre, whose pervasive tropes, defined character (stereo)types, and ubiquitous catchphrases have indelibly affected us as we subtly react to what we see on screen. Although we might voice moral outrage at offerings like The Jersey Shore or decry the spate of shows glamorizing teen pregnancy, perhaps our most significant response to unscripted popular entertainment is a fundamental shift in our conceptualization of fame and celebrity. Advancing a premise that promotes the ravenous consumption of otherwise non-descript “real” people by a seemingly insatiable audience, reality television forwards the position that anyone—including us!—can gain renown if we merely manage to get in front of a camera. Although the hopeful might understand this change in celebrity as democratizing, the cynic might also argue that fame’s newfound accessibility also indicates its relative worthlessness in the modern age; individuals today can, as the saying goes, simply be famous for being famous.

Encapsulated by Mark Rowlands’ term “vfame,” the relative ease of an unmerited rise in reputation indicates how fame in the current cultural climate has largely divorced from its original association with distinguished achievement. Although traditional vestiges of fame have not necessarily disappeared, it would appear that vfame has become a prominent force in American culture—something Katniss surely would not agree with. Recalling, in part, Kierkegaard’s thoughts on nihilism, vfame’s appearance stems from an inability of people to distinguish quality (or perhaps lack of concern in doing so), resulting in all things being equally valuable and hence equally unimportant. This, in rather negative terms, is the price that we pay for the democratization of celebrity: fame—or, more accurately, vfame—is uniformly available to all in a manner that mirrors a function of religion and yet promises a rather empty sort of transcendence. Although alluring, vfame is rather unstable as it is tied to notions of novelty and sensation as opposed to fame, which is grounded by its association with real talent or achievement; individuals who achieve vfame typically cannot affect the longevity of their success in substantial terms as they were not instrumental in its creation to begin with. Stars in the current age, as it were, are not born so much as made. Moreover, the inability of the public to distinguish quality leads us to focus on the wrong questions (and, perhaps worse, to not even realize that we are asking the wrong questions) in ways that have very real consequences; although vfame and its associated lapse in thinking might be most obvious in the realm of celebrities, it also manifests in other institutions such as politics. As a culture that is obsessed with image and reputation, we have, in some ways, forgotten how to judge the things that really matter because we have lost a sense of what our standards should be.

Born out of an early to mid-20th century society in which the concept of the “celebrity” was being renegotiated by America, concepts like vfame built upon an engrained cultural history of the United States that was firmly steeped in a Puritan work ethic. Americans, who had honored heroes exemplifying ideals associated with a culture of production, were struggling to reconcile these notions in the presence of an environment now focused on consumption. Although Katniss, as proxy for modern audiences, might initially find this shift difficult to appreciate, one need only consider that the premium placed on production is so central to American ideology that it continues to linger today: in a culture that exhibits rampant consumerism, we still value the “self-made man” and sell the myth of America as a place where anyone can achieve success through hard work. To abandon these ideas would necessitate that we reinterpret the very meaning of “America.” Thus, we become more sympathetic to the critics of the day who lamented the loss of the greatness of man and bristled against the notion that fame or celebrity could be manufactured—such a system would only result in individuals who were lacking and unworthy of their status. To this day, our relationship with celebrities is a tenuous and complex one at best, for although we celebrate the achievements of some, we continue to flock to the spectacle created by the public meltdown of others, unable or unwilling to help; we vacillate between positions of adulation, envy, contempt, and pity, ever poised for incensement but all too willing to forgive.

Perhaps it should come as no surprise that reality television puts us a little on edge, as the genre represents a fundamental blurring of fact and fiction. Celebrities, we see, are just like us—just like our neighbors, who, through the magic of reality television, can become stars! Ever-shifting classifications leave us on unstable ground. But also consider the aforementioned philosophy of Boorstin: stars are, among other things, individuals whose images are important enough to be reproduced, which causes “celebrity” to transition from a type of person to a description of how someone is represented in society. In other words, we witness a shift from a term that labels who someone is to a term that designates who someone seems to be. Celebrities, it might be argued, derive at least a portion of their power in modern culture because they embody a collection of images that has been imbued with some sort of significance. Ultimately, it seems that much of our unease with celebrity and fame centers on notions of authenticity.

All I Can Think of Are Hidden Things

Long before Katniss ever becomes a celebrity herself, she exhibits disdain for the Capitol and its residents, evidencing a particularly adverse reaction to things she considers artificial. As previously discussed, authenticity played a particular role in Katniss’ growth and her ability to survive: for Katniss, a false image literally represented an affront on the level of life or death, for a lapse in judgment could have resulted in possible electrocution or poisoning. Concordantly, Katniss dismisses the strange colors of the Capital along with the characteristic features of its citizens—stylists, in particular, are purported to be grotesque—because she is not readily able to reconcile these visuals with her established worldview. As Katniss operates on a literal level, directly associating identity with appearance, the self can only present in one way (in this case, relatively unadorned) and maintain its authenticity.

Like Katniss, we too may be tempted to summarily reject the unfamiliar; our modern anxieties might best be encapsulated by the question: What to do with a problem like Lady Gaga? Perhaps the strongest contemporary mass image that mirrors the visual impact of the stylists on Katniss (followed closely by New York socialite Jocelyn Wildenstein), Lady Gaga suffers continual criticism for her over-the-top theatrical presentations. With dresses made from meat and Hello Kitty heads, it is all too easy to write Lady Gaga as “attention-starved,” simplifying her presence to the succinct “weird.” Yet, it seems rash to write off Lady Gaga and the world of fame as nothing more than frivolity and fluff, for pop culture is only as vapid as our disinclination to engage in it.

Consider, for example, how the Capitol and its residents (of whom a prominent one would undoubtedly be Lady Gaga) embody the spirit of Decadence, a particularly prominent theme in Victorian culture. A reaction to the 19th century movement of Romanticism, Decadence championed concepts like artifice, which served to demonstrate man’s ability to rebel against, and possibly tame, the natural order. Although this inclination toward the unnatural manifested in myriad ways, French poet and philosopher Chrarles Baudelaire viewed women’s use of cosmetics as a particular site of interest, for proper application did not just enhance a woman’s beauty but acted to transform her, allowing transcendence through artifice.

With this in mind, we begin to understand the innate control wielded by figures such as Cinna and Caesar Flickman. Perceived as facile by some, these two men represent a class of individuals adept at understanding the power inherent in fame, reputation, celebrity, and appearance; in the Capitol, image mongers such as these hold sway. Although one reading of these characters plants them firmly in the realm of artifice, painting them as masters of emotional manipulation and spectacle, an alternate view might consider how these two have come to recognize a shift toward a new localized reality—one that Katniss must adapt to or perish.

And yet, despite their commonality, these two individuals also underscore fundamentally different approaches to image: Caesar (and, perhaps, by extension, the Capitol) wields his power in order to mask or redirect while Cinna endeavors to showcase a deep-seeded quality through the management of reputation and representation. Coexisting simultaneously, these two properties of illusion mirror the complimentary natures of Peeta and Katniss with regard to image. Peeta, skilled in physical camouflage, exhibits an emotional candidness that Katniss is initially unready, or unwilling, to match; Katniss, very much the inverse of Peeta, is characterized by traits associated with hunting, finding, and sight in the “real” world all while maintaining a level of emotive subterfuge. Over the course of the 74th Hunger Games, however, Katniss quickly learns to anticipate how her actions in the Arena will affect her representation and reputation beyond the battlefield. With the help of Haymitch, Katniss begins to better understand the link between a robust virtual self and a healthy physical one as she pauses for the cameras and plays up her affection for Peeta in exchange for much-needed rewards of food and medicine. As she matures, Katniss comes into alignment with Cinna and Caesar, individuals who, despite being participatory members of a system arguably deemed inauthentic, distinguish themselves from the majority of Panem by understanding how image works; Cinna and Caesar (and later Katniss) are not just powerful, but empowered and autonomous.

Herein lies the true import of Collins’ choice to weave the trope of reality television into the fabric of The Hunger Games: throughout the trilogy, the audience is continually called upon to question the nature of authenticity as it presents in the context of a media ecology. Ultimately, the question is not whether Katniss (or anyone else) maintains a sense of authenticity by participating in the games of the Capitol—trading a true self for a performed self—but rather an askance of how we might effect multiple presentations of self without being inauthentic. How does Katniss, in her quest to survive, embody Erving Goffman’s claims that we are constantly performing, altering our presentation as we attempt to cater to different audiences? Is Katniss truly being inauthentic or does she ask us to redefine the concept of authenticity and its evaluation? Struggling with these very questions, users of social media today constantly juggle notions of authenticity and self-presentation with platforms like Facebook and Twitter forming asynchronous time streams that seamlessly coexist alongside our real-life personas. Which one of these selves, if any, is authentic? Like Katniss, we are born into the world of the “real” without a ready ability to conceptualize the power latent in the virtual, consequentially resenting what we do not understand.


On My OWN

 

Modern American culture finds itself infused with celebrities, typically thought of as Hollywood actors or reality show starlets. Increasingly, however, the moniker of “celebrity” is being applied to potentially unlikely individuals, giving rise to the “Celebrity CEO.” Beginning with a brief examination into the possible purpose and cultural function of the celebrity, this paper will then go on to focus on Oprah Winfrey as a particular type of celebrity CEO who has created, and subsequently embodied, a lifestyle brand. Throughout the course of the paper it will be argued that this strategy presents some advantages to celebrity-endorsed endeavors while presenting some additional vulnerabilities. Finally, the implications of this status as celebrity CEO will be applied to the Oprah Winfrey Network.

 

Oprah Winfrey, an American media figure familiar the world over, certainly fulfills modern definitions of a celebrity:  face prominently featured on streaming banners in Chicago’s O’Hare airport, Oprah is associated with events like “Oprah’s Favorite Things” along with projects like Oprah’s Book Club and the Angel Network. Although ubiquitous, if one should doubt her celebrity status, one need only remember that Oprah has also managed to obtain the true mark of the modern star in American culture—the ability to drop her last name and still be recognized. Even Daniel Boorstein, who criticized the current state of celebrity as being devoid of meaning—in the process coining a term that has become colloquially referred to as “famous for being famous” (1962)—might have to reconsider his thoughts after encountering Oprah Winfrey. Ranging from stories of sexual abuse as a child to weight management issues played out in public, Oprah is quite literally known for being well-known:  part of her allure stems from her willingness to address the darkest parts of her life with her audience and part of her power comes from fans’ ability to connect with Oprah through these stories.

Beginning with a brief background into the nature of the celebrity CEO, this paper will explore the general effects of celebrity CEOs with particular respect to narrative before examining Oprah as a particular iteration of this process. Celebrity CEOs, it will be argued, are not entirely dissimilar from other types of stars when it comes to issues of brand management, although they necessarily possess additional economic and social considerations. Once the connection between a CEO’s dual identities as executive and individual are established, Oprah’s development of her lifestyle as brand will be briefly discussed as foundational context for an evaluation of the launch of OWN (i.e., the Oprah Winfrey Network).

 

There’s No Business Like Show Business?

In an increasingly industrialized world filled with sprawling organizations, CEOs have become somewhat sequestered from the majority of their employees, leading to isolation and alienation (Yalom, 1998). Although undoubtedly recognizable to boards of directors, it appears as though CEOs have become largely disappeared from public view (with notable exceptions as will be discussed below).

Directly addressing this issue, the CBS reality television show Undercover Boss facilitates the connection between roles of “CEO” and “person”—although the program likely provides an opportunity to learn about the inner workings of their organization, the arguably larger benefit is the humanization of a corporate suit. Although viewers might cite schadenfreude as a prominent theme, laughing as they see an administrator stumble over a seemingly “simple” task, the net effect (realized or not) is that they most likely begin to connect emotionally with the undercover boss; they become actively invested in the outcome of this somewhat contrived scenario and an unspoken desire to see that the CEO has learned a lesson indicates that they have come to care about this person and his or her company—provided that the CEO is at all likeable.[1] In the course of an hour, audiences are not only exposed to a company that they may or may not have heard of, but also been introduced to a CEO and a handful of employees and witnessed “behind-the-scenes” or “backstage” operations (which might also serve to increase our identification with the company)—all in all, not a bad public relations move for a corporation!

Alternatively, we can consider that an appearance on a show like Undercover Boss instantly transforms a CEO into a media figure. Thrust into the public eye, one becomes a minor celebrity through the power of television:  even if we had little to no prior interest in the featured boss, social cues prevalent in a mediated society indicate that we should pay attention—a major broadcast network surely would not have chosen to feature someone who was not worthy?—and the mere ability to command copious amounts of attention (momentarily at least) affords a CEO the ability to transcend mundaneness, potentially obtaining the status of a celebrity.

Moreover, the Undercover Boss example indicates that while CEOs could potentially demand or cultivate an audience themselves—as suggested by Lois Arbogast in reference to Best Buy CEO Brian Dunn (2010)—they can also be featured or promoted by journalists (Hayward, Rindova, & Pollock, 2004). Although we might ascribe the prominence of CEOs to their role as leaders, we can also consider how humans display a tendency to oversimplify situations in order to understand complex and nebulous narratives.

Take, for example, a study conducted by Jones and Harris demonstrating that the prevalent attitudes in a writing sample were attributed to its author:  this study represented the first time that the Fundamental Attribution Error had been observed, although it was not immediately labeled as such (1967). In short, the Fundamental Attribution Error posits that observers tend to ignore situational explanations in favor of personality- or dispositional-based ones. In turn, these perceptions of us, once established, can cause us to act in particular ways as we endeavor to maintain our public image. Although the corollary between the Fundamental Attribution Error and the celebrity CEO might not seem apparent at first, we can understand how humans have learned to employ the Fundamental Attribution Error as a type of heuristic—a mental shortcut—in order to simply a intricate situation into manageable (and readily understood) explanations. In the case of the Fundamental Attribution Error, we see an eschewing of situational/environmental factors as we focus on an individual. Similarly, we focus on the actions and exploits of a celebrity CEO, channeling the output of a multidimensional process through a figurehead.

As a specific example of this process, the origin story of non-profit group Invisible Children taps into the pervasive myth of Joseph Campbell’s Hero’s Journey with its depiction of young adventurers traveling into a foreign land on a quest to find and cultivate a narrative. Lured by a sense of mystery into East Africa, an unexpected assault by the Lord’s Resistance Army alters the path of the filmmakers, acting as the impetus to enter into a world fraught with danger and uncertainty:  the realm of the unknown (Russell, 2007). Prior to this point, Kenya and Sudan had represented a relatively unfortunate, physically demanding, and sometimes boring wilderness for the team but nothing substantial. With the assistance of various guides (one of these a literal guide tasked with driving the group to a nearby refugee camp), Jason Russell, Laren Poole, and Bobby Bailey began to glimpse the conflict that underscores the region as they asked a series of questions of the locals. Wholly consumed by their newfound situation, the filmmakers discovered a little-known world of night commuters and child soldiers in Northern Uganda. This alien setting, which “disgusted and inspired,” also presented an opportunity for transformation as filmmakers shed their naïveté and were reborn as crusaders against witnessed injustice (Invisible Children, 2010). Having found their story—the ultimate prize sought at the outset of the journey—the founders of Invisible Children extricated themselves in order to return to their homeland as masters of the unknown and share their insights with their community. The documentarians themselves echo this sentiment in their first production, Invisible Children:  Rough Cut, through a voiceover that proclaims that the group came to Africa as novices but hoped to “leave as warriors” (Bailey, Poole, & Russell, 2004). While never explicitly acknowledged as a tool, it seems plausible that self-described storytellers such as Jason, Laren, and Bobby would have integrated successful elements of narrative into their production.

Although the real-life nature of Invisible Children’s origin precludes an exact overlay with the steps of Campbell’s monomyth, it is easy to imagine that the retelling of the tale draws some of its power (consciously or unconsciously) from this established structure. For some, the intertwining of narrative and Invisible Children might have seemed inevitable for an organization created by filmmakers/storytellers, born out of a documentary, and focused on recounting a tale of adversity in Uganda. Nevertheless, through the mythic nature of Invisible Children’s origin story, the organization’s founders are made into celebrity CEOs, performing a similar function as those individuals featured on Undercover Boss as the surrounding narrative is rewritten to feature a chosen few as its stars. Celebrity CEOs, then, can be understood to act as a focal point for the narratives that surround and pervade a company, locking the perceptions of the organization and individual into a symbiotic (or mutually destructive) relationship as sentiments accrued in one role migrate to another. In the case of Invisible Children, the organization’s founders were able to leverage the mystique associated with their experience into a full-fledged movement with their stories at its origin.

 


 

The Medium Is the Message

Structuring the message as a narrative helps to convey complex ideas in a relatable format, making sense out of a potentially overwhelming wave of information. Personal narratives, however, provide a relatively simple path that cuts through the chaos and allows audiences to focus. Preachers, for example, might utilize a parable to illustrate a point, giving audiences something familiar to relate to while simultaneously introducing a new idea. In a larger sense, we can also consider how the first iterations of narrative, myths and legends, informed the populace about the rules of a world (e.g., why the sun rose or how humans had come to be) in a process that mirrors functions of advertising or identity construction via celebrity culture; although many have now come to accept scientific explanations in lieu of (or possibly in conjunction with) these tales, the fact remains that stories can serve to develop cognitive scaffolding as we evaluate foreign concepts. This educational element, similar to the one existent in the concept of play, allows individuals to learn intricate lessons without any overt effort. Narrative structure provides a guide for people to follow as they absorb additional information, easing the progression of learning. However, when considering this process, it is important to realize that narrative, in choosing which facts to highlight, also chooses which facts to exclude from a story, which might be just as significant.

For some, the process of inclusion and exclusion might seem oddly similar to the creation (or recording) of history; certain facts become relevant and serve to shape our perception of an event while others fade into obscurity. If we were to take a second, however, and think about this notion, we would realize that narratives often served as the first oral histories for a given population. Individuals entrusted with this position in these societies were the “keepers of information,” whose ability to recount narrative shaped their community’s collective memory, and, thus, a key part of the community’s combined sense of identity (Eyerman, 2004; Williams, 2001). Performing a similar role as the oral historians of the past, modern society’s sense of shared knowledge can be understood to be influenced by the commercial storytelling that is branding (Twitchell, 2004)—this concept gains additional importance as we think about modern celebrities who are, along with handlers and public relations agents, in charge of their brand and understand celebrity CEO’s as an extension of this. The ramifications of branding’s ability to affect American culture in this manner is profound:  with its capacity to color perceptions, branding can influence the communal pool that forms the basis for social norms and cultural capital.

Stories, it seems, not only allow us to construct a framework through which we understand our world, but also afford us the ability to share our interpretations with others (Short, et al., 1994). Indeed, author Stephen Greenblatt mentions that a sort of compulsiveness exists that is intrinsic to storytelling (1991). The function, then, of narrative is not only to shape a community, but also to create (or at least maintain) it. The process of sharing not only relays information—an important function, to be sure—but also serves to cultivate the bonds between source and receiver. Sharing represents an important component of storytelling as it facilitates a sense of community with a successful story anchoring an individual’s commitment to a community, strengthening the overall cause.

Oprah as Celebrity CEO

As previously discussed, Oprah has managed to use the power of storytelling, often recounting stories of a deeply personal nature, in order to develop her brand and her audience (a form of community). For example, Oprah’s rather public weight battles offer one point of connection with her viewers:  due to the show’s longevity, audiences have been able to readily document Oprah’s weight gains and losses. Although the same sort of scrutiny has plagued female celebrities for years—Calista Flockhart, Jennifer Love Hewitt, Ricki Lake, Carnie Wilson, and Jessica Simpson come to mind—Oprah managed to benefit from the potentially negative discussion by addressing it directly. In addition to deflating the issue, Oprah’s weight struggles allowed her audience to sympathize with her, strengthening their connection to both Oprah and her brand as trainer Bob Greene was featured on The Oprah Winfrey Show and in books. Consistent with her overall message, Oprah did not advocate for a diet but instead argued for a fundamental change in lifestyle. Further strengthening the bonds between her brand and her personal life, Oprah also publically trained for a marathon in 1994—in this scenario, the brand espoused by The Oprah Winfrey Show is literally embodied by Oprah herself. With this act, we see the synergy between goodwill accrued by Oprah as a media figure and the struggle of a real person to obtain a goal—cheering for Oprah in one capacity naturally led viewers to support her in her other endeavor.[2]

Given Oprah’s strong presence as a personality and as a media mogul, the talk show host seems ripe for consideration as a celebrity CEO. Even ignoring the connection between business and self latent in the name of Oprah’s production company, Harpo (i.e., Oprah spelled backward), Oprah appears to have carved out a niche for herself as a lifestyle brand that promotes self-transformation. Fitting neatly into the ongoing lives of its supporters, Oprah promotes a brand that is anchored to her public perception that, despite presentation in multiple media channels (e.g., television talk show, online website and message boards, magazine, and self-help books), retains consistent messaging, which allows each experience to compliment, but not compete with, the others.

As further evidence of the connection between Oprah’s personal lifestyle, we can reference the much ballyhooed “Favorite Things” episode of The Oprah Winfrey Show. Although possibly driven simply by a desire to share her favorite things, the episode has become a production unto itself, rooted in emotionality and vividness while circumventing logical and rational thinking. The spectacle of the “Favorite Things” episode uses vividness and sensationalism to indicate that the featured products are emotionally interesting, image provoking, and proximate (Sherer & Rogers, 1984; Nesbit & Ross, 1980)—cues that seem salient when discussing media-saturated audiences notorious for variable attention spans and interest. Over the years, in-studio audiences have been groomed into a carefully controlled state of histrionics as they gush about whatever objects are placed in front of them while lauding Oprah’s charity.[3] Although participants of these parties most likely do not stop to consider the processes at work, the creation and careful cultivation of affective ties helps to bind them to Oprah and her lifestyle. Ultimately, although the audience is given free gifts (ignoring the taxes that must be paid), one might argue that individuals do in fact pay a price for these goods:  in exchange for material gain, the audience offers up its ability as a consumer bloc to dictate trends and value.

Adding support to this idea, we can consider the successful implementation of Oprah’s Book Club as another way in which Oprah was able to largely influence American culture through her lifestyle as brand. Using The Oprah Winfrey Show as a platform, Oprah was able to express her approval of a wide range of books (and reading in general). Although Oprah’s Book Club likely sparked a number of book clubs around the country, one might question how many of these were simply waiting, with baited breath, for Oprah to announce her next selection—instead of seeking out books that were personally meaningful, viewers may have abdicated this power to Oprah as she assumed the role of cultural dictator.

Oprah’s Book Club also demonstrated one of the potential pitfalls of connecting one’s personal life to one’s professional presence:  in 2005, Oprah’s support of James Frey’s A Million Little Pieces caused her personal integrity to be questioned as the selection of the Book Club became suspect (Koehn, Helms, Miller, Wilcox, & Rachel, 2009). Although Oprah most likely could not have known that Frey’s work was a fabrication, her pick, and subsequent support on Larry King Live, caused minor damage to reputation due to her personal involvement in the matter.[4]

 

Coming into Her OWN

Continuing the deployment of her lifestyle brand, Oprah plans to debut the Oprah Winfrey Network in 2011. Described by Winfrey as “A channel where people will see themselves…see who they are through the lives of others—in a way real way that enriches them,” one can sense the immediate connection to her existing brand (ABC News, 2010). Building off of her phrase “Live your best life” (a sentiment remarkably similar to, but also strikingly different in tone from, the Army’s “Be all you can be”), the message is clear:  the Oprah Winfrey Network, like all of Oprah’s other media ventures is about the power and process of self-transformation.

Plagued by delays, the Oprah Winfrey Network has also run afoul of controversy prior to its launch. In the run up to its opening, rumors swirled about the possible rigging of votes in the “Search for the Next TV Star” contest (Walker, 2010). Given Oprah’s very obvious connection to the new network, we can conjecture that the same negative publicity that applied to the James Frey incident would likely pertain to this example—even if executives were completely innocent of the allegations, charges of cheating or misconduct had to be addressed in order to avoid damage to the unborn network and Oprah.

Having chosen to create a brand that centers around herself, Oprah has inextricably tied herself to the fortunes of the new network; in exchange for using her name to lend the new channel credence, Oprah runs the risk of personal devaluation should the venture fail. Although Oprah might have accrued enough goodwill to survive even the most devastating blow, any sort of scandal will undoubtedly reflect poorly upon Oprah and any future ventures.


[1] In a somewhat less flattering light, the MTV show Punk’d also performed a similar function for celebrities. Similar to the candid camera shows of generations prior, the Ashton Kutcher vehicle exposed the “true” face of stars in a process that could endear them to the public. More often, however, viewers were able to have a laugh at the celebrities’ expense (with Justin Timberlake being a memorable example) and often exposing them, in Frankie Muniz’s case, as insufferable human beings.

[2] I would also add that Oprah’s choice to relay her story of success despite her trials growing up also affects culture in a couple of important ways. On the surface level, we can see how Oprah’s story can be considered inspirational for those who would wish to follow in her footsteps. Yet, at the same time, Oprah’s background also serves to raise the bar for suffering as audiences question their right to complain as they compare their personal stories to Oprah’s. Although Oprah’s personality lends itself to the aspiration/inspirational interpretation, a larger trend of celebrity/mediated suffering might be that individuals are less inclined to realize the significance of their own situation since it is “not as bad” as what they see on television.

[3] Oprah’s creation of the Angel Network, involvement with Oprah’s Big Give and the creation of the Leadership Academy also work to support this image of Oprah.

[4] Interestingly, Oprah was able to avert a major crisis by responding to the situation through public statements and a follow-up interview with author James Frey. Again possibly working as a spokesperson for larger sentiments, Oprah seemed to win back her audience by conveying her outrage and being duped—a stance likely held by many of the people who had picked up the book at the recommendation of Oprah. In some ways, Oprah became the champion of the people as she confronted the author and the publisher; audience members could rally around Oprah and her power as a media personality allowed her to deliver results that individual viewers could not have hoped to achieve on their own. It might also be noted that Oprah’s Leadership Academy (see previous footnote) also suffered from allegations of misconduct that also served to cast similar doubt on Oprah’s credibility.


Focus on the Family

This week, our class continued to explore ideas of gender in the world of Caprica. Focusing primarily on the women, students began to contemplate the ways in which sexuality and gender intersect. Although I study this particular overlap extensively in respect to Horror, our class evidenced some interesting ideas in this arena and I will leave it to them to carry on the discussion.

Before proceeding, I should take a quick second to differentiate the terms “sex” and “gender”:  I use “sex” in reference to a biological classification while I see “gender” as socially constructed. Although patriarchal/heteronormative stances have traditionally aligned the two concepts, positioning them along a static binary, scholarship in fields such as Gender Studies and Sociology has effectively demonstrated that the interaction between sex and gender is much more fluid and dynamic (Rowley, 2007). For example, in our current culture, we have metrosexuals coexisting alongside retrosexuals and movements to redefine female beauty (the Dove “Real Beauty” ads were mentioned in class and their relative merits–or lack thereof—deserve a much deeper treatment than I can provide here).

Although a number of students in our class focused on the sexuality ofAmanda Graystone, Diane Winston poignantly noted that the character of Amanda also invoked the complex web of associations between motherhood, women, and gender. Motherhood, I would argue, plays an important part in the definition of female identity in America; our construction of the “female” continually assigns meaning to women’s lives based on their status as, or desire to be, mothers. (Again, drawing upon my history with gender and violence, I suggest that we can partially understand the pervasive nature of this concept by considering how society variously views murderers, female murderers, and mothers who murder their children.) In line with this idea, we see that almost every female featured in the episode was directly connected to motherhood in some fashion (with Evelyn perhaps being the weakest manifestation, although we know that she has just started down the path that will lead her into becoming the mother of young Willie).

Amanda, the easiest depiction to deconstruct, voices a struggle of modern career women as she feels the pressure to “have it all.” Although Amanda tells Mar-Beth that she suffered from Post-Partum Depression, and explains her general inability to connect with her daughter as a newborn (the ramifications of which we have already seen played out over the course of the series thus far), she later informs Agent Durham that she circumvented Mar-Beth’s suspicions by lying (we assume that she was referring to the aforementioned interaction, but this is not specified). For me, this moment was significant in that it made Amanda instantly more relatable—something that I have struggled with for a while now—as a woman who may have, in fact, tried desperately to connect with her daughter but simply could not.

Both Daniel and Amanda, it seems, had trouble fully understanding their daughter Zoe. While Amanda’s struggles play out on an emotional level, Daniel labors to decipher the secret behind Zoe’s resurrection program (a term charged with religious significance and also resonance within the world ofBattlestar Galactica). Here we see a parallel to the female notion of motherhood–Daniel, in his own way, is giving birth to a new life (he hopes). Yet, as the title alludes to, Daniel experiences a false labor:  his baby is not quite ready to be let loose in the world. Moreover, like his wife, Daniel attempts to force something that should occur naturally, resulting in a less-than-desired outcome.

For Daniel, this product is a virtual Amanda, who was discussed by some of our class as they pointed out stark differences in sexuality and sexualization. Although the contrast between the real and virtual versions of Amanda holds mild interest, the larger question becomes one of the intrinsic value of “realness.” Despite Daniel’s best attempts, he continues to berate the virtual Amanda for not being real, much to her dismay as she, through no fault of her own, cannot understand that she is fundamentally broken. Although not necessarily appropriate for this course, we can think about the issues raised by virtual reality, identities, and reputations along with our constant drive for “authenticity” in a world forever affected by mediated representations. Popular culture has depicted dystopian scenarios like The Matrix that argue against our infatuation with the veneer—underneath a shiny exterior, some would argue, we are rotting. Images, according to critics like Daniel Boorstin and Walter Benjamin, leave something to be desired.

Sub-par copies also appear in Graystone Industries’ newest advertisement for “Grace,” the commercial deployment of Daniel’s efforts, along with a contestation over image. Daniel quibbles about his virtual image (which is admittedly similar to the one that Joe Adama saw the first time that he entered V world) but doesn’t balk at selling the bigger lie of reunification. (Exploring this, I think, tells us a lot about Daniel and his perception of the world.)

On one level, what Daniel offers is a sort of profane/perverted Grace that is situated firmly in the realm of the material; although it addresses notions of the afterlife and death, it attempts to exert control over them through science. Drawing again from my background in Horror and Science Fiction, we can see that while Daniel’s promise is appealing, we can come back “wrong” (Buffy) or degrade as we continue to be recycled (Aeon Flux). Media warnings aside, I would argue that the allure of Daniel’s Grace is the promise of eternal life but would ultimately be undermined by the program’s fulfillment. In a similar fashion, religion, I think, holds meaning for us because it offers a glimpse of the world beyond but does not force us to contemplate what it would actually be like to live forever without any hope of escaping the mundanity of our lives (Horror, on the other hand, firmly places us in the void of infinity and explores what happens to us once we’ve crossed over to the other side).

Perhaps more importantly, however, the reunited parties in the commercial for Grace reconstitute a family:  after panning over a torch bearing two triangles (which, if we ascribe to Dan Brown’s symbology lessons, could represent male/female), we see a husband returned to his wife and children. Needless to say, the similarities between the situation portrayed and Daniel’s own are obvious. On one level, the commercial has a certain poignancy when juxtaposed with Daniel’s low-grade avatar but also subtly reinforces the deeper narrative thread of the family within the episode.

Picking up on a different representation of the family, classmates also wrote about the contrasting depictions of motherhood as embodied in Mar-Beth andClarice. Although some students focused on the connections between genderroles and parenting, others commented on the divergent views of Mar-Beth and Clarice concerning God and family. One student even mentioned parallels between Clarice and Abraham in order to explore the relationship between the self, the family, and God. Culminating in a post that considers the role of mothers and females in the structure of the family, this succession of blog entries examines family dynamics from the interpersonal level to the metaphysical.

Although we each inevitably respond to different things in these episodes, I believe that there is much to gain by looking at “False Birth” through the lens of the family. For example, what if we look back at a relatively minor (if creepy) scene where Ruth effectively tells Evelyn to sleep with her son? Much like Clarice (and arguably Mar-Beth) is/are the matriarchs of their house, Ruth rules over the Adamas. Since we are exploring gender, let’s contrast these examples with that of the Guatrau, who holds sway over a different type of family—how does Clarice compare with Ruth? Ruth with the Guatrau? How does the organizational structure of the family in each case work with (or against) religion? We often talk about the ability of religion (organized or lived) to provide meaning, to tell us who/what we are, and to develop community—and yet these are also functions of family.

OTHER OBSERVATIONS

  • Hinted at by the inclusion of Atreus, whose story is firmly situated in family in a fashion that would give any modern soap opera a run for its money, we begin to see a pattern as the writers continually reinforce the connections between family and the divine. The short version of this saga is that Atreus’ grandfather cooked and served his son Pelops as a test to the gods (and you thought Clarice was ruthless) and incurs wrath and a curse. After Pelops causes the death of his father-in-law, Atreus and his brother Thyestes murder their step-brother and are banished. In their new home, Atreus becomes king and Thyestes wrests the throne away from Atreus (after previously starting an affair with his wife). In revenge Atreus kills and cooks Thyestes’ son (and taunts him with parts of the body!) and Thyestes eventually has sex with his daughter (Pelopia) in order to produce a son (Aegisthus) who is fated to kill Atreus. Before Atreus dies, however, he fathers Agamemnon and Menelaus, two brothers with their own sordid history that includes marrying sisters (one of whom is the famous Helen). As most of you know, the Trojan war then ensues and Agamemnon sacrifices his daughter Iphigenia; although Iphigenia is happy to die for the war, her mother, Clytemnestra, holds a grudge and sleeps with Aegisthus (remember him?) and eventually kills Agamemnon out of anger. The son of Clytemnestra and Agamemnon, Orestes, kills his mother in order to avenge his father and, in so doing, becomes one of the first tragic heroes who has to choose between two evils. If we want to take this a step further, we can also examine the resonance between Orestes and Mal from Firefly, to bring it back full circle.
  • The name of Mar-Beth may be an allusion to MacBeth (although it is entirely possible that I am reading too much into this), which is also a story about power, kings, and family. Although I am most familiar with Lady MacBeth and her OCD (obsessed with her guilt, she is compelled to wash invisible blood off of her hands), I would also suggest that Lady MacBeth overlaps with Clarice and the relationship between the MacBeths is similar to that of the Clarice and her husbands.
  • As much as our class does not focus on institutional religion, a background in the Christian concept of Grace provides some interesting insight into Daniel’s project. Although I am not an expert in the subject—I very much defer to Diane—I think that we could make a strong argument for the role of Grace in Christianity and its links to salvation as thematic elements in “False Labor.” Building off of my reaction post, we might think about the role that Grace plays in Daniel’s life and how Joe’s words to Daniel on the landing of the Graystone building speak to exactly this concept.
  • There seems to be an interesting distinction developing between notions of the earth/soil and the air/sky. The Taurons/Halatha, as we have seen before and continue to see in this episode, evidence a strong spiritual connection with the soil (and are also called “Dirteaters”) as Sam utters a prayer before he is about to be executed. We also see the Halatha grumble when the figure of Phaulkon on a television screen, whose name can be associated with flying and the sky. Moreover, in their ways, Daniel and Joe embody this duality as they both show concern for their families but attempt to resolve their issues in different ways–Joe, as is his want, concentrates on the material while Daniel looks toward the intangible.

Love Me or Hate Me, Still an Obsession

Reacting to atrocities witnessed throughout the course of World War II, Americans in the 1950s became obsessed with notions of power and control, fearing that they would be subsumed by the invisible hand of a totalitarian regime. In particular, the relatively young medium of television became suspect as it represented a major broadcast system that seemed to have an almost hypnotic pull on its audience, leaving viewers entranced by its images. And images, according to author and historian Daniel Boorstin, were becoming increasingly prominent throughout the 19th century as part of the Graphic Revolution replete with the power to disassociate the real from its representation (1962). For cultural critics still reeling from the aftereffects of Fascism and totalitarianism, this was a dangerous proposition indeed.

Although these underlying anxieties of mid-century American society could be examined via a wide range of anthropological lenses and frameworks, visual media has historically provided a particularly vivid manifestation of the fears latent in the people of the United States (Haskell, 2004). This is, of course, not to imply that visual media is necessarily the best or only means by which we can understand prevailing ideologies in the years after World War II, but merely one of the most visible. However, as a critical examination of the entire media landscape of the 1950s would be beyond the scope of a single paper of this magnitude, discussion shall be primarily concentrated around Elia Kazan’s 1957 movie A Face in the Crowd with particular attention paid to the contrasting channels of cinema and television.[1] This paper will seek to briefly position A Face in the Crowd in the larger context of paranoia-driven cinema of the 1950s before using the film as an entryway to discuss critiques of mass culture. Given the film’s apparent sustained resonance as indicated by its relatively recent mention (Vallis, 2008; Hoberman, 2008b; Franklin, 2009), the arguments of Critical Theory will then be applied to modern American culture in an attempt to ascertain their continued validity. Finally, an argument will be made that acknowledges the potential dangers facing mass culture in the 21st century but also attempts to understand the processes that underlie these pitfalls and provides a suggestion for recourse in the form of cultural and media literacy.

 

Paranoia, Paranoia, Everyone’s Coming to Get Me

The post-war prosperity of the 1950s caused rapid changes in America, literally altering the landscape as families began to flood into the newly-formed suburbs. With the dream and promise of upward social mobility firmly ensconced in their heads, families rushed to claim their piece of the American dream, replete with the now-iconic front yard and white picket fence. And yet, ironically, a new set of worries began to fester underneath the idyllic façade of the suburbs as the troubles of the city were merely traded for fears of paranoia and invasion; the very act of flight led to entrapment by an ethos that subtly precluded the possibility of escape.

As with many other major cultural shifts, the rapid change in the years following World War II caused Americans to muse over the direction in which they were now headed; despite a strong current of optimism that bolstered dreams of a not-far-off utopia, there remained a stubborn fear that the quickly shifting nature of society might have had unanticipated and unforeseen effects (Murphy, 2009). Life in the suburbs, it seemed, was too good to be true and inhabitants felt a constant tension as they imagined challenges to their newly rediscovered safety:  from threats of invasion to worries about conformity, and from dystopian futures to a current reality that could now be obliterated with nuclear weapons, people of the 1950s continually felt the weight of being a society under siege. An overwhelming sense of doubt, and more specifically, paranoia, characterized the age and latent fears manifested in media as the public began to struggle with the realization that the suburbs did not fully represent the picturesque spaces that they had been conceived to be. In fact, inhabitants were assaulted on a variety of levels as they became disenchanted with authority figures, feared assimilation and mind control (particularly through science and/or technology), began to distrust their neighbors (who could easily turn out to be Communists, spies, or even aliens!), and felt haunted by their pasts, all of which filled the movie screens of the decade (Jensen, 1971; Murphy, 2009; Wolfe, 2002).[2] Following solidly in this tradition, Kazan’s A Face in the Crowd picks up on some of the latent strains of paranoia in American culture while simultaneously serving as a platform for a set of critiques regarding mass culture.

 

Somewhere, a Star Is Made

The storyline of A Face in the Crowd is rather straightforward and yet deceptively complex in its undertones:  on the surface, we experience a rather heavy-handed morality tale in the form of country bumpkin Larry “Lonesome” Rhodes, a relative nobody who is plucked from obscurity and made (and subsequently broken) through powers associated with television. Yet, it is only when we begin to connect the movie to a larger societal context that we begin to understand the ramifications of the film’s message; a careful examination of A Face in the Crowd reveals striking suspicions regarding the role that media plays (in this case, primarily television and cinema) in shaping American culture. Stars, director Elia Kazan argues, are not so much born as made, a distinction that portends dire consequences.

It is worth noting that Kazan’s film was made during a time when the concept of the “celebrity” was being renegotiated by America; for a large part of its history, the United States, firmly grounded in a Puritan work ethic, had honored heroes who exemplified ideals associated with a culture of production and was struggling to reconcile these notions in the presence of an environment whose emphasis was now focused on consumption. Although modern audiences might initially find this shift difficult to appreciate, one need only consider that the premium placed on production is so central to American ideology that it continues to linger today:  in a culture that exhibits rampant consumerism, we still value the “self-made man” and sell the myth of America as a place where anyone can achieve success through hard work. To abandon these ideas would necessitate that we reinterpret the very meaning of “America.” Thus, we become more sympathetic to the critics of the day who lamented the loss of the greatness of man and bristled against the notion that fame or celebrity could be manufactured—such a system could would only result in individuals who were lacking and unworthy of their status (Gamson, 1994; Benjamin, 1973)

Such is the case it seems, with Larry Rhodes, who is discovered by roving reporter Marcia Jeffries in an Arkansas jail. Although it cannot be denied that Rhodes has some modicum of talent and a certain charisma that comes from being unafraid to speak one’s mind, Marcia ushers Rhodes onto the path of greatness by dubbing him “Lonesome” and thus creates a character that transforms Rhodes from a despondent drunk to a winsome drifter. This scene—the first major one in the movie—thusly introduces the important notion that those involved in the media can be implicitly involved in the manipulation of the information that travels over the airwaves. Subtly adding to the insidious nature of the media, A Face in the Crowd portrays Marcia as a character that seems likable enough, but also a person who is, in a way, exploiting the people in jail as she rushes in with her tape recorder intent on prying the stories from the characters she finds (or creates!) and does not exhibit much concern in truly understanding why these men are imprisoned in the first place. Taken to an extreme, we later come across the character of The General, who further perverts the connection between media and power as he conspires with Lonesome to remake the image of Senator Worthington Fuller as the congressman runs for President.

Yet, as Lonesome Rhodes grows in his role as a media personality, he quickly demonstrates that the power to manipulate does not lie solely with those who sit behind the cameras. In Memphis, Rhodes incites a riot against the Luffler mattress company and also solicits donations in order to help a Black woman rebuild her house. In light of this, we can see that while Kazan focuses on the negative implications of television and celebrity, that the relative good or bad that comes from these actions is not necessarily the point—instead, the one constant in all of the depicted scenarios is a public who is manipulated into performing actions on the behalf of others. Although the characters of Lonesome and The General are vilified throughout the film, it is the masses for which Kazan demonstrates true disdain.

 

Extraordinary Popular Delusions

Perhaps nowhere is this contempt more apparent than at the end of the film where, in an attempt to offer a small moment of solace to Marcia after her unmasking of Lonesome, writer Mel Miller notes, “We get wise to them, that’s our strength” (Kazan, 1957). And Miller is not wrong:  Western tradition has long recognized the correlation between knowledge and power and Miller’s assertion touches upon the revelatory clout inherent in the realignment of perception and reality as noted by public relations guru Howard Bragman (2008). A more critical examination of the film’s closing scene, however, raises an important question:  Who is Miller’s “we”? Although one might be tempted to read this line as indicative of an egalitarian philosophical view, it is important to note that the only two characters in the shot represent the film’s arguably upper-middle class, and pointedly Eastern-educated, elite—nowhere to be seen are representatives of the small Arkansas town from the film’s opening or denizens of Memphis, both of whom serve to characterize the majority of Lonesome’s devoted viewers.[3] In fact, if we take time to reflect upon the movie, we realize that the majority of the audience was only alerted to Lonesome’s dual nature after Marcia flipped a control room switch and revealed the underlying deterioration; the masses oscillated from one position to the next without understanding how or why and once again adopted a passive stance in their relationship with media. Moreover, as Courtney Maloney points out, Kazan’s depiction of the agency of the masses is actually limited in scope:  despite a montage of audience members vehemently phoning in, sponsors are simultaneously shown to be acting independently as they withdraw their association with Lonesome (1999). Moreover, the subtext of the scene distances the rational decision-making of the truly powerful from the impassioned beseeching of the masses, likening the power of the latter to that of a mob. Knowledge and its associated authority, clearly, are afforded to a select group.

This idea, that the world can be divided between those who “get wise” and those who do not, serves to develop a rather sharp classist criticism against the medium of television and those who would watch it:  moviegoers, by virtue of witnessing Kazan’s work, find themselves elevated in status and privy to “the man behind the curtain” (to borrow a phrase). In contrast, the malleable masses were considered to be pacified and placated by idealistic portrayals of life in the 1950s in the form of television programs like Leave It to Beaver, The Donna Reed Show, and The Adventures of Ozzie and Harriet. Clearly, Kazan creates a dichotomy imbued with a value judgment descended from the thoughts of prominent thinkers in the Frankfurt School who, as far as aesthetics were concerned, preferred the high culture of cinema to the conformity and manipulated tastes of television (Horkheimer & Adorno, 2002; Adorno, 1985; Quart, 1989). This distinction between high and low culture would be a crucial supporting idea for critics as a prominent fear of mass culture was that it portended a collapse between concepts (e.g., fame, celebrity, or intellectual value) of objectively different quality, essentially rendering all manifestations the same and therefore all equally mundane (Boorstin, 1962; Hoberman, 2008b; Kierkegaard, 1962).  Even worse for critics, perhaps, was the perception of the masses’ refusal to grow out of its immature interests, a behavior that was characterized as both childlike and stubborn (Adorno, 1985).

And the fears of such theorists, all of whom were reacting to recent and rapid advances in broadcast technology, were not unfounded. Consider, for example, that radio had been popularized a scant fifty years prior and had vastly altered critics’ understanding of media’s potential impact, creating a precedent as it proliferated across the country and began to develop a platform for solidarity and nationalism. Yet, while the effects of radio were decidedly pro-social, due in part to its propagation of orchestral music and transmission of fireside chats, television was viewed as a corrosive force on society that spurred on the destruction of culture instead of enriching it.[4] For the critics of the Frankfurt School, television was indicative of an entrenched sentiment that regarded mass-produced culture as formulaic and perfectly suitable for a generation of passive consumers who sat enraptured in front of the glowing set. Associating the potential dissemination of propagandist ideology with television as a form of mass broadcast, cultural theorists evoked notions of totalitarian regimes akin to Hitler and Stalin in an effort to illustrate the potential subjugation of individual thought (Mattson, 2003). These simmering fears, aggrandized by their concurrence with the rising threat of Communism and collectivist cultures, found fertile soil in the already present anxiety-ridden ethos of the United States during the 1950s.

 

It’s the End of the World as We Know It (And I Feel Fine)

Notably, however, the fears associated with the masses have not been limited to one particular decade in American history:  across cultures and times, we can witness examples akin to tulip mania where unruly mobs exhibited relatively irrational behavior. Given the reoccurring nature of this phenomenon, which receives additional credence from psychological studies exploring groupthink and conformity (Janis, 1972; Asch, 1956), we might choose to examine how, if at all, the cultural critiques of the 1950s apply to contemporary society.

Recast, the criticisms of mass culture presumably resonate today in a context where popular culture holds sway over a generally uncritical public; we might convincingly argue that media saturation has served to develop a modern society in which celebrities run wild while evidencing sexual exploits like badges of honor, traditional communities have collapsed, and the proverbial apocalypse appears closer than ever. Moreover, having lost sight of our moral center while further solidifying our position as a culture of consumption since the 1950s, the masses have repeatedly demonstrated their willingness to flash a credit card in response to advertising campaigns and to purchase unnecessary goods hawked by celebrity spokespeople in a process that demonstrates a marked fixation on appearance and the image in a process reminiscent of critiques drawn from A Face in the Crowd (Hoberman, 2008a; Ecksel, 2008). Primarily concerned with the melding of politics, news, and entertainment, which harkens back to Kierkegaard-inspiried critiques of mass culture, current critics charge that the public has at long last become what we most feared:  a mindless audience with sworn allegiances born out of fielty to the all-mighty image (Hoberman, 2008a).

Arguably the most striking (or memorable) recent expression of image, and subsequent comingling bewteen politics and entertainment, centered around Sarah Palin’s campaign for office in 2008. Indeed, much of the disucssion regarding Palin centered around her image and colloquisims rather than focusing solely on her abilities. [5] Throughout her run, Palin positioned herself as an everyman figure, summoning figures such as “Joe Six-Pack” and employing terms such as “hockey mom” in order to covey her relatability to her constituents.[6] In a piece on then-Vice-Presidential candidate Sarah Palin, columnist Jon Meacham questions this practice by writing:  “Do we want leaders who are everyday folks, or do we want leaders who understand everyday folks?” (2008). Palin, it seemed to Meacham, represented much more of the former than the latter; this position then  leads to the important suggestion that Palin was placed on the political bill in order to connect with voters (2008). Suddenly, a correlary between Palin and Lonesome Rhodes from A Face in the Crowd becomes almost self-evident.

At our most cynical, we could argue that Palin is a Lonesome-type figure, cleverly manipulating her image in order to connect with the disenfranchised and disenchanted. More realistically, however, we might consider how Palin could understand her strength in terms of her relatability instead of her political acumen; she swims against the current as a candidate of the people (in perhaps the truest sense of the term) and provides hope that she will represent the voice of the common man, in the process challenging the status quo in a government that has seemingly lost touch with its base. In some ways, this argument continues to hold valence in post-election actions that demonstrate increasing support of the Tea Party movement.

However, regardless of our personal political stances, the larger pertinent issue raised by A Face in the Crowd is the continued existence of an audience whose decision-making process remains heavily influenced by image—we actually need to exert effort in order to extract our opinion of Sarah Palin the politician from the overall persona of Sarah Palin. Although admittedly powerful, author Mark Rowlands argues that a focus on image—and the reliance on the underlying ethereal quality described by Daniel Boorstin as being “well known for [one’s] well-knownness” (Boorstin, 1962, p. 221)—is ultimately damning as the public’s inability to distinguish between items of quality leads them to focus on the wrong questions (and, perhaps worse, to not even realize that we are asking the wrong questions) in ways that have very real consequences. Extrapolating from Rowlands, we might argue that, as a culture that is obsessed with image and reputation, we have, in some ways, forgotten how to judge the things that really matter because we have lost a sense of what our standards should be.

 

Ever the Same?

So while the criticisms of critics from the Frankfurt School might appear to hold true today, we also need to realize that modern audiences exist in a world that is, in some ways, starkly different from that of the 1950s. To be sure, the mainstream media continues to exist in a slightly expanded form but new commentary on the state of American culture must account for the myriad ways in which current audiences interact with the world around them. For instance, work published after Theodor Adorno’s time has argued against the passive nature of audiences, recognizing the agency of individual actors (Mattson, 2003; Shudson, 1984).[7] Moreover, the new activity on the part of audiences has done much to comingle the once distinctly separate areas of high and low culture in a process that would have likely confounded members of the Frankfurt School. The current cultural landscape encompasses remix efforts such as Auto-Tune the News along with displays of street art in museum galleries; projects once firmly rooted in folk or pop art have transcended definitional boundaries to become more accepted—and even valued—in the lives of all citizens. While Adorno might be tempted to cite this as evidence of high culture’s debasement, we might instead argue that these new manifestations have challenged the long-held elitism surrounding the relative worth of particular forms of art.

Additionally, examples like Auto-Tune the News suggest that advances in technology have also had a large impact on the cultural landscape of America over the past half century, with exponential growth occurring after the widespread deployment of the Internet and the resulting World Wide Web. While the Internet certainly provided increased access to information, it also created the scaffolding for social media products that allowed new modes of participation for users. Viewed in the context of image, technology has helped to construct a world in which reputations are made and broken in an instant and we have more information circulating in the system than ever before; the appearance of technology, then, has not only increased the velocity of the system but has also amplified it.

Although the media often showcases deleterious qualities of the masses’ relationship with these processes (the suicide of a student at Rutgers University being a recent and poignant example), we are not often exposed to the incredible pro-social benefits of a platform like Twitter or Facebook. While we might be tempted to associate such pursuits with online predators (a valid concern, to be sure) or, at best, unproductive in regard to civic engagement (Gladwell, 2010), to do so would to ignore the powerfully positive uses of this technology (Burnett, 2010; Lehrer, 2010; Johnston, 2010). Indeed, we need only look at a newer generation of activist groups who have built upon Howard Rheingold’s concept of “smart mobs” in order to leverage online technologies to their benefit (2002)—a recent example can be found in the efforts of groups like The Harry Potter Alliance, Invisible Children, and the Kristin Brooks Hope Center to win money in the Chase Community Giving competition (Business Wire, 2010). Clearly, if the masses can self-organize and contribute to society, the critiques of mass culture as nothing more than passive receptors of media messages need to be revised.

 

Reconsidering the Masses

If we accept the argument that audiences can play an active part in their relationship with media, we then need to look for a framework that begin to address media’s role in individuals’ lives and to examine the motivations and intentions that underlie media consumption. Although we might still find that media is a corrosive force in society, we must also realize that, while potentially exploiting an existing flaw, it does not necessarily create the initial problem (MacGregor, 2000).

A fundamental building block in the understanding of media’s potential impact is the increased propensity for individuals (particularly youth) to focus on external indicators of self-worth, with the current cultural climate of consumerism causing individuals to focus on their inadequacies as they begin to concentrate on what they do not have (e.g., physical features, talent, clothes, etc.) as opposed to their strengths. Simultaneously both an exacerbation of this problem and an entity proffering solutions, constructs like advertising provide an easy way for youth to compensate for their feelings of anxiety by instilling brands as a substitute for value:  the right label can confer a superficial layer of prestige and esteem upon individuals, which can act as a temporary shield against criticism and self-doubt. In essence, one might argue that if people aren’t good at anything, they can still be associated with the right brands and be okay. Although we might be tempted to blame advertising for this situation, it actually merely serves to exploit our general unease about our relationship to the world, a process also reminiscent of narcissism (Lasch, 1979).

Historian Christopher Lasch goes on to argue that, once anchored by institutions such as religion, we have become generally disconnected from our traditional anchors and thus have come to substitute media messages and morality tales for actual ethical and spiritual education (1979). The overlapping role of religion and advertising is noted by James Twitchell, who contends that, “Like religion, which has little to do with the actual delivery of salvation in the next world but everything to do with the ordering of life in this one, commercial speech has little to do with material objects per se but everything to do with how we perceive them” (1996, 110). Thus, we might classify religion, advertising, entertainment, and celebrity as examples of belief systems (i.e., a certain way of seeing the world complete with their own set of values) and use these paradigms to begin to understand their respective (and ultimately somewhat similar!) effects on the masses.

 

A Higher Power

Ideologies such as those found in popular culture, religion, or advertising tell believers, in their own ways, what is (and is not) important in society, something that Twitchell refers to as “magic” (1996, 29). Each manifestation also professes a particular point of view and attempts to integrate itself into everyday life, drawing on our desire to become part of something (e.g., an idea, a concept, or a movement) that is larger than ourselves. Perhaps, most importantly, the forces of advertising, entertainment, religion, and art (as associated with high/pop/folk culture) play on this desire in order to allow humans to give their lives meaning and worth, in terms of the external:  God, works of art, and name brands all serve as tools of classification. While cynics might note that this stance bears some similarities to the carnival sideshows of P. T. Barnum—it does not matter what is behind the curtain as long as there is a line out front (Gamson, 1994; Lasch, 1979)—the terms survive because they continue to speak to a deep desire for structure; the myth of advertising works for the same reasons that we believe in high art, higher education, and higher powers. Twitchell supports this idea by mentioning that “the real force of [the culture of advertising] is felt where we least expect it:  in our nervous system, in our shared myths, in our concepts of self, and in our marking of time” (1996, 124). Constructs like advertising or entertainment, it seems, not only allow us to assemble a framework through which we understand our world, but also continually informs us about who we are (or who we should be) as a collection of narratives that serves to influence the greater perceptions of individuals in a manner reminiscent of the role of television in Cultivation Theory (Gerbner & Gross, 1976). The process of ordering and imbuing value ultimately demonstrates how overarching ideologies can not only create culture but also act to shape it, a process evidenced by the ability of the aforementioned concepts to consume and/or reference previously shared cultural knowledge while simultaneously contributing to the cultural milieu.

Given our reconsideration of mid-century cultural critiques, it follows that we should necessarily reevaluate proposed solutions to the adverse issues present within mass culture. We recall the advice of A Face in the Crowd’s Mel Miller (i.e., “We get wise to them”) and reject its elitist overtones while remaining mindful of its core belief. We recognize that priding ourselves on being smart enough to see through the illusions present in mass culture, while pitying those who have yet to understand how they are being herded like so many sheep, makes us guilty of the narcissism we once ascribed to the masses—and perhaps even more dangerous than the uneducated because we are convinced that we know better. We see that aspects of mass culture address deeply embedded desires and that our best hope for improving culture is to satisfy these needs while educating audiences so that they can better understand how and why media affects them. Our job as critics is to encourage critical thinking on the part of audiences, dissecting media and presenting it to individuals so that they can make informed choices about their consumption patterns; our challenge is to convincingly demonstrate that engagement with media is a crucial and fundamental part of the process. If we ascribe to these principles, we can preserve the masses’ autonomy and not merely replace one dominant ideology with another.


[1] It should be noted, however, that the comics of this time—those that belong to the end of the Golden Age and beginning of the Silver Age—also provide an additional understanding of the ways in which Americans indirectly wrestled with their fears.

[2] For a more exhaustive list of movies that support this point, see Wolfe, 2002.

[3] Let us also not forget the fact that Lonesome exhibits a rather patronizing attitude toward his audience in his later career, instituting the Cracker Barrel show with its manufactured country lackeys (Yates, 1974). In contrast to his first stint in Memphis, Lonesome has begun to embrace his country image as a means (if an inauthentic one) to connect with his audience, a point of contention to which we will return.

[4] Curiously, however, we see that this relationship between presidential addresses (like the aforementioned fireside chats) and mass media did not elicit notable complaints from critics who were generally wary of the merging of politics and entertainment (Quart, 1989; Benjamin, 1973). Although a larger discussion is warranted regarding the subtleties of this distinction, I would suggest that part of the differentiation stems from a high-low culture dichotomy. Although critics linked the negative presence of television with corporate advertising, James Twitchell suggests that there has always been a rather intimate relationship between arts and commerce, most saliently exhibited by wealthy citizens or entities who act as patrons (Twitchell, 1996).

[5] Certainly being a female did not help this as American women are typically subject to a “halo effect” wherein their attractiveness (i.e., appearance) affects their perception (Kaplan, 1978)

[6] Palin has continued the trend, currently employing the term “mama grizzlies,” a call-to-arms that hopes to rally the willingness of women to fight in order to protect things that they believe in. Interestingly, a term that reaffirms the traditional role of women as nurturing matriarchs has been linked to feminist movements, a move that seems to confuse the empowerment of women with a socially conservative construct of their role in American life (Dannenfelser, 2010).

[7] We can also see much work conducted in the realm of fan studies that supports the practice of subversive readings or “textual poaching,” a term coined by Henry Jenkins (1992), in order to discuss contemporary methods of meaning making and resistance by fans.