Thoughts from my study of Horror, Media, and Narrrative

Posts tagged “Celebrity

The Most Important Product Is You!

“The Culture Industry” seems to be one of those seminal pieces in the cannon of Cultural Studies that elicits a visceral (and often negative) reaction from modern scholars. Heavily influenced by the Birmingham School, generations of scholars have been encouraged to recognize agency in audiences, with the Frankfurt School often placed in direct opposition to the ideals of modern inquiry. Read one way, Horkheimer and Adorno appear elitist, privileging what has come to be known as “high culture” (e.g., classical music and fine art) over the entertainment of the masses. Horkheimer and Adorno argue that the culture industry creates a classification scheme in which man exists; whereas man previously struggled to figure out his place in the world, this job is done for him by the culture industry and its resultant structure of artificial stratification. Ultimately, then, because he does not have to think about his position in culture, man is not engaged in his world in the same way as he was before, which therefore allows content to become formulaic and interchangeable.

Later echoed in Adorno’s “How to Look at Television,” “The Culture Industry” laments the predictable pattern of televisual media, with audiences knowing the ending of movies as soon as they begin. (Interestingly, there is some overlap with this and Horror with audiences expecting that offerings follow a convention—one might even argue that the “twist ending” has become its own sort of genre staple—and that a movie’s failure to follow their expectations leaves the audience disgruntled. This of course raises questions about whether modern audiences have been conditioned to expect certain things out of media or to engage in a particular type of relationship with their media and whether plot progression, at least in part, defines the genre.) Horkheimer and Adorno’s attitude speaks to a privileging of original ideas (and the intellectual effort that surrounds them) but the modern context seems to suggest that the combination of preexisting ideas in a new way holds some sort of cultural value.

Adorno’s “How to Look at Television” also points out a degradation in our relationship to media by highlighting the transition from inward-facing to outward-facing stances, equating such migration with movement away from subtlety. Although the point itself may very well be valid, it does not include a robust discussion of print versus televisual media:  Adorno’s footnote that mentions the different affordances of media (i.e., print allows for contemplation and mirrors introspection while television/movies rely on visual cues due to their nature as visual media) deserves further treatment as the implications of these media forms likely has repercussions on audience interactions with them. Almost necessarily, then, do we see a televisual viewing practice that does not typically rely on subtlety due to a different form of audience/media interaction.  (It might also be noted that the Saw movies have an interesting take on this in that they pride themselves on leaving visual “breadcrumbs” for viewers to discover upon repeated viewings although these efforts are rarely necessary for plot comprehension.)

To be fair, however, one might argue that Horkheimer and Adorno wrote in a radically different media context. Sixty years later, we might argue that there’s not that much left to discover and that prestige has now been shifted to recombinations of existent information. Moreover, Horkheimer and Adorno’s position also assumes a particular motivation of the audience (i.e., that the payoff is the conclusion instead of the journey) that may no longer be completely true for modern viewers.

Although Horkheimer and Adorno rightly raise concerns regarding a lack of independent thinking (or even the expectation of it!), we are perhaps seeing a reversal of this trend with transmedia and attempts at audience engagement. Shows now seem to want people to talk about their shows (message boards, Twitter, etc.) in order to keep them invested and although we might quibble about the quality of such discourse and whether it is genuine or reactionary, it seems that this practice must be reconciled with Horkheimer and Adorno’s original position. It should be noted, however, that the technology on which such interaction relies was not around when Horkheimer and Adorno wrote “The Culture Industry” and the Internet has likely helped to encourage audience agency (or at least made it more visible).

Seeking to challenge the notion that the Horkheimer and Adorno discounted audience agency, John Durham Peters argues for the presence of both industry and audience influence in the space of culture and furthermore that while audiences may be empowered, their actions serve to reinforce their submission to the dominant wishes of industry in a realization of hegemonic practice. Although Horkheimer and Adorno, writing in the shadow of World War II were undoubtedly concerned with the potential undue influence of mass media as a vehicle for fascist ideology—as evidenced by quotes such as “The radio becomes the universal mouthpiece of the Fuhrer” and “The gigantic fact that the speech penetrates everywhere replaces its content”—they were also concerned that the public had relinquished its ability to resist by choosing to pursue frivolous entertainment rather than freedom (Adorno, 1941). From this position, Peters extracts the argument that Horkheimer and Adorno did in fact recognize agency on the part of audiences, but also that such energies were misspent.

The notion of “the masses” has long been an area of interest for me as it manifests throughout suburban Gothic horror in the mid-20th century. In many ways, society was struggling to come to terms with new advances with technology and the implications for how these new inventions would bring about resultant changes in practice and structure. Below is an excerpt from a longer piece about a movie that also grappled with some of these issues.

Reacting to atrocities witnessed throughout the course of World War II, Americans in the 1950s became obsessed with notions of power and control, fearing that they would be subsumed by the invisible hand of a totalitarian regime. In particular, the relatively young medium of television became suspect as it represented a major broadcast system that seemed to have an almost hypnotic pull on its audience, leaving viewers entranced by its images. And images, according to author and historian Daniel Boorstin, were becoming increasingly prominent throughout the 19th century as part of the Graphic Revolution replete with the power to disassociate the real from its representation (1962). For cultural critics still reeling from the aftereffects of Fascism and totalitarianism, this was a dangerous proposition indeed.

Although these underlying anxieties of mid-century American society could be examined via a wide range of anthropological lenses and frameworks, visual media has historically provided a particularly vivid manifestation of the fears latent in the people of the United States (Haskell, 2004). This is, of course, not to imply that visual media is necessarily the best or only means by which we can understand prevailing ideologies in the years after World War II, but merely one of the most visible. However, as a critical examination of the entire media landscape of the 1950s would be beyond the scope of a single paper of this magnitude, discussion shall be primarily concentrated around Elia Kazan’s 1957 movie A Face in the Crowd with particular attention paid to the contrasting channels of cinema and television.[1] This paper will seek to briefly position A Face in the Crowd in the larger context of paranoia-driven cinema of the 1950s before using the film as an entryway to discuss critiques of mass culture. Given the film’s apparent sustained resonance as indicated by its relatively recent mention (Vallis, 2008; Hoberman, 2008b; Franklin, 2009), the arguments of Critical Theory will then be applied to modern American culture in an attempt to ascertain their continued validity. Finally, an argument will be made that acknowledges the potential dangers facing mass culture in the 21st century but also attempts to understand the processes that underlie these pitfalls and provides a suggestion for recourse in the form of cultural and media literacy.

Paranoia, Paranoia, Everyone’s Coming to Get Me

The post-war prosperity of the 1950s caused rapid changes in America, literally altering the landscape as families began to flood into the newly-formed suburbs. With the dream and promise of upward social mobility firmly ensconced in their heads, families rushed to claim their piece of the American dream, replete with the now-iconic front yard and white picket fence. And yet, ironically, a new set of worries began to fester underneath the idyllic façade of the suburbs as the troubles of the city were merely traded for fears of paranoia and invasion; the very act of flight led to entrapment by an ethos that subtly precluded the possibility of escape.

As with many other major cultural shifts, the rapid change in the years following World War II caused Americans to muse over the direction in which they were now headed; despite a strong current of optimism that bolstered dreams of a not-far-off utopia, there remained a stubborn fear that the quickly shifting nature of society might have had unanticipated and unforeseen effects (Murphy, 2009). Life in the suburbs, it seemed, was too good to be true and inhabitants felt a constant tension as they imagined challenges to their newly rediscovered safety:  from threats of invasion to worries about conformity, and from dystopian futures to a current reality that could now be obliterated with nuclear weapons, people of the 1950s continually felt the weight of being a society under siege. An overwhelming sense of doubt, and more specifically, paranoia, characterized the age and latent fears manifested in media as the public began to struggle with the realization that the suburbs did not fully represent the picturesque spaces that they had been conceived to be. In fact, inhabitants were assaulted on a variety of levels as they became disenchanted with authority figures, feared assimilation and mind control (particularly through science and/or technology), began to distrust their neighbors (who could easily turn out to be Communists, spies, or even aliens!), and felt haunted by their pasts, all of which filled the movie screens of the decade (Jensen, 1971; Murphy, 2009; Wolfe, 2002).[2] Following solidly in this tradition, Kazan’s A Face in the Crowd picks up on some of the latent strains of paranoia in American culture while simultaneously serving as a platform for a set of critiques regarding mass culture.

Somewhere, a Star Is Made

The storyline of A Face in the Crowd is rather straightforward and yet deceptively complex in its undertones:  on the surface, we experience a rather heavy-handed morality tale in the form of country bumpkin Larry “Lonesome” Rhodes, a relative nobody who is plucked from obscurity and made (and subsequently broken) through powers associated with television. Yet, it is only when we begin to connect the movie to a larger societal context that we begin to understand the ramifications of the film’s message; a careful examination of A Face in the Crowd reveals striking suspicions regarding the role that media plays (in this case, primarily television and cinema) in shaping American culture. Stars, director Elia Kazan argues, are not so much born as made, a distinction that portends dire consequences.

It is worth noting that Kazan’s film was made during a time when the concept of the “celebrity” was being renegotiated by America; for a large part of its history, the United States, firmly grounded in a Puritan work ethic, had honored heroes who exemplified ideals associated with a culture of production and was struggling to reconcile these notions in the presence of an environment whose emphasis was now focused on consumption. Although modern audiences might initially find this shift difficult to appreciate, one need only consider that the premium placed on production is so central to American ideology that it continues to linger today:  in a culture that exhibits rampant consumerism, we still value the “self-made man” and sell the myth of America as a place where anyone can achieve success through hard work. To abandon these ideas would necessitate that we reinterpret the very meaning of “America.” Thus, we become more sympathetic to the critics of the day who lamented the loss of the greatness of man and bristled against the notion that fame or celebrity could be manufactured—such a system could would only result in individuals who were lacking and unworthy of their status (Gamson, 1994; Benjamin, 1973)

Such is the case it seems, with Larry Rhodes, who is discovered by roving reporter Marcia Jeffries in an Arkansas jail. Although it cannot be denied that Rhodes has some modicum of talent and a certain charisma that comes from being unafraid to speak one’s mind, Marcia ushers Rhodes onto the path of greatness by dubbing him “Lonesome” and thus creates a character that transforms Rhodes from a despondent drunk to a winsome drifter. This scene—the first major one in the movie—thusly introduces the important notion that those involved in the media can be implicitly involved in the manipulation of the information that travels over the airwaves. Subtly adding to the insidious nature of the media, A Face in the Crowd portrays Marcia as a character that seems likable enough, but also a person who is, in a way, exploiting the people in jail as she rushes in with her tape recorder intent on prying the stories from the characters she finds (or creates!) and does not exhibit much concern in truly understanding why these men are imprisoned in the first place. Taken to an extreme, we later come across the character of The General, who further perverts the connection between media and power as he conspires with Lonesome to remake the image of Senator Worthington Fuller as the congressman runs for President.

Yet, as Lonesome Rhodes grows in his role as a media personality, he quickly demonstrates that the power to manipulate does not lie solely with those who sit behind the cameras. In Memphis, Rhodes incites a riot against the Luffler mattress company and also solicits donations in order to help a Black woman rebuild her house. In light of this, we can see that while Kazan focuses on the negative implications of television and celebrity, that the relative good or bad that comes from these actions is not necessarily the point—instead, the one constant in all of the depicted scenarios is a public who is manipulated into performing actions on the behalf of others. Although the characters of Lonesome and The General are vilified throughout the film, it is the masses for which Kazan demonstrates true disdain.

Extraordinary Popular Delusions

Perhaps nowhere is this contempt more apparent than at the end of the film where, in an attempt to offer a small moment of solace to Marcia after her unmasking of Lonesome, writer Mel Miller notes, “We get wise to them, that’s our strength” (Kazan, 1957). And Miller is not wrong:  Western tradition has long recognized the correlation between knowledge and power and Miller’s assertion touches upon the revelatory clout inherent in the realignment of perception and reality as noted by public relations guru Howard Bragman (2008). A more critical examination of the film’s closing scene, however, raises an important question:  Who is Miller’s “we”? Although one might be tempted to read this line as indicative of an egalitarian philosophical view, it is important to note that the only two characters in the shot represent the film’s arguably upper-middle class, and pointedly Eastern-educated, elite—nowhere to be seen are representatives of the small Arkansas town from the film’s opening or denizens of Memphis, both of whom serve to characterize the majority of Lonesome’s devoted viewers.[3] In fact, if we take time to reflect upon the movie, we realize that the majority of the audience was only alerted to Lonesome’s dual nature after Marcia flipped a control room switch and revealed the underlying deterioration; the masses oscillated from one position to the next without understanding how or why and once again adopted a passive stance in their relationship with media. Moreover, as Courtney Maloney points out, Kazan’s depiction of the agency of the masses is actually limited in scope:  despite a montage of audience members vehemently phoning in, sponsors are simultaneously shown to be acting independently as they withdraw their association with Lonesome (1999). Moreover, the subtext of the scene distances the rational decision-making of the truly powerful from the impassioned beseeching of the masses, likening the power of the latter to that of a mob. Knowledge and its associated authority, clearly, are afforded to a select group.

This idea, that the world can be divided between those who “get wise” and those who do not, serves to develop a rather sharp classist criticism against the medium of television and those who would watch it:  moviegoers, by virtue of witnessing Kazan’s work, find themselves elevated in status and privy to “the man behind the curtain” (to borrow a phrase). In contrast, the malleable masses were considered to be pacified and placated by idealistic portrayals of life in the 1950s in the form of television programs like Leave It to Beaver, The Donna Reed Show, and The Adventures of Ozzie and Harriet. Clearly, Kazan creates a dichotomy imbued with a value judgment descended from the thoughts of prominent thinkers in the Frankfurt School who, as far as aesthetics were concerned, preferred the high culture of cinema to the conformity and manipulated tastes of television (Horkheimer & Adorno, 2002; Adorno, 1985; Quart, 1989). This distinction between high and low culture would be a crucial supporting idea for critics as a prominent fear of mass culture was that it portended a collapse between concepts (e.g., fame, celebrity, or intellectual value) of objectively different quality, essentially rendering all manifestations the same and therefore all equally mundane (Boorstin, 1962; Hoberman, 2008b; Kierkegaard, 1962).  Even worse for critics, perhaps, was the perception of the masses’ refusal to grow out of its immature interests, a behavior that was characterized as both childlike and stubborn (Adorno, 1985).

And the fears of such theorists, all of whom were reacting to recent and rapid advances in broadcast technology, were not unfounded. Consider, for example, that radio had been popularized a scant fifty years prior and had vastly altered critics’ understanding of media’s potential impact, creating a precedent as it proliferated across the country and began to develop a platform for solidarity and nationalism. Yet, while the effects of radio were decidedly pro-social, due in part to its propagation of orchestral music and transmission of fireside chats, television was viewed as a corrosive force on society that spurred on the destruction of culture instead of enriching it.[4]For the critics of the Frankfurt School, television was indicative of an entrenched sentiment that regarded mass-produced culture as formulaic and perfectly suitable for a generation of passive consumers who sat enraptured in front of the glowing set. Associating the potential dissemination of propagandist ideology with television as a form of mass broadcast, cultural theorists evoked notions of totalitarian regimes akin to Hitler and Stalin in an effort to illustrate the potential subjugation of individual thought (Mattson, 2003). These simmering fears, aggrandized by their concurrence with the rising threat of Communism and collectivist cultures, found fertile soil in the already present anxiety-ridden ethos of the United States during the 1950s.


[1] It should be noted, however, that the comics of this time—those that belong to the end of the Golden Age and beginning of the Silver Age—also provide an additional understanding of the ways in which Americans indirectly wrestled with their fears.

[2] For a more exhaustive list of movies that support this point, see Wolfe, 2002.

[3] Let us also not forget the fact that Lonesome exhibits a rather patronizing attitude toward his audience in his later career, instituting the Cracker Barrel show with its manufactured country lackeys (Yates, 1974). In contrast to his first stint in Memphis, Lonesome has begun to embrace his country image as a means (if an inauthentic one) to connect with his audience, a point of contention to which we will return.

[4] Curiously, however, we see that this relationship between presidential addresses (like the aforementioned fireside chats) and mass media did not elicit notable complaints from critics who were generally wary of the merging of politics and entertainment (Quart, 1989; Benjamin, 1973). Although a larger discussion is warranted regarding the subtleties of this distinction, I would suggest that part of the differentiation stems from a high-low culture dichotomy. Although critics linked the negative presence of television with corporate advertising, James Twitchell suggests that there has always been a rather intimate relationship between arts and commerce, most saliently exhibited by wealthy citizens or entities who act as patrons (Twitchell, 1996).

 

Works Cited

Adorno, T. (1941). On Popular Music. Studies in Philosophy and Social Science, 9, 17-48.

Adorno, T. (1985). On the Fetish Character in Music and the Regression of Listening. In A. Arato, & E. Gebhardt (Eds.), The Essential Frankfurt School Reader (pp. 270-299). New York, NY: Continuum.

Benjamin, W. (1973). The Work of Art in the Age of Mechanical Reproduction. In H. Arendt (Ed.), Illuminations (H. Zohn, Trans., pp. 217-242). London, England: Schocken.

Boorstin, D. (1962). The Image: A Guide to Pseudo-Events in America. New York, NY: Athenenum.

Bragman, H. (2008). Where’s My Fifteen Minutes?: Get Your Company, Your Cause, or Yourself the Recognition You Deserve. New York, NY: Portfolio.

Gamson, J. (1994). Claims to Fame: Celebrity in Contemporary America. Berkeley: University of California Press.

Haskell, M. (2004, August 8). Whatever the Public Fears Most, It’s Right Up There on the Big Screen. The New York Times, pp. 4-5.

Horkheimer, M., & Adorno, T. W. (2002). Dialectic of Enlightenment: Philosophical Fragments. Stanford, CA: Stanford University Press.

Jensen, P. (1971). The Return of Dr. Caligari. Film Comment, 7(4), 36-45.

Kazan, E. (Director). (1957). A Face in the Crowd [Motion Picture].

Maloney, C. (1999). The Faces in Lonesome’s Crowd: Imaging the Mass Audience in “A Face in the Crowd”. Journal of Narrative Theory, 29(3), 251-277.

Mattson, K. (2003). Mass Culture Revisited: Beyond Tail Fins and Jitterbuggers. Radical Society, 30(1), 87-93.

Murphy, B. M. (2009). The Suburban Gothic in American Popular Culture. Basingstoke, Hampshire, England: Palgrave Macmillan.

Quart, L. (1989). A Second Look. Cineaste, 17(2), pp. 30-31.

Wolfe, G. K. (2002). Dr. Strangelove, Red Alert, and Patterns of Paranoia in the 1950s. Journal of Popular Film, 57-67.


Love Me or Hate Me, Still an Obsession

Reacting to atrocities witnessed throughout the course of World War II, Americans in the 1950s became obsessed with notions of power and control, fearing that they would be subsumed by the invisible hand of a totalitarian regime. In particular, the relatively young medium of television became suspect as it represented a major broadcast system that seemed to have an almost hypnotic pull on its audience, leaving viewers entranced by its images. And images, according to author and historian Daniel Boorstin, were becoming increasingly prominent throughout the 19th century as part of the Graphic Revolution replete with the power to disassociate the real from its representation (1962). For cultural critics still reeling from the aftereffects of Fascism and totalitarianism, this was a dangerous proposition indeed.

Although these underlying anxieties of mid-century American society could be examined via a wide range of anthropological lenses and frameworks, visual media has historically provided a particularly vivid manifestation of the fears latent in the people of the United States (Haskell, 2004). This is, of course, not to imply that visual media is necessarily the best or only means by which we can understand prevailing ideologies in the years after World War II, but merely one of the most visible. However, as a critical examination of the entire media landscape of the 1950s would be beyond the scope of a single paper of this magnitude, discussion shall be primarily concentrated around Elia Kazan’s 1957 movie A Face in the Crowd with particular attention paid to the contrasting channels of cinema and television.[1] This paper will seek to briefly position A Face in the Crowd in the larger context of paranoia-driven cinema of the 1950s before using the film as an entryway to discuss critiques of mass culture. Given the film’s apparent sustained resonance as indicated by its relatively recent mention (Vallis, 2008; Hoberman, 2008b; Franklin, 2009), the arguments of Critical Theory will then be applied to modern American culture in an attempt to ascertain their continued validity. Finally, an argument will be made that acknowledges the potential dangers facing mass culture in the 21st century but also attempts to understand the processes that underlie these pitfalls and provides a suggestion for recourse in the form of cultural and media literacy.

Paranoia, Paranoia, Everyone’s Coming to Get Me

The post-war prosperity of the 1950s caused rapid changes in America, literally altering the landscape as families began to flood into the newly-formed suburbs. With the dream and promise of upward social mobility firmly ensconced in their heads, families rushed to claim their piece of the American dream, replete with the now-iconic front yard and white picket fence. And yet, ironically, a new set of worries began to fester underneath the idyllic façade of the suburbs as the troubles of the city were merely traded for fears of paranoia and invasion; the very act of flight led to entrapment by an ethos that subtly precluded the possibility of escape.

As with many other major cultural shifts, the rapid change in the years following World War II caused Americans to muse over the direction in which they were now headed; despite a strong current of optimism that bolstered dreams of a not-far-off utopia, there remained a stubborn fear that the quickly shifting nature of society might have had unanticipated and unforeseen effects (Murphy, 2009). Life in the suburbs, it seemed, was too good to be true and inhabitants felt a constant tension as they imagined challenges to their newly rediscovered safety:  from threats of invasion to worries about conformity, and from dystopian futures to a current reality that could now be obliterated with nuclear weapons, people of the 1950s continually felt the weight of being a society under siege. An overwhelming sense of doubt, and more specifically, paranoia, characterized the age and latent fears manifested in media as the public began to struggle with the realization that the suburbs did not fully represent the picturesque spaces that they had been conceived to be. In fact, inhabitants were assaulted on a variety of levels as they became disenchanted with authority figures, feared assimilation and mind control (particularly through science and/or technology), began to distrust their neighbors (who could easily turn out to be Communists, spies, or even aliens!), and felt haunted by their pasts, all of which filled the movie screens of the decade (Jensen, 1971; Murphy, 2009; Wolfe, 2002).[2] Following solidly in this tradition, Kazan’s A Face in the Crowd picks up on some of the latent strains of paranoia in American culture while simultaneously serving as a platform for a set of critiques regarding mass culture.

Somewhere, a Star Is Made

The storyline of A Face in the Crowd is rather straightforward and yet deceptively complex in its undertones:  on the surface, we experience a rather heavy-handed morality tale in the form of country bumpkin Larry “Lonesome” Rhodes, a relative nobody who is plucked from obscurity and made (and subsequently broken) through powers associated with television. Yet, it is only when we begin to connect the movie to a larger societal context that we begin to understand the ramifications of the film’s message; a careful examination of A Face in the Crowd reveals striking suspicions regarding the role that media plays (in this case, primarily television and cinema) in shaping American culture. Stars, director Elia Kazan argues, are not so much born as made, a distinction that portends dire consequences.

It is worth noting that Kazan’s film was made during a time when the concept of the “celebrity” was being renegotiated by America; for a large part of its history, the United States, firmly grounded in a Puritan work ethic, had honored heroes who exemplified ideals associated with a culture of production and was struggling to reconcile these notions in the presence of an environment whose emphasis was now focused on consumption. Although modern audiences might initially find this shift difficult to appreciate, one need only consider that the premium placed on production is so central to American ideology that it continues to linger today:  in a culture that exhibits rampant consumerism, we still value the “self-made man” and sell the myth of America as a place where anyone can achieve success through hard work. To abandon these ideas would necessitate that we reinterpret the very meaning of “America.” Thus, we become more sympathetic to the critics of the day who lamented the loss of the greatness of man and bristled against the notion that fame or celebrity could be manufactured—such a system could would only result in individuals who were lacking and unworthy of their status (Gamson, 1994; Benjamin, 1973)

Such is the case it seems, with Larry Rhodes, who is discovered by roving reporter Marcia Jeffries in an Arkansas jail. Although it cannot be denied that Rhodes has some modicum of talent and a certain charisma that comes from being unafraid to speak one’s mind, Marcia ushers Rhodes onto the path of greatness by dubbing him “Lonesome” and thus creates a character that transforms Rhodes from a despondent drunk to a winsome drifter. This scene—the first major one in the movie—thusly introduces the important notion that those involved in the media can be implicitly involved in the manipulation of the information that travels over the airwaves. Subtly adding to the insidious nature of the media, A Face in the Crowd portrays Marcia as a character that seems likable enough, but also a person who is, in a way, exploiting the people in jail as she rushes in with her tape recorder intent on prying the stories from the characters she finds (or creates!) and does not exhibit much concern in truly understanding why these men are imprisoned in the first place. Taken to an extreme, we later come across the character of The General, who further perverts the connection between media and power as he conspires with Lonesome to remake the image of Senator Worthington Fuller as the congressman runs for President.

Yet, as Lonesome Rhodes grows in his role as a media personality, he quickly demonstrates that the power to manipulate does not lie solely with those who sit behind the cameras. In Memphis, Rhodes incites a riot against the Luffler mattress company and also solicits donations in order to help a Black woman rebuild her house. In light of this, we can see that while Kazan focuses on the negative implications of television and celebrity, that the relative good or bad that comes from these actions is not necessarily the point—instead, the one constant in all of the depicted scenarios is a public who is manipulated into performing actions on the behalf of others. Although the characters of Lonesome and The General are vilified throughout the film, it is the masses for which Kazan demonstrates true disdain.

Extraordinary Popular Delusions

Perhaps nowhere is this contempt more apparent than at the end of the film where, in an attempt to offer a small moment of solace to Marcia after her unmasking of Lonesome, writer Mel Miller notes, “We get wise to them, that’s our strength” (Kazan, 1957). And Miller is not wrong:  Western tradition has long recognized the correlation between knowledge and power and Miller’s assertion touches upon the revelatory clout inherent in the realignment of perception and reality as noted by public relations guru Howard Bragman (2008). A more critical examination of the film’s closing scene, however, raises an important question:  Who is Miller’s “we”? Although one might be tempted to read this line as indicative of an egalitarian philosophical view, it is important to note that the only two characters in the shot represent the film’s arguably upper-middle class, and pointedly Eastern-educated, elite—nowhere to be seen are representatives of the small Arkansas town from the film’s opening or denizens of Memphis, both of whom serve to characterize the majority of Lonesome’s devoted viewers.[3] In fact, if we take time to reflect upon the movie, we realize that the majority of the audience was only alerted to Lonesome’s dual nature after Marcia flipped a control room switch and revealed the underlying deterioration; the masses oscillated from one position to the next without understanding how or why and once again adopted a passive stance in their relationship with media. Moreover, as Courtney Maloney points out, Kazan’s depiction of the agency of the masses is actually limited in scope:  despite a montage of audience members vehemently phoning in, sponsors are simultaneously shown to be acting independently as they withdraw their association with Lonesome (1999). Moreover, the subtext of the scene distances the rational decision-making of the truly powerful from the impassioned beseeching of the masses, likening the power of the latter to that of a mob. Knowledge and its associated authority, clearly, are afforded to a select group.

This idea, that the world can be divided between those who “get wise” and those who do not, serves to develop a rather sharp classist criticism against the medium of television and those who would watch it:  moviegoers, by virtue of witnessing Kazan’s work, find themselves elevated in status and privy to “the man behind the curtain” (to borrow a phrase). In contrast, the malleable masses were considered to be pacified and placated by idealistic portrayals of life in the 1950s in the form of television programs like Leave It to Beaver, The Donna Reed Show, and The Adventures of Ozzie and Harriet. Clearly, Kazan creates a dichotomy imbued with a value judgment descended from the thoughts of prominent thinkers in the Frankfurt School who, as far as aesthetics were concerned, preferred the high culture of cinema to the conformity and manipulated tastes of television (Horkheimer & Adorno, 2002; Adorno, 1985; Quart, 1989). This distinction between high and low culture would be a crucial supporting idea for critics as a prominent fear of mass culture was that it portended a collapse between concepts (e.g., fame, celebrity, or intellectual value) of objectively different quality, essentially rendering all manifestations the same and therefore all equally mundane (Boorstin, 1962; Hoberman, 2008b; Kierkegaard, 1962).  Even worse for critics, perhaps, was the perception of the masses’ refusal to grow out of its immature interests, a behavior that was characterized as both childlike and stubborn (Adorno, 1985).

And the fears of such theorists, all of whom were reacting to recent and rapid advances in broadcast technology, were not unfounded. Consider, for example, that radio had been popularized a scant fifty years prior and had vastly altered critics’ understanding of media’s potential impact, creating a precedent as it proliferated across the country and began to develop a platform for solidarity and nationalism. Yet, while the effects of radio were decidedly pro-social, due in part to its propagation of orchestral music and transmission of fireside chats, television was viewed as a corrosive force on society that spurred on the destruction of culture instead of enriching it.[4] For the critics of the Frankfurt School, television was indicative of an entrenched sentiment that regarded mass-produced culture as formulaic and perfectly suitable for a generation of passive consumers who sat enraptured in front of the glowing set. Associating the potential dissemination of propagandist ideology with television as a form of mass broadcast, cultural theorists evoked notions of totalitarian regimes akin to Hitler and Stalin in an effort to illustrate the potential subjugation of individual thought (Mattson, 2003). These simmering fears, aggrandized by their concurrence with the rising threat of Communism and collectivist cultures, found fertile soil in the already present anxiety-ridden ethos of the United States during the 1950s.


[1] It should be noted, however, that the comics of this time—those that belong to the end of the Golden Age and beginning of the Silver Age—also provide an additional understanding of the ways in which Americans indirectly wrestled with their fears.

[2] For a more exhaustive list of movies that support this point, see Wolfe, 2002.

[3] Let us also not forget the fact that Lonesome exhibits a rather patronizing attitude toward his audience in his later career, instituting the Cracker Barrel show with its manufactured country lackeys (Yates, 1974). In contrast to his first stint in Memphis, Lonesome has begun to embrace his country image as a means (if an inauthentic one) to connect with his audience, a point of contention to which we will return.

[4] Curiously, however, we see that this relationship between presidential addresses (like the aforementioned fireside chats) and mass media did not elicit notable complaints from critics who were generally wary of the merging of politics and entertainment (Quart, 1989; Benjamin, 1973). Although a larger discussion is warranted regarding the subtleties of this distinction, I would suggest that part of the differentiation stems from a high-low culture dichotomy. Although critics linked the negative presence of television with corporate advertising, James Twitchell suggests that there has always been a rather intimate relationship between arts and commerce, most saliently exhibited by wealthy citizens or entities who act as patrons (Twitchell, 1996).


Light Up the Sky Like a Flame

But what is reality television? Although the genre seems to defy firm definitions, we, like Justice Stewart, instinctually “know it when [we] see it.” The truth is that reality television spans a range of programming, from clip shows like America’s Funniest Home Videos, to do-it-yourself offerings on The Food Network, investigative reporting on newsmagazines like 60 Minutes, the docu-soap Cops, and many other sub-genres in between, including the reality survival competition that forms the basis for The Hunger Games. Although a complete dissection of the genre is beyond the scope of this chapter—indeed, entire books have been written on the subject—reality television and its implications will serve as a lens by which we can begin to understand how Katniss experiences the profound effects of image, celebrity, and authenticity throughout The Hunger Games.

She Hits Everyone in the Eye

For the residents of Panem, reality television is not just entertainment—it is a pervasive cultural entity that has become inseparable from citizens’ personal identity. Although fans of The Hunger Games can likely cite overt allusions to reality television throughout the series, the genre also invokes a cultural history rife with unease regarding the mediated image in the United States.

Reacting to atrocities witnessed throughout the course of World War II, Americans in the 1950s became obsessed with notions of power and control, fearing that they would be subsumed by the invisible hand of a totalitarian regime. In particular, the relatively young medium of television became suspect as it represented a major broadcast system that seemed to have a hypnotic pull on its audience, leaving viewers entranced by its images. And images, according to author and historian Daniel Boorstin, were becoming increasingly prominent throughout the 19th century as part of the Graphic Revolution replete with the power to disassociate the real from its representation. Boorstin argued that although the mass reproduction of images might provide increased levels of access for the public, the individual significance of the images declined as a result of their replication; as the number of images increased, the importance they derived from their connection to the original subject became more diffuse. And, once divorced from their original context, the images became free to take on a meaning all their own. Employing the term “pseudo-event” to describe an aspect of this relationship, Boorstin endeavored to illuminate shifting cultural norms that had increasingly come to consider the representation of an event more significant than the event itself.

Katniss unwittingly touches upon Boorstin’s point early inThe Hunger Games, noting that the Games exert their control by forcing Tributes from the various districts to kill another while the rest of Panem looks on. Katniss’ assertion hints that The Hunger Games hold power primarily because they are watched, voluntarily or otherwise; in a way, without a public to witness the slaughter, none of the events in the Arena matter. Yet, what Katniss unsurprisingly fails to remark upon given the seemingly ever-present nature of media in Panem is that the events of The Hunger Games are largely experienced through a screen; although individuals may witness the Reaping or the Tribute’s parade in person, the majority of their experiences result from watching the Capitol’s transmissions. Without the reach of a broadcast medium like television (or, in modern culture, streaming Internet video), the ability of The Hunger Games to effect subjugation would be limited in scope, for although the Games’ influence would surely be felt by those who witnessed such an event in person, the intended impact would rapidly decline as it radiated outward. Furthermore, by formulating common referents, a medium like television facilitates the development of a mass culture, which, in the most pessimistic conceptualizations, represents a passive audience ripe for manipulation. For cultural critics of the Frankfurt School (1923-1950s), who were still reeling from the aftereffects of Fascism and totalitarianism, this was a dangerous proposition indeed. Although the exact nature of modern audiences is up for debate, with scholars increasingly championing viewers’ active participation with media, Panem has seemingly realized a deep-seeded fear of the Frankfurt School. It would appear, then, that The Hunger Games function as an oppressive force precisely because of its status as a mediated spectacle of suffering.

But perhaps we should not be so hard on Katniss. Growing up in an environment that necessitated the cultivation of skills like hunting and foraging, Katniss’ initial perspective is firmly grounded in a world based on truth. Plants, for example, must be checked (and double-checked!) to ensure their genuineness, lest a false bite result in death. In order for Katniss to survive, not only must she be able to identify plants but must also trust in their authenticity; prior to her experience in the Arena, Katniss undoubtedly understands the world in rather literal terms, primarily concerned with objects’ functional or transactional value. However, as hinted by Boorstin, additional layers of meaning exist beyond an item’s utility—layers that Katniss has not yet been trained to see.

Echoing portions of Boorstin’s previous work, French philosopher Jean Baudrillard conceptualized four types of value that objects could possess in modern society: functional, transactional, symbolic, and sign. Admittedly a more complex theory than the description provided herein, we can momentarily consider how Baudrillard’s value categories of “functional” and “transactional” might align with Boorstin’s previously introduced concept of the “real,” while “symbolic” and “sign” evidence an affinity toward “representation.” Whereas the functional and transactional value of items primarily relates to their usefulness, the categories of “symbolic” and “sign” are predominantly derived as a result of the objects’ relationship to other objects (sign) or to actors (symbolic). Accordingly, being relatively weak in her comprehension of representation’s nuances, Katniss characteristically makes little comment on Madge’s gift of a mockingjay pin. However, unbeknownst to Katniss (and most likely Madge herself), Madge has introduced one of the story’s first symbols, in the process imbuing the pin with an additional layer of meaning. Not just symbolic in a literary sense, the mockingjay pin gains significance because it is attached to Katniss, an association that will later bear fruit as fans well know.

Before moving on, let’s revisit the import of The Hunger Games in light of Baudrillard: what is the value of the Games? Although some might rightly argue that The Hunger Games perform a function for President Snow and the rest of the Capitol, this is not the same as saying the Games hold functional value in the framework outlined by Baudrillard. The deaths of the Tributes, while undeniably tragic, do not in and of themselves fully account for The Hunger Games’ locus of control. In order to supplement Boorstin’s explanation of how The Hunger Games act to repress the populace with the why, Baudrillard might point to the web of associations that stem from the event itself: in many ways, the lives and identities of Panem’s residents are defined in terms of a relationship with The Hunger Games, meaning that the Games possess an enormous amount of value as a sign. The residents of the Capitol, for example, evidence a fundamentally different association with The Hunger Games, viewing it as a form of entertainment or sport, while the denizens of the Districts perceive the event as a grim reminder of a failed rebellion. Holding a superficial understanding of The Hunger Games’ true import when we first meet her, Katniss could not possibly comprehend that her destiny is to become a symbol, for the nascent Katniss clearly does not deal in representations or images. Katniss, at this stage in her development, could not be the famed reality show starlet known as the “girl on fire” even if she wanted to.

By All Accounts, Unforgettable

Returning briefly to reality television, we see that Panem, like modern America, finds itself inundated with the genre, whose pervasive tropes, defined character (stereo)types, and ubiquitous catchphrases have indelibly affected us as we subtly react to what we see on screen. Although we might voice moral outrage at offerings like The Jersey Shore or decry the spate of shows glamorizing teen pregnancy, perhaps our most significant response to unscripted popular entertainment is a fundamental shift in our conceptualization of fame and celebrity. Advancing a premise that promotes the ravenous consumption of otherwise non-descript “real” people by a seemingly insatiable audience, reality television forwards the position that anyone—including us!—can gain renown if we merely manage to get in front of a camera. Although the hopeful might understand this change in celebrity as democratizing, the cynic might also argue that fame’s newfound accessibility also indicates its relative worthlessness in the modern age; individuals today can, as the saying goes, simply be famous for being famous.

Encapsulated by Mark Rowlands’ term “vfame,” the relative ease of an unmerited rise in reputation indicates how fame in the current cultural climate has largely divorced from its original association with distinguished achievement. Although traditional vestiges of fame have not necessarily disappeared, it would appear that vfame has become a prominent force in American culture—something Katniss surely would not agree with. Recalling, in part, Kierkegaard’s thoughts on nihilism, vfame’s appearance stems from an inability of people to distinguish quality (or perhaps lack of concern in doing so), resulting in all things being equally valuable and hence equally unimportant. This, in rather negative terms, is the price that we pay for the democratization of celebrity: fame—or, more accurately, vfame—is uniformly available to all in a manner that mirrors a function of religion and yet promises a rather empty sort of transcendence. Although alluring, vfame is rather unstable as it is tied to notions of novelty and sensation as opposed to fame, which is grounded by its association with real talent or achievement; individuals who achieve vfame typically cannot affect the longevity of their success in substantial terms as they were not instrumental in its creation to begin with. Stars in the current age, as it were, are not born so much as made. Moreover, the inability of the public to distinguish quality leads us to focus on the wrong questions (and, perhaps worse, to not even realize that we are asking the wrong questions) in ways that have very real consequences; although vfame and its associated lapse in thinking might be most obvious in the realm of celebrities, it also manifests in other institutions such as politics. As a culture that is obsessed with image and reputation, we have, in some ways, forgotten how to judge the things that really matter because we have lost a sense of what our standards should be.

Born out of an early to mid-20th century society in which the concept of the “celebrity” was being renegotiated by America, concepts like vfame built upon an engrained cultural history of the United States that was firmly steeped in a Puritan work ethic. Americans, who had honored heroes exemplifying ideals associated with a culture of production, were struggling to reconcile these notions in the presence of an environment now focused on consumption. Although Katniss, as proxy for modern audiences, might initially find this shift difficult to appreciate, one need only consider that the premium placed on production is so central to American ideology that it continues to linger today: in a culture that exhibits rampant consumerism, we still value the “self-made man” and sell the myth of America as a place where anyone can achieve success through hard work. To abandon these ideas would necessitate that we reinterpret the very meaning of “America.” Thus, we become more sympathetic to the critics of the day who lamented the loss of the greatness of man and bristled against the notion that fame or celebrity could be manufactured—such a system would only result in individuals who were lacking and unworthy of their status. To this day, our relationship with celebrities is a tenuous and complex one at best, for although we celebrate the achievements of some, we continue to flock to the spectacle created by the public meltdown of others, unable or unwilling to help; we vacillate between positions of adulation, envy, contempt, and pity, ever poised for incensement but all too willing to forgive.

Perhaps it should come as no surprise that reality television puts us a little on edge, as the genre represents a fundamental blurring of fact and fiction. Celebrities, we see, are just like us—just like our neighbors, who, through the magic of reality television, can become stars! Ever-shifting classifications leave us on unstable ground. But also consider the aforementioned philosophy of Boorstin: stars are, among other things, individuals whose images are important enough to be reproduced, which causes “celebrity” to transition from a type of person to a description of how someone is represented in society. In other words, we witness a shift from a term that labels who someone is to a term that designates who someone seems to be. Celebrities, it might be argued, derive at least a portion of their power in modern culture because they embody a collection of images that has been imbued with some sort of significance. Ultimately, it seems that much of our unease with celebrity and fame centers on notions of authenticity.

All I Can Think of Are Hidden Things

Long before Katniss ever becomes a celebrity herself, she exhibits disdain for the Capitol and its residents, evidencing a particularly adverse reaction to things she considers artificial. As previously discussed, authenticity played a particular role in Katniss’ growth and her ability to survive: for Katniss, a false image literally represented an affront on the level of life or death, for a lapse in judgment could have resulted in possible electrocution or poisoning. Concordantly, Katniss dismisses the strange colors of the Capital along with the characteristic features of its citizens—stylists, in particular, are purported to be grotesque—because she is not readily able to reconcile these visuals with her established worldview. As Katniss operates on a literal level, directly associating identity with appearance, the self can only present in one way (in this case, relatively unadorned) and maintain its authenticity.

Like Katniss, we too may be tempted to summarily reject the unfamiliar; our modern anxieties might best be encapsulated by the question: What to do with a problem like Lady Gaga? Perhaps the strongest contemporary mass image that mirrors the visual impact of the stylists on Katniss (followed closely by New York socialite Jocelyn Wildenstein), Lady Gaga suffers continual criticism for her over-the-top theatrical presentations. With dresses made from meat and Hello Kitty heads, it is all too easy to write Lady Gaga as “attention-starved,” simplifying her presence to the succinct “weird.” Yet, it seems rash to write off Lady Gaga and the world of fame as nothing more than frivolity and fluff, for pop culture is only as vapid as our disinclination to engage in it.

Consider, for example, how the Capitol and its residents (of whom a prominent one would undoubtedly be Lady Gaga) embody the spirit of Decadence, a particularly prominent theme in Victorian culture. A reaction to the 19th century movement of Romanticism, Decadence championed concepts like artifice, which served to demonstrate man’s ability to rebel against, and possibly tame, the natural order. Although this inclination toward the unnatural manifested in myriad ways, French poet and philosopher Chrarles Baudelaire viewed women’s use of cosmetics as a particular site of interest, for proper application did not just enhance a woman’s beauty but acted to transform her, allowing transcendence through artifice.

With this in mind, we begin to understand the innate control wielded by figures such as Cinna and Caesar Flickman. Perceived as facile by some, these two men represent a class of individuals adept at understanding the power inherent in fame, reputation, celebrity, and appearance; in the Capitol, image mongers such as these hold sway. Although one reading of these characters plants them firmly in the realm of artifice, painting them as masters of emotional manipulation and spectacle, an alternate view might consider how these two have come to recognize a shift toward a new localized reality—one that Katniss must adapt to or perish.

And yet, despite their commonality, these two individuals also underscore fundamentally different approaches to image: Caesar (and, perhaps, by extension, the Capitol) wields his power in order to mask or redirect while Cinna endeavors to showcase a deep-seeded quality through the management of reputation and representation. Coexisting simultaneously, these two properties of illusion mirror the complimentary natures of Peeta and Katniss with regard to image. Peeta, skilled in physical camouflage, exhibits an emotional candidness that Katniss is initially unready, or unwilling, to match; Katniss, very much the inverse of Peeta, is characterized by traits associated with hunting, finding, and sight in the “real” world all while maintaining a level of emotive subterfuge. Over the course of the 74th Hunger Games, however, Katniss quickly learns to anticipate how her actions in the Arena will affect her representation and reputation beyond the battlefield. With the help of Haymitch, Katniss begins to better understand the link between a robust virtual self and a healthy physical one as she pauses for the cameras and plays up her affection for Peeta in exchange for much-needed rewards of food and medicine. As she matures, Katniss comes into alignment with Cinna and Caesar, individuals who, despite being participatory members of a system arguably deemed inauthentic, distinguish themselves from the majority of Panem by understanding how image works; Cinna and Caesar (and later Katniss) are not just powerful, but empowered and autonomous.

Herein lies the true import of Collins’ choice to weave the trope of reality television into the fabric of The Hunger Games: throughout the trilogy, the audience is continually called upon to question the nature of authenticity as it presents in the context of a media ecology. Ultimately, the question is not whether Katniss (or anyone else) maintains a sense of authenticity by participating in the games of the Capitol—trading a true self for a performed self—but rather an askance of how we might effect multiple presentations of self without being inauthentic. How does Katniss, in her quest to survive, embody Erving Goffman’s claims that we are constantly performing, altering our presentation as we attempt to cater to different audiences? Is Katniss truly being inauthentic or does she ask us to redefine the concept of authenticity and its evaluation? Struggling with these very questions, users of social media today constantly juggle notions of authenticity and self-presentation with platforms like Facebook and Twitter forming asynchronous time streams that seamlessly coexist alongside our real-life personas. Which one of these selves, if any, is authentic? Like Katniss, we are born into the world of the “real” without a ready ability to conceptualize the power latent in the virtual, consequentially resenting what we do not understand.


On My OWN

 

Modern American culture finds itself infused with celebrities, typically thought of as Hollywood actors or reality show starlets. Increasingly, however, the moniker of “celebrity” is being applied to potentially unlikely individuals, giving rise to the “Celebrity CEO.” Beginning with a brief examination into the possible purpose and cultural function of the celebrity, this paper will then go on to focus on Oprah Winfrey as a particular type of celebrity CEO who has created, and subsequently embodied, a lifestyle brand. Throughout the course of the paper it will be argued that this strategy presents some advantages to celebrity-endorsed endeavors while presenting some additional vulnerabilities. Finally, the implications of this status as celebrity CEO will be applied to the Oprah Winfrey Network.

 

Oprah Winfrey, an American media figure familiar the world over, certainly fulfills modern definitions of a celebrity:  face prominently featured on streaming banners in Chicago’s O’Hare airport, Oprah is associated with events like “Oprah’s Favorite Things” along with projects like Oprah’s Book Club and the Angel Network. Although ubiquitous, if one should doubt her celebrity status, one need only remember that Oprah has also managed to obtain the true mark of the modern star in American culture—the ability to drop her last name and still be recognized. Even Daniel Boorstein, who criticized the current state of celebrity as being devoid of meaning—in the process coining a term that has become colloquially referred to as “famous for being famous” (1962)—might have to reconsider his thoughts after encountering Oprah Winfrey. Ranging from stories of sexual abuse as a child to weight management issues played out in public, Oprah is quite literally known for being well-known:  part of her allure stems from her willingness to address the darkest parts of her life with her audience and part of her power comes from fans’ ability to connect with Oprah through these stories.

Beginning with a brief background into the nature of the celebrity CEO, this paper will explore the general effects of celebrity CEOs with particular respect to narrative before examining Oprah as a particular iteration of this process. Celebrity CEOs, it will be argued, are not entirely dissimilar from other types of stars when it comes to issues of brand management, although they necessarily possess additional economic and social considerations. Once the connection between a CEO’s dual identities as executive and individual are established, Oprah’s development of her lifestyle as brand will be briefly discussed as foundational context for an evaluation of the launch of OWN (i.e., the Oprah Winfrey Network).

 

There’s No Business Like Show Business?

In an increasingly industrialized world filled with sprawling organizations, CEOs have become somewhat sequestered from the majority of their employees, leading to isolation and alienation (Yalom, 1998). Although undoubtedly recognizable to boards of directors, it appears as though CEOs have become largely disappeared from public view (with notable exceptions as will be discussed below).

Directly addressing this issue, the CBS reality television show Undercover Boss facilitates the connection between roles of “CEO” and “person”—although the program likely provides an opportunity to learn about the inner workings of their organization, the arguably larger benefit is the humanization of a corporate suit. Although viewers might cite schadenfreude as a prominent theme, laughing as they see an administrator stumble over a seemingly “simple” task, the net effect (realized or not) is that they most likely begin to connect emotionally with the undercover boss; they become actively invested in the outcome of this somewhat contrived scenario and an unspoken desire to see that the CEO has learned a lesson indicates that they have come to care about this person and his or her company—provided that the CEO is at all likeable.[1] In the course of an hour, audiences are not only exposed to a company that they may or may not have heard of, but also been introduced to a CEO and a handful of employees and witnessed “behind-the-scenes” or “backstage” operations (which might also serve to increase our identification with the company)—all in all, not a bad public relations move for a corporation!

Alternatively, we can consider that an appearance on a show like Undercover Boss instantly transforms a CEO into a media figure. Thrust into the public eye, one becomes a minor celebrity through the power of television:  even if we had little to no prior interest in the featured boss, social cues prevalent in a mediated society indicate that we should pay attention—a major broadcast network surely would not have chosen to feature someone who was not worthy?—and the mere ability to command copious amounts of attention (momentarily at least) affords a CEO the ability to transcend mundaneness, potentially obtaining the status of a celebrity.

Moreover, the Undercover Boss example indicates that while CEOs could potentially demand or cultivate an audience themselves—as suggested by Lois Arbogast in reference to Best Buy CEO Brian Dunn (2010)—they can also be featured or promoted by journalists (Hayward, Rindova, & Pollock, 2004). Although we might ascribe the prominence of CEOs to their role as leaders, we can also consider how humans display a tendency to oversimplify situations in order to understand complex and nebulous narratives.

Take, for example, a study conducted by Jones and Harris demonstrating that the prevalent attitudes in a writing sample were attributed to its author:  this study represented the first time that the Fundamental Attribution Error had been observed, although it was not immediately labeled as such (1967). In short, the Fundamental Attribution Error posits that observers tend to ignore situational explanations in favor of personality- or dispositional-based ones. In turn, these perceptions of us, once established, can cause us to act in particular ways as we endeavor to maintain our public image. Although the corollary between the Fundamental Attribution Error and the celebrity CEO might not seem apparent at first, we can understand how humans have learned to employ the Fundamental Attribution Error as a type of heuristic—a mental shortcut—in order to simply a intricate situation into manageable (and readily understood) explanations. In the case of the Fundamental Attribution Error, we see an eschewing of situational/environmental factors as we focus on an individual. Similarly, we focus on the actions and exploits of a celebrity CEO, channeling the output of a multidimensional process through a figurehead.

As a specific example of this process, the origin story of non-profit group Invisible Children taps into the pervasive myth of Joseph Campbell’s Hero’s Journey with its depiction of young adventurers traveling into a foreign land on a quest to find and cultivate a narrative. Lured by a sense of mystery into East Africa, an unexpected assault by the Lord’s Resistance Army alters the path of the filmmakers, acting as the impetus to enter into a world fraught with danger and uncertainty:  the realm of the unknown (Russell, 2007). Prior to this point, Kenya and Sudan had represented a relatively unfortunate, physically demanding, and sometimes boring wilderness for the team but nothing substantial. With the assistance of various guides (one of these a literal guide tasked with driving the group to a nearby refugee camp), Jason Russell, Laren Poole, and Bobby Bailey began to glimpse the conflict that underscores the region as they asked a series of questions of the locals. Wholly consumed by their newfound situation, the filmmakers discovered a little-known world of night commuters and child soldiers in Northern Uganda. This alien setting, which “disgusted and inspired,” also presented an opportunity for transformation as filmmakers shed their naïveté and were reborn as crusaders against witnessed injustice (Invisible Children, 2010). Having found their story—the ultimate prize sought at the outset of the journey—the founders of Invisible Children extricated themselves in order to return to their homeland as masters of the unknown and share their insights with their community. The documentarians themselves echo this sentiment in their first production, Invisible Children:  Rough Cut, through a voiceover that proclaims that the group came to Africa as novices but hoped to “leave as warriors” (Bailey, Poole, & Russell, 2004). While never explicitly acknowledged as a tool, it seems plausible that self-described storytellers such as Jason, Laren, and Bobby would have integrated successful elements of narrative into their production.

Although the real-life nature of Invisible Children’s origin precludes an exact overlay with the steps of Campbell’s monomyth, it is easy to imagine that the retelling of the tale draws some of its power (consciously or unconsciously) from this established structure. For some, the intertwining of narrative and Invisible Children might have seemed inevitable for an organization created by filmmakers/storytellers, born out of a documentary, and focused on recounting a tale of adversity in Uganda. Nevertheless, through the mythic nature of Invisible Children’s origin story, the organization’s founders are made into celebrity CEOs, performing a similar function as those individuals featured on Undercover Boss as the surrounding narrative is rewritten to feature a chosen few as its stars. Celebrity CEOs, then, can be understood to act as a focal point for the narratives that surround and pervade a company, locking the perceptions of the organization and individual into a symbiotic (or mutually destructive) relationship as sentiments accrued in one role migrate to another. In the case of Invisible Children, the organization’s founders were able to leverage the mystique associated with their experience into a full-fledged movement with their stories at its origin.

 


 

The Medium Is the Message

Structuring the message as a narrative helps to convey complex ideas in a relatable format, making sense out of a potentially overwhelming wave of information. Personal narratives, however, provide a relatively simple path that cuts through the chaos and allows audiences to focus. Preachers, for example, might utilize a parable to illustrate a point, giving audiences something familiar to relate to while simultaneously introducing a new idea. In a larger sense, we can also consider how the first iterations of narrative, myths and legends, informed the populace about the rules of a world (e.g., why the sun rose or how humans had come to be) in a process that mirrors functions of advertising or identity construction via celebrity culture; although many have now come to accept scientific explanations in lieu of (or possibly in conjunction with) these tales, the fact remains that stories can serve to develop cognitive scaffolding as we evaluate foreign concepts. This educational element, similar to the one existent in the concept of play, allows individuals to learn intricate lessons without any overt effort. Narrative structure provides a guide for people to follow as they absorb additional information, easing the progression of learning. However, when considering this process, it is important to realize that narrative, in choosing which facts to highlight, also chooses which facts to exclude from a story, which might be just as significant.

For some, the process of inclusion and exclusion might seem oddly similar to the creation (or recording) of history; certain facts become relevant and serve to shape our perception of an event while others fade into obscurity. If we were to take a second, however, and think about this notion, we would realize that narratives often served as the first oral histories for a given population. Individuals entrusted with this position in these societies were the “keepers of information,” whose ability to recount narrative shaped their community’s collective memory, and, thus, a key part of the community’s combined sense of identity (Eyerman, 2004; Williams, 2001). Performing a similar role as the oral historians of the past, modern society’s sense of shared knowledge can be understood to be influenced by the commercial storytelling that is branding (Twitchell, 2004)—this concept gains additional importance as we think about modern celebrities who are, along with handlers and public relations agents, in charge of their brand and understand celebrity CEO’s as an extension of this. The ramifications of branding’s ability to affect American culture in this manner is profound:  with its capacity to color perceptions, branding can influence the communal pool that forms the basis for social norms and cultural capital.

Stories, it seems, not only allow us to construct a framework through which we understand our world, but also afford us the ability to share our interpretations with others (Short, et al., 1994). Indeed, author Stephen Greenblatt mentions that a sort of compulsiveness exists that is intrinsic to storytelling (1991). The function, then, of narrative is not only to shape a community, but also to create (or at least maintain) it. The process of sharing not only relays information—an important function, to be sure—but also serves to cultivate the bonds between source and receiver. Sharing represents an important component of storytelling as it facilitates a sense of community with a successful story anchoring an individual’s commitment to a community, strengthening the overall cause.

Oprah as Celebrity CEO

As previously discussed, Oprah has managed to use the power of storytelling, often recounting stories of a deeply personal nature, in order to develop her brand and her audience (a form of community). For example, Oprah’s rather public weight battles offer one point of connection with her viewers:  due to the show’s longevity, audiences have been able to readily document Oprah’s weight gains and losses. Although the same sort of scrutiny has plagued female celebrities for years—Calista Flockhart, Jennifer Love Hewitt, Ricki Lake, Carnie Wilson, and Jessica Simpson come to mind—Oprah managed to benefit from the potentially negative discussion by addressing it directly. In addition to deflating the issue, Oprah’s weight struggles allowed her audience to sympathize with her, strengthening their connection to both Oprah and her brand as trainer Bob Greene was featured on The Oprah Winfrey Show and in books. Consistent with her overall message, Oprah did not advocate for a diet but instead argued for a fundamental change in lifestyle. Further strengthening the bonds between her brand and her personal life, Oprah also publically trained for a marathon in 1994—in this scenario, the brand espoused by The Oprah Winfrey Show is literally embodied by Oprah herself. With this act, we see the synergy between goodwill accrued by Oprah as a media figure and the struggle of a real person to obtain a goal—cheering for Oprah in one capacity naturally led viewers to support her in her other endeavor.[2]

Given Oprah’s strong presence as a personality and as a media mogul, the talk show host seems ripe for consideration as a celebrity CEO. Even ignoring the connection between business and self latent in the name of Oprah’s production company, Harpo (i.e., Oprah spelled backward), Oprah appears to have carved out a niche for herself as a lifestyle brand that promotes self-transformation. Fitting neatly into the ongoing lives of its supporters, Oprah promotes a brand that is anchored to her public perception that, despite presentation in multiple media channels (e.g., television talk show, online website and message boards, magazine, and self-help books), retains consistent messaging, which allows each experience to compliment, but not compete with, the others.

As further evidence of the connection between Oprah’s personal lifestyle, we can reference the much ballyhooed “Favorite Things” episode of The Oprah Winfrey Show. Although possibly driven simply by a desire to share her favorite things, the episode has become a production unto itself, rooted in emotionality and vividness while circumventing logical and rational thinking. The spectacle of the “Favorite Things” episode uses vividness and sensationalism to indicate that the featured products are emotionally interesting, image provoking, and proximate (Sherer & Rogers, 1984; Nesbit & Ross, 1980)—cues that seem salient when discussing media-saturated audiences notorious for variable attention spans and interest. Over the years, in-studio audiences have been groomed into a carefully controlled state of histrionics as they gush about whatever objects are placed in front of them while lauding Oprah’s charity.[3] Although participants of these parties most likely do not stop to consider the processes at work, the creation and careful cultivation of affective ties helps to bind them to Oprah and her lifestyle. Ultimately, although the audience is given free gifts (ignoring the taxes that must be paid), one might argue that individuals do in fact pay a price for these goods:  in exchange for material gain, the audience offers up its ability as a consumer bloc to dictate trends and value.

Adding support to this idea, we can consider the successful implementation of Oprah’s Book Club as another way in which Oprah was able to largely influence American culture through her lifestyle as brand. Using The Oprah Winfrey Show as a platform, Oprah was able to express her approval of a wide range of books (and reading in general). Although Oprah’s Book Club likely sparked a number of book clubs around the country, one might question how many of these were simply waiting, with baited breath, for Oprah to announce her next selection—instead of seeking out books that were personally meaningful, viewers may have abdicated this power to Oprah as she assumed the role of cultural dictator.

Oprah’s Book Club also demonstrated one of the potential pitfalls of connecting one’s personal life to one’s professional presence:  in 2005, Oprah’s support of James Frey’s A Million Little Pieces caused her personal integrity to be questioned as the selection of the Book Club became suspect (Koehn, Helms, Miller, Wilcox, & Rachel, 2009). Although Oprah most likely could not have known that Frey’s work was a fabrication, her pick, and subsequent support on Larry King Live, caused minor damage to reputation due to her personal involvement in the matter.[4]

 

Coming into Her OWN

Continuing the deployment of her lifestyle brand, Oprah plans to debut the Oprah Winfrey Network in 2011. Described by Winfrey as “A channel where people will see themselves…see who they are through the lives of others—in a way real way that enriches them,” one can sense the immediate connection to her existing brand (ABC News, 2010). Building off of her phrase “Live your best life” (a sentiment remarkably similar to, but also strikingly different in tone from, the Army’s “Be all you can be”), the message is clear:  the Oprah Winfrey Network, like all of Oprah’s other media ventures is about the power and process of self-transformation.

Plagued by delays, the Oprah Winfrey Network has also run afoul of controversy prior to its launch. In the run up to its opening, rumors swirled about the possible rigging of votes in the “Search for the Next TV Star” contest (Walker, 2010). Given Oprah’s very obvious connection to the new network, we can conjecture that the same negative publicity that applied to the James Frey incident would likely pertain to this example—even if executives were completely innocent of the allegations, charges of cheating or misconduct had to be addressed in order to avoid damage to the unborn network and Oprah.

Having chosen to create a brand that centers around herself, Oprah has inextricably tied herself to the fortunes of the new network; in exchange for using her name to lend the new channel credence, Oprah runs the risk of personal devaluation should the venture fail. Although Oprah might have accrued enough goodwill to survive even the most devastating blow, any sort of scandal will undoubtedly reflect poorly upon Oprah and any future ventures.


[1] In a somewhat less flattering light, the MTV show Punk’d also performed a similar function for celebrities. Similar to the candid camera shows of generations prior, the Ashton Kutcher vehicle exposed the “true” face of stars in a process that could endear them to the public. More often, however, viewers were able to have a laugh at the celebrities’ expense (with Justin Timberlake being a memorable example) and often exposing them, in Frankie Muniz’s case, as insufferable human beings.

[2] I would also add that Oprah’s choice to relay her story of success despite her trials growing up also affects culture in a couple of important ways. On the surface level, we can see how Oprah’s story can be considered inspirational for those who would wish to follow in her footsteps. Yet, at the same time, Oprah’s background also serves to raise the bar for suffering as audiences question their right to complain as they compare their personal stories to Oprah’s. Although Oprah’s personality lends itself to the aspiration/inspirational interpretation, a larger trend of celebrity/mediated suffering might be that individuals are less inclined to realize the significance of their own situation since it is “not as bad” as what they see on television.

[3] Oprah’s creation of the Angel Network, involvement with Oprah’s Big Give and the creation of the Leadership Academy also work to support this image of Oprah.

[4] Interestingly, Oprah was able to avert a major crisis by responding to the situation through public statements and a follow-up interview with author James Frey. Again possibly working as a spokesperson for larger sentiments, Oprah seemed to win back her audience by conveying her outrage and being duped—a stance likely held by many of the people who had picked up the book at the recommendation of Oprah. In some ways, Oprah became the champion of the people as she confronted the author and the publisher; audience members could rally around Oprah and her power as a media personality allowed her to deliver results that individual viewers could not have hoped to achieve on their own. It might also be noted that Oprah’s Leadership Academy (see previous footnote) also suffered from allegations of misconduct that also served to cast similar doubt on Oprah’s credibility.


Believing One’s Own Press

In their article, “Believing One’s Own Press:  The Causes and Consequences of CEO Celebrity,” Hayward et al. discuss the conflation of companies and their celebrity leaders by journalists.[1] Although it should be noted that the authors focus primarily on journalists, we can understand the tendency to oversimplify situations in order to understand complex and nebulous narratives.

Take, for example, a study conducted by Jones and Harris in 1967 demonstrating that the prevalent attitudes in a writing sample were attributed to its author:  this study represented the first time that the Fundamental Attribution Error had been observed (although it was not immediately labeled as such). In short, the Fundamental Attribution Error posits that observers tend to ignore situational explanations in favor of personality- or dispositional-based ones. In turn, these perceptions of us, once established, can cause us to act in particular ways as we endeavor to maintain our public image. Although the corollary between the Fundamental Attribution Error and the celebrity CEO might not seem apparent at first, we can understand how humans have learned to employ the Fundamental Attribution Error as a type of heuristic—a mental shortcut—in order to simply a intricate situation into manageable (and readily understood) explanations. In the case of the Fundamental Attribution Error, we see an eschewing of situational/environmental factors as we focus on an individual. Similarly, we focus on the actions and exploits of a celebrity CEO, channeling the output of a multidimensional process through a figurehead. Moreover, we can also use the lens provided by the Fundamental Attribution error to better understand the Hayward’s connections between hubristic actions of celebrity CEOs, image maintenance, and ego.


[1] Hayward, M. L., Rindova, V. P., & Pollock, T. G. (2204). Believing One’s Own Press: The Causes and Consequences of CEO Celebrity. Strategic Management Journal , 637-653.

 


Becoming Extra-Textual

It seems hard to argue that modern celebrity has coexisted with a sense of public interest—indeed, the very definition of celebrity, and its associated notions of fame, can be understood in terms of the willingness of strangers to engage with a particular entity. In our examination of celebrity, we have explored the public/private dichotomy and how audiences attempt to align a star’s “real” life with his or her persona (Bragman, 2008). Supporting this sentiment, Richard deCordova argues that the basis of our interest in celebrities could in fact result from our limited access to stars’ private lives (Spohrer, 2007; deCordova, 1990). Although the early movie studio system (arguably the birthplace of the first modern celebrities) strove to integrate the public and private representations of their stars, Joshua Gamson notes a pushback from audiences who began to question the authenticity of performers (1994).

And, as Spohrer argues through an invocation of Paul Robeson, the lack of perceived authenticity can lead to scandal but also the quality of extra-textuality, wherein a performer becomes more than the composite of his performances (2007). John Fiske adds to this discussion, also drawing upon Baudrillard’s notions of simulacra and the real, in his examination of the confluence of Murphy Brown (a fictional character), Candice Bergen (the actress portraying Brown), and Diane English (the creator of the character)—a quality denoted as hyperreal (Fiske, 1994). While this juxtaposition serves to create an entirely new entity, one way to dissect it is to understand it is in terms of foreground/background distinction:  we can selectively choose to mentally focus on one aspect of the construct at a time, looking at each manifestation in isolation and then as part of a larger picture. For example, Fiske brings up the notion of the agency afforded to an individual body (in this case Rodney King and Anita Hill) in order to demonstrate that while an individual body can possess varying amounts of agency, it also figures in a larger societal context that also informs our understanding of the body’s importance in an ongoing story (1994).

Fiske’s work raises the important notion that cultural scholars need to understand the significance of the site of contestation when attempting to explore societal issues:  realizing whether we are fighting over a physical body, an image, or a hyperreal figure provides much insight into the nature of the arguments in question. Supporting this concept, we see Spohrer discuss the implications of Paul Robeson’s movement from singer to singer/actor to singer/activist (2007); as Robeson participated in increasingly expanded spheres of influence, his increasing extra-textuality afforded us a greater number of lenses through which to examine his impact.