Thoughts from my study of Horror, Media, and Narrrative

Posts tagged “Journalism

Merry Christmas. War is…Over?

I’m not sure when it started, but I am told that there is most definitely a “War on Christmas” occurring in America at the present moment. Well, reoccurring to be more accurate. In response we see comedian Jon Stewart formally declaring a (mock) “War on Christmas” even as other media outlets attempt to understand what the fuss is all about. What is a journalist to do? To ignore the situation seems neglectful, but to acknowledge the dispute is to grant a measure of legitimacy. In short, what is one’s responsibility when it comes to reporting something like the “War on Christmas”? Is there a way to contextualize the issue without engaging directly, perhaps seeing the fervor as the product of a channel that must find news (or create it)?

Part of the struggle, I think, lies in the notion that the commercialized Christmas has invariably become part of the traditional Christmas story in America. Even children’s specials support this notion, for despite Charlie Brown’s speech that commercialism will not ruin his Christmas, it is only after his sapling is transformed through the trappings of commercial celebration that it is allowed to bring the children together. This is not to suggest that we confuse Santa Claus with Jesus, but that both figures have become central to the Christian understanding of the season. If there indeed is a “War on Christmas,” are we even fighting the right enemy?

But beyond the rabble rousing, I am incredibly interested in the sensationalist deployment of “Nazi” in order to challenge those who are seen as hindering Christians’ ability to celebrate Christmas for one of Nazism’s goals was the unification of the German people through the perceived threat imposed by the Polish. In line with a position heavily influenced by Social Darwinism, Nazism construed this assault in terms of a mortal danger—in this model, there would only be one type of winner and Nazis most definitely intended to ensure their own survival.

What happens, then, if we remove Nazis from the discussion (and, along with them, the inherent negative connotations) and instead consider the process employed by the National Socialist party? We must recognize that this sort of community building is not inherently evil nor is it a product of Nazism—to label it as such would be to ignore the larger ideological developments at play and to fail to understand its relative import to us today. This, to me, seems to be the real story at the heart of all of this rhetoric:  what are the ways that dominant ideology paints itself as a victim in order to garner support?

As others have noted, the “War on Christmas” is perhaps really just a battle in the larger “War on Christianity.” Viewed in this way, we can situate the current discussions alongside Lowe’s decision to remove its ads from All-American Muslim and concern over religious iconography on military bases. For me, the larger area of concern is not whether Christianity is in fact jeopardized but rather how such a war is being constructed in the first place. If we look back at tactics employed during the Nazi assertion of power, we can better understand the potential benefits of fostering an environment that finds itself in a perpetual state of battle with the secular Other threatening imminent demise.

Chris Tokuhama is a doctoral student at the USC Annenberg School for Communication and Journalism where he studies cultural anxieties that surround the social construction of the body. Throughout his work, Chris attempts to use Post/Transhumanism, Early Modern Science, Gothic Horror, and religion to answer the question, “How do we become more than our bodies?” 

Read up on Chris’ pop culture musings or follow him on Twitter as he endeavors not to eat his weight in holiday snacks.


The Majority Report

The charge of media’s liberal bias is not a new one.

From Sarah Palin’s cry of “gotcha questions” to Jon Stewart’s arguments against inflammatory rhetoric, we see a wide range of individuals in America expressing discontentment with the status quo.

And those critical of mainstream media also have a point.

But when we consider our demands for mainstream media, are we calling for reforms in reporting or asking journalism to be something that it’s not? We see individuals from different political positions calling for change in the media and in reporting, but how realistic are our demands given the structure of the media industry itself? I believe that we can challenge the system, but are we focusing on the branches instead of the roots?

Taking the recent News Corporation “hacking scandal” as an example, we simultaneously see multiple ways in which journalistic outlets failed citizens and how the problem cannot simply be solved by asking reporters/editors to “do better.”

On the ground level, we of course have the unethical behavior evidenced by the News of the World staff that formed the basis for the story. However, given that this was not just an isolated incident (i.e., a “rogue reporter” as initially stated) we must also examine the institutional and structural supports that may have served to foster a culture in which the aforementioned scandal could occur. As the story developed, the public began to gain insight into a newsroom that deemed information more valuable than people; a mogul who, although not directly involved, nevertheless shirked responsibility for his employees; and a media that seemed content to fixate on “hacking” rather than the larger issues of ethical practice and invasion of privacy.[1]

This, of course, raises the notion of just who comprises journalism’s constituency. Although it seems like the straightforward answer would be that the fourth estate ideally serves the people, this stance may in fact not be correct in practice. The propaganda model, put forth by Herman and Chomsky (1988) suggests that a number of intervening factors—what the authors call “filters”—exist in mass media that serve to subvert journalism, making it beholden to entities other than the public. Concentration of ownership along with reliance on advertisers and reliable sources suggest that any problems evidenced by the media are, in fact, much more complex than many initially realize; while criticism of the media might be warranted, focusing all of our attention solely on the media will never effect any real change.

If we accept the validity of Herman and Chomsky’s arguments, we see that mainstream media might actually contain strains of conservative bias. Such an argument should not suggest that media outlets cannot also contain a liberal bias (to wit, Herbert J. Gans paraphrases Stephen Colbert’s assertion that life itself tends to lean liberal) but merely argue against the notion that media inherently and/or necessarily contains an all-consuming bias toward the liberal.


[1] This should not suggest that hacking is not a legitimate social concern, as we have witnessed large-scale attacks against government and corporations that have definite potential for harm. However, in this case, the discussion surrounding this particular story seemed to play on the fears (and popular preconceptions) of the public in order to make a somewhat sensationalist argument. Put another way, I would suggest that this was a “scandal that involved hacking” and not a “hacking scandal.” Although I think that the first conceptualization is more accurate, I can also see how the second phrase is easier to sell and why mainstream media outlets—beholden to advertisers and conscious of time/space—would choose the latter.


Why the “Cult” of Mormonism Misses the Mark

The question of Mormonism’s role in this election cycle refuses to die.

Over the weekend, much ado was made regarding Reverend Robert Jeffress’ assertion that Mormonism was a cult, with editorials and articles appearing across media outlets. Although I recognize that the dispute supposedly at the heart of this matter is whether or not Mormonism is, in fact, a form of Christianity, I also suspect that this entire discussion is being overplayed because of its proximity to the Republican nomination process. I, for one, have not seen many (if any) crusades to dissuade Mormons from calling themselves Christians in other contexts. For that matter, this is not the first time that America has broached the subject, but we seem to have forgotten that Mitt Romney had to defend his religion the last time we went through all of this four years ago. We could go back and forth over the distinction between religion and cult—see other discussions regarding the nature of Scientology or the perception of early Christianity in a Jewish society—but I believe that this would be time spent unwisely.

Instead, the more problematic line from Jeffress at the Value Voters Summit was, “Do we want a candidate who is a good moral person, or do we want a candidate who is a born-again follower of Jesus Christ?” Putting aside the false dichotomy between a “good moral person” and a “born-again follower of Jesus Christ”—which incidentally suggests that a candidate who identifies as born-again Christian is not a good moral person—the underlying message subtly implies supporting Christians over good moral people. Of course the two categories are not mutually exclusive, but I think that reporters missed a great opportunity to disentangle emotionally-charged words from thoughtful political action. Even when the topic was mentioned, discussion quickly moved onto another distraction:  the Constitutional injunction against religious testing prior to assuming public office. Instead of publishing headlines like “Cantor Doesn’t Believe Religion Should be Factor in 2012,” which, besides being misleading and not truly reflective of the article’s body, news media have an obligation to explain to voters why religion does matter in the political process. Values do matter and religion undoubtedly speaks to a portion of that—just not all of it. We know from reports like those of the Pew Research Center for the People and the Press that religion does impact voting, so why pretend otherwise? The opportunity that the press has, however, is to challenge pundits, politicians, and the public not to use “religion” to mean more than it should.

Moreover, another missed opportunity for the media was Jeffress’ assertion that Romeny was a “fine family person” but still not a Christian, given that he was speaking to a crowd ostensibly gathered in support of family values. Shouldn’t this statement, particularly at this function, cause reporters to question exactly what types of values are being upheld? Doesn’t Jeffress’ statement call for an examination of exactly what is meant by terms like “Christian” and “Mormon”? Ultimately it is these values that will determine the potential President’s policy, not the moniker of a religion.

_________________________________

Chris Tokuhama is a doctoral student at the USC Annenberg School for Communication and Journalism where he studies the relationship of personal identity to the body. Employing lenses that range from Posthumanism (with forays into Early Modern Science and Gothic Horror), the intersection of technological and community in Transhumanism, and the transcendent potential of the body contained in religion, Chris examines how changing bodies portrayed in media reflect or demand a renegotiation in the sense of self, acting as visual shorthand for shared anxieties.

Read up on Chris’ pop culture musings or follow him on Twitter as he tries to avoid the Flavor Aid.


Iowa’s Straw Poll Helps Us Take Stock, Not Just of Candidates But of Ourselves

In a move long-suspected by many, Texas governor Rick Perry officially declared his intention to seek the office of President this past Saturday. Perry, who garnered national attention with his rally, The Response, once again invokes—or at least should cause one to question—the manner in which religion has been interwoven into a political climate that has, of late, seemed to largely fixate on the economic issues of budgets, debt, and unemployment.

Without diminishing the importance of these topics or their coverage, the recent debates in Iowa suggest that understanding the potential impact of religion in the various GOP campaigns is of value whether one identifies as Republican or not. Beyond the gaffe of news anchor Ainsley Earhardt, and the larger discussion (and negotiation) of Mormonism that it references, religion’s presence seems to have manifested in subtle, but potentially significant, ways throughout this campaign season.

Responding, perhaps, to a recent poll that indicated Americans’ preference for a strongly religious President (despite not being able to correctly identify the specific beliefs of major candidates), Fox News displayed a graphic during the Iowa debates that indicated three pieces of information:  religion, marital status, and number of children. Interestingly, this graphic was paired with another image showcasing each individual candidate’s political experience, perhaps suggesting that Fox News considered these two sets of information equally important for viewers.

And, in a way, maybe they are.

During the debates on Thursday, Byron York asked candidate Michele Bachmann about how her religious beliefs—specifically her belief in the virtue of submissiveness—might affect her behavior, citing her prior decision to become a tax lawyer as a result of her understanding of God’s desire as channeled through her husband. Although this inquiry elicited a strong display of displeasure from the audience as extraneous or unfair, the question seemed designed to probe Bachmann’s decision-making process in the past as well as what might shape her choices in the future if she were to become President.

So maybe the relevant concerns aren’t necessarily what religion a person is or isn’t (although this does not excuse the propagation of misinformation), but rather specifically how these beliefs influence a candidate’s perception of the world and the behavioral responses that those filters elicit. Undoubtedly, religion plays a role in shaping our understanding of the world and the range of perceived actions that is available to us at any given moment.

But beliefs aren’t exclusive to the religious community:  if the recent skirmishes over the federal debt ceiling have taught us anything, it is that we demonstrate a potential aversion to complexity or are perhaps slightly overwhelmed by the enormity of problems posed by the modern world. Our own response to these looming presences is to streamline the world, tending to engage with our environment in the specific, and limited, ways that align with our mental picture of the world.

So, before we criticize Rick Perry’s drive to ask God to fix America—as tempting it might be for atheists and secularists—we need to examine the human desire to seek out, and ascribe to, simple answers that are readily available in times of crisis. This impulse, which seems to have largely assumed the form of religious rhetoric in the current round of Republican campaigns (and one might even argue that the content itself is not necessarily spiritual in nature if we look at the reverence given to the invocation of Reagan) seems to be the real, and often under-discussed, issue at play. Although a more arduous task, I believe that appreciating the power and presence of religion in this process will afford us a richer understanding of the American people and their relationship to contemporary politics.

Chris Tokuhama is a doctoral student at the USC Annenberg School for Communication and Journalism where he studies the relationship of personal identity to the body. Employing lenses that range from Posthumanism (with forays into Early Modern Science and Gothic Horror), the intersection of technological and community in Transhumanism, and the transcendent potential of the body contained in religion, Chris examines how changing bodies portrayed in media reflect or demand a renegotiation in the sense of self, acting as visual shorthand for shared anxieties.

Read up on Chris’ pop culture musings or follow him on Twitter as he searches for LA’s best iced coffee.


Man of Science, Man of Faith

In today’s world, it seems that “secularization” is all too often matched with a sense of loss:  whether it be the decline in institutional religion or the dissipation of enchantment, we seem to employ the term in order to forward the idea that we are moving away from something that was once valued. And, to be fair, this is true. The modern age has, since the Enlightenment, been, in fits and starts, shifting away from a life infused with religion. But, I also think that “secularization” can also speak to something larger, and more significant, than that.

Unfortunately, it appears as though “secularization” has become synonymous with Science and been placed in opposition to Religion–atheists rigidly adhere to a rather static ideology that denounces aspects of religion, preferring the explanations proffered by experiments and equations. Yet, are we simply trading one set of dogma for another as we move between extremes? For me, Science works best when it challenges Religion (and vice versa) to keep pace with the developing world. The sense of awe, mystery, and wonder inherent in religion keeps scientists humble and science reminds us that some holy laws must be reconciled with modern culture.

One of the most welcome and quoted new books on the subject is Taylor’s A Secular Age, an 896-page opus that argues that secularization has been largely positive — as long as it leaves open a “window on the transcendent.”

The spiritual and religious impulse in humans will never die, says Taylor. Even if religion doesn’t dominate a society, as it once unfortunately did in Europe and elsewhere, people will always seek the transcendent; something ultimate, larger than themselves.

The great sociologist of religion, Robert Bellah, author of Habits of the Heart, says what is needed most now is new forms of religion that work in a secular age, where they are subject to analysis and don’t rely on political endorsement.

We are seeing this today. Many open-minded forms of Christianity, Judaism, Buddhism and of smaller spiritual movements, including meditation, yoga and healing, are maintaining a sense of the transcendent in some secular, pluralistic societies.

We can partly thank the Enlightenment for the rise of secularism, with the era’s emphasis on freethinking, rationality and science. But many thinkers, including 19th-century sociologist Max Weber, also credit the advance of secularism to Protestantism.

The Protestant Reformation rejected the absolute authority claimed by the Roman Catholic church of the time.

It brought a new wave of reform, choice and intellectual questioning to Christianity. By the 19th century, Protestants were critically analyzing the Bible and trying to discern the difference between the “historical Jesus” and the Christ of unquestioned mythology.

This so-called “critical method” wasn’t an attack on the faith, as some traditionalistic Christians continue to argue today. But it was what many consider a valid attempt to challenge the taboos that surrounded Christian orthodoxy.

Seeing the synthesis of these two areas is what makes studying modern religion so fascinating. Despite a formal training in Natural Sciences, I have gradually come to appreciate the power inherent in religion and am quite excited to be in some other great minds at the USC Knight Chair in Media and Religion blog.


Under a Microscope, Politician’s Private Lives Become Public

“There are some genuinely bad people who would like to infiltrate our country and we have got to have the guts to stand up and say ‘No.’”

By now, Newt Gingrich’s comments at the first New Hampshire Republican Presidential Debates have made the rounds, spreading across blogs, mainstream news outlets, and, of course, The Daily Show. Positioning Muslims alongside Nazis and Communists as those who would infiltrate our country, Gingrich has once again invoked anti-Muslim sentiment in the name of patriotism.

Although Gingrich’s polemic likely raised a few eyebrows, it was admittedly not all that surprising given his recent stance on the subject; highly visible in a movement that would label American Muslims as forever foreigners, Gingrich seems to have crafted himself into a candidate who is willing to engage with the popular topic of American Muslims. Despite the recent spate of coverage, Justin Elliott notes that the mainstream media has generally shied away from what might very well be the real story: the evolution of this particular brand of rhetoric by Gingrich.

Perhaps the American public is partly at fault as it clamors for briefs primed to incite moral outrage and hungers for stories that whet an appetite for spectacle. Yet, as we know, journalism also has a role to play and it is perhaps neglecting its duties in this regard. An issue larger than a simple lack of coverage, there seems to be a fundamental absence in the training of many journalists who would cover religion.

And yet religion continues to have a large presence in the current state of politics, manifesting concerns separate from the intricacies of traditional voting demographic blocs. With Rick Perry hosting an event for governors named The Response, and Reverend Gaddy of the Interfaith Alliance calling for a reduction in religion’s political presence, it appears as though this upcoming race will see the resurgence of a negotiation between the public and private aspects of religion that was recently highlighted by the Ground Zero mosque.

But it’s not only politicians who struggle to understand how religion figures into the everyday, with salvation to be had at venues unlikely as cowboy church. However, despite the potential collapse of the private/public dichotomy, are we really encouraging people to think about the role that religion plays in both of these spheres? Has our news coverage been affected by an upswing in atheism’s popularity? Religion, faith, and spirituality all bridge the gap, with values formed in private undoubtedly affecting actions displayed in public. Why, then, do we hesitate to engage in meaningful discussion of religion’s potential political impact, focusing more on what a particular individual’s religion is in lieu of an attempt to understand how and why that particular philosophy permeates a candidate’s positions? If we are content to simplify our interest to buzzwords like “pro-life” or “against gay marriage,” never challenging ourselves to understand the root causes of the issues we hold dear, how can we ever hope to convince the other side that we may in fact have a point? We insist that others see it our way and never take the time to talk to them in words that they might actually be receptive to. Rather than avoiding the issue entirely, perhaps we should encourage people to make the discussion of religion a routine practice—and provide them with the information and rhetorical tools they need in order to facilitate intelligent discussion.

Chris Tokuhama is a doctoral student in the USC Annenberg School for Communication and Journalism where he is pursuing media/cultural studies with a concentration in Gothic Horror as an articulator of cultural anxiety. A biologist by training, Chris currently endeavors to understand transformative bodies through lenses as varied as narrative studies, media, and religion, a process that has resulted in an upcoming chapter in The Hunger Games and Philosophy focusing on issues of authenticity in celebrity. Follow his quest to find the perfect cup of coffee on Twitter at @TrojanTopher.


Just More of That “He Said, She Said”

“We live in a land that you can choose one or the other, same-sex marriage or opposite marriage…and you know what, in my country, and in my family, I believe that a marriage should be between a man and a woman. No offense to anybody out there, but that’s how I was raised…”

 

-Carrie Prejean, Former Miss California 2009

 

These words, spoken in response to a question posed by blogger (and then acting judge) Perez Hilton, reignited simmering tensions as the issue of gay marriage was again thrust into the national spotlight during the 2009 Miss USA pageant. Although he had hoped for an answer from Miss Utah (Denizet-Lewis, 2009), Hilton nevertheless took advantage of his opportunity, forcing national attention toward the subject of gay marriage legislation; outspoken, media savvy, and an unapologetic gay man, Hilton had capitalized on his moment, engaging mass audiences in what had become an embroiled topic of conversation. Particularly poignant was the fact that Perez Hilton resided in California, which had just narrowly defeated Proposition 8 (otherwise known as the California Marriage Protection Act) and was, at the time, in the thralls of a back-and-forth battle of escalating appeals. Although questionably worded—the choice of the term “opposite marriage” with its non-normative connotations would come to haunt her in the coming months—Carrie Prejean’s response represented a fairly standard beauty pageant answer to a relevant and noteworthy current issue. Hilton, however, did not seem content with Prejean’s reply and expressed his displeasure in a video blog, calling her a “dumb bitch” (Vasquezama, 2009),[1] a catalytic move that helped vault the incident to the status of a media event.

Based in part on the work of sociologist Simon Cottle, this paper will present a background of mediatized rituals and, as a subset, media events in order to contextualize the Carrie Prejean/Perez Hilton controversy. Concerned more with the unfolding of this particular story, and less with value judgments of “right” and “wrong,” I will also draw upon French philosopher Jean Baudrillard and media theorist John Fiske in order to argue toward a position that seeks to understand how and why discussion of gay marriage came to involve the figures of Carrie Prejean and Perez Hilton; I will also strive to demonstrate that although much discussion centered around these two figures for a period of time in 2009, the much ballyhooed incident was in fact indicative of a much larger set of concerns.

Figuring It Out

In some ways, the controversy stemming from the 2009 Miss USA pageant seems somewhat surprising as both Carrie Prejean and Perez Hilton appear incredibly unqualified to spearhead discussion of gay marriage; prior to this incident, neither seemed to be respected as a particular expert on the issue of gay rights or identified as a pundit with any sort of political acumen. Yet, despite an arguable lack of obvious credentials, Prejean and Hilton had managed to meet one important criterion:  they were on national television. Although the viewership of the 2009 Miss USA pageant hit a record low (Keveney, 2010), the simple fact that the controversy occurred on a mediated large-scale platform indicated two noteworthy (and interrelated) factors:  (1) the reach of television as a broadcast medium is widespread and singular in its presentation; (2) the only way to experience the event for most people was through media.

The first factor—which is more readily apparent but ultimately less important—came about as a result of developments in communication technologies that allowed for a global system of satellites and near-instantaneous transmission of news and information (Friedman, 1999). Building upon a model that had its roots in the radio and television culture of the early 20th century, mass communication throughout the late 20th and early 21st centuries allowed an increasingly large proportion of people to simultaneously experience an occurrence; this idea is significant because it develops a common reference point that then serves as the seed for the germination of a mediatized ritual or a media event. Although recent developments in online culture have increasingly proven to support divergent points of view, broadcast media, by its nature, continues to provide a central communal narrative. Additionally, the scale of exposure is also an important factor to bear in mind as broadcast media can make the difference between niche market and national scope.

More important, however, was the notion that, for most people, the incident only existed in its mediated form. According to media scholar John Fiske, this fact meant that audiences could only operate on and conceptualize what Baudrillard terms “hyperreality,” as opposed to the “real” (Fiske, 1994; Baudrillard, 1994). For people living in a post-modern world, the representation of the exchange mixed with its reality, causing the two levels to become effectively indistinguishable from one another—for viewers, everything about this particular media event was, in short, hyperreal. Moreover, for the majority of the audience, both Carrie Prejean and Perez Hilton did not exist as actual people per se, but instead as media personalities; our entire construction of these individuals’ identities stemmed from their portrayals in and though the media.

And, in some ways, the “real” Carrie Prejean and Perez Hilton are somewhat immaterial for our purposes as most people involved in the ensuing discussion will never come to know either of these individuals directly—for most of us, the representation is much more powerful and salient; who we perceive these two to be is more important than who they actually are. Speaking to this concept, Fiske introduces the concept of the “figure” as an embodiment of deeply-seeded conflicts, emotions, and/or feelings within a society (1994).[2] Although Fiske uses individuals like O.J. Simpson and Clarence Thomas to make a series of points about figures and racial tensions, we can perhaps employ his thought process to draw similar analogies with Carrie Prejean and Perez Hilton regarding the issue of gay marriage.

The Voice of the People

In line with Fiske’s description of figures as manifestations of underlying contestations, the response from Americans (to both Prejean and Hilton) was swift and vocal; having been provided with a tangible focal point for their perhaps previously unarticulated and unfocused sentiment, individuals on both sides of the debate began to write letters to newspaper editors in order to express their opinions (Rubin, 2009; Morris, 2009). Combing through opinion pieces from the time of the incident, one notices a stark trend:  authors seem less concerned with debating the relative merits of the situation at hand and instead tend to express outrage that others do not see the world as they do.

Eventually, as the months continued, the narrative surrounding Carrie Prejean would grow as Prejean and her supporters began to cite the contentious answer as the reason—notably, not one reason of many possible factors, but the reason—she had placed second in the Miss USA pageant (The Chicago Tribune, 2009); individual citizens like Judith Martin would go a step further and attempt to contextualize the negative response to Prejean’s answer as part of a larger disruptive pro-gay marriage movement (2009). Prejean, it seems, was the victim in all of this, being vilified by a left-leaning minority public who was hypocritically intolerant.

It is at this point that we begin to see the breakdown in communication between opposing perspectives in conjunction with a general unwillingness to understand the other side of the issue:  those supporting Prejean felt justified in their counter-critique of gay marriage supporters, but were in effect calling for advocates of gay marriage to tolerate an ideology that perceived to violated civil rights. From their vantage points, both sides had a valid argument and were not going to back down.

With supporters of Perez Hilton losing much of their moral high ground thanks to the blogger’s aforementioned “dumb bitch” comment, both sides of this issue were rapidly enmeshed in emotional mudslinging as they attempted to shout down the other side. In retrospect, the rapid escalation of the argument (and perhaps our personal investments in the outcome), caused us to forgo a rational discussion of the real issues alluded to by the incident; as academics and professionals, we have learned that we live and die by our ability to argue a point—rhetoric and intelligent discourse are our hallmarks—and we have also come to understand that criticizing ideas is acceptable and appropriate but assailing character is uncouth. Yet, by responding to Prejean’s answer with a personal attack (and simultaneously showcasing the danger of “lay journalism”), Hilton instantaneously altered the course of the conversation and changed the focus of the gay marriage debate as it pertained to this particular case.

Placed on the defensive, Carrie Prejean positioned herself on the side of truth, stating that she had given her honest opinion in response to Perez Hilton’s question and simultaneously invoked faith, becoming, in essence, a martyr figure (The Staff at wowOwow, 2009; Foreign Mail Service, 2009). As a result of this development, popular readings of the First Amendment were also invoked as Prejean’s supporters questioned the preservation of free speech, not seeming to understand that Prejean’s rights were never threatened (Sullivan, 2009). Here again we see that Prejean fulfilled the definition of a figure, serving as a focal point for discontent in America; although the incident itself had little to do with Constitutional rights, the perception that Prejean’s speech was being impinged upon allowed a certain subset of Americans to adopt the event as their own banner moment. Writing a response to the incident later that year, one author noted that Prejean “all too quickly became a heroine for those who are sick and tired of Hollywood and the thought police” (Hagelin, 2009)—clearly, then, Prejean was thought to stand in as champion for all Americans who had grown disenchanted with the (arguably) corrosive factors represented by celebrity culture and the stifling adherence to political correctness. Regardless of our own stance on the issue of gay marriage, the dissent characterized by Prejean indicates that we have, as a country, failed to promote an environment that fosters rational discourse; those on the right feel as though they are unable to adequately express their opinions and this frustration has developed into outright anger as we near the mid-term elections of 2010.

Additionally, casting her experience as a test from God further entrenched Prejean and her supporters as she became infallible—when framed as a choice between lying to win a beauty competition or pleasing God, how could Christians not support Prejean’s choice (offensive as it might or might not be)? Elevating the discussion to the next level, Prejean also sued the operators of Miss California USA for alleged religious discrimination (Business Insurance, 2009). Suddenly, a personal religious trial had become an assault on Christianity; Prejean, no longer a mere defender of personal integrity, became a crusader for Christianity and all it represented (Homan, 2009). Seemingly all too happy to embrace this new direction, the public began to more closely identify Carrie Prejean with traditional Christian values and morals as she became affiliated with conservative groups (Family Research Council Action, 2009).

The repercussions of Carrie Prejean’s new stance were swift and graphic:  within a few weeks, a variety of scandals surfaced—ranging from rumors of breast enhancement surgery to semi-nude photos, bad behavior, and a sex tape—possibly in order to discredit Prejean’s position as a blameless and righteous victim (Coutts, 2009; Abrahamson, 2009; Gensler, 2009). Again raising the notion of Prejean as a figure in the Fiskian sense, we might argue that while it is doubtful that many cared about Prejean’s sex tape per se (i.e., the backlash did not censure Prejean for having/producing a sex tape but rather for being duplicitous), the revelation of the artifact’s existence mattered immensely in regard to the public perception of Prejean’s character. Whether he had intended it or not, by attacking Prejean personally, Perez Hilton had opened the doorway to a moral absolutism that ran counter to his originally stated goal of gay marriage as a legal issue (vasquezama, 2009); instead of being productive, the discussion had become focused on media figures and again fragmented into the prevalent left/right talking points that have propagated throughout the nation in recent years.

Although the memory of Carrie Prejean and Perez Hilton has somewhat faded in the present, we have continued to see a surge in the disconnect between left- and rightwing politics as the mid-term elections approach:  the rise of the Tea Party (admittedly a diverse group of individuals who are interested in a range of issues) demonstrates the growing separation between competing ideologies in America. While figures like Christine O’Donnell have replaced Prejean in the national spotlight, we continue to see similar themes of God, country, and Constitution reflected in the talking points of the Republican Party. As the issues raised by the figures of Prejean and Hilton in 2009 have not been adequately addressed or resolved, they continue to manifest in the public sphere as points of contention. Having firmly established that Prejean and Hilton reflected the Fiskian conceptualization of the figure, we now turn to work by Simon Cottle in order to further understand how such representations function at the intersection of media and life.

Mediatized Rituals as Disruption

Although some might consider the controversy embodied by Carrie Prejean and Perez Hilton to only be suitable for display on infotainment outlets like Entertainment Tonight and Access Hollywood, we have seen that the back-and-forth pop-culture-based battle evidences very real political issues; although mainstream media might become caught up in discussion of Prejean and Hilton as representations, we can also conceptualize the emergent discourse as an example of a mediatized ritual. Despite a historical resistance to its study (Scannell, 2001), scholars have recently reintroduced the importance of spectacle in everyday political processes, arguing that to delegitimize spectacle is to discount the possible role it plays in people’s lives (Duncombe, 2006; Cottle, 2006).

Employing sociologist Simon Cottle’s argument that mediatized rituals “open up productive spaces for social reflexivity and critique,” we can gain a theoretical perspective on the Prejean/Hilton incident as we see Americans contemplate the discrepancy between how society is and how society should be (2006, p. 411). Although Cottle describes six different classes of mediatized rituals, the most valuable framework comes from the notion of mediatized public crises.[3]

In contrast to a media scandal, which represents a fairly isolated transgression, the story of Carrie Prejean and Perez Hilton morphed throughout its deployment to encompass a range of issues as previously demonstrated. Reflective of deeply personal issues (and highly contestable ones!), the Prejean/Hilton controversy embodied a mediated public crisis as the event “exhibit[ed] narrative progression, unfolded over an extended period of time, and [was] theorized in relation to discernible phases” (Cottle, 2006, p. 424). Once conceptualized as a mediated public crisis, we can plot the milestones of the Prejean/Hilton saga in a trajectory that showcases a struggle for validation, legitimacy, acceptance, and ultimately power. Moreover, understanding the incident in the context of an ongoing, and constant, debate over gay marriage and gay rights, we see that the issue was never really about Prejean or Hilton—sooner or later two opposing figures would have said similar things that would have sparked the tinderbox of controversy. Correlation, as they say, is not the same as causation.


[1] The clip of Hilton’s response to Prejean appears on YouTube and was uploaded by user vasquezama, which accounts for the use of lower case in the citation. Attempts to find the original video blog by Hilton referencing the event were unsuccessful.

[2] Although off-topic for this particular paper, I am much more familiar with this same idea in regard to the genre of Horror and the conceptualization of monsters. A particular fan of American Gothic, I see the continued resonance of vampires, zombies, and werewolves as indicative of the fact that we, as Americans, have not yet come to terms with what these figures represent (e.g., death, paranoia, etc.). It is my position that we create monsters in order to grapple with the underlying issues as we are generally less likely to confront concepts like our mortality head on due to their associated cognitive duress. I would also add that a similar function is performed by Science Fiction and its creatures as we attempt to reconcile our feelings toward the integration of technology and scientific advances into our society. For me, Horror touches on our desire to explore these sorts of fears along with other states of liminality, pushing the boundaries as we attempt to expand the extent of the known. We find fascination in Gothic figures of vampires and zombies for they represent a transgression of the norm and find exhilaration in Horror’s potent blend of sex and violence as a means of experiencing violations of the cultural standard without suffering the real life repercussions. Underneath the morality pleas of many horror films lies a valid method of exploration for audiences. Even scenes of torture, which most definitely assume a different meaning in a post-9/11 world, can be understood as a method of exploring what humanity is like at its extremes; both assailant and victim are at limits (albeit very different ones) of the human condition and Horror provides us with a voyeuristic window that allows us to vicariously experience these scenes.

[3] There is admittedly some overlap between categories as noted in Cottle’s paper, with the Prejean/Hilton incident reflecting elements of media scandal and moral panics at various points in the chronology of the controversy. I have focused here on mediatized public crises due to the narrative/unfolding elements of the case study.


Love Me or Hate Me, Still an Obsession

Reacting to atrocities witnessed throughout the course of World War II, Americans in the 1950s became obsessed with notions of power and control, fearing that they would be subsumed by the invisible hand of a totalitarian regime. In particular, the relatively young medium of television became suspect as it represented a major broadcast system that seemed to have an almost hypnotic pull on its audience, leaving viewers entranced by its images. And images, according to author and historian Daniel Boorstin, were becoming increasingly prominent throughout the 19th century as part of the Graphic Revolution replete with the power to disassociate the real from its representation (1962). For cultural critics still reeling from the aftereffects of Fascism and totalitarianism, this was a dangerous proposition indeed.

Although these underlying anxieties of mid-century American society could be examined via a wide range of anthropological lenses and frameworks, visual media has historically provided a particularly vivid manifestation of the fears latent in the people of the United States (Haskell, 2004). This is, of course, not to imply that visual media is necessarily the best or only means by which we can understand prevailing ideologies in the years after World War II, but merely one of the most visible. However, as a critical examination of the entire media landscape of the 1950s would be beyond the scope of a single paper of this magnitude, discussion shall be primarily concentrated around Elia Kazan’s 1957 movie A Face in the Crowd with particular attention paid to the contrasting channels of cinema and television.[1] This paper will seek to briefly position A Face in the Crowd in the larger context of paranoia-driven cinema of the 1950s before using the film as an entryway to discuss critiques of mass culture. Given the film’s apparent sustained resonance as indicated by its relatively recent mention (Vallis, 2008; Hoberman, 2008b; Franklin, 2009), the arguments of Critical Theory will then be applied to modern American culture in an attempt to ascertain their continued validity. Finally, an argument will be made that acknowledges the potential dangers facing mass culture in the 21st century but also attempts to understand the processes that underlie these pitfalls and provides a suggestion for recourse in the form of cultural and media literacy.

 

Paranoia, Paranoia, Everyone’s Coming to Get Me

The post-war prosperity of the 1950s caused rapid changes in America, literally altering the landscape as families began to flood into the newly-formed suburbs. With the dream and promise of upward social mobility firmly ensconced in their heads, families rushed to claim their piece of the American dream, replete with the now-iconic front yard and white picket fence. And yet, ironically, a new set of worries began to fester underneath the idyllic façade of the suburbs as the troubles of the city were merely traded for fears of paranoia and invasion; the very act of flight led to entrapment by an ethos that subtly precluded the possibility of escape.

As with many other major cultural shifts, the rapid change in the years following World War II caused Americans to muse over the direction in which they were now headed; despite a strong current of optimism that bolstered dreams of a not-far-off utopia, there remained a stubborn fear that the quickly shifting nature of society might have had unanticipated and unforeseen effects (Murphy, 2009). Life in the suburbs, it seemed, was too good to be true and inhabitants felt a constant tension as they imagined challenges to their newly rediscovered safety:  from threats of invasion to worries about conformity, and from dystopian futures to a current reality that could now be obliterated with nuclear weapons, people of the 1950s continually felt the weight of being a society under siege. An overwhelming sense of doubt, and more specifically, paranoia, characterized the age and latent fears manifested in media as the public began to struggle with the realization that the suburbs did not fully represent the picturesque spaces that they had been conceived to be. In fact, inhabitants were assaulted on a variety of levels as they became disenchanted with authority figures, feared assimilation and mind control (particularly through science and/or technology), began to distrust their neighbors (who could easily turn out to be Communists, spies, or even aliens!), and felt haunted by their pasts, all of which filled the movie screens of the decade (Jensen, 1971; Murphy, 2009; Wolfe, 2002).[2] Following solidly in this tradition, Kazan’s A Face in the Crowd picks up on some of the latent strains of paranoia in American culture while simultaneously serving as a platform for a set of critiques regarding mass culture.

 

Somewhere, a Star Is Made

The storyline of A Face in the Crowd is rather straightforward and yet deceptively complex in its undertones:  on the surface, we experience a rather heavy-handed morality tale in the form of country bumpkin Larry “Lonesome” Rhodes, a relative nobody who is plucked from obscurity and made (and subsequently broken) through powers associated with television. Yet, it is only when we begin to connect the movie to a larger societal context that we begin to understand the ramifications of the film’s message; a careful examination of A Face in the Crowd reveals striking suspicions regarding the role that media plays (in this case, primarily television and cinema) in shaping American culture. Stars, director Elia Kazan argues, are not so much born as made, a distinction that portends dire consequences.

It is worth noting that Kazan’s film was made during a time when the concept of the “celebrity” was being renegotiated by America; for a large part of its history, the United States, firmly grounded in a Puritan work ethic, had honored heroes who exemplified ideals associated with a culture of production and was struggling to reconcile these notions in the presence of an environment whose emphasis was now focused on consumption. Although modern audiences might initially find this shift difficult to appreciate, one need only consider that the premium placed on production is so central to American ideology that it continues to linger today:  in a culture that exhibits rampant consumerism, we still value the “self-made man” and sell the myth of America as a place where anyone can achieve success through hard work. To abandon these ideas would necessitate that we reinterpret the very meaning of “America.” Thus, we become more sympathetic to the critics of the day who lamented the loss of the greatness of man and bristled against the notion that fame or celebrity could be manufactured—such a system could would only result in individuals who were lacking and unworthy of their status (Gamson, 1994; Benjamin, 1973)

Such is the case it seems, with Larry Rhodes, who is discovered by roving reporter Marcia Jeffries in an Arkansas jail. Although it cannot be denied that Rhodes has some modicum of talent and a certain charisma that comes from being unafraid to speak one’s mind, Marcia ushers Rhodes onto the path of greatness by dubbing him “Lonesome” and thus creates a character that transforms Rhodes from a despondent drunk to a winsome drifter. This scene—the first major one in the movie—thusly introduces the important notion that those involved in the media can be implicitly involved in the manipulation of the information that travels over the airwaves. Subtly adding to the insidious nature of the media, A Face in the Crowd portrays Marcia as a character that seems likable enough, but also a person who is, in a way, exploiting the people in jail as she rushes in with her tape recorder intent on prying the stories from the characters she finds (or creates!) and does not exhibit much concern in truly understanding why these men are imprisoned in the first place. Taken to an extreme, we later come across the character of The General, who further perverts the connection between media and power as he conspires with Lonesome to remake the image of Senator Worthington Fuller as the congressman runs for President.

Yet, as Lonesome Rhodes grows in his role as a media personality, he quickly demonstrates that the power to manipulate does not lie solely with those who sit behind the cameras. In Memphis, Rhodes incites a riot against the Luffler mattress company and also solicits donations in order to help a Black woman rebuild her house. In light of this, we can see that while Kazan focuses on the negative implications of television and celebrity, that the relative good or bad that comes from these actions is not necessarily the point—instead, the one constant in all of the depicted scenarios is a public who is manipulated into performing actions on the behalf of others. Although the characters of Lonesome and The General are vilified throughout the film, it is the masses for which Kazan demonstrates true disdain.

 

Extraordinary Popular Delusions

Perhaps nowhere is this contempt more apparent than at the end of the film where, in an attempt to offer a small moment of solace to Marcia after her unmasking of Lonesome, writer Mel Miller notes, “We get wise to them, that’s our strength” (Kazan, 1957). And Miller is not wrong:  Western tradition has long recognized the correlation between knowledge and power and Miller’s assertion touches upon the revelatory clout inherent in the realignment of perception and reality as noted by public relations guru Howard Bragman (2008). A more critical examination of the film’s closing scene, however, raises an important question:  Who is Miller’s “we”? Although one might be tempted to read this line as indicative of an egalitarian philosophical view, it is important to note that the only two characters in the shot represent the film’s arguably upper-middle class, and pointedly Eastern-educated, elite—nowhere to be seen are representatives of the small Arkansas town from the film’s opening or denizens of Memphis, both of whom serve to characterize the majority of Lonesome’s devoted viewers.[3] In fact, if we take time to reflect upon the movie, we realize that the majority of the audience was only alerted to Lonesome’s dual nature after Marcia flipped a control room switch and revealed the underlying deterioration; the masses oscillated from one position to the next without understanding how or why and once again adopted a passive stance in their relationship with media. Moreover, as Courtney Maloney points out, Kazan’s depiction of the agency of the masses is actually limited in scope:  despite a montage of audience members vehemently phoning in, sponsors are simultaneously shown to be acting independently as they withdraw their association with Lonesome (1999). Moreover, the subtext of the scene distances the rational decision-making of the truly powerful from the impassioned beseeching of the masses, likening the power of the latter to that of a mob. Knowledge and its associated authority, clearly, are afforded to a select group.

This idea, that the world can be divided between those who “get wise” and those who do not, serves to develop a rather sharp classist criticism against the medium of television and those who would watch it:  moviegoers, by virtue of witnessing Kazan’s work, find themselves elevated in status and privy to “the man behind the curtain” (to borrow a phrase). In contrast, the malleable masses were considered to be pacified and placated by idealistic portrayals of life in the 1950s in the form of television programs like Leave It to Beaver, The Donna Reed Show, and The Adventures of Ozzie and Harriet. Clearly, Kazan creates a dichotomy imbued with a value judgment descended from the thoughts of prominent thinkers in the Frankfurt School who, as far as aesthetics were concerned, preferred the high culture of cinema to the conformity and manipulated tastes of television (Horkheimer & Adorno, 2002; Adorno, 1985; Quart, 1989). This distinction between high and low culture would be a crucial supporting idea for critics as a prominent fear of mass culture was that it portended a collapse between concepts (e.g., fame, celebrity, or intellectual value) of objectively different quality, essentially rendering all manifestations the same and therefore all equally mundane (Boorstin, 1962; Hoberman, 2008b; Kierkegaard, 1962).  Even worse for critics, perhaps, was the perception of the masses’ refusal to grow out of its immature interests, a behavior that was characterized as both childlike and stubborn (Adorno, 1985).

And the fears of such theorists, all of whom were reacting to recent and rapid advances in broadcast technology, were not unfounded. Consider, for example, that radio had been popularized a scant fifty years prior and had vastly altered critics’ understanding of media’s potential impact, creating a precedent as it proliferated across the country and began to develop a platform for solidarity and nationalism. Yet, while the effects of radio were decidedly pro-social, due in part to its propagation of orchestral music and transmission of fireside chats, television was viewed as a corrosive force on society that spurred on the destruction of culture instead of enriching it.[4] For the critics of the Frankfurt School, television was indicative of an entrenched sentiment that regarded mass-produced culture as formulaic and perfectly suitable for a generation of passive consumers who sat enraptured in front of the glowing set. Associating the potential dissemination of propagandist ideology with television as a form of mass broadcast, cultural theorists evoked notions of totalitarian regimes akin to Hitler and Stalin in an effort to illustrate the potential subjugation of individual thought (Mattson, 2003). These simmering fears, aggrandized by their concurrence with the rising threat of Communism and collectivist cultures, found fertile soil in the already present anxiety-ridden ethos of the United States during the 1950s.

 

It’s the End of the World as We Know It (And I Feel Fine)

Notably, however, the fears associated with the masses have not been limited to one particular decade in American history:  across cultures and times, we can witness examples akin to tulip mania where unruly mobs exhibited relatively irrational behavior. Given the reoccurring nature of this phenomenon, which receives additional credence from psychological studies exploring groupthink and conformity (Janis, 1972; Asch, 1956), we might choose to examine how, if at all, the cultural critiques of the 1950s apply to contemporary society.

Recast, the criticisms of mass culture presumably resonate today in a context where popular culture holds sway over a generally uncritical public; we might convincingly argue that media saturation has served to develop a modern society in which celebrities run wild while evidencing sexual exploits like badges of honor, traditional communities have collapsed, and the proverbial apocalypse appears closer than ever. Moreover, having lost sight of our moral center while further solidifying our position as a culture of consumption since the 1950s, the masses have repeatedly demonstrated their willingness to flash a credit card in response to advertising campaigns and to purchase unnecessary goods hawked by celebrity spokespeople in a process that demonstrates a marked fixation on appearance and the image in a process reminiscent of critiques drawn from A Face in the Crowd (Hoberman, 2008a; Ecksel, 2008). Primarily concerned with the melding of politics, news, and entertainment, which harkens back to Kierkegaard-inspiried critiques of mass culture, current critics charge that the public has at long last become what we most feared:  a mindless audience with sworn allegiances born out of fielty to the all-mighty image (Hoberman, 2008a).

Arguably the most striking (or memorable) recent expression of image, and subsequent comingling bewteen politics and entertainment, centered around Sarah Palin’s campaign for office in 2008. Indeed, much of the disucssion regarding Palin centered around her image and colloquisims rather than focusing solely on her abilities. [5] Throughout her run, Palin positioned herself as an everyman figure, summoning figures such as “Joe Six-Pack” and employing terms such as “hockey mom” in order to covey her relatability to her constituents.[6] In a piece on then-Vice-Presidential candidate Sarah Palin, columnist Jon Meacham questions this practice by writing:  “Do we want leaders who are everyday folks, or do we want leaders who understand everyday folks?” (2008). Palin, it seemed to Meacham, represented much more of the former than the latter; this position then  leads to the important suggestion that Palin was placed on the political bill in order to connect with voters (2008). Suddenly, a correlary between Palin and Lonesome Rhodes from A Face in the Crowd becomes almost self-evident.

At our most cynical, we could argue that Palin is a Lonesome-type figure, cleverly manipulating her image in order to connect with the disenfranchised and disenchanted. More realistically, however, we might consider how Palin could understand her strength in terms of her relatability instead of her political acumen; she swims against the current as a candidate of the people (in perhaps the truest sense of the term) and provides hope that she will represent the voice of the common man, in the process challenging the status quo in a government that has seemingly lost touch with its base. In some ways, this argument continues to hold valence in post-election actions that demonstrate increasing support of the Tea Party movement.

However, regardless of our personal political stances, the larger pertinent issue raised by A Face in the Crowd is the continued existence of an audience whose decision-making process remains heavily influenced by image—we actually need to exert effort in order to extract our opinion of Sarah Palin the politician from the overall persona of Sarah Palin. Although admittedly powerful, author Mark Rowlands argues that a focus on image—and the reliance on the underlying ethereal quality described by Daniel Boorstin as being “well known for [one’s] well-knownness” (Boorstin, 1962, p. 221)—is ultimately damning as the public’s inability to distinguish between items of quality leads them to focus on the wrong questions (and, perhaps worse, to not even realize that we are asking the wrong questions) in ways that have very real consequences. Extrapolating from Rowlands, we might argue that, as a culture that is obsessed with image and reputation, we have, in some ways, forgotten how to judge the things that really matter because we have lost a sense of what our standards should be.

 

Ever the Same?

So while the criticisms of critics from the Frankfurt School might appear to hold true today, we also need to realize that modern audiences exist in a world that is, in some ways, starkly different from that of the 1950s. To be sure, the mainstream media continues to exist in a slightly expanded form but new commentary on the state of American culture must account for the myriad ways in which current audiences interact with the world around them. For instance, work published after Theodor Adorno’s time has argued against the passive nature of audiences, recognizing the agency of individual actors (Mattson, 2003; Shudson, 1984).[7] Moreover, the new activity on the part of audiences has done much to comingle the once distinctly separate areas of high and low culture in a process that would have likely confounded members of the Frankfurt School. The current cultural landscape encompasses remix efforts such as Auto-Tune the News along with displays of street art in museum galleries; projects once firmly rooted in folk or pop art have transcended definitional boundaries to become more accepted—and even valued—in the lives of all citizens. While Adorno might be tempted to cite this as evidence of high culture’s debasement, we might instead argue that these new manifestations have challenged the long-held elitism surrounding the relative worth of particular forms of art.

Additionally, examples like Auto-Tune the News suggest that advances in technology have also had a large impact on the cultural landscape of America over the past half century, with exponential growth occurring after the widespread deployment of the Internet and the resulting World Wide Web. While the Internet certainly provided increased access to information, it also created the scaffolding for social media products that allowed new modes of participation for users. Viewed in the context of image, technology has helped to construct a world in which reputations are made and broken in an instant and we have more information circulating in the system than ever before; the appearance of technology, then, has not only increased the velocity of the system but has also amplified it.

Although the media often showcases deleterious qualities of the masses’ relationship with these processes (the suicide of a student at Rutgers University being a recent and poignant example), we are not often exposed to the incredible pro-social benefits of a platform like Twitter or Facebook. While we might be tempted to associate such pursuits with online predators (a valid concern, to be sure) or, at best, unproductive in regard to civic engagement (Gladwell, 2010), to do so would to ignore the powerfully positive uses of this technology (Burnett, 2010; Lehrer, 2010; Johnston, 2010). Indeed, we need only look at a newer generation of activist groups who have built upon Howard Rheingold’s concept of “smart mobs” in order to leverage online technologies to their benefit (2002)—a recent example can be found in the efforts of groups like The Harry Potter Alliance, Invisible Children, and the Kristin Brooks Hope Center to win money in the Chase Community Giving competition (Business Wire, 2010). Clearly, if the masses can self-organize and contribute to society, the critiques of mass culture as nothing more than passive receptors of media messages need to be revised.

 

Reconsidering the Masses

If we accept the argument that audiences can play an active part in their relationship with media, we then need to look for a framework that begin to address media’s role in individuals’ lives and to examine the motivations and intentions that underlie media consumption. Although we might still find that media is a corrosive force in society, we must also realize that, while potentially exploiting an existing flaw, it does not necessarily create the initial problem (MacGregor, 2000).

A fundamental building block in the understanding of media’s potential impact is the increased propensity for individuals (particularly youth) to focus on external indicators of self-worth, with the current cultural climate of consumerism causing individuals to focus on their inadequacies as they begin to concentrate on what they do not have (e.g., physical features, talent, clothes, etc.) as opposed to their strengths. Simultaneously both an exacerbation of this problem and an entity proffering solutions, constructs like advertising provide an easy way for youth to compensate for their feelings of anxiety by instilling brands as a substitute for value:  the right label can confer a superficial layer of prestige and esteem upon individuals, which can act as a temporary shield against criticism and self-doubt. In essence, one might argue that if people aren’t good at anything, they can still be associated with the right brands and be okay. Although we might be tempted to blame advertising for this situation, it actually merely serves to exploit our general unease about our relationship to the world, a process also reminiscent of narcissism (Lasch, 1979).

Historian Christopher Lasch goes on to argue that, once anchored by institutions such as religion, we have become generally disconnected from our traditional anchors and thus have come to substitute media messages and morality tales for actual ethical and spiritual education (1979). The overlapping role of religion and advertising is noted by James Twitchell, who contends that, “Like religion, which has little to do with the actual delivery of salvation in the next world but everything to do with the ordering of life in this one, commercial speech has little to do with material objects per se but everything to do with how we perceive them” (1996, 110). Thus, we might classify religion, advertising, entertainment, and celebrity as examples of belief systems (i.e., a certain way of seeing the world complete with their own set of values) and use these paradigms to begin to understand their respective (and ultimately somewhat similar!) effects on the masses.

 

A Higher Power

Ideologies such as those found in popular culture, religion, or advertising tell believers, in their own ways, what is (and is not) important in society, something that Twitchell refers to as “magic” (1996, 29). Each manifestation also professes a particular point of view and attempts to integrate itself into everyday life, drawing on our desire to become part of something (e.g., an idea, a concept, or a movement) that is larger than ourselves. Perhaps, most importantly, the forces of advertising, entertainment, religion, and art (as associated with high/pop/folk culture) play on this desire in order to allow humans to give their lives meaning and worth, in terms of the external:  God, works of art, and name brands all serve as tools of classification. While cynics might note that this stance bears some similarities to the carnival sideshows of P. T. Barnum—it does not matter what is behind the curtain as long as there is a line out front (Gamson, 1994; Lasch, 1979)—the terms survive because they continue to speak to a deep desire for structure; the myth of advertising works for the same reasons that we believe in high art, higher education, and higher powers. Twitchell supports this idea by mentioning that “the real force of [the culture of advertising] is felt where we least expect it:  in our nervous system, in our shared myths, in our concepts of self, and in our marking of time” (1996, 124). Constructs like advertising or entertainment, it seems, not only allow us to assemble a framework through which we understand our world, but also continually informs us about who we are (or who we should be) as a collection of narratives that serves to influence the greater perceptions of individuals in a manner reminiscent of the role of television in Cultivation Theory (Gerbner & Gross, 1976). The process of ordering and imbuing value ultimately demonstrates how overarching ideologies can not only create culture but also act to shape it, a process evidenced by the ability of the aforementioned concepts to consume and/or reference previously shared cultural knowledge while simultaneously contributing to the cultural milieu.

Given our reconsideration of mid-century cultural critiques, it follows that we should necessarily reevaluate proposed solutions to the adverse issues present within mass culture. We recall the advice of A Face in the Crowd’s Mel Miller (i.e., “We get wise to them”) and reject its elitist overtones while remaining mindful of its core belief. We recognize that priding ourselves on being smart enough to see through the illusions present in mass culture, while pitying those who have yet to understand how they are being herded like so many sheep, makes us guilty of the narcissism we once ascribed to the masses—and perhaps even more dangerous than the uneducated because we are convinced that we know better. We see that aspects of mass culture address deeply embedded desires and that our best hope for improving culture is to satisfy these needs while educating audiences so that they can better understand how and why media affects them. Our job as critics is to encourage critical thinking on the part of audiences, dissecting media and presenting it to individuals so that they can make informed choices about their consumption patterns; our challenge is to convincingly demonstrate that engagement with media is a crucial and fundamental part of the process. If we ascribe to these principles, we can preserve the masses’ autonomy and not merely replace one dominant ideology with another.


[1] It should be noted, however, that the comics of this time—those that belong to the end of the Golden Age and beginning of the Silver Age—also provide an additional understanding of the ways in which Americans indirectly wrestled with their fears.

[2] For a more exhaustive list of movies that support this point, see Wolfe, 2002.

[3] Let us also not forget the fact that Lonesome exhibits a rather patronizing attitude toward his audience in his later career, instituting the Cracker Barrel show with its manufactured country lackeys (Yates, 1974). In contrast to his first stint in Memphis, Lonesome has begun to embrace his country image as a means (if an inauthentic one) to connect with his audience, a point of contention to which we will return.

[4] Curiously, however, we see that this relationship between presidential addresses (like the aforementioned fireside chats) and mass media did not elicit notable complaints from critics who were generally wary of the merging of politics and entertainment (Quart, 1989; Benjamin, 1973). Although a larger discussion is warranted regarding the subtleties of this distinction, I would suggest that part of the differentiation stems from a high-low culture dichotomy. Although critics linked the negative presence of television with corporate advertising, James Twitchell suggests that there has always been a rather intimate relationship between arts and commerce, most saliently exhibited by wealthy citizens or entities who act as patrons (Twitchell, 1996).

[5] Certainly being a female did not help this as American women are typically subject to a “halo effect” wherein their attractiveness (i.e., appearance) affects their perception (Kaplan, 1978)

[6] Palin has continued the trend, currently employing the term “mama grizzlies,” a call-to-arms that hopes to rally the willingness of women to fight in order to protect things that they believe in. Interestingly, a term that reaffirms the traditional role of women as nurturing matriarchs has been linked to feminist movements, a move that seems to confuse the empowerment of women with a socially conservative construct of their role in American life (Dannenfelser, 2010).

[7] We can also see much work conducted in the realm of fan studies that supports the practice of subversive readings or “textual poaching,” a term coined by Henry Jenkins (1992), in order to discuss contemporary methods of meaning making and resistance by fans.


Zen and the Art of Journalism

Zen and the Art of Journalism

If a tree falls in an empty forest, the inquiry goes, will it make a sound? Budding philosophers and scientists have struggled over this question for years—empirical evidence suggests no reason why a falling tree would not produce a crash, but we can never be positive if we cannot measure the reverberations. While some adults might dismiss this scenario as something suitable for idle chatter or meditation, we can recast the same question in terms of political or social movements:  if a group takes a stand but is not witnessed, do they make a sound? Do they have a voice? Do they even matter?

Questions such as these signal the importance of journalism in our political process. As a democracy, we purport that all citizens have the ability and the right to engage in the government’s decision-making process, but the reality is that news outlets play a large role in mediating our ability to be heard. Recognizing that information transmission and journalism have a rather unique (and profound) power to affect the public, authors Bill Kovach and Tom Rosenstiel argue for operational guidelines in their book, The Elements of Journalism.

Soliciting extensive feedback through interviews, meetings, and surveys, Kovach and Rosenstiel began to identify foundational principles that professional journalists deemed central to their craft. The Elements of Journalism represents the summation of the authors’ work and clearly articulates Americans’ long-held, and deep-seeded, beliefs regarding the function of journalism in society. Recent news stories pepper the pages of The Elements of Journalism, vividly demonstrating deficiencies in our current system and urging readers to demand better. However, lest one think that this book represents a publication by journalists for journalists, Kovach and Rosenstiel also include relevant points regarding the role and participation of everyday citizens in the journalistic process.

Free to Be You and Me

The Elements of Journalism opens with the assertion that, in recent years, journalism has lost its way; the latter half of the 20th century has seen a decline in critical analysis regarding the role of journalism in society along with a loss of faith in the profession (CCJ Forum, 1997; CCJ Forum, 1998). Kovach and Rosenstiel argue that although journalists and laypeople continue to recognize the existence of journalism, not many have stopped to question why it exists in the first place.

Building upon an earlier work by Rosenstiel, which suggests that equitable distribution of information forms the foundation of democracy (Rosenstiel, 1990), the authors state, “the primary purpose of journalism is to provide citizens with the information they need to be free and self-governing” (Kovach & Rosenstiel, 2007, p. 12). In practice, this information could range from information that allows individuals to hold elected officials accountable to which roads are closed for repair; in larger and lesser ways, journalism provides the information that people use to formulate their decisions and thusly govern their lives.

Notably, the authors do not argue that journalism represents the only means for obtaining information but limit their position to argue that journalism, as a conduit for information, has a particular responsibility to its constituents in a democratic society. This information, provided by journalists, has the ability to shape citizens’ identities by producing, and defining, a collective conscious/unconscious; to a large degree, journalists are responsible for developing our knowledge of the world outside of our individual experiences. Moreover, in this new era, journalists are not necessarily limited to individuals possessing press credentials—contemporary society has created a number of ways that individuals can participate in the news making process. The once clearly defined boundaries between media and audience have begun to disappear, making the process of standardization even more difficult.

I’ll Be Watching You

Despite the apparent murkiness present in society, Kovach and Rosenstiel suggest a definite desire for journalism to act as an “independent monitor of power,” perpetuating the idea of checks and balances present in our government (Kovach & Rosenstiel, 2007, p. 140). Specifically examining this idea in the realm of politics, the role of media as a watchdog further allows the public to hold their elected officials accountable for their actions; without the press, ordinary citizens would be in jeopardy of becoming even more marginalized in the political process.

In 2004, America faced yet another highly contested race for President; four years earlier, the 2000 election and its infamous “hanging chads” left a sour taste in the mouth of many. Despite the passage of the Help Americans Vote Act of 2002, states around the union continued to experience problems with voting, particularly with electronic voting machines. The months leading up to the 2004 election unearthed numerous accounts from districts that showcased problems with voter records, an inability to audit ballots cast, and missing votes (Anonymous, 2004).

Given this milieu, one would have expected mainstream media to be primed to respond to the aftermath of the 2004 elections, but the days after the event saw no meaningful immediate response from major news outlets. Once again, localities reported issues with voter fraud, voter intimidation, and electronic voting machines but mainstream media refused to entertain the notion that the election was “stolen.” In fact, despite the appearance of a sworn affidavit and testimony by Clint Curtis, a computer programmer and former employee of Yang Enterprises, that indicated his development of software that potentially allowed Congressman Tom Feeny to flip votes, virtually no relevant mention of “Clint Curtis” (or “Clinton Curtis”) appears in the archives of the New York Times or the Washington Post. Compounding the issue, an article by Paul McLeary published in the Columbia Journalism Review states that “no major national publications…have seriously investigated how these very electronic machines were used to help steal the presidential election in Ohio 2004” (2006). Cynics might argue that the press had moved on to more important issues at the time or had no interest in upsetting the status quo, but the questions continued to linger and individual citizens began to step up to address the lack of information.

In what he calls an “exclusive,” Brad Friedman reported the story of Curtis on his website http://www.bradblog.com, complete with an animated picture of a siren to serve as a visual alarm (2004). As the story gained momentum, local papers and Wired began to report on the implications of the software mentioned in Clint Curtis’ affidavit but even these publications did not elicit a response from mainstream media. Although there seemed to be a flurry of discussion in smaller circles, the ramifications of the Curtis story did not seem to transition to the larger stage of national politics.

For some, this incident represents the breakdown of the press’ watchdog function in society—what good is mainstream media if it does not bring situations like Curtis’ to light and frame them in an appropriate context? On the other hand, this case demonstrates individuals’ commitment to upholding the ideal of independent monitoring set forth by Kovach and Rosenstiel.

Power to the People

Indeed, independence seems to be an increasingly difficult quality for mainstream media to maintain as conglomerations begin to absorb media outlets. Giants like AOL-Time Warner and GE/NBC raise some questions about the impartiality of news outlets that reside under the umbrella of corporations with vested interests in various sectors of business. Kovach and Rosenstiel address this conflict of interest by asserting that journalism’s “practitioners must maintain an independence from those they cover” (Kovach & Rosenstiel, 2007, p. 118). Addressing this need are bloggers; while not completely (or necessarily) free of outside influence themselves, bloggers enjoy a degree of independence due to their grassroots nature; individuals do not owe others in the same way that companies do.

Originally stemming from blogging movements in the late 1990s, Citizen Journalism provided a way for the average person to interact with, and analyze, information coming from major outlets. Currently, one can see that a number of traditional news outlets have developed ways to tap into input from a mass market using methods ranging from editors to weighted importance to determine which items are actually newsworthy:  iReport, uReport, FirstPerson, and i-Caught all represent outgrowths of “brand name” news sources. Additionally, increased access to information (largely due to the Internet) has resulted in the ability of individual people to check articles’ accuracy and provide informed commentary on a wide variety of subjects, which in turn has helped to engage the public in news. Blogging, and Citizen Journalism, then, reflect an additional guideline created by Kovach and Rosenstiel:  “The essence of journalism is a discipline of verification” (Kovach & Rosenstiel, 2007, p. 79).

Due Process

In some ways, the rigor implied by verification seems lost on some media outlets (both mainstream and independent) today. Advances in technology and expectations by consumers have developed a culture that demands news instantaneously with accuracy being somewhat of a secondary concern. No longer are mainstream outlets competing against each other to avoid being “scooped”—now, the amateur journalist has entered the picture and made competition even fiercer and more frantic.

The rush to publish material has come in conflict with the principle of authentication:  verification, after all, takes time. Mark Bowden relates a recent story regarding the news coverage of Judge Sonia Sotomayor’s election to the Supreme Court:  in the minutes after her nomination, multiple news channels displayed identical clips of Sotomayor that portrayed her in a less than flattering light (2009). Subsequent opinion polls continued to show the resonating effects of these clips, with individuals specifically citing comments about being a Latina judge or making policy as reasons for disapproval (Pew Research Center, 2009). Had the sound bites come from press conferences or major public speaking engagements, one might have been able to write off the simultaneous broadcast as mere coincidence—the snippets, however, were from rather obscure talks at Duke and Berkeley. Further investigation by Bowden revealed that a single person, Morgen Richmond, was responsible for the packages seen on their air. In and of themselves, there seems to be little wrong with the clips; the segments are not altered and accurately depict a section of a talk given by Judge Sotomayor. Instead, the problem resides in the willingness of major media outlets to air the videos without first fully understanding what they were unleashing onto society.

Building upon ideas outlined by Walter Lippman, Kovach and Rosenstiel articulate some guidelines for the science of reporting that provide a systematic way for journalists to consider their material thoroughly while developing a story (Lippman, 1995; Kovach & Rosenstiel, 2007). Looking at Kovach and Rosenstiel’s list, one finds remarkable similarity to the procedure for a high school laboratory report:  journalists are told to list resources, divulge methods such that their efforts can be replicated, do their own work, draw conclusions from their research, and expand upon the importance of their findings. Recalling the purpose of journalism a conduit for providing foundational information, one can easily see that scientific research represents a type of journalism; although daily news certainly experiences additional time constraints, why shouldn’t it employ the same rigorous approach as science?

The authors also note that this practice of verification differs from the similar, and somewhat more sinister, practice of assertion in that verification stresses the idea of transparency. Interestingly, the practice of asserting facts to bolster one’s case seems to have a curious place in society as competitive debate in high schools and colleges often employs the same tactic; the presence of technique caused Kovach and Rosenstiel to write, “barely do people in journalism, even on the opinion end of the craft, market themselves as better arguers, but instead as more accurate” (2007, p. 84). However, as the Sotomayor case has shown, accuracy alone is not always sufficient to tell a story.

Truth Be Told

Fist in the air, a Navy lawyer engages with a witness:  “I want the truth!” he shouts. This line, uttered in the 1992 movie, “A Few Good Men,” has managed to etch itself into the minds of a certain generation of people but also seems to voice the desires of many citizens in a democratic society. For Kovach and Rosenstiel, the concept of truth encompasses much more than factual information:  “it is a sorting-out process that takes place between the initial story and the interaction among the public, newsmakers, and journalists…This is what journalism is after—a practical or functional form of truth” (2007, pp. 41-42). Indeed, looking once again at the Sotomayor case, we can see that understanding the truth of her words required an appreciation of context and nuance.

Furthering the discussion, Harry Frankfurt argues in his publication, On Truth, that the concept of truth has intrinsic value. According to Frankfurt, “Individuals require truths in order to navigate their way effectively through the thicket of hazards and opportunities that all people invariably confront in going about their lives”—a statement similar in spirit to Kovach and Rosenstiel’s assertion regarding the primary purpose of journalism (Frankfurt, 2006, pp. 34-35). Truth, then, forms the basis for our decision-making process and journalism’s duty is to convey these truths to the public so that they can accomplish this task; in other words, as stated by Kovach and Rosenstiel, “journalism’s first obligation is to the truth” (2007, p. 36).

On occasion, individuals like Jayson Blair or Maureen Dowd will appear, challenging the profession’s responsibility to its citizens and the truth. We might be tempted to dismiss people like Jayson as anomalies—surely we can trust our press to discourage this sort of behavior? What happens, however, when a relative disregard for the truth is institutionalized?

Media Matters for America, a non-profit watch site, released a compilation of records indicating that the “news” provided on Fox News’ broadcasts echoed the opinionated or editorialized rhetoric that has become the channel’s hallmark (Levin, 2009). While this revelation might cause a sense of unease to develop, the real danger resides in a consistent and sanctioned ideology that diminishes the importance of truth in our society. Newspapers have also been found to engage in dishonest acts, which can only serve to call their integrity into question (Project for Excellence in Journalism, 2005). As Kovach and Rosenstiel note, “Oppressive societies tend to belittle literal definitions of truthfulness and accuracy” (2007, p. 37). To make matters worse, a wealth of information and the culture of affirmation have also allowed polarizing argumentation to occur, in effect creating silos of knowledge that attract fragmented communities. Now, instead of weighing all sides to an issue, citizens aim to aggregate as much support as they can, thinking that this somehow makes them more “right.”

This bleak picture seems to underscore the need for individuals and companies to reaffirm their comment toward the ideals of truth. However, while we demand much of our journalists, we should also note that the process of being truthful is not always easy. Sometimes journalists find themselves pitted against entire corporations or industries, as with Mike Wallace and a case involving former Brown & Williamson executive Jeffrey Wigand and tobacco in 1995.

Although Wigand’s name might not seem familiar, many will recognize the incident as the first time that a representative of Big Tobacco admitted knowledge about nicotine’s addictive properties. While Wigand’s actions were seemingly in the public interest—didn’t America deserve to know that tobacco companies were willfully causing their consumers to become addicted to a product that has proven links to detrimental health effects?—Wigand was not unilaterally lauded. Despite some measure of protection afforded to whistleblowers (e.g., the Whistleblower Protection Act), rocking the boat continues to have negative repercussions, possibly resulting in a hostile work environment or harassment (Colby, 2006).

60 Minutes, a televised news program, began to promote a story that it had produced surrounding Wigand but then backed off as fears regarding the release of confidential information started to mount (McLeary, 2005). An article released by CBS reports that management was not afraid that they would get caught in a lie—it seemed as though nobody disputed the truth in Wigand’s statements—but instead was worried that they would be seen as responsible for the breach of a confidentiality contract (Leung, 2005). The truth, Brown & Williamson said, could be told, but for a very steep price. Thus, in addition to highlighting complications inherent in reporting the “truth,” the Wigand story demonstrates a further constraint that limits media’s ability to operate independently:  a struggle to reconcile proprietary information with the latitude granted by the First Amendment.

It’s a Start

It seems difficult to argue with the points laid out in Kovach and Rosenstiel’s book—none of their guidelines contained any inherent flaws. However, the book seemed to represent a primer of sorts; Kovach and Rosenstiel managed to give a report on the current shortcomings in journalism for individuals unaware of media’s present state but not much more. The ten elements labeled by the authors represented the thorough investigation of journalism and a deep understanding of the craft. If journalists adhered to the standards set by Kovach and Rosnstiel, and consumers kept them in mind, a new standard for journalism could be set. In particular, the information seemed to be of value for individuals who had not previously spent a significant amount of time thinking about the role and mission of journalism in society (democratic or otherwise). Ultimately, however, it did not seem as though Kovach and Rosenstiel brought any new information to light, but instead served to organize and articulate knowledge held within the journalistic (and, to some extent, general) community. The problem with this lies in the lack of solutions to the problems depicted in the work—to be fair, the book does not purport to offer any—Kovach and Rosenstiel present a Bill of Rights to citizens that might help order their thinking and expectations but do not offer a plan of action to affect meaningful change in journalism or the news media. In particular, the authors indicate the dangers of letting Business interfere with journalistic standards, but do not teach citizens how to stave off this infection or to lessen its impact.

Kovach and Rosenstiel also address the developing cultures of affirmation and infotainment throughout the course of their book, deftly noting their flaws along the way, but do not do anything to cause practitioners to see the value in a more rigorous form of journalism.  Although the authors note the deleterious effects of reporting (including the fragmentation of society!), their words do not seem sufficiently persuade others to reform their ways. Entertainment television and gossip magazines fill our collective conscious, and The Elements of Journalism does not address the root causes of our fascination with these, instead focusing solely on their aftereffects. Furthermore, the book does not seem geared toward the consumer in general:  it seems as though it would only attract the attention of journalists or those already disgruntled with the current state of journalism. Again, while the book does not purport to be all things to all people, it appears as though the authors missed an opportunity to spread their message further.