Although utopia—and perhaps more commonly, dystopia—has come to be regularly associated with the genre of Science Fiction (SF), it seems prudent to assert that utopia is not necessarily a subgenre of SF. Instead, a result of the shift toward secular and rational thinking in the Enlightenment, the modern notions of progress and idealism inherent in Western utopian thought find themselves intimately connected to science and technology in various forms. Early twentieth century American figures like Tom Swift, for example, articulated the optimism and energy associated with youth inventors, highlighting the promise associated with youth and new technology. 
After Robert A. Heinlin’s partnership with Scribner’s helped to legitimize science fiction in the late 1940s through the publication of Rocket Ship Galileo, the genre began to flourish and, like other contemporary works of fiction, increasingly reflected concerns of the day. Still reeling from the aftereffects of World War II, American culture juggled the potential destruction and utility of atomic energy while simultaneously grappling with a pervasive sense of paranoia that manifested during the Cold War. As with many other major cultural shifts, the rapid change in the years following World War II caused Americans to muse over the direction in which they were now headed; despite a strong current of optimism that bolstered dreams of a not-far-off utopia (see Tomorrowland in Disneyland), there remained a stubborn fear that the quickly shifting nature of society might have had unanticipated and unforeseen effects.Very much grounded in an anxiety-filled relationship with developing technology, this new ideological conflict undercut the optimism afforded by consumer technology’s newfound modern conveniences. Life in the suburbs, it seemed, was too good to be true and inhabitants felt a constant tension as they imagined challenges to their newly rediscovered safety: from threats of invasion to worries about conformity, from dystopian futures to a current reality that could now be obliterated with nuclear weapons, people of the 1950s continually felt the weight of living in a society under siege. An overwhelming sense of doubt, and more specifically, paranoia, characterized the age with latent fears manifesting in literature and media as the public began to struggle with the realization that the suburbs did not fully represent the picturesque spaces that they had been conceived to be. In fact, inhabitants were assaulted on a variety of levels as they became disenchanted with authority figures, feared assimilation and mind control (particularly through science and/or technology), began to distrust their neighbors (who could easily turn out to be Communists, spies, or even aliens!), and felt haunted by their pasts.  In short, the utopia promised by access to cars, microwave dinners, and cities of the future only served to breed frustration in the 1960s as life did not turn out to be as idyllic as advertised.
Suggesting that utopian and dystopian notions were not intrinsically linked to technology, this pattern would repeat itself in the 1980s after the promises of the Civil Rights, environmental, Women’s Liberation, and other counter-cultural movements like the Vietnam War protests faltered. To be sure, gains were made in each of these arenas, reflected in an increase in utopian science fiction during the 1970s, but stalling momentum—and a stagnating economy—caused pessimism and disillusionment to follow a once burgeoning sense of optimism during the 1980s.On a grander scale, bolstered by realizations that societies built upon the once-utopian ideals of fascism and communism had failed, the 1980s became a dark time for American political sentiment and fiction, allowing for the development of dystopian genres like cyberpunk that mused on the collapse of the State as an effective beneficial regulating entity.  Reflected in films like The Terminator, a larger travesty manifested during the decade through an inability to devise systemic solutions to society’s problems, as we instead coalesced our hopes in the formation of romanticized rebel groups with individualist leaders. Writing in 1985, author John Berger opined that repeated promises from Progressive moments in the past had contributed to society’s growing sense of impatience.  A powerful sentiment that holds resonance today, we can see reflections of Berger’s statement in President Obama’s campaign slogan and the backlash that followed his election to office—“Hope,” it seems, capitalized upon our expectations for a future filled with change but also sowed the seeds of discontent as the American public failed to witness instantaneous transformation. For many in the United States, a lack of significant, tangible, and/or immediate returns caused fractured utopian dreams to quickly assume the guise of dystopian nightmares.
Furthermore, these cycles set a precedent for the current cultural climate: the promises of new developments in communication technologies like the Internet—particularly relevant is its ability to lower the barriers of access to information—have turned dark as we have come to recognize the dangers of online predators and question the appropriateness of sexting. Moreover, technological advances that allow for the manipulation of the genetic code—itself a type of information—have allowed us to imagine a future that foresees the elimination of disease while simultaneously raising issues of eugenics and bioethics. Shifting our focus from the void of space to the expanses of the mind, utopian and dystopian fiction appears to be musing on the intersection of information (broadly defined) and identity. Spanning topics that feature cybernetic implants, issues of surveillance and privacy, or even the simple knowledge that a life unencumbered by technology is best, ultimately it is access to, and our relationship with, information that links many of the current offerings in utopian/dystopian Science Fiction.
 Francis J. Molson, “American Technological Fiction for Youth: 1900-1940.” In Young Adult Science Fiction, edited by C. W. Sullivan III. (Westport: Greenwood Press, 1999).
 Comics at this time, for example, also spoke to cultural negotiations of science and progress. For more about the establishment of Science Fiction as a genre, see C. W. Sullivan III, “American Young Adult Science Fiction Since 1947.” In Young Adult Science Fiction, edited by C. W. Sullivan III. (Westport: Greenwood Press, 1999).
 Bernice M. Murphy, The Suburban Gothic in American Popular Culture. (Basingstoke: Palgrave Macmillan, 2009).
 See Paul Jensen, “The Return of Dr. Caligari.” Film Comment 7, no. 4 (1971): 36-45 or Wolfe, Gary K. “Dr. Strangelove, Red Alert, and Patterns of Paranoia in the 1950s.” Journal of Popular Film, 2002: 57-67 for further discussion.
 Peter Fitting, “A Short History of Utopian Studies.” Science Fiction Studies 36, no. 1 (March 2009): 121-131.
 Lyman Tower Sargent, “In Defense of Utopia.” Diogenes 209 (2006): 11-17.
 Constance Penley, “Time Travel, Primal Scene, and the Critical Dystopia.” Camera Obscura 5, no. 3 15 (1986): 66-85.
 John Berger, The White Bird. (London: Chatto, 1985).
Although it seems doubtful that viewers of the 2011 Video Music Awards (MTV) would conceptualize their actions in such a fashion, they were, in part, observing a celebration of the image and of semiotics. Having become literate in the “language” of music videos, audiences undoubtedly learned to extract meaning from the images paired with songs downloaded on mp3 players. And yet, although each of the featured music videos could be deconstructed in productive and informative ways, one of the most interesting moments, for me, was the tribute to Amy Winehouse.
50s aesthetic of Bruno Mars aside, the choice to pay homage to a fallen singer through the use of a motif directly derived from Andy Warhol’s pop art movement was a curious one, as Warhol’s work was born out of a response to a viewership inundated with the mechanically reproduced image. Repetition, for Warhol, spoke to the relative meaninglessness of the image in a culture saturated by media but also challenged viewers to then reconcile slight variations in the image, begging audiences to develop a discerning eye. In some ways, examples like the Winehouse tribute cause me to wonder about the modern visual sensibility, for it seems as though the very process that Warhol spoke out against has come back to haunt him; the very notion of an image’s repetition rendering it meaningless has come to apply to Warhol’s own work!
And perhaps there was something to the concerns of Warhol (along with a slew of cultural theorists in the 50s), who was reacting to recent and rapid advances in broadcast technology. Consider, for example, that radio had been popularized a scant fifty years prior and had vastly altered critics’ understanding of media’s potential impact, creating a precedent as it proliferated across the country and began to develop a platform for solidarity and nationalism. Yet, while the effects of radio were decidedly pro-social, due in part to its propagation of orchestral music and transmission of fireside chats, television was viewed as a corrosive force on society that spurred on the destruction of culture instead of enriching it. For the critics of the Frankfurt School, television was indicative of an entrenched sentiment that regarded mass-produced culture as formulaic and perfectly suitable for a generation of passive consumers who sat enraptured in front of the glowing set. Associating the potential dissemination of propagandist ideology with television as a form of mass broadcast, cultural theorists evoked notions of totalitarian regimes akin to Hitler and Stalin in an effort to illustrate the potential subjugation of individual thought (Mattson, 2003). These simmering fears, aggrandized by their concurrence with the rising threat of Communism and collectivist cultures, found fertile soil in the already present anxiety-ridden ethos of the United States during the 1950s.
Although the cultural climate has changed somewhat since the mid-20th century, modern Americans continue to find themselves assaulted by images, particularly in the form of advertising.
The advertisement above, for example, represents a fairly straightforward—and yet beguilingly complex—image to promote a sound dubbing service. Undoubtedly played for a laugh, careful analysis of the image speaks to a potentially more disturbing reading of the situation.
The most immediate reading is likely that of a threesome (undoubtedly playing off of the cliche that sex sells), but closer inspection suggests that this is a particular type of engagement and most definitely geared toward a particular type of audience. Without much effort, we can clearly see that this ad portrays an encounter between two women and a man (it is unlikely that any of the participants are transsexual as nothing in the ad suggests as much to the reader) in fulfillment of the stereotypical straight male fantasy. Furthermore, the man is actively engaged in the act of viewing the sexual encounter between the two women, suggesting an element of voyeurism and the male gaze; in contrast to the women, who are embracing with their eyes closed, the man is not shown to have any physical contact (his body posture actually suggests that he is pulling away, creating space for the women to “do their thing”). Women, in this scenario exist to be watched and/or perform for the man as they reinforce the notion that lesbianism (temporary or otherwise) exists to provide males with sexual pleasure. In some ways, we might also think about how this image reaffirms a heteronormative stance (with a small exception for lipstick lesbians who perform not for their pleasure, but for a man’s) with traditional gender representations.
The copy also serves to reinforce a male-dominated view of sexuality, with the laugh coming from the disconnect between the line’s attribution to a man versus a woman. Humor in this scene derives from the fact that we, the viewer, are supposed to chuckle that a woman is uttering the phrase “If my dad could see me now, he’d be so proud” because we know that no woman would ever say such a thing. We are, then, operating under a social/family structure that speaks to the notion of the good girl (or possibly “daddy’s girl”) and assume that if this woman’s father actually saw her, he would be disappointed rather than proud. Moreover, we see that the “correct” attribution of the phrase is its attachment to the male, which then positions him as a member of an all-boys club that he and his father can participate in. The underlying suggestion is that father and son can share a fond story over the son’s sexual conquest of two women (perhaps invoking a similar story from the father’s youth) and that father, like son, uphold the devaluation of women. Looked at in another way, we can see Willamson’s referent systems at play in this ad: man is asserting his ability to “tame” the wild/sexual nature of women, moving them from “raw” to “cooked.”
Interestingly, this ad says nothing about the ability of the company in question (Herbert Richers Sound Dubbing) to fulfill the function for which they would be hired, instead relying on goodwill generated by humor to translate into positive affect regarding the brand. Given that nature of the advertisement, the company must rely on a visual representation of the syncing process (in this case, showcasing something that is out of sync) and the purpose of the ad seems to create name recognition, for no contact information is given.
In order to study this image further, it would be helpful to know more about the context in which it appeared. As Herbert Richers is a Brazilian dubber, it stands to reason that this ad was displayed in Brazil (but the copy is in English and not Portuguese) but the question remains if it appeared on a website, magazine, or billboard. Was it part of a campaign that played on similar themes (and how how might it affect our reading if we knew that there was a mirroring ad that featured two men and a woman)?
Mattson, K. (2003). Mass Culture Revisited: Beyond Tail Fins and Jitterbuggers. Radical Society , 30 (1), 87-93.
A twenty-something coming to terms with being an adult in Southern California, Chris Tokuhama is interested in attempts to transcend the human body. With research interests that range from Early Modern Science to Gothic horror and transhumanism (with a bit of religion sprinkled in), Chris hopes to increase media literacy in youth so that, in future threesomes, all partners can participate equally. Read more about Chris’ take on pop culture on his blog or follow him on Twitter.
In a flash of light it happens.
It makes sense that Sookie’s faerie powers help to set Eric right, for his reclaimed memory is conveyed through a set of remembered images. Faeries, as masters of light and image, allow us to realize what has been in front of us but just out of sight. And, what does it mean that the show conflates Wicca and witchcraft? Is magic grounded in Nature and, if so, what does it mean to break that spell?
Nan, with her head in the game, reminds us that image is all important.
In so many ways, this episode is about collapsing the structures that we have erected around us: physical, ideological, and of course those associated with personal identity. (It’s a more direct way to show that everyone’s starting to wake up from their various dreams.) Tommy, who gave up on himself a long time ago, killing himself time and time again as he became anything he could to escape, finally finds release.
And as heartbreaking as Tommy is, we have Marnie who becomes grosser (and more dangerous) than we could for she is the kid who’s so beaten down by the world that she has become vengeful. She will bring the world down just to see the pain end. It’s a dangerous proposition at best and one that will only get uglier before it gets better.
Manipulation seems like such a dirty word.
And yet, as a Social Psychology student, my books were filled with terms like “influence,” “persuasion,” “schema activation,” and “behavior modification”—apparently, my undergraduate years were spent learning how to constrain the range of salient choices available to others. Over the years, my work has evolved, but I often think back to my initial interest in the subject and how it was closely linked to the work of Goffman, although I was not able to articulate the connection at the time.
As a former admission officer, the implications of self-presentation are often glaringly obvious for anyone who has set foot inside of a high school: from stereotypical cliques to personal dress and demeanor, students broadcast an incredible number of messages as they attempt to manufacture, consolidate, express, and identify their senses of self. Goffman suggests that this exchange of information can occur through processes that the actor willfully controls (or is perceived to) and those that are communicated without conscious knowledge. Further exploring the relationship between audience, message, and source, we encounter the work of Gross who labels sign-events either “natural” or “symbolic.” Although natural sign-events undoubtedly have a measure of usefulness when it comes to interacting with the physical world around us, the symbolic seem most relevant to Goffman’s discussion of interpersonal interaction and the nature of socially-constructed reality.
Although we might discuss the potential of fashion to figure into this process, I tend to think about it more broadly in terms of information and power; as Goffman mentions, a constant tension exists between actor and audience as both parties attempt to ascertain who knows what about whom. And, in many ways, possessing a more complete picture of the situation allows one to better dictate the nature of the interaction, for understanding the rules of the game (i.e., Goffman’s “working consensus”)—or even knowing that you are playing one in the first place!—leads to more desirable outcomes, particular when we are trying to deceive others about who we are.
There seems to be a biological imperative for deception, as the act, in its many forms, can serve to reduce the cost of obtaining something of value (e.g., goods, services, protection, contentment, etc.), but while animals have traditionally employed this tactic for self-preservation (e.g., mimicry), human beings have taken the practice to more complex levels. Perhaps unsurprisingly, as we slowly exit the Age of Information, many current deceptive practices revolve around the manipulation of knowledge. Online, we might “fudge” our profile pictures in an attempt to lessen the rejection that we so desperately seek to avoid in real life or we might alter a personal characteristic in order to test the waters of a new identity in an environment that allows us to process anxiety and judgment from the safety of our homes. I often wonder how those of us who use social media cultivate our profiles, tending to them like gardens: to what extent do we fetishize our online presence, letting it define us instead of the other way around? It would seem that while the relative ease of online deception confers us some cognitive defense, it also threatens to overwhelm us with delusion.
We lie to others and, perhaps even worse, lie to ourselves. We look outward for acceptance and affirmation instead of delving inward to confront the deepest parts of ourselves. Technology has allowed us, as individuals, to connect over vast differences and afforded us many opportunities that we might not otherwise have; yet, in some ways, it has also left us disconnected from the things that (arguably) matter the most.
 Although this is a separate topic, I am incredibly interested in the ways in which perform acts of self-deception, as I think that these have the potential to harm us in spectacular ways. Using Goffman’s theater metaphor, I think it is fascinating to consider what happens when the “backstage” is really just the “front stage” for our inner psyche.
2005 was the year that I became a fan. It was a year full of turmoil for me: within a few months I had graduated from college, learned how to bribe Mexican police, started a serious relationship, started a new job, and moved. Quite a lot for a 22-year-old with plenty of education but no real life skills.
And then, PostSecret happened.
PostSecret started out as an art project in 2005 that simply asked people to submit a secret anonymously on a decorated postcard. These secrets were then published on the Internet through the PostSecret blog or in a series of books. No real guidelines were given to participants—they merely had to reveal something about themselves that they had never shared before.
I soon found myself anxious as I waited for the weekend to roll around, specifically Sunday morning, bringing with it a new batch of secrets. I was fascinated with the statements on these cards. Many of the images reflected my own fears and hopes (some I didn’t even know I had) and ranged in topic from secret crushes to quiet crises to guilty consciences.
Over the years, I have been exposed to hundreds of images, but, in particular, there is probably one card that I will never forget. It said, “I wish I had been a better sister than you were a brother. Yours was not the only life you took. I miss you. I hate you. I love you. I am sorry.”
Despite its humble beginnings, PostSecret has developed into a full-fledged community with over 300 million site visits, eventually resulting in a movement with an interest in suicide prevention. Although some members surely embrace this philosophy more than others, the director, Frank Warren, has occasionally mobilized the group in support of the Kristin Brooks Hope Center and its teen crisis line.
We can measure civic engagement by the number of dollars raised for suicide prevention, but we can also think about how this particular community has allowed for the development of an individual’s empathy. This group has permitted members to feel what it means to be a part of a community and surely that has some value in developing the skills necessary for participatory culture. Movements like PostSecret have the power to allow our young people to realize how they relate to others in society.
Frank put forth this idea in the form of a statement followed by a question: “Everyone, no matter how long you’ve known them, probably has a secret, that, if he or she told you, would break your heart. How does this change the way that you interact with those around you?” I ask, “How does this change the way that you think about those around you?”
A year after I saw this image, my friend’s brother killed himself. This card, written by an anonymous stranger, allowed me to connect with my friend in a way that had previously seemed impossible. An only child, I had no idea what it was like to lose a sibling, much less to have a loved one commit suicide. But this card—this image—elicited feelings that are the closest that I will get to understanding my friend’s pain in that moment.
Seeing that card allowed me to engage in a dialogue instead of being scared or overwhelmed. Seeing that card pushed me to work though a script in a space where I felt safe to explore my emotional limits. Being a fan of this community had equipped me to handle this unforeseen situation and gave me tools that afforded action. This, on a very personal level, is the potential power of fandom.
Participation in this community can cause members to undergo a paradigm shift as they see the world through entirely different lenses. My inherent cynic and college admission officer wonders if these sorts of affiliations result from mystique or ambiance rather than inherent virtue. I’d argue that, in some ways, young people have become much better at manipulating their image and that the joining of these clubs can represent a form of brand management; students leverage their brand in order to gain social capital.
Communities of participation like PostSecret develop their own knowledge base but also a common language and a shared understanding. Projects like these brand themselves and their participants as part of a movement—affiliation with the group implies that you have incorporated certain things into your identity.
Influenced by Linguistics, Kenneth Pike constructs a set of terms (etic and emic) that help describe different observational stances. However, although the classification utility of Pike’s phrases seems without question, the major contribution of Pike’s work seems to be a shift from either/or thinking to one that is described as both/and. Although Pike describes the merits of both etic and emic lenses, he also importantly suggests that a unit of information can assume varied—but reconcilable!—layers of meaning. In fact, Pike alludes to the interconnected nature of etic and emic positions, noting that etic work might provide general theories that are refined by emic inquiry, which might resultantly generate newer working hypotheses that are etic in nature. Furthermore, Pike implies that a more sophisticated analysis can occur when a researcher is able to see/understand both layers simultaneously—in other words, being able to conceptualize how data fits into a broad/horizontal array while also appreciating its role in a closed/vertical system.
In a similar fashion, James Carey’s chapter in Communication as Culture speaks to a paradigmatic model of Communication that also features two complimentary—though distinct—views. Throughout this work Carey endeavors to expand the perception of communication, challenging readers to look past the more commonplace understanding of communication as a means of information transmission. Invoking concepts that highlight religion and ritual as a community of practice, Carey’s words overlap with Robert Orsi’s thoughts on lived religion. Using religion as a lens, Carey comments on religion’s ability to reflect culture even as it acts to create it: a critical study of communication allows one to glimpse the ordering structures that serve to imbue symbolic meaning in culture while our interaction with media (e.g., checking a website for news as part of a daily routine) becomes a ritual whose practice is then incorporated into culture! In fact, in some ways, Carey’s ideas speak to communication as a force that can act to maintain or change; it is in this dual nature that we see resonance with Pike, for communication seems to be able to assume these roles without contradiction. Moreover, etic and emic lenses can be applied to Carey’s newspaper analogy for while one can see that the same general categories of news continue to exist (e.g., Politics, Economics, Entertainment, etc.), the specifics of those topics (e.g., amount of space/attention given, tonality, order in the newspaper) are often particular to a given culture in a particular temporal context.
The major contribution of Carey, however, seems to be a recognition of communication’s ability to superimpose an additional layer of meaning upon the world: in effect, communication transforms the reality of “what is” into a reality of “what is understood” in a process that mirrors the distinction between sensation and perception in Psychology. Although the presentations can assume various forms or modes, the important point seems to be that reality is realized through communication and it might be argued that, in some ways, the symbolic reality matters more than whatever the objective reality might be. In short, the reality formulated by communication constitutes a reality of meaning.
In this regard, Carey’s work builds upon work by Gross, who speaks to the symbolic power of communication and reminds readers (through Piaget) that communication contains an ability to structure reality. Furthermore, although some data may have inherent value, it becomes useful knowledge only after it is construed in relation to other bits of information—in other words, information becomes valuable largely because it is structured. Challenging readers to look beyond the American privileging of verbal symbols, Gross speaks to other modes in a process that foreshadows Carey’s example of maps that can be pictorial, oral, or kinesthetic. In some ways, we continue to suffer from the culture lamented by Gross in modern scholarship with American Higher Education largely privileging lexical skills while generally dismissing multi-modal production; although the model seems to be changing slowly, most educational settings appear to ask that students continually funnel symbolic knowledge through speech or writing.
Also speaking to the confluence of education and communication, Edward Hall points out the pervasive influence that learning styles can have on culture. Although it may seem obvious to any who would study culture in the modern age, Hall points out that ways of seeing are shaped by the learning process; cultural misunderstandings, then, result from an inability of people to transcend their own learning styles and to assume the perspective of others. Applying this to Communication, it reminds us that we must not only study content but also endeavor to understand how a particular piece (or set) or information was acquired.
Extrication is the name of the game.
What happens when you’re in over your head and up against things that you couldn’t possibly understand? What happens when the adage holds true and “too much of a good thing” becomes your reality? When the thing you wanted becomes the thing you fear?
The only way out isn’t up—the only way out is through. Debbie, Tommy, and Lafayette get it. Jessica’s finally got the tools. Jesus and Jason got it because they’ve lived through this before. Eric, Sookie, and Bill will get it in the end.
And Marnie? She’ll get it just before she dies. Again.
It’s hard to come back from an episode that ends with a fade to white and a gasp. Taking a bit of a breather (was there ever any doubt that Jessica was in real danger?), it gives us a chance to reflect on times when the one thing the want—the thing that we burn for—is the very thing that will kill us. This story has been told time and again throughout history, with varying levels of moral shading, but, in some ways, it’s one of the things this show has always been about. It’s really kind of amazing, when you think about it—in this season, there’s the idea that the repressed part of you is going to destroy you, the notion that we will kill ourselves in order to save or protect the ones we love, and this last bit about the death drive.
And maybe this is particular to me, or the way that I see the world, but my favorite episodes with this, Caprica, BSG, or Six Feet Under are always ones where things are crumbling down everywhere you look. I suppose that part of it is that I trust these shows and know that the breakdown is delicious because it helps the characters prioritize and realize what is really important and what is really worth fighting for. I am always interested in the the choice to become hard or to become strong and episodes penned by Alan Ball do that so well.
Knowing, for example, that Eric will eventually get his memory back only makes it sweeter when Sookie allows herself to believe that Eric will never betray her. Sookie being happy and/or in love are somewhat surface issues for me—the real question is how, when, and why we choose to pursue a path that we know is going to come back to bite us in the end. The pain is going to be that much worse for all that we put into it. I don’t think the show comes down on either side but hopefully causes viewers to think about which choice is right in their own lives.
In a move long-suspected by many, Texas governor Rick Perry officially declared his intention to seek the office of President this past Saturday. Perry, who garnered national attention with his rally, The Response, once again invokes—or at least should cause one to question—the manner in which religion has been interwoven into a political climate that has, of late, seemed to largely fixate on the economic issues of budgets, debt, and unemployment.
Without diminishing the importance of these topics or their coverage, the recent debates in Iowa suggest that understanding the potential impact of religion in the various GOP campaigns is of value whether one identifies as Republican or not. Beyond the gaffe of news anchor Ainsley Earhardt, and the larger discussion (and negotiation) of Mormonism that it references, religion’s presence seems to have manifested in subtle, but potentially significant, ways throughout this campaign season.
Responding, perhaps, to a recent poll that indicated Americans’ preference for a strongly religious President (despite not being able to correctly identify the specific beliefs of major candidates), Fox News displayed a graphic during the Iowa debates that indicated three pieces of information: religion, marital status, and number of children. Interestingly, this graphic was paired with another image showcasing each individual candidate’s political experience, perhaps suggesting that Fox News considered these two sets of information equally important for viewers.
And, in a way, maybe they are.
During the debates on Thursday, Byron York asked candidate Michele Bachmann about how her religious beliefs—specifically her belief in the virtue of submissiveness—might affect her behavior, citing her prior decision to become a tax lawyer as a result of her understanding of God’s desire as channeled through her husband. Although this inquiry elicited a strong display of displeasure from the audience as extraneous or unfair, the question seemed designed to probe Bachmann’s decision-making process in the past as well as what might shape her choices in the future if she were to become President.
So maybe the relevant concerns aren’t necessarily what religion a person is or isn’t (although this does not excuse the propagation of misinformation), but rather specifically how these beliefs influence a candidate’s perception of the world and the behavioral responses that those filters elicit. Undoubtedly, religion plays a role in shaping our understanding of the world and the range of perceived actions that is available to us at any given moment.
But beliefs aren’t exclusive to the religious community: if the recent skirmishes over the federal debt ceiling have taught us anything, it is that we demonstrate a potential aversion to complexity or are perhaps slightly overwhelmed by the enormity of problems posed by the modern world. Our own response to these looming presences is to streamline the world, tending to engage with our environment in the specific, and limited, ways that align with our mental picture of the world.
So, before we criticize Rick Perry’s drive to ask God to fix America—as tempting it might be for atheists and secularists—we need to examine the human desire to seek out, and ascribe to, simple answers that are readily available in times of crisis. This impulse, which seems to have largely assumed the form of religious rhetoric in the current round of Republican campaigns (and one might even argue that the content itself is not necessarily spiritual in nature if we look at the reverence given to the invocation of Reagan) seems to be the real, and often under-discussed, issue at play. Although a more arduous task, I believe that appreciating the power and presence of religion in this process will afford us a richer understanding of the American people and their relationship to contemporary politics.
Chris Tokuhama is a doctoral student at the USC Annenberg School for Communication and Journalism where he studies the relationship of personal identity to the body. Employing lenses that range from Posthumanism (with forays into Early Modern Science and Gothic Horror), the intersection of technological and community in Transhumanism, and the transcendent potential of the body contained in religion, Chris examines how changing bodies portrayed in media reflect or demand a renegotiation in the sense of self, acting as visual shorthand for shared anxieties.
The world is a dangerous place not because of those who do evil but because of those who look on and do nothing.
These words, known to most who have ever encountered a course on Ethics, set the tone for HBO’s documentary Superheroes, which profiles individuals involved in the Real Life Superhero movement. Although there is truth to Einstein’s words—for failing to stop injustice can represent a form of evil—the question becomes one of perspective: in a simplified system that includes three perceived parties (victim, perpetrator, and savior), the solution seems rather obvious for we know what we should do, regardless of whether we actually intercede. But what happens when we situate the same concept in the context of a community or society? Vigilante justice leads to societal breakdown as we each enforce our own moral codes.
It is for this reason that I’m often suspect of these individuals who undoubtedly have good intentions. Although it is easy to judge and express disdain for grown adults who seem out of touch with society, the fringe often worry me. I’m not so much concerned that they do not look like I do or that they “dress up,” but rather worry because they do not play by my rules. And, ultimately, isn’t it just a short hop from there to a defining characteristic of a villain? Regardless of if the individual in question is a nuisance, helpful, or a menace, the fact that he or she has checked out of the system that you live in raises should raise some red flags.
The invocation of Kitty Genovese is also perhaps unsettling because it misses the forest for the trees. Again, on an individual level, the story is perhaps one that inspires you to action, but donning your gear does nothing to address the larger issue of apathy, the bystander effect, or the diffusion of responsibility (whatever you choose to call it).
And although it has not yet come to this, what if super villain groups formed in opposition to these real life superheroes? Not just gangs or the mafia, but groups of individuals who preferentially targeted those who would do good? The streets would devolve into a sort of war zone with casual citizens and public property caught in the crossfire. I suspect that these superheroes only survive because nobody is actively trying to hunt them.
This of course raises the notion of who should be a superhero.
I certainly get some of the impulse to become a superhero, transforming yourself into a powerful figure as you draw out the traits in yourself that you most admire. In many ways, I am all for that. And I also recognize the power and prestige that comes from donning a mask or a cape or a costume—these accouterments are symbols of your office and confer power, status, and meaning.
My constant struggle, however, is to walk the fine line between seeing the everyday as worthy of superhero status while fighting the impulse to disconnect from the system. In the best possible world, everyone is a superhero and everyone works together (using the particular talents that we each have) to contribute to a whole. In some ways, perhaps, similar to Communism (and we know how that went), but never with the expectation that you are going to necessarily get something for your efforts. You work because you believe in the system and because you place hope in your community.
And is this also just another symptom of a society that has become desensitized to extreme and spectacle? That we believe that, in order to be empowered, we have to become superheroes? Has the groundwork been laid by Stan Lee’s shows, which have “trained” real life superheroes? Where do we draw the line between a good citizen (one who is, perhaps, civically engaged!) and having to create a superhero persona? Ultimately, why can’t we integrate the actions that we would undertake as a superhero into our everyday sense of self?
The old questions that were first raised with virtual selves and MMORPGs continue to haunt me: are these people who are compensating for a feeling of powerlessness elsewhere in their lives? In some ways, their actions seem to be a direct response to the inefficiency of law enforcement but I suspect that it runs deeper than that. If you still believed in the system, wouldn’t you fight for reform instead of doing it on your own? Is there something just viscerally satisfying about putting yourself in potential danger that adds to the equation?
I saw it there, unmistakable; it was unlike anything I had seen before (or would ever see again).
Often existing just on the edge of familiarity—there exists here a certain resonance with Freud’s “uncanny“—the realm of Science Fiction (SF) might be seen to possess an intuitive relationship with design, with the distinctive look and feel of a crafted world often our first clue that we have transcended everyday reality. On one level, the connection between SF and design seems rather banal, with repeated exposure to depictions of outer space or post-apocalyptic visions of the Earth—we have been there and done that (figuratively, if not literally).
Yet, upon reflection, I think that it’s not only natural for SF to be concerned with the concept of design, but a part of the process itself for both concepts ask the same basic questions of how things could be and how things should be. Science Fiction, then, like design, is concerned with contemplation and speculation, a point echoed by Brian David Johnson.
And contemplation and speculation in SF often takes the form of artistic expression that is largely driven by the realization of relationships that do not yet exist: if a job of a writer is to commit unexplored connections to paper—or perhaps to see established links in a new and/or unexpected light—then the SF writer might tend to focus on relationships as they intersect with technology. In other words, one possible function of SF writers is to explore the interaction between us (as individuals or collectively) and the world around us, highlighting technology as a salient subject; SF provides a creative space that allows authors to probe the consequences of permutations latent in the near future.
The term “technology,” however, should not merely imply gadgets or machines (although it certainly includes them), but rather a whole host of tools (e.g., paper) and apparatuses that comprise the tangible world. We might even broaden the scope of our inquiry, asking whether “technology” is a product, a process, or both. We see, for example, that Minority Report pushes the envelope by proffering new conceptualizations of tools used for imaging and data storage, but John Anderton’s interaction with information surely suggests a rethinking in process as well. Does this practice, on some level, constitute a new technology? Or, perhaps we return our gaze back to futuristic buildings and structures: advances in construction materials certainly represents a new type of technology (in the traditional sense) but architecture as a form also underscores a kind of social interface, its affordant qualities subtly hinting at directions for movement, observation, or interaction. How, then, might the design of something also be considered a type of technology?
So if elements of technology infuse design, and a quick mental survey indicates that design is largely concerned with technology, we might argue that Science Fiction possesses the potential to intersect with design on several levels.
One such implementation, as John Underkoffler points out in his TED talk, is the development of the user interface (UI), an incredibly important milestone in our relationship with computers as it translated esoteric programming syntax into a type of language that the average person could understand. Indeed, as our abilities become more sophisticated, we seem to be making computers more accessible (and also intuitive, although this is a separate issue) to even the most basic users as we build interfaces that respond to touch, gestures, and brain waves.
 Alternatively, one might also suggest that “we are not in Kansas anymore” as a nod to the transformational properties of the third of three related genres: Horror, alluded to by the the uncanny, Science Fiction, and Fantasy.
It hung there, slightly faded and more than a little wrinkled. Nestled among bright advertisements for football and spirit rallies lay a humble flyer, no larger than a quarter of a page, that caught my attention.
My ascent up the stairs to the college counseling office slowed as I reached out to touch the rough surface of the advertisement. So humble, so easy to miss, but yet the most powerful thing on the bulletin board—this was the thing that mattered the most (yes, even more than football).
During my presentations, I often make it a point to bring up things like “To Write Love on Her Arms” or “PostSecret” as I feel that these are important tools that allow me to connect with my audience. Through these websites and stories, I remember the stresses and the pressures of friends, of parents, and of school. Admittedly, I am a young professional and while I can generally relate to being a high school student (I was one at one point in my life), things like PostSecret vividly remind me of what it is like to be a junior or senior in high school. If nothing else, PostSecret has taught me that everyone has a secret that, if told to me, would break my heart.
This has changed the way that I look at the world.
One of the things that I have learned in my years of college admission is that an increasing number of students are suffering from something that I call “floating duck syndrome”—on the surface, students are serene and perfect but, underneath the water, their legs are churning. Needless to say, students have some issues. I don’t mean to imply that students will not be able to overcome these things, but I must admit that I was shocked to learn about what they were dealing with.
For this reason, I find myself absolutely thrilled when high schools have groups like “To Write Love on Her Arms” because I think that so many of our students can use an outlet. I am certain that individuals are dealing with various amounts of baggage (or maybe not at all) and I am so glad that St. Margaret’s has taken it upon itself to offer support for peers in need; whether the situation revolves around academic pressure or thoughts of self-harm, I see clubs like “To Write Love on Her Arms” as an invaluable part of the school community.
However, lest one become depressed, I should mention that I am incredibly hopeful for the generation of students that is following in my footsteps. I am hopeful that students will learn to brave the dark places of themselves, secure in the knowledge that friends and family will always be there to draw them back. I am hopeful that students will come to understand who they are and accept themselves for that. And, I am hopeful that students will learn to step outside of themselves in order to offer their help to those in need. I am lucky to be in a situation where I can empower future students to realize that, although occasionally overwhelmed by adversity, they are all survivors in some respect: any person who has ever been teased, ridiculed, outcast, or made to simply feel less than is a survivor and can embrace that. And, because you are a survivor, you have been imbued with the power to tell your story to others in similar situations in order to pull them through. Ultimately, I am also hopeful because I have learned that young people are incredibly resilient and innovative—they can accomplish some amazing things if given half a chance.
Applicants to the University sometimes want to get inside my head and to gain insight about the college admission process. Often, people want to know how to get in, how to make an impression, and how to stand out from their peers. I will be honest and say that a clever title on your essay or photos of you in USC garb is not the most effective means; tell me a compelling story, however, and I will be hooked. I understand that this process is difficult, particularly for young writers, but one of the things that sets you apart from all other applicants is the truth of your story. Believe it or not, I want to learn more about you as an individual and these sorts of stories are the ones that I love to hear. These tales do not always have to be tragic or morose—I love the stories of triumph as well—but I would encourage all of you to dig down deep and figure out your narrative. I am fully aware that this process of self-discovery is quite scary (who knows what you might find?) but rest assured that college admission officers are not in the position to judge you and we are not laughing at you behind your back; instead, I believe that students who are brave enough to open up should be rewarded.
I am hopeful that this piece has given you more of a sense of not only what matters to me and, perhaps more importantly, why these things matter. I want to convey that the admission process is human and that we care more deeply for you than you might realize. We, along with your college counselors, are fighting for you to realize your potential and it is my profound hope that we can make this inherently frightening process less scary; I hope that we have made it easier for you to venture out into this sometimes daunting landscape of college admission and shine as though you never had any doubt.
Chris is in his fifth year with the Office of Admission at USC and also studies the intersection of popular culture, media, and online communities as a Masters student in Annenberg. When not on campus, Chris spends his time blogging or volunteering for 826LA in an effort to use his meager writing talents for social good. Chris is excited to write for the St. Margaret’s community and always welcomes any invitation to drink overpriced coffee while discussing the cultural merits of Six Feet Under, True Blood, or Gossip Girl.
I may not be proud of what I did, but I am proud of who I am.
My forearm still bears the mark–you can see it if you look closely enough–the little half-moon, a result of the one time I couldn’t stop. Things were so much simpler then: dig in, hold on, and focus on the pain. Focus on the pain not because you are a masochist, but because this pain–this pain–is at least definable; this pain is tangible, real, and quantifiable. I refused to pick up anything sharper, lest I turn into one of those tragic emo kids splayed out in the tub with one arm crooked over the edge of the tub–or maybe the truth was that I was just too much of a coward–so this pain was, for now, all I had.
I ran my finger over my arm in lazy figure eights, writing my name so many times over in those angry scarlet letters.
Now I write my names many times over so I will not forget: I put ink to skin so that I will not forget who I am, what I am, and, most importantly, what I am worth. I write so that I will never hear someone say, “I knew you when.”
In today’s world, it seems that “secularization” is all too often matched with a sense of loss: whether it be the decline in institutional religion or the dissipation of enchantment, we seem to employ the term in order to forward the idea that we are moving away from something that was once valued. And, to be fair, this is true. The modern age has, since the Enlightenment, been, in fits and starts, shifting away from a life infused with religion. But, I also think that “secularization” can also speak to something larger, and more significant, than that.
Unfortunately, it appears as though “secularization” has become synonymous with Science and been placed in opposition to Religion–atheists rigidly adhere to a rather static ideology that denounces aspects of religion, preferring the explanations proffered by experiments and equations. Yet, are we simply trading one set of dogma for another as we move between extremes? For me, Science works best when it challenges Religion (and vice versa) to keep pace with the developing world. The sense of awe, mystery, and wonder inherent in religion keeps scientists humble and science reminds us that some holy laws must be reconciled with modern culture.
One of the most welcome and quoted new books on the subject is Taylor’s A Secular Age, an 896-page opus that argues that secularization has been largely positive — as long as it leaves open a “window on the transcendent.”
The spiritual and religious impulse in humans will never die, says Taylor. Even if religion doesn’t dominate a society, as it once unfortunately did in Europe and elsewhere, people will always seek the transcendent; something ultimate, larger than themselves.
The great sociologist of religion, Robert Bellah, author of Habits of the Heart, says what is needed most now is new forms of religion that work in a secular age, where they are subject to analysis and don’t rely on political endorsement.
We are seeing this today. Many open-minded forms of Christianity, Judaism, Buddhism and of smaller spiritual movements, including meditation, yoga and healing, are maintaining a sense of the transcendent in some secular, pluralistic societies.
We can partly thank the Enlightenment for the rise of secularism, with the era’s emphasis on freethinking, rationality and science. But many thinkers, including 19th-century sociologist Max Weber, also credit the advance of secularism to Protestantism.
The Protestant Reformation rejected the absolute authority claimed by the Roman Catholic church of the time.
It brought a new wave of reform, choice and intellectual questioning to Christianity. By the 19th century, Protestants were critically analyzing the Bible and trying to discern the difference between the “historical Jesus” and the Christ of unquestioned mythology.
This so-called “critical method” wasn’t an attack on the faith, as some traditionalistic Christians continue to argue today. But it was what many consider a valid attempt to challenge the taboos that surrounded Christian orthodoxy.
Seeing the synthesis of these two areas is what makes studying modern religion so fascinating. Despite a formal training in Natural Sciences, I have gradually come to appreciate the power inherent in religion and am quite excited to be in some other great minds at the USC Knight Chair in Media and Religion blog.
“This implies, does it not, that in order to raise a generation of children who can reach their full potential, we must find a way to make their lives interesting. And the question I have for you, Mr. Hackworth, is this: Do you think that schools accomplish that? Or are they like the schools Wordsworth complained of?”
–Neal Stephenson, The Diamond Age
Fifteen years after these words are written, we are still struggling to answer the question posed by Science Fiction author Neal Stephenson. Increasingly, we are finding that our American educational system does not raise a generation of children to reach their full potential; arguments about mental acuity aside, we seem to suffer from a generation of college applicants that is, well, rather uninteresting. This is not to say that there aren’t amazing students out there–there definitely are some–but they are more the exception than the rule.
To combat this, we have seen a rise in adult-driven initiatives that aim to cultivate interesting children. Although I don’t disagree with the sentiment, I do disagree with the practice. Fantastic trips and summer camps are not, in and of themselves, the problem. (Certainly, I think we have come adopt a rather distorted view of what’s important and, on some level, we’ve all heard these arguments before. Bigger is better, theater audiences want to see their money on stage, news headlines scream at us, spectacle is rampant, etc.) Rather, I take issue with the idea that many applicants try to substitute someone else’s story for their own: time and time again, I have come across students who traveled to poor villages, or did research, or spent the summer living in European hostels and they typically tell me the same story. These students tell me the central narrative of what they were supposed to have learned or experienced on these adventures and, sometimes, force themselves to have those experiences whether they are genuine or not. Without realizing it, many of subscribed to the notion that there is a typical experience one is supposed to have in the Costa Rican jungles and they recount this like it was the most magical awakening. And, to be fair, it might have been, but I would argue that the shift in perspective is only part of it–everyone goes through an awakening at some point in his or her life–what I want from students is to understand what this change wrought in them. How did you learn something that forever changed the way that you saw the world, such that you couldn’t ever go back?
Or we extol the virtues of Boredom as a provider of quiet spaces free from stimulation, forgetting that, with the incredible, restless youth have also managed to enact incredible amounts of destruction. The practices of contemplation, introspection, and awareness can result from boredom but we are mistaken if we consider boredom to be a prerequisite.
Ultimately, I think that teaching kids to cultivate a passion is not the same as demanding mastery–sure, passion may lead to mastery and I’m not trying to stifle that process–but all I really want is for a student to want to be smarter, to be braver, to be more inquisitive. Simply put, all I really want is for a student to want to be more. If this is our goal, the trips and the flashy photos and the houses built all melt away for we see that we can have–that we do have–meaningful experiences every day. We don’t need to “discover” hidden truths but we do need to reconsider what’s happening around, to, and in us. I think we need to train kids how to understand the import of their “normal” lives and, perhaps more importantly, how to translate these lessons learned into purposeful action.
Homeostasis. Nature has a way of correcting itself, resetting the scales and maintaining a kind of chaotic order. Tumbling, turning, Halloween, shifting, and inversion–this season is all about seeing the same old things in the cold grey light of dawn.
Or maybe it’s really just seeing things as they really are (for the first time?). To live in a post-Edgington world is to live in a world that is constantly under surveillance. We work ourselves into a frenzy over issues of privacy and security, not realizing just how hard we have bought into the system. We have, collectively, become Big Brother (something anyone from Gossip Girl could tell you if he or she just thought about it hard enough). Social paranoia is the name of the game as we look for the first thing that’ll confirm our suspicions. We see things, then, not just as they are (to us) but as they have always been–and always will be. We are deaf to protestations, because, after all, that’s exactly what a zombie would say (and we knew it all along, anyway). To live in a post-Edgington world is to live in our world.
Or maybe it’s seeing the evident truths of others long before they do? Our gaze, focused at a distance, loses perspective on who and what is in front of us. We struggle to see what we’ve already lost. Older, wiser, we see the long view and just how far away we are from where we want to go.
Or maybe it’s seeing the truths that are all too evident to us. Driven by the spirit of Mab, we fixate on revenge, redemption, absolution, forgiveness, or our maker. We cling, we claw, and we scrape by because, for us, there’s only one way out, one way forward, one way through.
Sight. Seeing. Being seen. It’s what a part of this season is all about.
In different ways, we deal with the fracture of our selves, forgetting that we, as creatures of Nature, will also be set right by the cold grey light of dawn.
This raises interesting notions about what/who is a superhero. It sort of saddens me that we have gotten to the stage where we can’t see ourselves–our everyday selves–as superheroes in some fashion and attempt to grasp the feeling by (temporarily) transforming into something/someone else. Sure, symbols have power, but I can’t help but think that, if we tried, we could, too.
As children, we are taught that we can be anything we want to be, whether it be a President, an astronaut, or a firefighter. We can become, it seems, anything–as long as that thing is not ourselves.
We become the thing we hate because the thing we love is too much to bear; it’s too raw and too real, too fragile to survive. Defeated, we strive to be our worst, confident that, if nothing else, we can be that. Despite our best intentions, we become the person that we swore we’d never be, finding out that everything is just a matter of perspective once it’s too late. In order to protect the parts of ourselves we hold most sacred, we offer up our best parts of ourselves; in order to become who we want to be, we give up who we are.
We become the thing we love because the thing we hate is too much to bear; cobbled together with spent wishes and worn-out prayers, we cling to the thing we hope to be because it beats the hell out of who we really are. We focus on who we can be–who we could be–or who we were because who we are is too much and never enough. A shine, a sheen, a glamour–one day we’ll forget that we’ve slipped into another skin and it’ll all be for real.
Looking back, it would have seemed quite obvious: although I’ve now grown into someone studying for a graduate degree in Communication, I have always been enthralled by the power of narrative. As a child, mythology was my go-to, with stories of ancient cultures giving me—a kid with a short cultural history in America—a sense of place. I’ve since grown into someone who has embraced storytelling as a means of information transmission, learning to see identity as a complicated real-time narrative infused with performance. I think about the world in terms of stories being written, by ourselves as well as by others.
And perhaps this is why I tend to take issue with It Gets Better. Although a valuable message, the project has always sort of rubbed me the wrong way as it seems to suggest that others will write the story of your life for you. Things will get better, it says, somewhere and someday (that’s not here). Things will get better, but you will not. My gut is always to flip that and say that things will get better because you will make them better. You get to write the story of your life and, in so doing, learn the hard lesson that the story is never about you. Well, not just you, anyway. Your story intersects with millions of others and while you are the center of your story, you are a bit character in many others. You learn humility, but also that your presence makes a difference. Given my affinity for storytelling, it makes sense, then, that projects like PostSecret and To Write Love on Her Arms hit home for me.
I am particularly in love with TWLOHA’s newest project that asks people to define their greatest fear and hope. In so many ways, this is exactly what I hope to accomplish by studying horror—although the two aren’t always directly connected, I do believe that they stem from the core of our beings. Articulating both of those concepts is the first step on a journey that can lead to nothing but goodness. Articulating both of those is how you become a fighter, an activist, and a healer.
The video puts forth a series of statements:
This world needs you.
Your family needs you.
Your friends need you.
Your children—maybe someday, maybe now—need you.
But, to that, I would add: You need you.
There’s still time to make up for my sins. Or at least that’s what I tell myself before I go to sleep. I was young and I was doing the best that I could, because nobody ever asked anything more of me.
As I enter into a new phase of my growth, I think back on my participation in an admission process and find myself desperately hoping that, in the end, I did more good than harm.
I think about the messages that I was tasked with conveying and the ones that I unwittingly helped to perpetuate. Early in my career, I worked to break down specific stereotypes of USC, but, looking back, I sort of wonder if I was focused on the wrong objectives all along. Listening to faculty and other intelligent discussion about what skills are needed in college students today, I can’t help but think that we’re shooting ourselves in the collective foot by not really taking stock of the effects of our practices. This is not to suggest that there isn’t merit to the system that’s currently in place—it does its job in a number of different ways—but this also does not mean that it can’t be better.
I currently wonder about the more diffuse skills of creativity, remix, critical thinking, and how all of these intersect with media use by youth. I think about the charges of apathy and disengagement and how games, comics, and play can complicate the equation. I consider how the root of “academic inquiry” lies in a sense of joy that is systematically squeezed out of the grooming process—even though we know that this is what we need, does admission systematically work against the cultivation of the sentiment in youth? Instead of teaching students that their energies and passions are valued, do we irreparably damage youth by forcing them into a range of approved activities? Admittedly, the scope of what we recognize is broadening, but we will always be behind students. How powerful could it be to tell a student that he or she, exactly as he/she is, is valued? But also to challenge that student, saying that it’s not enough to stay there? To teach youth that they have a responsibility to use their passions to reshape the world? We talk about authenticity and genuineness with our applicants, and I can’t help but think that we’re going about it all wrong: if we valued who they already were, they wouldn’t feel the need to tell us what we want to hear. If we can reshape the discussion surrounding admission and get students to go after these things but also think critically about them, we can change the type of applicant who sits in our classrooms.
In some ways, you want to tell kids to just soar and so much of what we do as admission officers seems to work against that. We teach youth, whether we realize it or not, that the safe bet is valued (and sure, it’s safe for a reason) but not to think about why it’s valued in the first place and if there are in fact alternative routes to reach the same destination.
For me, the disconnect centers around the notion that kids aren’t given the tools to think about the things that they already do for fun in a critical manner. There’s certainly nothing wrong with traditional or established activities—and these should be encouraged as well—but I do think that we need to radically rethink the process by which our youth are developing skills that will prepare them for college and beyond. There’s something powerful inherent in really looking at what youth are already into—how they spend their time naturally—and using that; there’s something to the idea that showing students how their actions can serve as scaffolding for other things that we value.
While I doubt that any admission person would ever place a large amount of value in a student who competitively stacks cups, I would argue that there’s some skill in that and the trick is to flip that into something. In this process, we have to be partners with students: youth need to be able to articulate what such an activity means to them and we have to be receptive to that. Because, at the end of the day, admission officers are people and who can’t get on board with the simple joy that comes from something like that? Cup stacking might not be our favorite thing in the world, but we’ve all known that expression of joy (at least I hope so) and teaching a student how to parlay that sense of exuberance is what’s going to get him or her to the next level.
Ultimately, I want more kids to be unafraid to express some of that unadulterated passion on the application because knowing that, for possibly one second in your life, you simply shined is something so powerful. The trick is teaching kids humility and that their light isn’t better than or more special than anyone else’s…but if you don’t have a spark, you can’t shine.
But what is reality television? Although the genre seems to defy firm definitions, we, like Justice Stewart, instinctually “know it when [we] see it.” The truth is that reality television spans a range of programming, from clip shows like America’s Funniest Home Videos, to do-it-yourself offerings on The Food Network, investigative reporting on newsmagazines like 60 Minutes, the docu-soap Cops, and many other sub-genres in between, including the reality survival competition that forms the basis for The Hunger Games. Although a complete dissection of the genre is beyond the scope of this chapter—indeed, entire books have been written on the subject—reality television and its implications will serve as a lens by which we can begin to understand how Katniss experiences the profound effects of image, celebrity, and authenticity throughout The Hunger Games.
She Hits Everyone in the Eye
For the residents of Panem, reality television is not just entertainment—it is a pervasive cultural entity that has become inseparable from citizens’ personal identity. Although fans of The Hunger Games can likely cite overt allusions to reality television throughout the series, the genre also invokes a cultural history rife with unease regarding the mediated image in the United States.
Reacting to atrocities witnessed throughout the course of World War II, Americans in the 1950s became obsessed with notions of power and control, fearing that they would be subsumed by the invisible hand of a totalitarian regime. In particular, the relatively young medium of television became suspect as it represented a major broadcast system that seemed to have a hypnotic pull on its audience, leaving viewers entranced by its images. And images, according to author and historian Daniel Boorstin, were becoming increasingly prominent throughout the 19th century as part of the Graphic Revolution replete with the power to disassociate the real from its representation. Boorstin argued that although the mass reproduction of images might provide increased levels of access for the public, the individual significance of the images declined as a result of their replication; as the number of images increased, the importance they derived from their connection to the original subject became more diffuse. And, once divorced from their original context, the images became free to take on a meaning all their own. Employing the term “pseudo-event” to describe an aspect of this relationship, Boorstin endeavored to illuminate shifting cultural norms that had increasingly come to consider the representation of an event more significant than the event itself.
Katniss unwittingly touches upon Boorstin’s point early inThe Hunger Games, noting that the Games exert their control by forcing Tributes from the various districts to kill another while the rest of Panem looks on. Katniss’ assertion hints that The Hunger Games hold power primarily because they are watched, voluntarily or otherwise; in a way, without a public to witness the slaughter, none of the events in the Arena matter. Yet, what Katniss unsurprisingly fails to remark upon given the seemingly ever-present nature of media in Panem is that the events of The Hunger Games are largely experienced through a screen; although individuals may witness the Reaping or the Tribute’s parade in person, the majority of their experiences result from watching the Capitol’s transmissions. Without the reach of a broadcast medium like television (or, in modern culture, streaming Internet video), the ability of The Hunger Games to effect subjugation would be limited in scope, for although the Games’ influence would surely be felt by those who witnessed such an event in person, the intended impact would rapidly decline as it radiated outward. Furthermore, by formulating common referents, a medium like television facilitates the development of a mass culture, which, in the most pessimistic conceptualizations, represents a passive audience ripe for manipulation. For cultural critics of the Frankfurt School (1923-1950s), who were still reeling from the aftereffects of Fascism and totalitarianism, this was a dangerous proposition indeed. Although the exact nature of modern audiences is up for debate, with scholars increasingly championing viewers’ active participation with media, Panem has seemingly realized a deep-seeded fear of the Frankfurt School. It would appear, then, that The Hunger Games function as an oppressive force precisely because of its status as a mediated spectacle of suffering.
But perhaps we should not be so hard on Katniss. Growing up in an environment that necessitated the cultivation of skills like hunting and foraging, Katniss’ initial perspective is firmly grounded in a world based on truth. Plants, for example, must be checked (and double-checked!) to ensure their genuineness, lest a false bite result in death. In order for Katniss to survive, not only must she be able to identify plants but must also trust in their authenticity; prior to her experience in the Arena, Katniss undoubtedly understands the world in rather literal terms, primarily concerned with objects’ functional or transactional value. However, as hinted by Boorstin, additional layers of meaning exist beyond an item’s utility—layers that Katniss has not yet been trained to see.
Echoing portions of Boorstin’s previous work, French philosopher Jean Baudrillard conceptualized four types of value that objects could possess in modern society: functional, transactional, symbolic, and sign. Admittedly a more complex theory than the description provided herein, we can momentarily consider how Baudrillard’s value categories of “functional” and “transactional” might align with Boorstin’s previously introduced concept of the “real,” while “symbolic” and “sign” evidence an affinity toward “representation.” Whereas the functional and transactional value of items primarily relates to their usefulness, the categories of “symbolic” and “sign” are predominantly derived as a result of the objects’ relationship to other objects (sign) or to actors (symbolic). Accordingly, being relatively weak in her comprehension of representation’s nuances, Katniss characteristically makes little comment on Madge’s gift of a mockingjay pin. However, unbeknownst to Katniss (and most likely Madge herself), Madge has introduced one of the story’s first symbols, in the process imbuing the pin with an additional layer of meaning. Not just symbolic in a literary sense, the mockingjay pin gains significance because it is attached to Katniss, an association that will later bear fruit as fans well know.
Before moving on, let’s revisit the import of The Hunger Games in light of Baudrillard: what is the value of the Games? Although some might rightly argue that The Hunger Games perform a function for President Snow and the rest of the Capitol, this is not the same as saying the Games hold functional value in the framework outlined by Baudrillard. The deaths of the Tributes, while undeniably tragic, do not in and of themselves fully account for The Hunger Games’ locus of control. In order to supplement Boorstin’s explanation of how The Hunger Games act to repress the populace with the why, Baudrillard might point to the web of associations that stem from the event itself: in many ways, the lives and identities of Panem’s residents are defined in terms of a relationship with The Hunger Games, meaning that the Games possess an enormous amount of value as a sign. The residents of the Capitol, for example, evidence a fundamentally different association with The Hunger Games, viewing it as a form of entertainment or sport, while the denizens of the Districts perceive the event as a grim reminder of a failed rebellion. Holding a superficial understanding of The Hunger Games’ true import when we first meet her, Katniss could not possibly comprehend that her destiny is to become a symbol, for the nascent Katniss clearly does not deal in representations or images. Katniss, at this stage in her development, could not be the famed reality show starlet known as the “girl on fire” even if she wanted to.
By All Accounts, Unforgettable
Returning briefly to reality television, we see that Panem, like modern America, finds itself inundated with the genre, whose pervasive tropes, defined character (stereo)types, and ubiquitous catchphrases have indelibly affected us as we subtly react to what we see on screen. Although we might voice moral outrage at offerings like The Jersey Shore or decry the spate of shows glamorizing teen pregnancy, perhaps our most significant response to unscripted popular entertainment is a fundamental shift in our conceptualization of fame and celebrity. Advancing a premise that promotes the ravenous consumption of otherwise non-descript “real” people by a seemingly insatiable audience, reality television forwards the position that anyone—including us!—can gain renown if we merely manage to get in front of a camera. Although the hopeful might understand this change in celebrity as democratizing, the cynic might also argue that fame’s newfound accessibility also indicates its relative worthlessness in the modern age; individuals today can, as the saying goes, simply be famous for being famous.
Encapsulated by Mark Rowlands’ term “vfame,” the relative ease of an unmerited rise in reputation indicates how fame in the current cultural climate has largely divorced from its original association with distinguished achievement. Although traditional vestiges of fame have not necessarily disappeared, it would appear that vfame has become a prominent force in American culture—something Katniss surely would not agree with. Recalling, in part, Kierkegaard’s thoughts on nihilism, vfame’s appearance stems from an inability of people to distinguish quality (or perhaps lack of concern in doing so), resulting in all things being equally valuable and hence equally unimportant. This, in rather negative terms, is the price that we pay for the democratization of celebrity: fame—or, more accurately, vfame—is uniformly available to all in a manner that mirrors a function of religion and yet promises a rather empty sort of transcendence. Although alluring, vfame is rather unstable as it is tied to notions of novelty and sensation as opposed to fame, which is grounded by its association with real talent or achievement; individuals who achieve vfame typically cannot affect the longevity of their success in substantial terms as they were not instrumental in its creation to begin with. Stars in the current age, as it were, are not born so much as made. Moreover, the inability of the public to distinguish quality leads us to focus on the wrong questions (and, perhaps worse, to not even realize that we are asking the wrong questions) in ways that have very real consequences; although vfame and its associated lapse in thinking might be most obvious in the realm of celebrities, it also manifests in other institutions such as politics. As a culture that is obsessed with image and reputation, we have, in some ways, forgotten how to judge the things that really matter because we have lost a sense of what our standards should be.
Born out of an early to mid-20th century society in which the concept of the “celebrity” was being renegotiated by America, concepts like vfame built upon an engrained cultural history of the United States that was firmly steeped in a Puritan work ethic. Americans, who had honored heroes exemplifying ideals associated with a culture of production, were struggling to reconcile these notions in the presence of an environment now focused on consumption. Although Katniss, as proxy for modern audiences, might initially find this shift difficult to appreciate, one need only consider that the premium placed on production is so central to American ideology that it continues to linger today: in a culture that exhibits rampant consumerism, we still value the “self-made man” and sell the myth of America as a place where anyone can achieve success through hard work. To abandon these ideas would necessitate that we reinterpret the very meaning of “America.” Thus, we become more sympathetic to the critics of the day who lamented the loss of the greatness of man and bristled against the notion that fame or celebrity could be manufactured—such a system would only result in individuals who were lacking and unworthy of their status. To this day, our relationship with celebrities is a tenuous and complex one at best, for although we celebrate the achievements of some, we continue to flock to the spectacle created by the public meltdown of others, unable or unwilling to help; we vacillate between positions of adulation, envy, contempt, and pity, ever poised for incensement but all too willing to forgive.
Perhaps it should come as no surprise that reality television puts us a little on edge, as the genre represents a fundamental blurring of fact and fiction. Celebrities, we see, are just like us—just like our neighbors, who, through the magic of reality television, can become stars! Ever-shifting classifications leave us on unstable ground. But also consider the aforementioned philosophy of Boorstin: stars are, among other things, individuals whose images are important enough to be reproduced, which causes “celebrity” to transition from a type of person to a description of how someone is represented in society. In other words, we witness a shift from a term that labels who someone is to a term that designates who someone seems to be. Celebrities, it might be argued, derive at least a portion of their power in modern culture because they embody a collection of images that has been imbued with some sort of significance. Ultimately, it seems that much of our unease with celebrity and fame centers on notions of authenticity.
All I Can Think of Are Hidden Things
Long before Katniss ever becomes a celebrity herself, she exhibits disdain for the Capitol and its residents, evidencing a particularly adverse reaction to things she considers artificial. As previously discussed, authenticity played a particular role in Katniss’ growth and her ability to survive: for Katniss, a false image literally represented an affront on the level of life or death, for a lapse in judgment could have resulted in possible electrocution or poisoning. Concordantly, Katniss dismisses the strange colors of the Capital along with the characteristic features of its citizens—stylists, in particular, are purported to be grotesque—because she is not readily able to reconcile these visuals with her established worldview. As Katniss operates on a literal level, directly associating identity with appearance, the self can only present in one way (in this case, relatively unadorned) and maintain its authenticity.
Like Katniss, we too may be tempted to summarily reject the unfamiliar; our modern anxieties might best be encapsulated by the question: What to do with a problem like Lady Gaga? Perhaps the strongest contemporary mass image that mirrors the visual impact of the stylists on Katniss (followed closely by New York socialite Jocelyn Wildenstein), Lady Gaga suffers continual criticism for her over-the-top theatrical presentations. With dresses made from meat and Hello Kitty heads, it is all too easy to write Lady Gaga as “attention-starved,” simplifying her presence to the succinct “weird.” Yet, it seems rash to write off Lady Gaga and the world of fame as nothing more than frivolity and fluff, for pop culture is only as vapid as our disinclination to engage in it.
Consider, for example, how the Capitol and its residents (of whom a prominent one would undoubtedly be Lady Gaga) embody the spirit of Decadence, a particularly prominent theme in Victorian culture. A reaction to the 19th century movement of Romanticism, Decadence championed concepts like artifice, which served to demonstrate man’s ability to rebel against, and possibly tame, the natural order. Although this inclination toward the unnatural manifested in myriad ways, French poet and philosopher Chrarles Baudelaire viewed women’s use of cosmetics as a particular site of interest, for proper application did not just enhance a woman’s beauty but acted to transform her, allowing transcendence through artifice.
With this in mind, we begin to understand the innate control wielded by figures such as Cinna and Caesar Flickman. Perceived as facile by some, these two men represent a class of individuals adept at understanding the power inherent in fame, reputation, celebrity, and appearance; in the Capitol, image mongers such as these hold sway. Although one reading of these characters plants them firmly in the realm of artifice, painting them as masters of emotional manipulation and spectacle, an alternate view might consider how these two have come to recognize a shift toward a new localized reality—one that Katniss must adapt to or perish.
And yet, despite their commonality, these two individuals also underscore fundamentally different approaches to image: Caesar (and, perhaps, by extension, the Capitol) wields his power in order to mask or redirect while Cinna endeavors to showcase a deep-seeded quality through the management of reputation and representation. Coexisting simultaneously, these two properties of illusion mirror the complimentary natures of Peeta and Katniss with regard to image. Peeta, skilled in physical camouflage, exhibits an emotional candidness that Katniss is initially unready, or unwilling, to match; Katniss, very much the inverse of Peeta, is characterized by traits associated with hunting, finding, and sight in the “real” world all while maintaining a level of emotive subterfuge. Over the course of the 74th Hunger Games, however, Katniss quickly learns to anticipate how her actions in the Arena will affect her representation and reputation beyond the battlefield. With the help of Haymitch, Katniss begins to better understand the link between a robust virtual self and a healthy physical one as she pauses for the cameras and plays up her affection for Peeta in exchange for much-needed rewards of food and medicine. As she matures, Katniss comes into alignment with Cinna and Caesar, individuals who, despite being participatory members of a system arguably deemed inauthentic, distinguish themselves from the majority of Panem by understanding how image works; Cinna and Caesar (and later Katniss) are not just powerful, but empowered and autonomous.
Herein lies the true import of Collins’ choice to weave the trope of reality television into the fabric of The Hunger Games: throughout the trilogy, the audience is continually called upon to question the nature of authenticity as it presents in the context of a media ecology. Ultimately, the question is not whether Katniss (or anyone else) maintains a sense of authenticity by participating in the games of the Capitol—trading a true self for a performed self—but rather an askance of how we might effect multiple presentations of self without being inauthentic. How does Katniss, in her quest to survive, embody Erving Goffman’s claims that we are constantly performing, altering our presentation as we attempt to cater to different audiences? Is Katniss truly being inauthentic or does she ask us to redefine the concept of authenticity and its evaluation? Struggling with these very questions, users of social media today constantly juggle notions of authenticity and self-presentation with platforms like Facebook and Twitter forming asynchronous time streams that seamlessly coexist alongside our real-life personas. Which one of these selves, if any, is authentic? Like Katniss, we are born into the world of the “real” without a ready ability to conceptualize the power latent in the virtual, consequentially resenting what we do not understand.
“There are some genuinely bad people who would like to infiltrate our country and we have got to have the guts to stand up and say ‘No.’”
By now, Newt Gingrich’s comments at the first New Hampshire Republican Presidential Debates have made the rounds, spreading across blogs, mainstream news outlets, and, of course, The Daily Show. Positioning Muslims alongside Nazis and Communists as those who would infiltrate our country, Gingrich has once again invoked anti-Muslim sentiment in the name of patriotism.
Although Gingrich’s polemic likely raised a few eyebrows, it was admittedly not all that surprising given his recent stance on the subject; highly visible in a movement that would label American Muslims as forever foreigners, Gingrich seems to have crafted himself into a candidate who is willing to engage with the popular topic of American Muslims. Despite the recent spate of coverage, Justin Elliott notes that the mainstream media has generally shied away from what might very well be the real story: the evolution of this particular brand of rhetoric by Gingrich.
Perhaps the American public is partly at fault as it clamors for briefs primed to incite moral outrage and hungers for stories that whet an appetite for spectacle. Yet, as we know, journalism also has a role to play and it is perhaps neglecting its duties in this regard. An issue larger than a simple lack of coverage, there seems to be a fundamental absence in the training of many journalists who would cover religion.
And yet religion continues to have a large presence in the current state of politics, manifesting concerns separate from the intricacies of traditional voting demographic blocs. With Rick Perry hosting an event for governors named The Response, and Reverend Gaddy of the Interfaith Alliance calling for a reduction in religion’s political presence, it appears as though this upcoming race will see the resurgence of a negotiation between the public and private aspects of religion that was recently highlighted by the Ground Zero mosque.
But it’s not only politicians who struggle to understand how religion figures into the everyday, with salvation to be had at venues unlikely as cowboy church. However, despite the potential collapse of the private/public dichotomy, are we really encouraging people to think about the role that religion plays in both of these spheres? Has our news coverage been affected by an upswing in atheism’s popularity? Religion, faith, and spirituality all bridge the gap, with values formed in private undoubtedly affecting actions displayed in public. Why, then, do we hesitate to engage in meaningful discussion of religion’s potential political impact, focusing more on what a particular individual’s religion is in lieu of an attempt to understand how and why that particular philosophy permeates a candidate’s positions? If we are content to simplify our interest to buzzwords like “pro-life” or “against gay marriage,” never challenging ourselves to understand the root causes of the issues we hold dear, how can we ever hope to convince the other side that we may in fact have a point? We insist that others see it our way and never take the time to talk to them in words that they might actually be receptive to. Rather than avoiding the issue entirely, perhaps we should encourage people to make the discussion of religion a routine practice—and provide them with the information and rhetorical tools they need in order to facilitate intelligent discussion.
Chris Tokuhama is a doctoral student in the USC Annenberg School for Communication and Journalism where he is pursuing media/cultural studies with a concentration in Gothic Horror as an articulator of cultural anxiety. A biologist by training, Chris currently endeavors to understand transformative bodies through lenses as varied as narrative studies, media, and religion, a process that has resulted in an upcoming chapter in The Hunger Games and Philosophy focusing on issues of authenticity in celebrity. Follow his quest to find the perfect cup of coffee on Twitter at @TrojanTopher.
Flailing, lingering, drifting; in a word: restless. Horror is filled with those who cannot sleep. Ghosts, perhaps our first association, are ultimately the least helpful for they have one thing that all other undead envy: a purpose. Conversely, modern day vampires struggle to reconcile their “true natures,” cyborgs wrestle with post-humanism, and despite zombies’ evident drive, they are still miles away from truly possessing purpose. In their own ways, members of the undead horde toil without rest. Although we continue to tell tales, huddled in the dark, perhaps the decline in ghost stories means that we are no longer haunted by our pasts, instead unsure about our futures.
Narcissistic, apathetic, bored; in a word: restless. Modern youth have increasingly been painted in negative terms, each indicative of declines in the current generation. Yet, instead of castigating youth, how might we use the undead as a lens to sympathize with teenagers’ search for meaning? Both groups exist in worlds that have begun to move away from institutional and overt aspects of religion—how does each endeavor to fill the void? In a post-modern world, where all paths are equal (and hence, equally unhelpful), how do the undead and youth both fight to inscribe meaning through lived religion?