Thoughts from my study of Horror, Media, and Narrrative

Youth

He Chose Poorly, You Have Chosen Wisely

 Chosen Wisely

In April the rhetoric surrounding college admission inevitably comes to embrace the notion of choice as many students weigh their options prior to the National Candidates Reply Date on May 1. To be sure, the month of April is a portentous period for college-bound students and their parents as families begin to transition and necessarily start to contemplate a change in their configuration. Emotions run high as excitement and hope twine with a nervous energy that, in part, stems from a fear of missing out (FOMO, if you must). Here it would seem that much of the work of counselors lies in our ability to help students and families retain perspective through this process, validating their concerns even as we try to reframe them. On a practical level, individuals are often concerned with selecting the “right” college to attend and yet I cannot help but think about the invocation of “choice,” its assumed power, and its relationship to young people.

Back in 2012 I taught an undergraduate class on marketing ethics for USC’s Marshall School of Business and I began to cause a minor fuss over a recent Old Navy commercial that I had seen. Ostensibly aimed at a demographic that was of my age or older, I began a rant over how a company had come to appropriate the notion of choice in order to sell jeans that were functional but not necessarily noteworthy (which, I mean, is the whole purpose of commercial advertising). Although my students were generally too young to understand the original reference, I began to explain to them that I was upset because this move represented a blatant effort to wrest agency away from persons, presenting them with the illusion of choice while absconding with its power.

Taking a different approach, I had the students read the Beverly Hills, 90210 “I choose me” scene against the love triangle presented by Twilight—and the relative impossibility of readers aligning with Team Bella—to think about how the presentation of choice has changed in recent years. Young Adult (YA) fiction, in particular, has become an interesting place to explore the concepts of choice due to its relatively recent mainstream popularity and the rise in dystopian settings, which are often concerned with people making difficult choices and their ability to do so. Although some of my thoughts regarding the theme of choice in YA are reflected in a piece for Slate that reads Harry Potter against Divergent with respect to clans/identity/choice, what is more directly salient is not just the relative importance of choice but the implications of being asked to declare a binding allegiance as a young person. While useful as entry points, I also encouraged students to push back against the presentation of choice in Harry Potter and Divergent as both tended to focus on the connection between innate qualities and group identity, legitimizing individualism and genetics over context, environment, and existing social structures.

In spite of my occasional quibbles with them, I still think that there is a way to use YA texts like Divergent and Harry Potter (or, if you like, Enclave and Quarantine) to think about what it means for young people to make a decision that is, for them at least, often perceived as life-changing and somewhat irreversible. At its core, what does it actually mean to make a choice—in this case perhaps the choice of which institution to attend—in the college admission process? Is there merit to reframing the discussion in order to deemphasize the name of a school in favor of highlighting what the experience itself might bring? How can we change the college-going culture so that young people (and their parents) feel better equipped to make choices when they are presented? How do we get students and parents to complicate the notion of choice in order to consider that while an individual decision might be their own, the range of choices that they have are often influenced by external factors—for example, while an institution’s ranking might not be important in which school a student ultimately attends, a larger view also considers how the logic of ranking infiltrates education and makes particular avenues more salient for particular students in the first place—while remaining cognizant of the information overload that already exists? In what ways must we be self-reflexive as we guide students and parents toward making particular choices? Put another way, how do we, like Old Navy, encourage students to make particular kinds of choices in favor of others and what are the potential implications of our actions?

For me, the answer begins with a critical examination of choice:  I want to support the ability of students and families to make informed choices but the question always remains, “A choice to do what?”

I want students to choose wisely.


Like So Much Processed Meat

“The hacker mystique posits power through anonymity. One does not log on to the system through authorized paths of entry; one sneaks in, dropping through trap doors in the security program, hiding one’s tracks, immune to the audit trails that we put there to make the perceiver part of the data perceived. It is a dream of recovering power and wholeness by seeing wonders and not by being seen.”

—Pam Rosenthal

 In Pieces

Flesh Made Data:  Part I

This quote, which comes from a chapter in Wendy Hui Kyong Chun’s Control and Freedom on the Orientalization of cyberspace, gestures toward the values embedded in the Internet as a construct. Reading this quote, I found myself wondering about the ways in which identity, users, and the Internet intersect in the present age. Although we certainly witness remnants of the hacker/cyberpunk ethic in movements like Anonymous, it would seem that many Americans exist in a curious tension that exists between the competing impulses for privacy and visibility.

Looking deeper, however, there seems to be an extension of cyberpunk’s ethic, rather than an outright refusal or reversal:  if cyberpunk viewed the body as nothing more than a meat sac and something to be shed as one uploaded to the Net, the modern American seems, in some ways, hyper aware of the body’s ability to interface with the cloud in the pursuit of peak efficiency. Perhaps the product of a self-help culture that has incorporated the technology at hand, we are now able to track our calories, sleep patterns, medical records, and moods through wearable devices like Jawbone’s UP but all of this begs the question of whether we are controlling our data or our data is controlling us. Companies like Quantified Self promise to help consumers “know themselves through numbers,” but I am not entirely convinced. Aren’t we just learning to surveil ourselves without understanding the overarching values that guide/manage our gaze?

Returning back to Rosenthal’s quote, there is a rather interesting way in which the hacker ethic has become perverted (in my opinion) as the “dream of recovering power” is no longer about systemic change but self-transformation; one is no longer humbled by the possibilities of the Internet but instead strives to become a transformed wonder visible for all to see.

 Daniel

Flesh Made Data:  Part II

A spin-off of, and prequel to, Battlestar Galactica (2004-2009), Caprica (2011-2012) transported viewers to a world filled with futuristic technology, arguably the most prevalent of which was the holoband. Operating on basic notions of virtual reality and presence, the holoband allowed users to, in Matrix parlance, “jack into” an alternate computer-generated space, fittingly labeled by users as “V world.”[1] But despite its prominent place in the vocabulary of the show, the program itself never seemed to be overly concerned with the gadget; instead of spending an inordinate amount of time explaining how the device worked, Caprica chose to explore the effect that it had on society.

Calling forth a tradition steeped in teenage hacker protagonists (or, at the very least, ones that belonged to the “younger” generation), our first exposure to V world—and to the series itself—comes in the form of an introduction to an underground space created by teenagers as an escape from the real world. Featuring graphic sex, violence, and murder, this iteration does not appear to align with traditional notions of a utopia but might represent the manifestation of Caprican teenagers’ desires for a world that is both something and somewhere else. And although immersive virtual environments are not necessarily a new feature in Science Fiction television, with references stretching from Star Trek’s holodeck to Virtuality, Caprica’s real contribution to the field was its choice to foreground the process of V world’s creation and the implications of this construct for the shows inhabitants.

Seen one way, the very foundation of virtual reality and software—programming—is itself the language and act of world creation, with code serving as architecture. If we accept Lawrence Lessig’s maxim that “code is law”, we begin to see that cyberspace, as a construct, is infinitely malleable and the question then becomes not one of “What can we do?” but “What should we do?” In other words, if given the basic tools, what kind of existence will we create and why?

Running with this theme, the show’s overarching plot concerns an attempt to achieve apotheosis through the uploading of physical bodies/selves into the virtual world. I found this series particularly interesting to dwell on because here again we had something that recalls the cyberpunk notion of transcendence through data but, at the same time, the show asked readers to consider why a virtual paradise was more desirous than one constructed in the real world. Put another way, the show forces the question, “To what extent do hacker ethics hold true in the  physical world?”


[1] Although the show is generally quite smart about displaying the right kind of content for the medium of television (e.g., flushing out the world through channel surfing, which not only gives viewers glimpses of the world of Caprica but also reinforces the notion that Capricans experience their world through technology), the ability to visualize V world (and the transitions into it) are certainly an element unique to an audio-visual presentation. One of the strengths of the show, I think, is its ability to add layers of information through visuals that do not call attention to themselves. These details, which are not crucial to the story, flush out the world of Caprica in a way that a book could not, for while a book must generally mention items (or at least allude to them) in order to bring them into existence, the show does not have to ever name aspects of the world or actively acknowledge that they exist.


Admission + Confession

If I were feeling generous, I might be inclined to argue that the conflicted nature of Admission (Weitz, 2013) is a purposeful gesture designed to comment on the turmoil present in the process of admission (in both senses of the word). Unfortunately, however, I suspect that the movie simply lacked a clear understanding about its core story, relying instead on the well-worn structure of the American romantic comedy for support. Based on a 2009 book by Jean Hanff Korelitz, the movie adaptation focuses on the trajectory of Princeton admission officer Portia Nathan (Tina Fey) after the Head of School for the alternative school Quest, John Pressman (Paul Rudd), informs her that one of his students, Jeremiah (Nat Wolff), might be her son. Confused as the movie might have been, it was startlingly clear in its reflection of current cultural themes; evidencing a focus on the individual in a neoliberal environment and various manifestations of the sensibility of the post-, Admission remains a movie worth discussing.

 

Individualism and Neoliberal Thought

Although the decision to anchor the story in the character of Portia makes a certain amount of narrative sense, the focus on the individual at the expense of the process represents the first indication that Admission is driven by a worldview that has placed the self at the center of the universe. But, to be fair, I would readily argue that the college admission process itself is one that is driven by individualistic impulses as high school students learn to turn themselves into brands or products that are then “sold” to colleges and universities around the country. In large and small ways, college admission in its present form demands that American youth mold themselves into a somewhat elusive model of excellence. (Let’s be honest, we all know parents who teach their toddlers French or insist on lessons of various kinds in the hopes that these skills will place children on track for a “good” school.) In short, college admission sets the rather impossible task for students to, as Oprah would say, “Be your best self” while remaining authentic and not presenting as packaged (although that is secretly what is desired). The danger here, I think, is failing to realize that what is deemed “authentic” is, by its very nature, a self that has been groomed to meet invisible expectations and therefore is understood as natural.

Tracing one factor in the development of the current primacy of individualism Janice Peck performs a close analysis of Oprah’s Book Club in her book The Age of Oprah:  Cultural Icon for the Neoliberal Era, illustrating how Winfrey’s continual insistence on the self-enriching power of literature is reflective of the situation of the self as the most relevant construct for individuals immersed in a culture of neoliberalism (186). Through her examination of Oprah’s Book Club Peck suggests a manner in which culture has reinforced the adoption of particular values that are consistent with those of neoliberalism. Admission is not exempted from this reflection of a larger sensibility that judges worth in relationship to self-relevance as we see the character of Portia only really advocate for a student once she believes that he is the son that she gave up for adoption. Although I am willing to give Portia the benefit of the doubt and believe that she has been an advocate for other applicants in the past, the choice of the movie to conflate Portia’s professional and personal outreach grossly undercuts the character’s ability to effectively challenge a system that systematically promotes a particular range of students to its upper echelon.

Moreover, having previously established the influence of the 1980s recovery movement (7), Peck then suggests that for those who ascribe to the ideals of neoliberalism the therapeutic self—the self that is able to be transformed, redeemed, rehabilitated, or recovered—is of utmost importance. As example of this sentiment’s pervasiveness, although it would appear to be a clear conflict of interest, in discussing the merits of her applicant son Portia stresses the way in which Jeremiah has blossomed in the right environment and thus exemplifies the American ethic of pulling oneself up by one’s bootstraps. Here Portia urges her colleagues to overlook the first three years of high school that are riddled with Ds and Fs and to focus on Jeremiah’s transformative capacity.

 

The Manifestation of the Post-

And yet perhaps Portia’s insistence on the power of change makes a certain amount of sense given that she is the female lead of a romantic comedy and embodies transformation herself. Initially portrayed as a bookish middle-aged woman whose life is characterized by resigned acceptance, Portia inevitably has her world shaken by the introduction of a new male presence and proceeds to undergo the transformation that is typical of female leads in this scenario. Indicative of a postfeminist sensibility, Portia’s inner growth manifests as a bodily makeover in fashion that mirrors Rosalind Gill’s reading of Bridget Jones’ Diary (2007).

The most telling way manifestation of the logic of the post- in Admission is, however, the film’s express desire to “have it both ways” with regard toward attitudes on female identity/sexuality and race. In her article “Postfeminist Media Culture:  Elements of a Sensibility” Gill argues that the deployment of irony to comment on social issues is a central feature of the post- mentality and a practice that is ultimately damaging as it reinforces inequalities through its insistence that difference has been rendered innocuous enough to be rendered the subject of a joke (2007). In this vein, Admission introduces Portia’s mother, Susannah (Lily Tomlin), as a second-wave feminist only to undercut the power of the message that she represents. Although not expressly stated, the presentation of Susannah is suggestive of a radical feminist but also features a scene in which Susannah exemplifies postfeminism’s connection between the body and femininity by electing for reconstructive surgery after a double mastectomy and later ultimately admits that Portia’s conception was not an act of defiance but rather simply a mistake made by a young woman.

Admission also demonstrates ambivalence towards issues of race, not broaching the topic unless it is specifically the focus of the scene. To wit, John’s mother is a one-dimensional stereotype of a New England WASP whose articulations of racism (despite having a Ugandan grandchild) ostensibly indicates that she is not a “good white liberal.” This scene is indicative of the way in which irony has infiltrated popular media, going for the easy joke as it winks to the audience, “We all know that racism is awful, right?” Insultingly, Admission then fails to comment on the way in which John’s son Nelson (Travaris Spears) perpetuates a very specific presentation of young black males in popular culture as rascals and/or the way in which issues of race continue to be a very real point of contention for the admission process as a whole. Similar to issues of feminism, Admission exemplifies the sensibility of the post- in that it expresses a desire to gain approval for acknowledging social issues while not actually saying anything meaningful about them.

 

Problematizing Irony as Social Critique

How, then, do we go about unseating irony as a prevalent form of social critique when the response to challenges is often, “Can’t you take a joke?” I was surprised to see, for example, a response to Seth MacFarlane’s opening Oscar bit that argued that the feminist backlash was misplaced—according to Victoria Brownworth, MacFarlane was using satire to point out the inequalities in the Hollywood system. Although Brownworth fails to recognize that acknowledging a phenomenon without providing critique or an alternate vision only serves to reinforce the present, her reaction was not an isolated one.

One of the things that I have learned thus far in my life is that it is almost impossible to explain privilege to a person who is actively feeling the effects of that position and so a head-on confrontation is not always the best strategy. (This is, of course, not to say that one should allow things to pass without objection but merely that trying to breakdown the advantages that a party is experiencing in the moment is incredibly difficult.) If we recognize that the logic of neoliberalism constructs individuals who primarily understand importance in relationship to the relevance to the self—or, worse yet, do not think about interpersonal and structural forces at all—and that irony can be used as a distancing tactic, how to do we go about encouraging people to reengage and reconnect in a meaningful way?


Admission People Problems

This isn’t a new thing but I have to say that the Admission Problems tumblr (http://admissionsproblems.tumblr.com/) makes me so incredibly sad. As someone who used to work in the profession I have to admit that I get the jokes and I completely understand blowing off steam–a lot is asked of you as a professional and it is, at times, hard to remember why you do what you do. That is, if you even love it in the first place. I sympathize with the frustration of being continually misunderstood and seeing the same perceived shortcomings appear over and over again in students and parents but the thing is, I think, that we need to remember that the stakes look so different from the other side of the college fair table.

Our profession already struggles with an image issue and the danger of the tumblr is that outsiders are going to read it and judge all of us for what a few of us do. Outsiders are not going to understand the way that we might grumble but do so because we have so much hope for students and, perhaps unfairly, want them all to be as great as we know they can be. What does the blog do for students and families who are already nervous about navigating the college-going process? How many students will get the idea that they just aren’t good enough or that we don’t really care about them because of the tumblr’s vibe?

I get that a lot of the admission counselors who are on the ground are young but I also think that we should challenge ourselves to be better. This doesn’t mean that we don’t have faults and that we are immune from the occasional grumble session. We should be honest with our students and our families about how we are, just like them, human and we have human emotions that include frustration. But we should also be honest with them and let them know that this is not our dominant state of being–we are (with luck) not jaded and cynical and completely distanced from what it was like to apply to college. We should be honest and admit that sometimes we DO forget that this is, in many ways, the first time that these students can fail at something big and that an entire educational system has coached them to present themselves in ways that we occasionally find tiring. We need to be honest and tell people that our outbursts this don’t mean that we love students or support their goals any less.

We talk about how “students these days” can be narcissistic, individualistic, and needy. We talk about how our students aren’t smart about social media use. And maybe those arguments can be made. But we should consider how something like this Admission Problems tumblr implicates us in the very things that we think we are above. The tumblr talks about growth and how people can “learn” from the examples provided but makes evident that it knows nothing about what it actually means to be an educator. Is the information helpful? Maybe. But people should definitely be offended because the goal of Admission Problems is not to teach nor is it to truly understand. Admission Problems exists solely to critique and to judge and the fallacy of thinking that this is productive is a severely misguided notion. There are many things about the culture of college admission that I want to work to change but I also, at times, get angry enough to shout at these anonymous people, “Get out if you don’t love what you do. This work is too important to be done by people who don’t care.”

In so many ways I want to revise the tumblr’s subtitle and tell students that they ARE special in so many ways and sometimes we just can’t see that. But to also remind them that special doesn’t mean better than. I want to remind students that they are the protagonists of their stories but, at the same time, they are bit players in the stories of others and that being able to reconcile those two ideas is going to take them far in life.


Insufferable People Problems

This isn’t a new thing but I have to say that the Admission Problems tumblr (http://admissionsproblems.tumblr.com/) makes me so incredibly sad. As someone who used to work in the profession I have to admit that I get the jokes and I completely understand blowing off steam–a lot is asked of you as a professional and it is, at times, hard to remember why you do what you do. That is, if you even love it in the first place. I sympathize with the frustration of being continually misunderstood and seeing the same perceived shortcomings appear over and over again in students and parents but the thing is, I think, that we need to remember that the stakes look so different from the other side of the college fair table.

Our profession already struggles with an image issue and the danger of the tumblr is that outsiders are going to read it and judge all of us for what a few of us do. Outsiders are not going to understand the way that we might grumble but do so because we have so much hope for students and, perhaps unfairly, want them all to be as great as we know they can be. What does the blog do for students and families who are already nervous about navigating the college-going process? How many students will get the idea that they just aren’t good enough or that we don’t really care about them because of the tumblr’s vibe?

I get that a lot of the admission counselors who are on the ground are young but I also think that we should challenge ourselves to be better. This doesn’t mean that we don’t have faults and that we are immune from the occasional grumble session. We should be honest with our students and our families about how we are, just like them, human and we have human emotions that include frustration. But we should also be honest with them and let them know that this is not our dominant state of being–we are (with luck) not jaded and cynical and completely distanced from what it was like to apply to college. We should be honest and admit that sometimes we DO forget that this is, in many ways, the first time that these students can fail at something big and that an entire educational system has coached them to present themselves in ways that we occasionally find tiring. We need to be honest and tell people that our outbursts this don’t mean that we love students or support their goals any less.

We talk about how “students these days” can be narcissistic, individualistic, and needy. We talk about how our students aren’t smart about social media use. And maybe those arguments can be made. But we should consider how something like this Admission Problems tumblr implicates us in the very things that we think we are above. The tumblr talks about growth and how people can “learn” from the examples provided but makes evident that it knows nothing about what it actually means to be an educator. Is the information helpful? Maybe. But people should definitely be offended because the goal of Admission Problems is not to teach nor is it to truly understand. Admission Problems exists solely to critique and to judge and the fallacy of thinking that this is productive is a severely misguided notion. There are many things about the culture of college admission that I want to work to change but I also, at times, get angry enough to shout at these anonymous people, “Get out if you don’t love what you do. This work is too important to be done by people who don’t care.”

In so many ways I want to revise the tumblr’s subtitle and tell students that they ARE special in so many ways and sometimes we just can’t see that. But to also remind them that special doesn’t mean better than. I want to remind students that they are the protagonists of their stories but, at the same time, they are bit players in the stories of others and that being able to reconcile those two ideas is going to take them far in life.


On Obsession with Choice

A couple of weeks ago I found myself leading an exercise on marketing ethics for an introductory marketing class in the Marshall School of Business. Structured more as a provocation than a lecture, we covered basic concepts of persuasion and manipulation before proceeding to engage in a discussion about whether particular marketing practices were considered ethical (and how such a determination was ultimately made). During the course of our discussion many of these students expressed an opinion that it was, generally speaking, the responsibility of the consumer to know that he or she was 1) being marketed to and 2) potentially being tricked. I recorded this sentiment on a whiteboard in the room but didn’t comment much on it at the time. However, toward the end of the session I presented the class with a thought experiment that was designed to force the students to struggle with the concepts that they had just encountered and to push their thinking a bit about ethics.

Case (A):  Smith, a saleswoman, invites clients to her office and secretly dissolves a pill in their drinks.  The pill subconsciously inclines clients to purchase 30% more product than they would have had they not taken it but otherwise has no effect.

Case (B):  Smith, a saleswoman, hires a marketing firm to design her office.  The combination of colors, scents, etc., inclines clients to purchase 30% more product than they would in the old office but otherwise has no effect.

Question:  Are these two scenarios equally ethical and, if not, which one is more ethical?

After running this session multiple times a clear pattern began to emerge in students’ responses: the initial reaction was typically that Case B was more ethical than Case A and, when pushed, students typically reported that their decision resulted from the notion that individuals in Case B had a measure of choice (i.e., they could leave the room) while individuals in Case A did not.[1]

Although I didn’t think about it as such at the time, the notion of choice situates itself nicely alongside the empowerment of the self that Sarah Banet-Weiser writes about in Authentic. The takeaway that I had from working with students in this exercise was a profound realization about how choice was construed for them and how, generally, marketing was considered unethical only when it impinged upon an individual’s ability to make a choice.

Linking this back to the earlier statement that the burden of responsibility largely rested upon the consumer, I tried to incorporate examples from popular culture to suggest to the students that, for me, the most insidious effects of marketing are exemplified by its ability to limit or remove choices that you didn’t even know you had.

Because I am old, I invoked a scene from The Matrix Reloaded but drove the point home with a discussion of The Cabin in the Woods, a movie that, among other things, prominently evidenced philosophical questions of agency and free will.

Without spoiling anything, there is an interesting line in the movie where a character essentially argues that the free will of potential victims is preserved because outside forces can lead individuals to an open door but cannot ultimately force them to walk through it. Reflecting the idea that an individual is ultimately responsible for his or her fate, The Cabin in the Woods was particularly helpful for urging students to consider that they tended to focus on choice as an individual transaction instead of taking a step back to look at how behavior was permitted/controlled within a larger system of actions.

After the exercise concluded I found myself talking to the professor of the course about how I was slightly nervous for the future of business if these students held onto their mentality that consumers always acted rationally and were largely responsible for their own fates (to the exclusion of marketers taking responsibility for their campaigns). Now, as I muse on the prominence of the individual and the self in this cohort, I am reminded of an essay written by Kathryn Schulz about the prominence of self-help culture in America and the development of the concept of the self. As I reread the Schulz piece, I found myself revisiting Authentic’s chapters on consumer citizens and religion as I thought through the examples in terms of self-help rhetoric.


[1] For the record, I initially considered both of these cases to be equivalent in nature and suggested to students that part of their abhorrence to Case A had to do with perceived influence crossing the body/skin boundary and becoming physically incorporated into the self. Invariably students raised the notion of the pill causing some sort of change in brain chemistry and the thought experiment is designed to suggest that marketing’s true power does not lie in the realm of the directly observable.


Morphology of the Folktale

Morphology of the Folktale

Vladimir Propp

 

 

The fairy tale, on the other hand, is very much the result of common conscious and unconscious content having been shaped by the conscious mind, not of one particular person, but the consensus of many in regard to what they view as universal human problems, and what they accept as desirable solutions.

-Bruno Bettelheim

 

Bibliography

Propp, V. (1968). Morphology of the Folktale (2nd ed.). (L. Scott, Trans.) Austin: University of Texas Press.

 

 

Biography

Vladimir Propp was trained as a philologist, meaning that he studied the historical development of language. Trained as a Formalist, Propp is perhaps most famous for Morphology of the Folktale, his attempt to identify fundamental components of Russian fairy/folktales and the relationship of these elements to each other. In this, Propp responded to Antti Aarne, who focused on motifs (i.e., repeated story elements) and developed the Aarne–Thompson tale type index, by arguing that Aarne identified patterns but ignored the function(s) of these elements. Propp’s work would also go on to influence Roland Barthes and Claude Levi-Strauss, individuals working in mythology and folkloric studies. Given that Propp was interested in written language, his study of folklore has been criticized for its emphasis on the written form, disregarding that folklore had traditionally been transmitted orally.[1]

 

Summary

In accordance with Russian Formalism, Propp believed that literature was composed of discrete identifiable units and that appropriate analysis would result from the description of these elements and their relationship to both one another and the story as a whole. In order to tackle a study of the breadth of Russian folktales, Propp endeavored to create a comprehensive morphology that listed key elements or constants.

 

For Propp, the appropriate unit of analysis was the function of dramatis personae (i.e., character plus action), which differed from Aarne’s use of motifs in that Aarne placed more emphasis on the action itself whereas Propp argued that the action must be contextualized by an understanding of its actor (in this case the subject of the action is considered part of the action itself and not its own independent element).

 

Indeed, the “who” is not particularly important in fairy/folktales as the characters are relatively unambiguous and often derive their names from their social relationship or occupation (which itself hints at a type of social relationship). The individuals who inhabit a fairy/folktale are simple and largely free from internal conflict:  characters who seem good are good and those who seem bad are bad. In a sense, these characters are not really people so much as they are devices.

 

Furthermore, due to the lack of internal psychology, Propp did not use dimensions like motivation in his analysis. Although modern storytelling in America would seem to place a premium on the “why” (see, for example, the way in which intent is central to the legal system), the straightforward nature of the characters in fairy/folktales made this type of analysis unnecessary.

 

Propp identified 31 unique functions (see below) in Russian folktales, a morphology that bears a certain similarity to Joseph Campbell’s monomyth. However, looking at the elements in both schemas, one can immediately see that fairy/folktales are much more straightforward in that they consist of a specific actor/action while the elements of the monomyth speak much more to a process akin to character development.

 

It should also be noted that folklore is different than postmodern storytelling, which may use some of these familiar elements but will often combine them in new ways or otherwise play with conventions. For an example of how Propp’s attempt at morphology might be applied in a modern context, see The Periodic Table of Storytelling below.

 

 


[1] See, for example, Philip Pullman’s assertion that “A fairy tale is not a text … It’s a transcription made on one of more occasions of the words spoken by one of many people who have told this tale. And all sorts of things, of course, affect the words that are finally written down. A storyteller might tell the tale more richly, more extravagantly, one day than the next, when he’s tired or not in the mood. A transcriber might find her own equipment failing:  a cold in the head might make hearing more difficult, or cause the writing-down to be interrupted by sneezes or coughs.” (Fairy Tales from the Brothers Grimm, xviii)

 

 

Fairy/Folktales vs. (Mono)Myth

 

Fairytale

Monomyth

1

One of the members of a family absents himself from home The call to adventure

2

An interdiction (limitation) is addressed to the hero Refusal of the call

3

The interdiction is violated Supernatural aid

4

The villain makes an attempt at reconnaissance The crossing of the first threshold

5

The villain receives information about his victim Belly of the whale

6

The villain attempts to deceive his victim in order to take possession of him The road of trials

7

The victim unknowingly helps the villain by being deceived or influenced by the villain The meeting with the goddess

8

The villain harms a member of the family or a member of the family lacks/desires something Woman as temptress

9

The lack or misfortune is made known; the hero is given a request or a command; the hero goes on a mission/quest Atonement with the father

10

The seeker plans action against the villain Apotheosis

11

The hero leaves home The ultimate boon

12

The hero is tested The refusal of the return

13

The hero reacts to the actions of the future donor The magic flight

14

The hero uses the magical agent Rescue from without

15

The hero is transferred to the general location of the object of his mission/quest The crossing of the return threshold

16

The hero and villain join in direct combat Master of two worlds

17

The hero is branded Freedom to live

18

The villain is defeated  

19

The initial misfortune or lack is set right  

20

The hero returns home  

21

The hero is pursued  

22

The hero is rescued from pursuit  

23

The hero arrives home or elsewhere and is not recognized  

24

A false hero makes false claims  

25

A difficult task is set for the hero  

26

The task is accomplished  

27

The hero is recognized  

28

The false hero is exposed  

29

The false hero is transformed  

30

The villain is punished  

31

The hero is married and crowned  

 

Periodic Table of Storytelling


True Women and Fruitful Femininity: Evangelical ideology and women’s bodies

The rhetoric of war has become somewhat commonplace in the contemporary American political sphere, used by pundits and journalists to describe everything from the ongoing physical conflict abroad in Afghanistan to contestations over domestic ideology manifested via the War on Christmas. War, it seems, has become the de facto term used to label conflict on a national scale and the casual use of the phrase is rather indicative of the heightened political rhetoric of our time. Noting the prevalence of this existing sentiment, it makes a certain amount of sense that a phrase introduced via Tanya Melich’s The Republican War against Women in 1998 would be resurrected during the 2010 campaign season and popularized during the elections of 2012. Primarily used to describe the deluge of legislation related to women’s healthcare on both national and state levels—for example, restricting or eliminating funding for Planned Parenthood, the institution of “right-to-know” laws and waiting periods for abortions, accessible birth control, and transvaginal ultrasounds—“the war on women” was coined in order to signal a new round in the ongoing efforts of socially conservative politicians to institute control over women’s bodies.

            It is against this backdrop that Ann Romney took to the stage during the 2012 Republican National Convention to announce, “I love you women!” Regardless of Romney’s personal feelings on the subject, her declarative statement served as a recuperative effort to address the Democratic Party’s accusations that Republicans were engaged in an assault on women. Described as a “myth” by conservative sources (Merkel, 2012), or alternatively addressed by a tu quoque argument about conditions facing women elsewhere (Van Susteren, 2012), the Republican assertion that the party was women friendly stood in contrast to the very real ways in which Republicans, as a generalized political bloc, had systematically attempted to curb the rights of women in the 20th and 21st centuries. Although novel in their wording, the movements encapsulated by “the war on women” are not radical in their position; best understood in a context of Republican legislation reaching back to the 1970s, the war on women can be seen as an on-going battle. Without diminishing the important potential implications of current bills like House Bill 290 in Ohio, which would deprioritize Planned Parenthood clinics for funding in a manner that effectively eliminates federal support, these acts must be located within a broader socio-historical context in order to gain a fuller understanding of the situation at hand.

            In order to help situate the aforementioned war on women, this article will attempt to look at the intersection of conservative politics and religion as they pertain to the discipline and surveillance of the female body. Although an initial correlation can be readily made between these two categories, the relationship is not one of simple causation; rather, it will be argued that a deeper ideology about the body that springs from Protestantism has coevolved with American concerns about the body in order to inform the current legislation that comprises the war on women. Through explorations of issues surrounding recent mentions of rape and abortion, this article hopes to illustrate how ambivalence over the body that arises from a Protestant tradition results in conflicting views over the regulation and management women’s bodies and how the resurgence of the Evangelical movement in America has helped to transmit these ideas to a new generation of Christian youth through the creation of a lifestyle that successfully integrates politics and religion into everyday practices. One important limitation to note in this endeavor, however, is the way in which discussion of groups like women, evangelicals, and politics demonstrates a sensibility that is decidedly white and middle class. Although there are undoubtedly ways in which segments of the populations mentioned in this article reflect an experience that deviates from what is described, these minority positions derive their identities from their oppositional stance to the white male ideology that dominates evangelical Christian culture and, thus, the exploration of this phenomenon through such a lens remains valid if admittedly incomplete in its scope. Additionally, a longer paper would benefit from analysis of different forms of feminism, paying particular attention to the way in which modern American bodies are defined in part through practices of consumption on literal and metaphoric levels. Ultimately, the article aims to argue for feminists to situate events like “the war on women” in a broader socio-historical context that recognizes the importance of deeply-rooted and seemingly unrelated beliefs.

The Rape Thing

            The months leading up to the 2012 election seemed to be rife with socially conservative politicians on all levels of government voicing a series of positions on rape that became highly publicized:  Linda McMahon’s mention of “emergency rape” (Vigdor, 2012), Ron Paul’s use of “honest rape” (Benen, 2012) and John Koster’s employment of the phrase “the rape thing”[1] (Kaminsky, 2012) all helped to illustrate the various ways in which the issue of rape is understood and deployed in American culture at the present moment. Perhaps the most memorable story from this series of events, however, was Representative Todd Akin’s invocation of the now infamous term “legitimate rape” during a televised interview (Moore, 2012). Although Akin would later claim that he used the word “legitimate” in order to distinguish between true and false reports of rape, the context of the phrase made such a reading rather unlikely. To quote Akin from his appearance on The Jaco Report, “It seems to be, first of all, from what I understand from doctors, it’s really rare. If it’s a legitimate rape, the female body has ways to try to shut the whole thing down” (2012).

            In response to the outrage that followed his comments, Akin claimed that he “misspoke” in a move that essentially deflected attention away from the ideology underlying the original statement. Suggesting that Akin’s position was not merely a poor choice of words, Orange County Superior Court Judge Derek G. Johnson reportedly made the following statement during the sentencing of a convicted rapist in 2008:  “If someone doesn’t want to have sexual intercourse, the body shuts down. The body will not permit that to happen unless a lot of damage in inflicted.” (Goffard & Marble, 2012; Moxley, 2008). Although Johnson did not use the term “legitimate rape,” the choice of language here is eerily similar to that of Akin, replete with the notion that the (female) body somehow “shuts down” in order to prevent unwanted and/or unsanctioned sexual intercourse.

            Although the comments of Representative Todd Akin and Judge Derek G. Johnson suggest a way in which science has been commandeered to support inaccurate medical positions, they also raise an important point regarding the way in which rape is popularly conceptualized:  rape is something that only happens to women and is perpetuated by men. Before castigating Akin and his conservative colleagues, however, we should consider the way in which this view of rape is enshrined within the American legal system as a whole:  according to the FBI’s Uniform Crime Reporting program, “forcible rape” has been defined as “carnal knowledge of a female [emphasis added] forcibly and against her will” since 1927 and was only revised in 2012 to read as “the penetration, no matter how slight, of the vagina or anus with any body part or object, or oral penetration by a sex organ of another person, without the consent of the victim” (Federal Bureau of Investigation, 2011; Federal Bureau of Investigation, 2012). Here, the modifier “forcible” is employed in order to differentiate this particular type of rape from statutory rape, which is, by definition, excluded from this particular category.

            Consistent with this differentiation and demonstrating that a firm definition of rape is not just a problem ascribed to socially conservative individuals, Whoopi Goldberg’s asserted on the television talk show The View that director Roman Polanski was guilty of statutory rape, but not “rape rape” (Kennedy, 2009). On some level, viewers of may have understood that Goldberg was trying to differentiate between degrees or types of acts based on use of force or violence but the statement revealed an underlying assumption that a form of “true” rape exists; to put it another way, Goldberg’s phrasing suggests that although “legitimate rape” may not exist, particular categories of rape are indeed legitimized.

            Indeed, the issue of rape has only become more confused in recent years with terms like “gray rape” appearing in Cosmoplitan to describe, as the author puts it, “A New Form of Date Rape” (Stepp, 2007). In her article, Stepp points to the apparent gray zone that exists when consent is unclear and effectively introduces a measure of doubt designed to attack the popular understanding of what constitutes rape. Here, it should be noted, Stepp’s words reflect an established position regarding consent given by women that is rendered ambiguous by intoxication and enacted as part of a hookup culture. Encapsulated by individuals like Katie Roiphe (1994)—who suggested in The Morning After:  Fear, Sex, and Feminism, “If a woman’s ‘judgment is impaired’ and she has sex, it isn’t necessarily the man’s fault; it isn’t necessarily always rape”—is a stance that remains entrenched in a moralizing and apologist discourse. Yet, aside from reaffirming the notion that rape is something happens solely to women by men, perhaps the most damaging aspect of this article is the way in which Stepp comingles the language of empowerment for women with restrictive gender roles in a manner that garners approval as it avoids blaming the victim even while proffers a solution reminiscent of the arguments that stemmed from the backlash to Second Wave Feminism.

            In her article, Stepp tells the story of Alicia[2] who is hesitant to describe her post-hookup experience as rape because Alicia considers herself to be a strong woman and sexually independent (2007). Here, the insistence on understanding the categories of “strong woman” and “rape victim” as mutually exclusive is particularly problematic for individuals as it not only prevents the reporting of a crime but also reinforces a good-bad binary:  under this false construction, to declare oneself as a victim of rape is to necessarily disempower oneself. The solution that Stepp provides to this dilemma is decidedly anti-feminist as she states that “A generation ago, it was easier for men and women to understand what constituted rape because the social rules were clearer. Men were supposed to be the ones coming on to women, and women were said to be looking for relationships, not casual sex” (2007). The emphasis on the good-bad girl dichotomy is clear, with a desire for casual sex (as stand-in for poor judgment in general) being associated with negative consequences. Undoubtedly influenced by social conservatism and postfeminism, we see here that Stepp’s clever choice of words asks readers, who are ostensibly women, to align with the perspective of Alicia as independent and sexually powerful person while attributing the root cause of gray rape to the ambiguity that stems from modern gender roles; the paradoxical problem, then, is women as a whole but not women as individuals.

            On one level, this debate over rape would appear to be about the issue of consent:  what is it, whether it is revocable, and who can give it.[3] While further exploration of this concept is certainly warranted, we can draw upon work by feminists like Catharine MacKinnon and Andrea Dworkin to consider the larger framework in which sexuality and choice are framed. What this discussion ultimately points to is the way in which rape has yet to be singularly defined in American legal and social spheres and this, in turn, stems from varying views on who should be in control of a woman’s body. In contrast, consider that domestic violence, an issue that has historically predominantly affected women, has become utterly abhorrent due in part to the 1994 campaign, “There’s no excuse for domestic violence.” Although the campaign is subject to criticism for its overrepresentation of white middle-class women, the series of public service announcements ardently worked to establish a common definition for what constituted domestic violence (The Ad Council, 2003). Stepp’s elaboration on her article in a panel discussion at the John Jay College of Criminal Justice on the topic of gray rape reinforced apparent themes of vagueness and confusion while opponents responded with the finality of “rape is rape” (Chan, 2007). The note of uncertainty in Stepp’s position and the corresponding desire to find reassurance in retreat is important to note, however, as it speaks to the way that, in a world of ambiguity, the female body is the thing that we return to as that which we can control.

The Cult(ure) of Life

            In order to more fully understand the themes of retreat and uncertainty, it is helpful to remember the context in which the discussion of rape was placed during the 2012 election season:  in most cases, discussion of rape was nested within a larger ongoing discussion about the Republicans’ positions on abortion, a political issue that becomes almost inseparable from religious beliefs in contemporary debates. Abortion, very much a Catholic concern in 1973 when the Supreme Court decided Roe v. Wade, became an Evangelical issue partially through the work of Francis Schaeffer, who produced a book and film both entitled Whatever Happened to the Human Race? On some level, the idea that religion influences abortion policy seems rather obvious with suppositions made about the pro-life leanings of conservative Christians and indeed, as a rule of thumb, such assumptions may not be incorrect. However, a deeper examination helps to illuminate how elements of Christianity, in addressing questions of ambiguity and uncertainty, support the particular policies that are currently manifesting. In this, it is particularly instructive to situate the current political and religious climate within a larger history of American religious activities.

            Awakenings, movements born during times of upheaval and uncertainty, characteristically began with an appeal to traditional values as large numbers of people converted to, or reaffirmed their faith in, Christianity. Although a detailed discussion of America’s Great Awakenings is beyond the scope of this paper, consider that the First Great Awakening occurred roughly between 1730 and 1760 while the Second appeared between approximately 1800 and 1830; both of these movements foreshadowed the most pivotal domestic wars in the history of the United States and were indicative of periods of civil unrest that precipitated conflict on a massive scale. A Third Great Awakening then came at the start of the 20th century as concerns over modernity and industrialization once again introduced ambivalence about the future and man’s place in the world.[4] Understanding that the notion of uncertainty is vital to the appearance of Great Awakenings and we might consider how current developments in science and technology have once again worked to decenter mankind’s position as the center of the universe, causing us to engage in an ongoing renegotiation of our senses of self. In this context, the intellectual retreatism that manifests around issues like climate change and the body makes a certain amount of sense; whether or not we are ready to label the current project as the Fourth Great Awakening, it is difficult to deny that the framework of the Awakening provides a possible lens through which we can attempt to understand the phenomena that we have witnessed in recent history with the late 20th and early 21st centuries playing host to a number of interrelated issues that range from abortion to stem cell research and artificial life support that are united through their exaltation of life.

            Popularized by Pope John Paul II in the late 20th century, the “culture of life” was rapidly adopted by American evangelicals in order to connect a set of theological beliefs about life to public policy (1995). The culture of life assumes, in a manner reminiscent of the Great Chain of Being, that life is fundamentally different from inert matter and furthermore that human life is substantially different from all other forms of life. For those who ascribe to this particular philosophy, there is a particular way in which life evidences a measure of agency and self-direction with human life (as opposed to animal life) being distinguished by a unique animating principle. Although this specific view on life descends from a vitalist tradition that may or may not have considered the unique spark to be the soul, the “culture of life” as a product of Catholic theology unapologetically described this essential life essence in terminology that references the soul. Consequentially, the culture of life positions this human exceptionalism as a direct result of divine will, meaning that God has implanted a soul within each individual body. Given that this differentiation between forms of being is what structures the universe, challenges that threaten to upend this order take on increased significance; the fight for any one individual life, then, is a fight to preserve the sanctity of all life.

            Exemplifying the attitude of the culture of life in this matter was the case of Terri Schiavo, who was at the center of a protracted legal battle over the ability of Schiavo’s husband, Michael, to remove Schiavo’s feeding tube and thus end her life. Schaivo’s case was notable in that garnered national attention and resulted in the passing of health legislation—the Palm Sunday Compromise—designed solely to benefit a single person. The president at the time, George W. Bush, rushed back to Washington D.C. from a vacation in Texas in order to sign a bill designed to move Schiavo’s case from state to federal court and issued this statement of support:  “It should be our goal as a nation to build a culture of life, where all Americans are valued, welcomed, and protected—and that culture of life must extend to individuals with disabilities” (2005). A few months later, President Bush would go on to declare his opposition to embryonic stem cell research while simultaneously supporting an ongoing war in Iraq that is estimated to have killed between tens of thousands and hundreds of thousands of Iraquis (Iraq Body Count, 2012). The culture of life, then, would appear to have an inherent ambivalence about the concept of life, or, at the very least, lives that are of value.

            Returning to the larger framework from which the culture of life derives, however, we see that any notion of ambiguity is addressed through the hierarchal structure of life that orders the universe. The underlying structure of a hierarchy—along with the presumption that white American males sit at the top of the heap—legitimates policy that works to support systemic social inequality and would otherwise appear unjust. This drive to fight for life at the expense of lesser forms slides readily into a justification for the domination of everything else under the guise of protection; a worldview informed by the hierarchal nature inherent in the culture of life is reflected in policy that covers everything from universal health care to advanced interrogation techniques and the environment.

The Issue of Women and Their Bodies

            One group, in particular, that the culture of life’s hierarchical structure often works to subjugate is women and, in this, the issue of abortion presents a fruitful subject of inquiry as it resides at the nexus of issues regarding theology, politics, gender, and the body. Bodies in general, and women’s bodies in particular, have traditionally represented an additional source of ambivalence and anxiety for socially conservative Christians. In fact, the concept of the body was used throughout early Christianity to reinforce the hierarchy established by constructs like the Great Chain of Being. Church doctrine formalized a gendered hierarchy that designated the man of the house as the “head” as the center of reason and logic while woman was associated with the body.[5] From Ephesians chapter 5, verses 22-24:

Wives, submit to your own husbands, as to the Lord. For the husband is the head of the wife even as Christ is the head of the church, his body, and is himself its Savior. Now as the church submits to Christ, so also wives should submit in everything to their husbands.

            For evangelicals, who believe in the inerrancy of the Bible, this particular passage is key as it establishes the basis of female submission and lays groundwork for the belief that men not only have the right but the divine duty to control women and their bodies. This is not to suggest, of course, that this particular passage is cited as justification for legislation designed to restrict women’s health but rather to argue that evangelicalism forms part of an underlying ethic that then serves to inform such policy.

            Addressing this very issue, radical feminism argued to point out the way in which women’s identity has been historically defined in relationship to that of men. Here, in contrast to previous iterations of feminism that understood inequality in terms of legal and class systems (i.e., liberal and Marxist feminism), we witness a movement that calls the legitimacy of patriarchy into question and, with it, the primacy of heterosexuality’s influence in society. Radical feminism’s opposition to the ethic of evangelicalism is important to note because strains of thought established by radical feminism are precisely what socially conservative Christian culture continues to battle today. To quote conservative evangelist Pat Robertson, “[feminism] is about a socialist, anti-family political movement that encourages women to leave their husbands, kill their children, practice witchcraft, destroy capitalism and become lesbians” (1992). Admittedly extreme in its view, Robertson’s quote nevertheless speaks to the way in which contemporary forms of feminism are associated with radical feminism and, as such, are subject to an incredible backlash. The danger here is that, the disparaging of radical feminism and its core ideals means that patriarchy further solidifies its hold and works to further entrench the legitimacy of men over women.[6]

            But it is not just women as a category that is addressed by Ephesians for the passage also speaks to the subjugation of the body and it is the linking of the two that has historically been a feminist concern. By creating an association with the body and the material—as opposed to the idealism and rationality represented by men—women’s bodies, and women by extension, have historically come to be regarded as objects. Successes from liberal feminism have helped to ensure that women’s bodies are no longer considered property but contemporary forms of feminism continue to struggle with ways in which control and surveillance of women’s bodies has become integrated into culture.

            As a site of investigation the body holds particular importance for it was through the body that anxieties about the world and one’s place in it were addressed:  early Christianity seized upon the desire for order and used the body to physically manifest notions of morality. The body, following a tradition established throughout medieval practice and ushered into the early modern era via Calvinism, became a barometer for the condition of the soul and fitter bodies indicated fitter souls. For many, efforts to secure salvation were enacted through the disciplining of one’s body as asceticism expanded to guard against excesses of food, sex, and the body.  One consequence of this is the rise in Christian fitness culture, a theme that is explored in R. Marie Griffith’s Born Again Bodies. For Griffith (2004), there is a key distinction to be made regarding the way in which the body is configured in American evangelism in that the American disciplining of the body is removed from earlier practices of penitence or identification with Christ’s suffering. The body has become a site of ambivalence as the entity that is responsible for the promulgation of sin while simultaneously acting as the conduit through which one demonstrates devotion to God. For American evangelicals, controlling the body is an end in and of itself.

The Bodies of the Future

            Evangelical youth in particular have renewed this effort to avoid excess, with movements ranging from modesty clubs to straight edge culture and participating in programs like The Silver Ring Thing. And, for evangelicalism, popular culture has, in a broad sense, been seized as a medium to transmit the messages and values of the movement and nowhere is this more apparent than among youth. This is not to imply, of course, that evangelicals believe that all instances of pop culture are performing the work of God but rather to suggest that popular culture—as the culture of the people—has been appropriated by evangelical movements and successfully integrated into a lifestyle for its followers.[7] There is a powerful community forming in this next generation of evangelical youth, united by their love for God and increasingly supported through an ever-widening network of rock concerts, skate parks, megachurches, prosperity gospels, and youth ministries that understand the importance of tapping into ethos that is driven by a profound need to belong. It is here that we see how the current movement of Evangelical youth has adopted lessons from the countercultural movements of the 1960s; employing the language of difference feminism for very different ends, young women understand sisterhood as a bond forged through the celebration of traditional social roles as devotion to God.

            If radical feminism coined the phrase “the personal is political” in order to argue that the everyday experiences of women were inextricably tied to political processes, the evangelical youth movement, in denying that it is about politics, performs a rather ingenious countermove:  it has cast the political as the everyday and thus makes itself more accessible to the next generation of activists. Although they may be hesitant to articulate it as such, politics, in the view of evangelical youth, has become a powerful combination of what you do, what you believe, and who you are. The political, in other words, has become personal.

            Even the very process of coalition building, championed by prominent feminist scholars like Bernice Johnson Reagon, has been assimilated into the toolkit of evangelism but unlike the feminist movement, this generation of evangelical activists has not been challenged to critically consider the implications of difference, instead focusing on messages of acceptance and cohesion through God’s love. The formation of cultural identity has become dependent on definition through disidentification with the Other and the incorporation of substantial difference is ignored. In a way movements like Mars Hill Church in Seattle represent the inversion of coalition politics for they champion the very sense of nationalism that Reagon warns is insufficient to survive in a modern world full of diversity (1983).

            Looking back to look forward, it is precisely this sense of retreatism that makes evangelical youth a population worth of study for we can study our nation’s history to understand what happens when deep cleavages are allowed to persist. The goal here is not to castigate evangelical youth movements but rather to issue a call to the corresponding members of the next generation of progressive activists:  if you are truly interested in forwarding the cause of feminism, remember the words of Bernice Johnson Reagon and push yourselves to see the linkages between seemingly disparate issues.  By turning politics into a lifestyle, evangelical youth movements have developed a structure that makes it almost impossible for a believer to be a single-issue voter and although there are assuredly differences between individuals, the sense of collective action that arises from this group remains one of their biggest successes.

Works Cited

Akin, T. (2012, August 19). The Jaco Report. (C. Jaco, Interviewer)

Benen, S. (2012, February 6). Ron Paul and “Honest Rape”. Retrieved from The Maddow Blog: http://maddowblog.msnbc.com/_news/2012/02/06/10331008-ron-paul-and-honest-rape?lite

Bush, G. W. (2005, March 17). President’s Statement on Terri Schiavo. Retrieved from The White House: http://georgewbush-whitehouse.archives.gov/news/releases/2005/03/20050317-7.html

Chan, S. (2007, October 15). ‘Gray Rape’: A New Form of Date Rape? Retrieved from The New York Times: http://cityroom.blogs.nytimes.com/2007/10/15/gray-rape-a-new-form-of-date-rape/

Federal Bureau of Investigation. (2011, September). Forcible Rape. Retrieved from Crime in the United States: http://www.fbi.gov/about-us/cjis/ucr/crime-in-the-u.s/2010/crime-in-the-u.s.-2010/violent-crime/rapemain

Federal Bureau of Investigation. (2012, January 6). Attorney General Eric Holder Announces Revisions to the Uniform Crime Report’s Definition of Rape. Retrieved from National Press Releases: http://www.fbi.gov/news/pressrel/press-releases/attorney-general-eric-holder-announces-revisions-to-the-uniform-crime-reports-definition-of-rape

Goffard, C., & Marble, S. (2012, December 13). Judge Who Said Rape Victim “Didn’t Put Up a Fight” Later Apologizes. Retrieved from The Los Angeles Times: http://latimesblogs.latimes.com/lanow/2012/12/judge-who-said-rape-victim-didnt-put-up-a-fight-later-apologized.html?utm_source=dlvr.it&utm_medium=twitter&dlvrit=649324

Griffith, R. M. (2004). Born Again Bodies: Flesh and Spirit in American Christianity. Berkeley: University of California Press.

Iraq Body Count. (2012, December 10). Iraq Body Count Database. Retrieved from Iraq Body Count: http://www.iraqbodycount.org/database/

John Paul II. (1995, March 25). Evangelium Vitae. Retrieved from The Vatican: http://www.vatican.va/edocs/ENG0141/_INDEX.HTM

Kaminsky, J. (2012, November 1). Republican Candidate Calls Aborting Rapist’s Child “More Violence on Woman’s Body”. Retrieved from Reuters: http://www.reuters.com/article/2012/11/01/us-usa-campaign-abortion-idUSBRE8A006A20121101

Kennedy, M. (2009, September 29). Polanski Was Not Builty of ‘Rape-Rape’, Says Whoopi Goldberg. Retrieved from The Guardian: http://www.guardian.co.uk/film/2009/sep/29/roman-polanski-whoopi-goldberg

Merkel, J. (2012, April). War on Women is a Myth: Nikki Haley and the Top 5 Republican Women. Retrieved from PolicyMic: http://www.policymic.com/articles/5958/war-on-women-is-a-myth-nikki-haley-and-the-top-5-republican-women

Moore, L. (2012, August 20). Rep. Todd Akin: The Statement and the Reaction. Retrieved from The New York Times: http://www.nytimes.com/2012/08/21/us/politics/rep-todd-akin-legitimate-rape-statement-and-reaction.html?_r=0

Moxley, R. S. (2008, October 30). The DA’s Office Reacts to a Naughty Episode of Prosecutorial Misconduct. Retrieved from Orange County Weekly: http://www.ocweekly.com/2008-10-30/news/moxley-confidential/

Reagon, B. J. (1983). Coalition Politics: Turning the Century. In B. Smith (Ed.), Home Girls: A Black Feminist Anthology (pp. 356-368). Boston: Kitchen Table: Women of Color Press.

Roiphe, K. (1994). The Morning After: Fear, Sex, and Feminism. New York: Back Bay Books.

Stepp, L. S. (2007, September). A New Kind of Date Rape. Retrieved from Cosmopolitan: http://www.cosmopolitan.com/sex-love/tips-moves/new-kind-of-date-rape

The Ad Council. (2003). Domestic Violence Prevention (1994-Present). Retrieved from The Ad Council: http://www.aef.com/exhibits/social_responsibility/ad_council/2472

The New York Times. (1992, August 26). Robertson Letter Attacks Feminists. Retrieved from The New York Tmes: http://www.nytimes.com/1992/08/26/us/robertson-letter-attacks-feminists.html

Van Susteren, G. (2012, November 14). The Real “War on Women” – The One We Do Not Hear About! And Has Facebook Joined the War on Women? On the Wrong Side? Retrieved from GretaWire: http://gretawire.foxnewsinsider.com/2012/11/14/the-real-war-on-women-the-one-we-do-not-hear-about/

Vigdor, N. (2012, October 17). McMahon Reverses Stance on Hospital Birth Control Mandate. Retrieved from Connecticut Post: http://www.ctpost.com/news/article/McMahon-reverses-stance-on-hospital-birth-control-3954682.php


[1] Here Koster was attempting to elucidate his position on abortion, indicating that he would support an allowance if a mother’s life was in danger but not in cases of incest or rape. According to Koster, incest occurred with such minor frequency that it was not worth including in legislation. Rape, however was referenced repeatedly as “the rape thing,” which at best could be translated as “on the point discussion that is rape” but at worst could be taken as a phrase that indicates a dismissive and casual attitude toward rape.

[2] Stepp notes that this is a pseudonym, which is understandable given the nature of the incident being reported. There is, however, an interesting discussion to be had regarding the way in which the use of a pseudonym can be used to consider the differences between empowerment as an abstract concept and embodied action.

[3] As example, it was only in 2008 that the state of Maryland overturned an existing law that prevented an individual from revoking consent once he or she had given it (see Maouloud Baby v. State of Maryland, 2008), meaning that, until that time, individuals could not be convicted for post-penetration rape in Maryland.  Here we see rape’s definition tied solely to the initial act of penetration, meaning that once consent was given to enter the body, rape could not happen even if the penetrated party changed his or her mind at a later point in time.

[4] As a side note, the Third Great Awakening happens to occur before and during World War I but this Awakening does not maintain the same connection to war as its predecessors. Looking at the Revolutionary War (First Great Awakening) and the Civil War (Second Great Awakening), we can see that conflict was the result of an ongoing negotiation over national identity that was not present as a motivation for World War I.

[5] In a somewhat complicated extended metaphor St. Augustine would go on to suggest that, mirroring the relationship between men and women, all of mankind constituted a type of body to the “head” of God.

[6] Progressive evangelical feminists have argued for a rereading of Ephesians in light of 5:21 (“Submit yourselves one to another in the fear of God”), suggesting that the passage actually speaks to the humbling of all humans in the face of God and calls for a renewed understanding of submission. Despite the popularity of biblical feminism in the 1970s and 1980s, groups like the Evangelical and Ecumenical Women’s Caucus and the Evangelical Women’s Caucus have declined in stature within the evangelical community, suggesting that progressive evangelical feminist discourse is not currently widely circulated.

[7] An analog for the Left might be President Obama’s understanding and deployment of social media during his election campaigns as indicative of the way in which politics comingles with the everyday practices of individuals.


Seen but Not Heard: Feminist Narratives of Girlhood

Seen but Not Heard:

Feminist Narratives of Girlhood

Caitlin Moran, How to Be a Woman, New York: Harper Perennial, 2011, 320 pp., $15.99 (paperback).

Peggy Orenstein, Cinderella Ate My Daughter:  Dispatches from the Front Lines of the New Girlie-Girl Culture, New York: Harper, 2011, 256 pp., $25.99 (hardcover).

Reviewed by

Chris Tokuhama

University of Southern California

           The first thing one needs to know about Caitlin Moran’s How to Be a Woman (Harper Perennial, 2011) is that it is not an academic book, nor does it claim to be. Moran, a columnist for the London paper The Times, rightly asserts that the movement of feminism is too important to be discussed solely by academics and endeavors to use vignettes from her life to illustrate particular ways in which the question of feminism infiltrates meaningfully into the everyday lives of ordinary individuals. In and of itself, this effort represents a perfectly admirable attempt to reintroduce notions of feminism into mainstream culture but good intentions can only carry one so far.

            Ultimately, when boiled down to its purest essence, Moran’s assertion that she has “stuff to say” (12) is really what this book is all about. Moran has assembled a collection of shorter pieces loosely linked by the fact that they all derive their thrust from a moment in which an experience has given her some insight into the condition of being a woman—and a pointedly white and heterosexual one at that—in the United Kingdom. Given Moran’s background as a columnist, one is not surprised that her book should take this form and, indeed, one might be inclined to deem the project successful if the book were conceived simply as a memoir of sorts. Instead, however, Moran positions her book in the tradition of the feminist practice of consciousness raising and readers must question what sorts of insights are gained from perusing this particular text.

            “But wait!” Moran might argue, “I’m not a feminist academic!” (12) And she would be correct in that assertion. What the caveat does not excuse, however, is a demonstrated lack of rigor in thought or practice. As one example, Moran cites an “Amnesty International survey that found that 25 percent of people believe a woman is still to blame for being raped if she dresses ‘provocatively’” (203) which might very well be true but Moran does not provide any means to verify such a statement. It is precisely because feminism is such an important issue that Moran should do her due diligence and not allow her position to be undermined by an easy attack; Moran should force her detractors to confront her ideas and not her evidence, which is frustrating since Moran has some really good ideas.

            For example, one of the themes that runs throughout Moran’s book is the way in which being a woman (i.e., female identity) is manifested through, and displayed on, the body and that women’s internalized sense of how to appropriately discipline their bodies plays a key part in becoming a woman in the United Kingdom and the United States. Pubic hair, in particular, occupies a bit of Moran’s attention as its initial appearance and subsequent removal remain closely linked to conceptualizations of womanhood and femininity. A notable section of Moran’s second chapter discusses how technical considerations of shooting pornography marketed to heterosexual men—again we must be wary that this constitutes conjecture for Moran provides no sources—have been imbued with a layer of cultural meaning that consequentially influences women’s grooming habits. Women, in short, are affected by a cultural product that likely does not have their best interests in mind and it is precisely this type of revelation that illustrates the continued relevance of feminism. And yet it is also interesting to note when and where Moran draws arbitrary lines:  pubic hair, for example, should be trimmed but not waxed. But why, you might ask? Here Moran misses an opportunity to discuss the larger implications of the way in which women (and men) have been socialized to relate to women’s bodies and although Moran correctly notes that pubic hair is different from other forms of ancillary hair in that it is sexualized, she fails to touch on the broader issue of how hair management (of which trimming would surely be included) is related to perceptions and enactment of femininity.

            And of course it would seem rather impossible to discuss female bodies and femininity without broaching the subject of the vagina. Moran muses on a conversation with her younger sister, “So now, in 1989, we have no word for ‘vagina’ at all—and with all the stuff that’s going on down there, we feel we need one.” (56) Although Moran goes on to talk about the various euphemisms that women have for their vagina, she does not touch upon the way in which this practice points to the way in which language plays a crucial role in configuring, maintaining, and enacting the relative subjugation of women. Moran notes, for example, that terms describing the vagina have the ability to cause discomfort but ultimately portrays this phenomenon as empowering—she notes that the most offensive male counterpart is “dick”—but not discuss the ways in which this difference is actually indicative of a problem. What does it mean, for example, that we are much more familiar with, and accepting of, dicks? Nothing particularly good for women, most likely. Compounding the problem, Moran makes a critique about how “pussy” evidences a disconnect between women and their vaginas but does not comment on the way in which a refusal to embrace “vagina” ultimately leads to the same conclusion. Here, instead of making a compelling argument about the way in which language can be used to excavate relationships, Moran merely produces a polemic about the vagina’s various names that ultimately boils down to a description of her personal taste without investigating how her taste—and here one might certainly nod toward Bourdieu—was cultivated in the first place.

And the vagina performs a key function for Moran as she provides a catchy, if unhelpful, survey on page 75 to determine if one is in fact a feminist:  Do you have a vagina? Do you want to be in charge of it? (Feminists, by the way, should answer “yes” to both.) Like Naomi Wolf’s Vagina, we see a way in which the vagina is made to stand in for the entirety of womanhood, essentially reducing the meaningful elements of a woman to her vagina. Surely, this is a provocative question but not incredibly feminist in the long run. Moreover, what about those who do not have a vagina (e.g., men)? How are they supposed to figure out if they are feminists or not? Compounding the problem, we must investigate what it means to be in charge of one’s vagina:  in the abstract, one might state that being in charge means that one should be able to do whatever one likes with one’s vagina but we are left to question how such a practice manifests in the real world. Here, Moran’s ambiguity allows her to assume a position that is difficult to counter for who would argue that women should not have control over their own bodies in theory? Moran provides a good sound bite that is ultimately meaningless, however, for there are many ways in which the actions of men (and women) do not evidence a belief that total control of the vagina belongs to the women who bear them.

And yet perhaps the most problematic way in which Moran’s ambiguity affects her writing rests in the rather causal way she employs the term “the patriarchy.” On one hand the term is easy enough to define but where Moran fails is in her refusal to explain exactly what “the patriarchy” encompasses; patriarchy manifests in a variety of forms and through myriad agents as it operates on individual, interpersonal, and institutional levels. Here one senses another distinct limitation in her work:  How to Be a Woman is written by Moran to other women like Moran. For Moran, “the patriarchy” does not need to be defined because its meaning is assumed. Ultimately Moran’s overly simplistic attempts to define feminism and patriarchy also do a larger disservice as Moran fails to address the notion that individuals may benefit from feminism without ever being feminist themselves. Moran’s assumptions about feminism occlude the nuanced ways in which individuals can work to support both feminism and patriarchal hegemony in a manner that does not produce internal conflict.

In contrast to Moran’s efforts, one feels compelled to laud a work like Peggy Orenstein’s Cinderella Ate My Daughter:  Dispatches from the Front Lines of the New Girlie-Girl Culture (HarperCollins, 2011) for its ability to use personal narrative as an entrée to discuss the way in which female gender roles are configured and interpreted on a variety of levels. Using her experiences with her daughter as a narrative backbone, Orenstein carefully develops a series of thoughts about the effect that princess culture has on contemporary children.

Primarily focused on the influence of markets, Orenstein shows how economic concerns have played a large part in shaping the world that girls experience today. From the concept of Disney Princesses as an effort to revitalize a flagging corporate consumer products division to the way in which American Girl dolls promote intergenerational female bonding through consumption to the mapping of a family’s aspirations for social mobility onto child beauty pageant contestants, Orenstein illustrates how disparate aspects of girlhood are connected to each other and to a larger system of meaning. It is precisely because of the influence of marketing, Orenstein argues, that the transgressive core of “girl power” has been eschewed for the faux empowerment of “girlz.” The insidious bargain that girls strike is to gain claims toward empowerment by using consumption to reaffirm traditional gender roles. Even as fewer opportunities become salient for young girls—here reference is made to a classroom exercise in which young girls chose to imagine themselves as a princess, a fairy, a butterfly, or a ballerina in contrast to boys who assumed a variety of roles—Orenstein explores how performance of gender has become increasingly divorced from notions of female pleasure. Particularly notable are the ways in which Orenstein uses new communications technologies like social media and picture messaging to showcase how young girls’ identities have become, in part, more externally focused with the cultivation of the self as a kind of real-time performance piece that lies parallel to one’s physical existence. Sexting, for example, is not a post-feminist celebration of the body but rather constitutes a functional practice where girls demonstrate their ability to use their bodies as means toward particular ends (e.g., keeping a boyfriend). Orenstein also suggests that young girls develop a form of internalized self-surveillance as they learn to see themselves and their bodies as others do. Connecting this to chapters on body image and princess gowns, Orenstein builds a case for how body, femininity, and self are intimately related for girls; for many girls, how one feels is related to how one perceives one’s body to look. Ultimately, Orenstein challenges readers to question exactly what kind of practical power is provided by an empowerment that continues to be grounded in perceptions of the female body.

In contrast to the ambiguity that Moran displays about being a woman at the end of her book, Orenstein develops a clear plan of action that asks individuals to consider how they participate in the maintenance of a culture that might be detrimental for girls. Although both authors ground their analysis in the trappings of everyday life, the key to Orenstein’s success is the way in which she calls for a type of engagement that extends beyond Moran’s askance that readers get up on a chair and proclaim “I am a feminist!” In the end, it is a shame, really, for frank discussion of feminism’s importance is sorely needed in today’s society and Caitlin Moran owes the awkward thirteen-year-olds of the world—including the one who forms the core of her story, herself—better.


You’re Already Real — And 6 Other Things I Know At (Almost) 30

by Margareth Wheeler Johnson

My 25th birthday was one of the most magical nights of my life. It was at the sort of restaurant that isn’t supposed to exist in New York City, with a nondescript storefront on an especially barren block of 14th Street and a walled courtyard in the back that feels like France. Almost all of the people important to me came. I wore the blue and lavender party dress I wore to everything, but it seemed to fit me better that night. The whole thing was one of those occasions so perfect you think you must be dead, or bound to die in your sleep that night. What could there be after this? That’s a wrap, you think. Scene.

If only.

That night was precious to me in part because it in no way reflected what was going on the day before or after, which was that I was a total mess, and knew it. I was making mostly wrong choices and couldn’t seem to stop myself and didn’t understand why. On an almost primal level, I was certain that this was going to end very, very badly, or, even worse, it would not end. It would go on forever.

Five years later, I’m relieved to say it hasn’t. In fact, it’s ending tomorrow. Given my extensive catalog of missteps (read on), I expected my strongest emotion the day before my 30th birthday to be deep remorse for the ways in which I stalled and backtracked, for the things left undone. But very strangely, that isn’t how I feel. Instead I’m mostly astonished. Not only have I emerged from a decade that often felt like some dark, elaborate joke, with a code name like “Sisyphus” or perhaps just “New York,” but I feel like I learned some things along the way, and that they might actually be useful to someone else. Here they are, seven things I’ve learned since I turned 25:

1. You are already Real.

One of my biggest problems in my 20s, as far as I can tell, was that I had what I’ll call Velveteen Rabbit Syndrome (editors of the DSM-V, take note).

“Real isn’t how you are made… It’s a thing that happens to you,” the Skin Horse tells the stuffed hare. And just like that sawdust-filled bunny, I was waiting in my party dress, expecting someone or something outside myself to declare me real, started, begun, on my way.

But not yet. When I was ready. Which would apparently be never.

I was already real, of course — just denying it. As the Skin Horse also informs the rabbit, correctly this time, being real hurts. Accepting that you’re real means accepting that you’re going to disappoint people, and yourself, and it might go on your Permanent Record. (I am still afraid of things going on my Permanent Record, not the kind kept by law-enforcement entities or academic institutions but in the minds of others, the running tab of things that might make them ultimately decide I am not worth their time.)

It means allowing that you’re going to spill things on yourself and break someone’s heart and get yours stomped on and treat a friend or two very, very badly and accidentally reply all and be unforgivably late and have terrible sex. Things are going to be unmemorable and unphotogenic, and you won’t be able to delete or edit them, and some of it could even end up on the Internet, which is the ultimate Permanent Record.

You can pretend you’re still in the nursery, nurturing some vision of what your life will be when you decide to participate, but in the meantime you’re paying rent and bills and taxes for and on the life you’re missing, all the while knowing that no one gets away with not getting on with it, and feeling constant paranoia about that. It’s easier just to get on with it.

2. It’s not all your fault.

I believe that your 20s are supposed to be hard. But it is not helpful to have to tackle the hard stuff while being told repeatedly that you belong to a unique generation of wastrels.

Throughout the last 10 years, journalists and psychologists have raised the alarm about that mutant creature in our midst, the stalled, entitled, myopic 20-something Millennial. See: “The Narcissism Epidemic,” “Emerging Adulthood” and “Not Quite Adults,” “Meet The Twixters,” “What Is It About 20-Somethings?

My favorite part of this obsession with the 20-somethings is the insistence that despite our laziness and navel-gazing, we are, as Jeffrey Arnett, author of “Emerging Adulthood” told the New York Times in 2010, “extraordinarily optimistic that life will work out.” Really?

I can’t recall a single time between 20 and 28 when I had a sense that it would work out, and I evidently wasn’t alone. In “The Kids Are Actually Sort Of All Right,” a piece that looked at Millennials through their own eyes, New York magazine’s Noreen Malone quoted a 24-year-old college dropout, “Definitely, if you don’t do something, it’s not going to happen. But if you do do something, it’s still probably not going to happen.”

When you take the one of the most funded, doted-on generations of Americans ever and find them saying things like that, you have problems as a society that extend far beyond a tendency toward self-absorption. You have a recession, downward mobility, a disappearing middle class. None of this is the fault of people in their 20s, but reading such extensive coverage of your shortcomings is enough to make anyone feel hopeless, and, in my case, guilty for not being more “extraordinarily optimistic.”

I want back the time in my 20s that I spent feeling bad about feeling bad.

3. You’re going to do it the way you’re going to do it.

The latest addition to all of the concerned Millennial watching is clinical psychologist Meg Jay’s book “The Defining Decade: Why Your Twenties Matter — and How to Make the Most of Them Now,” which informs 20-somethings that, supposedly contrary to all they have been told, now is the time to get a career, an apartment, a significant other, a spouse, a baby (or at least a fertility plan), and above all, a grip.

I saw her point. In many cases, I tried to make myself make the choices she recommends. And in almost every case, I walked straight into the opposite, bad decision. I, in chronological order,

-Quit my first job out of college because I knew I was bad at it (Chapter 12)
Wrote about that job after the fact (Jay doesn’t warn against this, but I do.)
-Spent large swaths of time “underemployed” (very-part-time copy editor, real estate assistant, interior design shop girl; Chapter 1)
-Moved back in with my parents
-Felt like every error I made at work would result in immediate firing (Chapter 12)
-Dated people with whom I saw absolutely no future (Chapter 9)
-Cohabited with someone I didn’t plan to marry (Chapter 8)

For whatever reason I insisted on living most of the last decade against my better judgment, and apparently I’m not alone. (Wasn’t this at one time referred to as “being young”?) As much sense as the path Jay outlines makes, if you don’t take it, if you take the hand-to-the-flame approach instead, you will still, for better or worse, make it to 30.

4. You (probably) know what you want to do.

I spent at least three of the last nine years resisting what I wanted to do professionally, not because I didn’t know I wanted to do it but because it seemed impossible. Where did I get off thinking I could do the hard, notoriously un-lucrative thing (write, in some fashion) that I had always wanted to do? Who did I think I was?

So I pretended I didn’t know, and tried for a while to do other things, and all that did was make me feel displaced and paralyzed and like I was somehow cheating on myself.

I’m not saying to quit your day job. For god’s sake, do not quit your day job. But once you admit what you know you want, it’s easier to go after it. In fact, you no longer have any excuse not to.

5. Don’t settle.

This is a tricky one. The under-30 set is notorious for the opposite urge, to constantly scan the horizon for something better than where you are and what you have. Many of us openly admit to an all-consuming FOMO (Fear Of Missing Out). But I think safety is equally alluring in a decade when virtually nothing seems stable, especially when it comes to relationships. It’s tempting to latch on to the person who seems right enough with the thought that you will finally have an anchor in the chaos and will never again have to date or worry that you’re never going to have babies. Some people even encourage this option. I don’t.

If you’re even remotely considering moving in with or marrying someone because you’re afraid you can’t do better, find and talk to a friend who’s had a near miss. Corner someone who got out of a relationship that had everything going for it except the most important things. Have coffee with a runaway bride. While that person is telling you the story, inhale. That’s the scent of pure relief.

Then wait to meet the person you’re sure about — not because they’re there but because they’re them.

6. It is all about the semipermeable membrane.

This is the hardest thing I know the day before 30. Life, my ninth-grade biology teacher instructed me, can be defined as genetic code surrounded by a semipermeable membrane. For a long time I did my best to ignore this fact, or at least its broader application. I didn’t have a membrane, I had a wall. When I was a teenager it was academics. School-related, in. Everything else, out. Later it was an eating disorder. Food, sex, adulthood, assorted other scary things, out. Inside, as little as possible.

But when you accept that you’re real, that your life cannot ever be this fixed, finished, perfect thing because that’s not what lives are, when you start to be intrigued by the messiness and actually want to let a little of it in, semipermeability becomes inevitable.

Here is what being semipermeable means: It means you’re your own gatekeeper. It means no one else is guarding your moat.

I’m not saying this isn’t terrifying. It is, especially if at some point earlier in your life someone whose job it was to protect you or listen when you said what you needed didn’t protect you and didn’t listen. It’s even more frightening if you were a girl raised to be nice and likeable, and watched women who spoke up for themselves get called difficult, selfish, needy in a tone that indicated their Permanent Record had become just a lit-tle too full.

If you don’t actively say no to some things and yes to others, though, you end up so overwhelmed that you make an even bigger scene, either outwardly or in your head (neither one’s a party). You can prevent all of that by simply owning that you are semipermeable, and not everybody and everything gets to come in, at least not all at once. Deciding what does and doesn’t is the most important thing you can do for yourself every day.

7. It will be okay.

Most prosaic four words ever, right? And what do I know? I could be hit by a bus tomorrow, and that would decidedly not be okay.

I can say, though, that as I worried through my later 20s, convinced that I had many times over Ruined Everything, something bizarre began to happen. The disillusionment lifted enough to let a little bit of ambition back in, and a sense of possibility. I started to have the strange sense that I could fail without being a failure. Maybe this was my irrational Millennial optimism finally kicking in, but for the first time in my life — literally, the first time — I found myself imagining the person I might be at 35 and 40 and 60.

I didn’t arrive at that in a way Meg Jay, Ph.D. or I, for that matter, approved of in the moment, and yet here I am, 30, with a sense of a future. So I can tell you: It will be okay. It will be okay. It will be. Okay.

 

 

 


To Be Free Is Free to Be

My provocation is this:  utopia is not the place to go looking for freedom. At least not the right kind of freedom. Ironically, I think, we should examine that which is so often associated with oppression, submission, and silence—dystopia.

 The idea for this paper came to me a year ago while watching an episode of Caprica, a spin-off of Battlestar Galactica. Here, Tad (gamertag:  Hercules) turns to Tamara and says:

“Look, I know this must seem really random to you, but this game—it really does mean something to me. It actually allows me to be something.”

Without pausing she fires back:

“Maybe if you weren’t in here playing this game you could be something out there, too.”

I think this exchange points to an interesting way in which the relationship between youth and the world is often cast:  youth are dreamers and cultivate their online selves at the expense of their real lives. But I think that this distinction between virtual and real is growing false and that the development of youth’s relationship with the intangible has everything to do with their relationship to the real.

Truth be told, this is actually my favorite episode of the series and it takes its name from a poem, “There Is Another Sky”:

There is another sky

Ever serene and fair

And there is another sunshine

Though it be darkness there

Never mind faded forests, Austin

Never mind silent fields

Here is a little forest

Whose leaf is evergreen

Here is a brighter garden

Where not a frost has been

In its unfading flowers

I hear the bright bee hum

Prithee, my brother

Into my garden come!

All of this from a woman who would never see the garden for herself.

But that’s sort of exactly the point, right? I mean, Dickinson and Tad are my people—they are the ones who are mired in the dark and they are the ones searching for a light, something more, something better. Something like a utopia.

And what is Dickinson’s garden, really, other than a form of utopia? Hearing those words, we picture a pastoral safe haven that is admittedly different from the technological utopias that we’ve been discussing in class but definitely a vision for a world that is better.

The trouble is that our utopias rarely come alone:  utopias are born out of dystopias, slide into dystopia, and maintain a healthy tension by threatening to turn into dystopias. As I’ve thought about this over the course of the semester, I have come to wonder if all utopias are in fact false for one person’s utopia is easily another’s dystopia. So we have this back and forth that is, as we have seen, instructive, but I’m most interested in the scenarios like those in Nineteen Eighty-Four and A Brave New World wherein an established utopia sets the scene for what has become a dystopian nightmare.

Somewhat like the life of a teenager. Tyler Clementi was perhaps the most high-profile case in a string of gay teen suicides that occurred in the fall of last year. At the time, I can remember being incredibly upset—not at Dharun, Clementi’s roommate—but at myself and my colleagues. “This death is, in part, on all of us,” I remember telling my peers for these are the kids that we are supposed to be advocating for and we’ve failed to change the culture that causes these things to happen. We’ve known about bullying in schools for a long time and we can make steps to alter that but we can also work to make youth more resilient.

Looking to do just that, columnist Dan Savage started a project called “It Gets Better” that attempted to convince gay youth to stick around because, well, “it gets better.” Once the initial goodwill wore off, I began to get increasingly upset at the project—not because the intent was unworthy but rather because the project showed a certain lack of understanding and compassion for those it was actually trying to help.

Telling a teenager that things will get better somehow, someday is like telling him that things will get better in an eternity because every day is like a million years. Telling a teenager your story means that you are not listening to theirs. And what about all those youth who don’t feel like they can tough it out until they can leave? They feel like failures. What you’re really after with this whole thing is hope, but I think that the efforts are misguided.

I was frustrated because this position caused youth to be passive bystanders in their own lives—that one day, they’d wake up or go off to college and things would magically get better. There might be some truth to that but what about all of the challenges that youth have yet to face? Life is hard—for everyone—and it’ll kick you while you’re down; but we need to teach our youth not to be afraid to get back up because the wrong lesson to learn from all of this is to become closed off and cynical.

So what are some of the ways that we can take a look at young adult culture and reexamine the activities that youth are already engaged in, in order to tell young people that they are valued just as they are?

For me, Young Adult fiction provides a great space in which to talk about themes of utopia/dystopia, depression, and bullying. So much more than Twilight, there was recently a discussion over this past summer on Twitter with participants employing the hash tag #YASaves. The topic was sparked in response to claims that the material in Young Adult fiction was too dark. Case in point, The Hunger Games centers on an event wherein 24 teenagers fight to the death in an arena. And I say this with the caveat that I am not a parent but I get that position—I really do. Years of interacting with parents and their children in the arena of college admission has convinced me that many parents want the best for their kids—they want to protect them from harm—but simply approach the process in a way that I do not find helpful.

Although “freedom from” represents a necessary pre-condition, it would seem that a true(r) sense of agency is the province of “freedom to.” And yet much of the rhetoric surrounding the current state of politics seems to center around the former as we talk fervently about liberation from dictatorships in the Middle East during the spring of 2011 or freedom from oppressive government in the United States. And these sound like good things, right? But here those dystopias born out of utopias are instructive for they show us what happens when “freedom from” collapses. Like “It Gets Better” which forwards its own vision of a life free from bullying, the dream rots because “freedom from” leads to a utopia—a space that, by its very nature, has no exit plan.

But, to be fair, perhaps “freedom to” has a stigma, one that Dan Savage is likely familiar with.

I imagine that there is a certain amount of disillusionment with this for “Free to Be…You and Me” has not really altered the perception that boys can have dolls or that it’s okay to cry. We are not yet truly free to be. But I would argue that it is not the concept of “freedom to” that is the issue here, it is the way in which it is defined—according to the song, it is a land where children and rivers run free in the green country.

In short, a utopia.

What if we applied what we learned from this course and instead of a place, recast utopia as a process of becoming? A dream of perpetual motion, if you will. What if we taught youth to think about how “freedom from” mirrors the language of colonialism and instead suggested that the more pertinent issue is that of freedom to? Not just freedom from censorship but freedom to protest, freedom to information and access to it, freedom to be visible, freedom to be anonymous, freedom to wonder, freedom to dream, and freedom to become. We are quickly seeing that virtual spaces are becoming hotbeds for these sorts of fights and the results of those skirmishes have a very real impact on the everyday lives of young adults. If there are teens who view high school as a war zone shouldn’t we arm them with better tactics? What if utopian described not a place but a type of person? Someone who fought accepted notions of the future and did not just wait for it to get better but challenged it, and us, to be better. Just maybe someone like a poet.

I opened with Emily Dickinson and I will return to her to close.

We’d never know how high we are

Until we’re called to rise

And then, if we are true to plan

Our statures touch the skies

Take what you’ve learned from this class and encourage youth to struggle with these notions of “freedom from” and “freedom to.” Help them rise.


Not Just for Suckers

Queen Mab, faerie queen from Season 4 of True Blood

As I’ve grown older, I have to increasingly come to appreciate the ways in which I have managed to pursue an academic discipline that affords me the ability to watch copious amounts of television. Who would have thought that I could go to school to watch vampires on TV? And yet here I am.

But as much as I watch television for fun, I also constantly find myself turning a critical eye to the subject at hand. A long-time fan of mythology and the power of narrative, I often think about how characters and tropes in television shows reflect, articulate, and create new aspects of culture. Very much in alignment with Stuart Hall’s notion of decoding/encoding, I believe that television is dissected by viewers and the pieces are shuffled around to enact new forms of meaning.

As such, I’m quite intrigued by the viewers of shows like True Blood (HBO, 2008-present). Over the years, vampires have been theorized to embody issues of gender (e.g., Nina Auerbach’s Our Vampires, Ourselves), sexuality (Camilla and the lesbian vampire or James Twitchell’s work on the fears of male heterosexuality), and medicine and the body (Ludmilla Jordanova’s Sexual Visions:  Images of Gender and Medicine and Science between the Eighteenth and Twentieth Centuries). And while these themes are still relevant to contemporary culture, I think that it would be interesting to investigate issues of authenticity and representation in a show like True Blood.[1]

Sookie as a Faerie, Season 4 of True Blood

Although this past season featured a number of references to illusion, appearance, and authenticity (ranging from introduction of faeries—long known to be notorious visual tricksters—to politicians and amnesia), the series itself has also wrestled with “realness” over its run. Whether it is vampires struggling with their “true nature,” the duplicity of organized religion, or relationships wherein one is cruel to be kind, I’m curious to examine how viewers interpret themes of authenticity and employ these incidents as references or models of behavior. How, for example, do viewers navigate the multiple layers of reality that exist on the show? How do stereotypes (i.e., “this is what you say I am) meld with religious themes (with an underlying current of “this body/life is not all that I am) and the lingering accusation of “Are you now or have you ever been a vampire?” Are the contemporary interpretations of vampires consistent with previous ways of thinking? Who watches the show and with whom? Do viewers watch a show multiple times (and does their understanding of the show evolve)?

Admittedly, one might be able to develop a rich body of work as a result of conducting a media ethnography on a show like True Blood but one should also be mindful of who is left out of this type of investigation, namely that one might miss the effects that a particular program has outside of its primary viewership. Obviously researchers must eventually decide who to examine in the process of ethnography as resources are not unlimited; this reality does not, however, excuse researchers from clearly labeling the bounds of their inquiry and articulating the limitations of their work. But, if we follow the argument that media can constitute culture, we can see how individuals may interact with a particular property at the level of culture without ever viewing the source material.

True Blood (top)/True Mud (below)

Take, for example, the Sesame Street short “True Mud,” which is roughly based on HBO’s True Blood. Here we see the potential for a wonderfully rich and complex set of meanings as Sesame Street appropriates a popular (and very adult) television show in order to wink at parents who might have seen True Blood. Although a portion of parents watching the “True Mud” skit might think back to an episode of True Blood, there are also assuredly parents who understand the reference but have never seen the show or parents who have no idea that “True Mud” is a parody of anything. These parents would most likely not consider themselves viewers of True Blood but their voices might tell researchers something about how True Blood fits into a larger media ecology.

Watching the clip, one immediately begins to develop a host of questions. What is the importance of the Southern setting and what does such an environment evoke for viewers of “True Mud” and True Blood? How does this contrast in setting relate to the environment of Sesame Street, which is urban? What is the demographic makeup of audiences for “True Mud” and True Blood and how does this constitution affect the way that the South is understood in relation to the property? How and why does the concept of a vampire map onto a grouch from Sesame Street? How does this presentation of a grouch differ from Oscar?

Ultimately, interviewing viewers of a property allows researchers to develop a complex understanding of the ways in which a piece of media might influence individuals but we must also recognize that the impact of media does not just stop with those who watch it directly. References made in pop culture, interpersonal interactions, or even children’s shows indicate that media can exhibit echoes as it permeates our lives.


[1] As a side note, this theme has been something that has been building up steam for a couple of years as I am curious about the seeming need for characters who can see through the veil or otherwise ascertain a measure of “objective” truth. We’ve seen shows like The Mentalist and Psych that feature incredibly observant individuals; Lie to Me, which concerned itself with Paul Ekman’s micro-expressions and truth telling, Ringer and The Vampire Diaries, which both feature doppelgängers, and Once Upon a Time and Grimm that both prominently feature a character who can see things that others can’t.


Twinkle, Twinkle, All the Night

There are times, I think, when all of this just seems overwhelming. With a new section each week, we are asking each of you to grapple with things that you may not have encountered before and I completely understand that this may not be easy.

But, then again, who said it was going to be?

Although you have to find the balance with this, some part of me believes that, as investigators, we should be a little overwhelmed for it is in this moment that we begin to grasp just how large the problem really is. What was once so clear becomes infinitely murky and we struggle to find a foothold. The issues that Asians Americans face are complex and seemingly never-ending. How do I go about dismantling the myriad problems that we encounter every day? Will I even make a difference? Should I even try?

It’s taken me a few years to get to where I am now but I have to come to believe that the answer is a resounding “Yes.” I get called out for being impractical because I’m not as interested in deliverables, action items, and long-range plans; instead, I’m interested in the transformation that occurs on an individual level when one decides that he or she is capable of making a difference.

And the thing that they never tell you in school is that you don’t have to change the world in a grand way on your first go. Making a difference isn’t about spectacle and scale so much as it is about intent and meaning. There a million ways in which one can change the world on an everyday basis that have profound and lasting implications and it is these sorts of actions that I often think about when we come to issues of sexuality and gender in CIRCLE.

By now, all of you have gone through the exercise where we attempted to place ourselves in the mindset of someone who does not identify as straight. Although our session exhibited moments of laughter and sympathy, I hope that the exercise also went beyond this to generate a feeling of empathy. I get that it’s a bit heavy to think about some of these things on a night when you are coming off of class and looking forward to homework, but I would challenge my session to think about how they would react if they couldn’t just go home after all was said and done. How might you feel if you really had to tear off the corners of your star?

The thing that we strive to teach our students in CIRCLE is that all of these issues are linked (and, yes, messy) but that you can also apply what you’ve learned from one week to another. What if you thought about sexuality like you think about ethnicity? Students in our session can’t just stop being Asian American—just like other students can’t stop being GLBTIQ. How can you map your need to justify your worth as an Asian onto things like gender or sexuality?

But even if that’s a bit too heavy for you, I do want to mention something that I brought up at the conclusion of our session. Issues of gender and sexuality figure heavily into what I do, along with my experiences in college admission and psychology. I spend a lot of time thinking about self-harm/mutilation, eating disorders, depression, restlessness and projects like It Gets Better (which I can happily discuss the faults of). I spend a good deal of my time trying to think about ways to change educational policy to help students to recognize and feel of worth; I think about bullying in schools but also bullying on Perez Hilton, TMZ, and even by Dan Savage.

One of the things that I have learned in my years of college admission is that an increasing number of students are suffering from something that I call “floating duck syndrome”—on the surface, students are serene and perfect but, underneath the water, their legs are churning. Needless to say, students have some issues. I don’t mean to imply that students will not be able to overcome these things, but I must admit that I was shocked to learn about what they were dealing with.

However, I should also mention that I am incredibly hopeful for the generation of students that is following in my footsteps. I am hopeful that students will learn to brave the dark places of themselves, secure in the knowledge that friends and family will always be there to draw them back. I am hopeful that students will come to understand who they are and accept themselves for that. And, I am hopeful that students will learn to step outside of themselves in order to offer their help to those in need. I am lucky to be in a situation where I can empower future students to realize that, although occasionally overwhelmed by adversity, they are all survivors in some respect:  any person who has ever been teased, ridiculed, outcast, or made to simply feel less than is a survivor and can embrace that. And, because you are a survivor, you have been imbued with the power to tell your story to others in similar situations in order to pull them through. Ultimately, I am also hopeful because I have learned that young people are incredibly resilient and innovative—you can accomplish some amazing things if given half a chance.

And one of those amazing things is to realize just how much power you have. As I mentioned before, you don’t have to change the world overnight but I challenge you to realize that, just by being yourself, you possessed an incredible amount of agency:  each and every one of you has the power to keep at least one point of that star intact. If they so choose, the you have the power to potentially save a life—and how amazing is that?

This week you were all given stars, but the thing that you need to realize—as cliché as it might sound—is that you are all, in your own way, stars. Go out there and burn bright. Shine like you’ve never had any doubt.


Watching You Watch Me Watching You

As an admission officer, you have to be quick on your feet. More often than not, you’re on your own in front of an audience who is scrutinizing your every move.  What you say, how you say it, what you don’t say—these are all things that are examined for hidden meanings. Grace, poise, enthusiasm (not unlike a beauty pageant contestant?) are attributes that the job demands, especially when you are trying to put out fires without breaking a sweat.

One of the most challenging experiences I ever had took place at USC’s satellite campus in Orange County. Current USC undergraduates were on hand to give local college counselors a taste of life at USC and were performing admirably until I heard those words float across the room:

“It was great to come to USC because you really got to see how the other half lives.”

I will fully admit that I hadn’t been entirely focused on the conversation, but, with that, my attention snapped back into focus. How do you fix something like that without drawing overt attention to it? Do you just hope that people didn’t notice? Is it worse that they didn’t? How do you come back from that?

Eventually everything worked out all right and, in the long run, that moment was much more instructive for me than it was damaging:  it’s something that I’ve carried with me throughout my career and something that I think about when we come to the topic of ethnography.

It’s easy, I think, to claim that you are interested in understanding the mindset of others but it is another thing entirely to be open to such a practice. Even if we momentarily ignore issues of assimilation and the fear of losing oneself in or to a project (as if self identity was ever something that was static), it is still incredibly difficult to work against a process that automatically filters perceptions through layers of developed experiences. Despite our stated intent, it may take us longer than we expected to truly begin to understand those we wish to study.

I’m looking at you, Tyra Banks.

Needless to say, Tyra Banks going “undercover” as a homeless person for a day is not a form of ethnography (although I do not think that Tyra herself would ever employ such a word). Being made up to look homeless for a day undoubtedly fails to convey the sense of hopelessness that some homeless feel or, for that matter, even a very real sense of the pervasiveness of the issue. In fact, at its worst, Tyra’s undercover episodes are a form of stunt journalism that seeks to profit off of the very groups that she is purporting to help; entering with all of the trappings of privilege, it is her duty and her prerogative to expose injustice, wrongdoing, and prejudice. This is, of course, not to suggest that the objects of her inquiry (e.g., strippers, homelessness, sexism) do not deserve inquiry but the danger lies in individuals like Tyra believing that their investigative experiences are more meaningful than they actually are. Spanning across instances as varied as Tyra’s episodes, colonialist literature, and It Gets Better, we see a common theme:  the story of the investigators is elevated above the tale(s) of the community.

Here we understand an opportunity for ethnography to redress the situation as it reasserts the relationship of the observer to those that he or she would study. Rather than striving to remove all traces of the observer (which is probably impossible anyway), I think that good ethnography acknowledges the impact of the observer and clearly outlines ways in which the observer’s presence might alter outcomes and how the observer’s perception of events is framed by personal history.

So as I sat in an after-school tutoring session, I found myself racing to take four sets of notes:  observations, possible meanings of what I saw, implications of those actions, and a running account that attempted to explain why I perceived things in the way that I did. In essence, I made a series of passes, adding additional layers of information each time I revisited my notes. Although this process would have ideally been aided by audio/visual recording, I think my mini-ethnography was quite instructive as I began to think about what things were worth recording (and which I had to let go because I just couldn’t keep up) and also how to rapidly shift between different layers of analysis. After three hours I found myself exhausted but with an interesting record of how the students in this center interacted with one another and their tutors; in my hands I held a formal record of what educators learn to do instinctively as they evaluate and assess each of their students (and themselves). Some students were easily distracted but amazing when focused, some were great motivators but not great leaders, some were bored when working with tutors but animated when teaching their peers, and some just seemed to feel uncomfortable in larger groups. To their credit, tutors seemed to have picked up on many of these traits (and undoubtedly more that I couldn’t even begin to see) and adjusted their mannerisms as they moved back and forth between students:  to some they were kind, others stern, still others saw a stern exterior interrupted with sly smiles. Although I didn’t have the opportunity to interview the tutors after I observed them, I wondered how much of this process was automatic for them. Did they consciously consider how to best handle a student or did they just seem to “know” what to do? Had they, as teachers, done an exercise like this before? Did this sort of self-reflexivity make them better teachers? How had these volunteers grown into their jobs as educators? Did the skills exhibited in the tutoring center translate to a classroom?

I suppose there’s always next time…


How to Save a Life

I fully admit that this is not mine, but I think it raises many good points about the nature of the project and its dialogue. While I certainly don’t think that the project comes from a place of ill will, it may be somewhat misguided. Or, more accurately, I think that the scope of what this whole thing is trying to do is limited and the project is unable to recognize its own bounds.

Fuck No, Dan Savage!

queerwatch: “Why I don’t like Dan Savage’s “It Gets Better” project as a response to bullying”

queerwatch:

“Why I don’t like Dan Savage’s “It Gets Better” project as a response to bullying

(Ten Points, in order of appearance)

1. The video promotes metro-centric and anti-religious sentiment. By aligning their bullying with the religiosity and “small-town mentality,” Dan and Terry tacitly reinforce the belief (especially rampant in queer communities) that the religious and the rural are more bigoted.

2. The message is wrong. Sometimes it gets better– but a lot of times it doesn’t get any better. Emphasizing that things will improve upon graduation is misleading both to young folks struggling and also to people with privilege who are looking on (or looking away).

3. Telling people that they have to wait for their life to get amazing–to tough it out so that they can be around when life gets amazing– is a violent reassignment of guilt. Dan Savage telling kids that if they don’t survive their teenage years they’re depriving themselves? What kind of ageist garbage is that? This quietly but forcefully suggests that if you don’t survive, if you don’t make it, it’s your own fault. It blames the queer for not being strong enough to get to the rosy, privileged, fantasy.

4. Stories of how your mom finally came around, over-write the present realities of youth. Arguing that in the future, the parts that hurt will be fixed, not only suggests that folks shouldn’t actually inhabit their own suffering but it also suggests that the future is more important. For a lot of folks, it doesn’t matter if your mother might come to love you and your spouse. It matters that right now she does not love you at all.

5. The rhetoric about being accepted by family, encourages folks to come out– even when coming out isn’t a safe idea. There is no infrastructure to catch you when your family reacts poorly. There is no truly benevolent queer family, waiting to catch you, ready to sacrifice so you can thrive. For a lot of folks, coming out doesn’t only mean that your parents will promise to hate your lovers– it means violence, homelessness, abuse.

6. Bar story: vomit. It’s no coincidence that this is the first place where Dan and Terry mention queer space. Codified queer-space, restricted to 21+, w alcohol? Try again.

7. We shouldn’t be talking, we should be listening. Telling our own stories from our incredibly privileged positions, overwrites youth experience.

8. Stories of over-coming adversity: no thank you. Narratives of how life was hard and but now is good, belittle lived pain, imply that a good ending is inevitable, and also undermine the joy and happiness in even bullied kids’ lives.

9. There is actually no path to change in this vision. Promoting the illusion that things just “get better,” enables privileged folks to do nothing and just rely on the imaginary mechanics of the American Dream to fix the world. Fuck that. How can you tell kids it gets better without having the guts to say how.

10. Then we get a baby and go to Paris? WTF? This is a video for rich kids for whom the only violent part of their life is high school. It’s a video for classist, privileged gay folks who think that telling their stories is the best way to help others. Telling folks that their suffering is normal doesn’t reassure them– it homogenizes their experience. It doesn’t make them feel like part of a bigger community, it makes them feel irrelevant.

Plus three (with a little help from my friends)

1. When we treat campaigns like this like they’re revolutionary, they undermine all the really amazing work that the youth already does for itself. Too often in the LGBT world, we are asked to thank our brave queer activist ancestors who made the world safe for us. That does have its place. But queer youth take care of themselves. They nurture and organize and love in order to save themselves and each other. Making famous messages legible as THE messages makes youth-work look minor, haphazard, or unofficial.

2. Campaigns like this lump everyone together. It doesn’t honor or respect the individuals. It turns them into icons. It sends confusing messages that we only attend to folks when their dead– when giving care doesn’t actually take anything out of us.

3. Broadcasting your story into the world, or congratulating others for broadcasting theirs is an anesthetized, misguided approach to connecting. We should help folks feel seen— by trying our hardest to see them.

It has been my experience that people are ashamed to help the folks they see as destitute. They are willing to let someone crash on their sofa for a night if they know that they have a back-up bed, somewhere else. They are happy to provide dinner, so long as they know you would be eating even without their generosity. It seems that if you’ve never been homeless or lost or hungry, if you don’t know what that feels like, is too embarrassing to give things to people who might die without them– it is humiliating to hand someone the only food they’ve had all week.

No one is skittish about giving things up so that others can live comfortably. But they are unspeakably afraid of giving away something so someone can merely live. Campaigns like this exacerbate these realities by dehumanizing the people they address, turning them into a depressing mass, ready to be farmed for beautiful tragedies, and transformed into class-passing, successful adults.

How about instead of hope: change. Even if it’s really small change. Even if it doesn’t inspire anyone and no one is grateful and no one even notices. How about doing the kind of work that makes differences in peoples lives without holding them responsible—without turning them into an icon of suffering or of hope, without using their story for a soundbyte, without using their life as your proof of goodness, or of how the world is so liberal, or how it’s great to be gay. I mean money. I mean listening. I mean time. I mean giving people space that we respect and don’t enter. I mean listening to needs and finding ways to fill them.

How about instead of honoring the bravery of youth and the sadness of our times: respecting queer youth for all the incredible work they do– despite the fact that it is so rarely recognized as work, or as adequate work.

Instead of jettisoning our religion, our upbringing, our origins: a cohesive self.

Instead of narratives of suffering and then, finally, success: a celebration of the pain and pleasure throughout.

And listening– way more listening. Because telling your personal story of adversity from a place of privilege, might have a lot of applications, might be asked of you perpetually, might seem alluring because it’s so often milked from us. But it’s not the way. Saying, “I know how you feel, because I used to feel that way, and let me tell you, I don’t feel that way anymore,” doesn’t help, it hurts. You’re dwelling in the present. Don’t insist that those in pain relocate themselves to the future.”

——————————————————————————————-

I really relate to the critical commentary on the It Gets Better project. I feel like my rural upbringing was in many ways a product of the gay rights movement settling down in urban areas and abandoning the rest of the country, without safe spaces, without infrastructure, and had this attitude of a binary—be closeted and rural or run away to the city and the university to have rights, be happy, and function. When we don’t return to our origins, to the communities we come from, we deprive those we leave behind of such richness of diversity and wisdom that come from experience and moreover, they fail to see the beautiful possibility of queer and trans rural youth who live, survive, and thrive, and make themselves ignorantly blessed to the continual struggles of these populations who deal with even more barriers and bigotry.

My town is a three-hour drive from San Francisco. I read the following on Wikipedia under the entry for Trannyshack, a SF-based drag venue regarding a tour they took: “Trannyshack also holds the annual Trannyshack Reno bus trip. Hosted by Trannyshack veteran Peaches Christ and held over Easter Weekend, participants are encouraged to dress and act as outrageously and/or provocatively as possible and imbibe alcohol heartily over the course of the weekend. During the ride from San Francisco to Reno, ***the tour bus makes several pit stops in relatively conservative places such as Placerville and Donner Pass, designed partially to get a rise out of small-town locals and unsuspecting travelers, all in real life scenes reminiscent of Priscilla, Queen of the Desert***. The culmination of the event is a special Trannyshack show at a Reno nightclub, followed by Easter Sunday brunch the next day at a local casino.”

I was so sad that I had missed this group of fabulous queens and kings…but also frustrated that they only came by my rural town of Placerville, in order to “get a rise” out of the ‘conservative’ population—what about actually networking with the rural town’s queer, trans, and allied populations, WHO EXIST, and are generally without resources and lack fabulously queer entertainment??? I would have loved them to perform for us, to have the opportunity to speak with them. To show them that queers exist beyond the city limits.

Beyond this note, I think that the argument that Dan Savage and crew are making about how queer life improves linearly with time ignores the experiences, past and present, of queer and trans elders/seniors, whose needs are not part of the mainstream gay rights movement’s agenda—are they really “better off” because they are no longer queer youth???

And for all the awesome power of the online video platform he uses, the self-replicating-ness of the video testimonial doesn’t really do much beyond go in a circle like a dog chasing it’s tail—what kind of policy change, structural change, cultural shift is he advocating? How do Dan Savage’s friends from similarly privileged backgrounds telling a similar story mobilize and organize the viewers to act?

—Zoe Melisa


Points for Trying?

The obvious answer is that if early Science Fiction was about exploring outer space, the writings of the late 20th century were largely about exploring inner space. More than just adventure tales filled with sensation or exploration (or cyberpunk thrill) the offerings that I encountered also spoke to, in a way, the colonizing of emotion. Thinking about Science Fiction in the late 20th century and early 21st century, I wondered how some works spoke to our desire for a new form of exploration. We seek to reclaim a sense of that which is lost, for we are explorers, yes—a new form of adventurer who seeks out the raw feeling that has been largely absent from our lives. Jaded, we long to be moved; jaded, we have set the bar so high for emotion that the spectacular has become nothing more than a nighttime attraction at Disneyworld.

At our most cynical, it would be easy to blame Disney for forcing us to experience wonder in scripted terms with false emotion constructed through tricks of architectural scale and smells only achievable through chemical slight of hand. But “force” seems like the wrong word, for doesn’t a part of us—perhaps a part that we didn’t even know that we had—want all of this? We crave a Main Street that most of us have never (and will never) know because it, in some fashion, speaks to the deeply ingrained notion of what it means to be an American who has lived in the 20th and 21st centuries.

For me, there are glaring overlaps with this practice and emotional branding, but what keeps me up at night is looking at how this process may have infiltrated education through gamification.

Over the past few years, after reading thousands of applications for the USC Office of Undergraduate Admission, I began to wonder how the college application structures students’ activities and identities. On one hand, I heard admission colleagues complaining about how they just wanted applicants to exhibit a sense of passion and authenticity; on the other, I saw students stressing out over their applications and their resumes. The things that I was seeing were impressive and students seemed to devote large amounts of time to things, but I often wondered, “Are they having any fun?”

Were students just getting sucked into a culture that put a premium on achievement and not really stopping to think about what they were doing or why? We can talk about the positive aspects of gamification, levling and badges, but as the years wore on, I really began to see titles on activity summaries as things that were fetishized, obsessed over, and coveted. Students had learned the wrong lesson—not to suggest in the slightest that they are primarily or solely responsible for this movement—going from a race to accumulate experience to merely aggregating the appearance of having done so. How could I convince them that, as an admission officer, it was never really about the experience in the first place but instead how a particular activity provided an opportunity for growth. It was—and is—about the process and not the product.

But, that being said, I try not to fault students for the very actions that frustrated me as a reader are reinforced daily in all aspects of education (and life in general). Processes are messy, vague, and fluid while products are not. How would one even go about conceiving a badge for emotional maturity? Would one even want to try?

Perhaps I am clinging to notions of experience that will become outdated in the future. Science Fiction challenges us to consider worlds where experiences and memory can be saved, uploaded, and imprinted and, really, what are recreational drugs other than our clumsy attempt to achieve altered experiences through physiological change? I don’t know what the future will bring, but I do know that my former colleagues in admission are likely not thinking about the coming changes and will struggle to recalibrate their metrics as we move forward.


Always Starting Over But Somehow I Always Know Where to Begin

As students in my section undoubtedly were aware, the Critical Analysis of Social Issues (CASI) model is one that I struggle with—mostly because, I think, of the word “context.” The trouble is that the word is much too broad to mean much of anything for me:  I can talk about unequal power structures or socio-historical background…but aren’t these all forms of context? I understand events like the Irvine 11 as situated in a number of overlapping contexts:  political, economic, social, historical, geographic, and temporal. Moreover, the way in which I choose to examine any particular issue also brings with it a certain set of affordances and limitations—I must remember that I too am a sort of context for the event is being interpreted though a series of lenses and filters that have developed out of my personal combination of experiences.

But I do not mean to imply that this effort is unworthy just because it is limited or because it is difficult. I think of critical thinking as a series of skills or tools that one can employ in order to contemplate an issue from multiple angles. The biggest challenge for our group seemed where to begin:  with so many questions floating in the air, how does one even begin unpacking it all? Every answer is necessarily connected to another and it seems like a ball of string that folds back in on itself, offering no place upon which to perch. The answer, for me, is to begin analyzing something along one line of inquiry knowing that your work will be incomplete but moving along anyway—you can, after all, always go back and add to what you have uncovered. Only through practice does the plodding turn into instinct.


Stay Classy

So although class and immigration are not necessarily my areas of expertise, I’m going to go ahead and give this one a shot with the caveat that I have not done extensive amounts of outside research.

In and of themselves, class and immigration exist as two fairly large and complicated issues in contemporary America. Looking at the current state of politics, it seems hard to ignore either with proclamations of “class warfare” flying, Occupy Wall Street (not to mention events occurring in major cities around the world, Sesame Street, and Education), the 99%, the 53%, the Dream Act and immigration legislation…the list goes on and on. We can employ the CASI model from last week to begin analyzing the question in terms of economics and politics but I also notice that students in our session spoke to notions of cultural capital.

Although there is a rich history on the subject, I encourage to students to think about how cultural capital represents one of the ways in which one can compare differences in class/immigration status.

Stolen from Wikipedia

Cultural capital (Frenchle capital culturel) is a sociological concept that has gained widespread popularity since it was first articulated by Pierre Bourdieu. Bourdieu and Jean-Claude Passeron first used the term in “Cultural Reproduction and Social Reproduction” (1973). In this work he attempted to explain differences in children’s outcomes in France during the 1960s. It has since been elaborated and developed in terms of other types of capital in The Forms of Capital (1986); and in terms of higher education, for instance, in The State Nobility (1996). For Bourdieu, capital acts as a social relation within a system of exchange, and the term is extended ‘to all the goods material and symbolic, without distinction, that present themselves as rare and worthy of being sought after in a particular social formation (cited in Harker, 1990:13) and cultural capital acts as a social relation within a system of exchange that includes the accumulated cultural knowledge that confers power and status.[1]

Those researchers and theorists who explore or employ Bourdieu’s theory use it in a similar way as it was articulated by Bourdieu. They usually apply it uncritically, and depending on the measurable indicators of cultural capital and the fields within which they measure it, Bourdieu’s theory either works to support their argument totally, or in a qualified way. These works help to portray the usefulness of Bourdieu’s concept in analysing (mainly educational) inequality but they do not add anything to the theory.

One work which does employ Bourdieu’s work in an enlightening way is that of Emirbayer & Williams (2005) who use Bourdieu’s notion of fields and capital to examine the power relations in the field of social services, particularly homeless shelters. The authors talk of the two separate fields that operate in the same geographic location (the shelter) and the types of capital that are legitimate and valued in each. Specifically they show how homeless people can possess “staff-sanctioned capital” or “client-sanctioned capital” (2005:92) and show how in the shelter, they are both at the same time, desirable and undesirable, valued and disparaged, depending on which of the two fields they are operating in. Although the authors do not clearly define staff-sanctioned and client-sanctioned capital as cultural capital, and state that usually the resources that form these two capitals are gathered from a person’s life as opposed to their family, it can be seen how Bourdieu’s theory of cultural capital can be a valuable theory in analysing inequality in any social setting.

In many ways, cultural capital is encapsulated in the types of things that one just knows as a result of one’s upbringing. Knowing how to voice one’s political opinion, how to navigate city government, and blend into the public are all forms of cultural capital and I would suggest that it is fruitful for students to contemplate how their sense of accrued cultural capital intersects with power.


Hunger Games ID

QR code for Hunger Games ARG…! Scan, please, if you have a chance!

 

 

 

 


Love out of Nothing at All?: A re-examination of popular culture’s presence in the college application

Key phrases:

College application essay, identity as narrative, popular culture, digital media literacy, self-branding

Session type:

Structured talk (30 minutes), discussion (30 minutes)

Target audience:

Secondary school counselors, CBOs

Abstract:

Harry Potter. Twilight. Video games. Twitter.

 The media environment that surrounds today’s applicants seems rife with topics that likely sit high atop lists that solemnly declare, “Bad Essay Ideas.” And, perhaps, not without reason, for the typical college application essay is one that often treats these subjects (along with more traditional ones like leadership, sports, or community service) lightly, evidencing a cursory understanding of the material at best. Students seem to struggle to infuse meaning into activities that appear on resumes, attempting to convince admission officers—and perhaps themselves—that these pursuits constituted time well spent.

 But what if we could encourage students to rethink their engagement in these activities, while also challenging them to respond to the question, “Why does this matter?” Instead of asking students to conform to a process that privileges particular activities over others, how might we inspire young people to cultivate genuine interests while simultaneously thinking critically about the implications of their actions? Similarly, how might we encourage adults to recognize the potential nascent political themes of Harry Potter, see young people negotiating family structures and gender roles through Twilight, witness creativity and collaboration through video games, and understand how Twitter can develop the skill of curation? Instead of promoting the chasm between digital media/popular culture and education, how can we use the space to promote the skills that our students will need to be competitive in the 21st century?

Description:

College attendance and completion (at a four-year institution) has come to represent a significant demarcation in American society with studies showing a positive correlation between obtainment of a bachelor’s degree and total lifetime income. But more so than a mere economic advantage, higher education represents an opportunity for social mobility and the accumulation of social/cultural capital. If we accept that college attendance represents at least a partial transformative experience, we realize that understanding who is accepted is important.

Informal reports from educators (an opinion pieces in The Chronicle of Higher Education) have hinted that the current generation of college students display a wide range of skills and intelligences but also appear to be distracted by social media platforms such as Facebook and Twitter while in class, suggesting that digital media is generally seen as inhabiting a space separate from education (although this might be changing, albeit slowly).

However, I suggest that some of the types of skills professors desire (e.g., critical thinking, academic inquiry, engagement, and risk-taking) can be, and are, cultivated through pop culture and digital media use/production but it is my belief that, as a whole, the undergraduate admission process systematically devalues participation in such spaces, privileging more traditional—and readily understood—activities. There seems to be a potential disconnect, then, between selection criteria and the skills that schools hope to attract; if an institution values traits like proactivity, are admission officers fully sensitive to the range of ways in which such a trait might present or manifest? Or have we become overly influenced on quantitative measures like GPA and test scores and the relative stability they purport to provide? If such a bias exists, a possible effect of the college application structure (and the American educational system) is to cause those involved in the admission process to internalize a mental barrier between digital media and education.

It seems evident that the admission selection process (as reflective of an institution’s values) plays a large part in shaping who is able to attend a given school. Highly-selective schools, however, seem to have a disproportionate amount of influence in American culture as their practices create a stance that other colleges and universities either aspire or react to. Therefore the position that highly-selective institutions take on the integration of digital media and education likely has a trickle-down effect that affects the admission profession as a whole and is likely internalized by college counselors and high school students who aim to be accepted by these schools.

Ultimately, I hope to foster discussion between high school students, high school college counselors, and admission officers that examines how we collectively conceptualize and articulate the value of the connection between pop culture, digital media and education. I argue that higher-order skills can be cultivated by youth practices such as remix but that incongruent language employed by youth and adults makes recognition of this process difficult. After giving a short talk that explores the ways in which the everyday practices of youth can be seen as valuable, I will ask participants to join in a discussion that seeks to uncover strategies to enable youth to articulate their process and how we can challenge our peers to become more sensitive to the manifestation of traits that mark a “successful student.”


 

Biography:

A 6-year veteran of undergraduate admission at the University of Southern California (Los Angeles, CA) Chris Tokuhama was responsible for coordinating the University’s merit-based scholarship process and 8-year combined Baccalaureate/M.D. program. Working closely with high school populations, Chris became interested in issues that ranged from self-harm to educational access and equity, which has helped to inform his current research interests in digital media literacy, learning, and youth cultures. In addition to his role as an advocate for youth in Education, which included a Journal of College Admission publication on the effects of branding in the admission process, Chris studies the relationship of personal identity to the body as a doctoral student in USC’s Annenberg School for Communication and Journalism. Employing lenses that range from Posthumanism (with forays into Early Modern Science and Gothic Horror), the intersection of technological and community in Transhumanism, and the transcendent potential of the body contained in religion, Chris examines how changing bodies portrayed in media reflect or demand a renegotiation in the sense of self, acting as visual shorthand for shared anxieties. When not pursuing his studies, Chris enjoys working with 826LA and drinking over-priced coffee.


In the Affirmative

Race is one of those things that immediately causes most people to take a position. We have all grown up in a world that is still struggling with racial equality and we have all been exposed to the racial profiling that took place after 9/11. Outwardly, we all recognize that it is no longer PC to call someone by a racial slur or to discriminate in an overt manner—and this is where we begin to enter dangerous territory.

Many of my students have grown up in an environment that shuns racism; we all profess to believe in equality. We think the lack of lynch mobs or ethnic cleansing in our surroundings means that we’ve somehow moved past all of this. But we still have Minute Men, we still have genocide, we still have the KKK, and we still have people dragged behind pickup trucks with their faces melting against asphalt. We exist in a country that is becoming more polarized than ever and it is frankly a little frightening. We are learning to turn our backs on each other and form communities that ascribe to the same beliefs that we do.

Racial issues affect all of you.

If you think that this statement is untrue, look at the world around you. Think about your place in your community and the niche that is carved out for you by others. Where does society tell you that you can exist? What is it safe for you to be? How much of this is determined by your physical features?

On a related note, the concept of Affirmative Action was explored by Thursday’s session—something that I happen to know a little about. Some students voiced concerns over the practice while others stated that they did not support it. Let me start off by saying that I get where these students were coming from as I was no different in college. Like it or not, however, all of you have been affected by Affirmative Action. USC as an institution values diversity and practices Affirmative Action; the term, however, does not mean what most people assume it to. In our eyes, Affirmative Action is about providing equity and access to education. You might think that such programs lend a helping hand to indigents at the expense of “more qualified” individuals; I would challenge you, however, to think about what makes one student more qualified. Is it test scores? Is it GPA? Is it the fact that you went to a fancy prep school and deserve to be at USC? Do you think that this somehow makes you better than someone else?

Now think about how many other people are just like you.

Affirmative Action aims to recognize the strengths that different individuals can bring to the table. Do Latinos and Blacks who have had to struggle to finish high school have a different perspective on the world than Asians (who might have benefited from positive aspects of the Model Minority myth)? Do these students see things in a way that you don’t? Is there a benefit to interacting with them and learning how other people think?

Affirmative Action doesn’t just apply to Blacks and Latinos, however. Are you Southeast Asian? Are you first generation? Are you from a low socioeconomic class? Did you have to work in high school to help your family? Are you from a state that does not typically send a lot of students to USC? Are you from a minority religion? Do you hold atypical political beliefs? Are you a female interested in Math or Science? Are you a male interested in Communication? If any of the above are true, then you have benefitted from the type of thinking that supports Affirmative Action.

Moreover, you all benefit from the diversity that Affirmative Action creates. The depth of experiences that you have at USC is in part due to the voices that we bring in. Every student has value.

And, to turn things a bit, if you think that Affirmative Action is wrong, let’s think about football. Many of the students on the team were individuals who may have scored lower than you on SATs or received lower GPAs. Why aren’t as many people upset that these “lesser qualified” people were admitted? Is it because you enjoy going to football games? Do you only extract value from people when it suits you? My point is that the entire USC community benefits from the presence of gifted athletes (who manage to graduate just fine, by the way) and that these individuals—analogous to ethnic minorities—can bring something invaluable to the table.

I think that the reaction against Affirmative Action stems from fear:  we instinctively lash out in order to protect ourselves when we feel threatened by the encroachment of undesirables.  We want to secure our hard-earned victories and may feel that our achievement are cheapened by the acceptance of people whom we do not respect.

Fight it.

Fight to see the similarities that you have with others; fight to see their worth. Think about how important it is for other people to see you and fight to feel the same way about others. Fight against the indoctrination that you’ve suffered for so long that has engrained these patterns of thinking into your minds. Fight the urge to think that you’re more important than you are. Fight the need to feel comfortable and fight the urge to judge. Fight for your life and fight for your life to be the way that it should be. Fight to understand the things that we’ve been talking about this semester; fight to find meaning in our discussions. Fight to make the world better for your children, for your friends, and for yourself. Fight for people who don’t have a voice. Fight in whatever way you can…but just fight.

It has been my pleasure to work with you this semester and there’s no real way to convey how hopeful I am that this will be a turning point in your lives. I don’t expect that you’ll all become crusaders for API rights (nor should you feel compelled to), but I do hope that we’ve been able to get you to see things for the first time or to feel empowered to make change happen. Take the critical thinking skills that you’ve learned from CIRCLE and go out and find your cause. We’ve got a long way to go, but you’ve already taken the first steps.


The Real-Life Implications of Virtual Selves

“The end is nigh!”—the plethora of words, phrases, and warnings associated with the impending apocalypse has saturated American culture to the point of being jaded, as picketing figures bearing signs have become a fixture of political cartoons and echoes of the Book of Revelation appear in popular media like Legion and the short-lived television series Revelations. On a secular level, we grapple with the notion that our existence is a fragile one at best, with doom portended by natural disasters (e.g, Floodland and The Day after Tomorrow), rogue asteroids (e.g., Life as We Knew It and Armageddon), nuclear fallout (e.g., Z for Zachariah and The Terminator), biological malfunction (e.g., The Chrysalids and Children of Men) and the increasingly-visible zombie apocalypse (e.g., Rot and Ruin and The Walking Dead). Clearly, recent popular media offerings manifest the strain evident in our ongoing relationship with the end of days; to be an American in the modern age is to realize that everything under—and including—the sun will kill us if given half a chance. Given the prevalence of the themes like death and destruction in the current entertainment environment, it comes as no surprise that we turn to fiction to craft a kind of saving grace; although these impulses do not necessarily take the form of traditional utopias, our current culture definitely seems to yearn for something—or, more accurately, somewhere—better.

In particular, teenagers, as the subject of Young Adult (YA) fiction, have long been subjects for this kind of exploration with contemporary authors like Cory Doctorow, Paolo Bacigalupi, and M. T. Anderson exploring the myriad issues that American teenagers face as they build upon a trend that includes foundational works by Madeline L’Engle, Lois Lowry, and Robert C. O’Brien. Arguably darker in tone than previous iterations, modern YA dystopia now wrestles with the dangers of depression, purposelessness, self-harm, sexual trauma, and suicide. For American teenagers, psychological collapse can be just as damning as physical decay. Yet, rather than ascribe this shift to an increasingly rebellious, moody, or distraught teenage demographic, we might consider the cultural factors that contribute to the appeal of YA fiction in general—and themes of utopia/dystopia in particular—as manifestations spill beyond the confines of YA fiction, presenting through teenage characters in programming ostensibly designed for adult audiences as evidenced by television shows like Caprica (2009-2010).

 

Transcendence through Technology

A spin-off of, and prequel to, Battlestar Galactica (2004-2009), Caprica transported viewers to a world filled with futuristic technology, arguably the most prevalent of which was the holoband. Operating on basic notions of virtual reality and presence, the holoband allowed users to, in Matrix parlance, “jack into” an alternate computer-generated space, fittingly labeled by users as “V world.”[1] But despite its prominent place in the vocabulary of the show, the program itself never seemed to be overly concerned with the gadget; instead of spending an inordinate amount of time explaining how the device worked, Caprica chose to explore the effect that it had on society.

Calling forth a tradition steeped in teenage hacker protagonists (or, at the very least, ones that belonged to the “younger” generation), our first exposure to V world—and to the series itself—comes in the form of an introduction to an underground space created by teenagers as an escape from the real world. Featuring graphic sex[2], violence, and murder, this iteration does not appear to align with traditional notions of a utopia but does represent the manifestation of Caprican teenagers’ desires for a world that is both something and somewhere else. And although immersive virtual environments are not necessarily a new feature in Science Fiction television,[3] with references stretching from Star Trek’s holodeck to Virtuality, Caprica’s real contribution to the field was its choice to foreground the process of V world’s creation and the implications of this construct for the shows inhabitants.

Taken at face value, shards like the one shown in Caprica’s first scene might appear to be nothing more than virtual parlors, the near-future extension of chat rooms[4] for a host of bored teenagers. And in some ways, we’d be justified in this reading as many, if not most, of the inhabitants of Caprica likely conceptualize the space in this fashion. Cultural critics might readily identify V world as a proxy for modern entertainment outlets, blaming media forms for increases in the expression of uncouth urges. Understood in this fashion, V world represents the worst of humanity as it provides an unreal (and surreal) existence that is without responsibilities or consequences. But Caprica also pushes beyond a surface understanding of virtuality, continually arguing for the importance of creation through one of its main characters, Zoe.[5]

Seen one way, the very foundation of virtual reality and software—programming—is itself the language and act of world creation, with code serving as architecture (Pesce, 1999). If we accept Lawrence Lessig’s maxim that “code is law” (2006), we begin to see that cyberspace, as a construct, is infinitely malleable and the question then becomes not one of “What can we do?” but “What should we do?” In other words, if given the basic tools, what kind of existence will we create and why?

One answer to this presents in the form of Zoe, who creates an avatar that is not just a representation of herself but is, in effect, a type of virtual clone that is imbued with all of Zoe’s memories. Here we invoke a deep lineage of creation stories in Science Fiction that exhibit resonance with Frankenstein and even the Judeo-Christian God who creates man in his image. In effect, Zoe has not just created a piece of software but has, in fact, created life!—a discovery whose implications are immediate and pervasive in the world of Caprica. Although Zoe has not created a physical copy of her “self” (which would raise an entirely different set of issues), she has achieved two important milestones through her development of artificial sentience: the cyberpunk dream of integrating oneself into a large-scale computer network and the manufacture of a form of eternal life.[6]

Despite Caprica’s status as Science Fiction, we see glimpses of Zoe’s process in modern day culture as we increasingly upload bits of our identities onto the Internet, creating a type of personal information databank as we cultivate our digital selves.[7] Although these bits of information have not been constructed into a cohesive persona (much less one that is capable of achieving consciousness), we already sense that our online presence will likely outlive our physical bodies—long after we are dust, our photos, tweets, and blogs will most likely persist in some form, even if it is just on the dusty backup server of a search engine company—and, if we look closely, Caprica causes us to ruminate on how our data lives on after we’re gone. With no one to tend to it, does our data run amok? Take on a life of its own? Or does it adhere to the vision that we once had for it?

Proposing an entirely different type of transcendence, another character in Caprica, Sister Clarice, hopes to use Zoe’s work in service of a project called “apotheosis.” Representing a more traditional type of utopia in that it represents a paradisiacal space offset from the normal, Clarice aims to construct a type of virtual heaven for believers of the One True God,[8] offering an eternal virtual life at the cost of one’s physical existence. Perhaps speaking to a sense of disengagement with the existent world, Clarice’s vision also reflects a tradition that conceptualizes cyberspace as a chance where humanity can try again, a blank slate where society can be re-engineered. Using the same principles that are available to Zoe, Clarice sees a chance to not only upload copies of existent human beings, but bring forth an entire world through code. Throughout the series, Clarice strives to realize her vision, culminating in a confrontation with Zoe’s avatar who has, by this time, obtained a measure of mastery over the virtual domain. Suggesting that apotheosis cannot be granted, only earned, Clarice’s dream of apotheosis literally crumbles around her as her followers give up their lives in vain.

Although it is unlikely that we will see a version of Clarice’s apotheosis anytime in the near future, the notion of constructed immersive virtual worlds does not seem so far off. At its core, Caprica asks us, as a society, to think carefully about the types of spaces that we endeavor to realize and the ideologies that drive such efforts. If we understand religion as a structured set of beliefs that structure and order this world through our belief in the next, we can see the overlap between traditional forms of religion and the efforts of technologists like hackers, computer scientists, and engineers. As noted by Mark Pesce, Vernor Vinge’s novella True Names spoke to a measure of apotheosis and offered a new way of understanding the relationship between the present and the future—what Vinge offered to hackers was, in fact, a new form of religion (Pesce, 1999). Furthermore, aren’t we, as creators of these virtual worlds fulfilling one of the functions of God? Revisiting the overlap between doomsday/apocalyptic/dystopian fiction as noted in the paper’s opening and Science Fiction, we see a rather seamless integration of ideas that challenges the traditional notion of a profane/sacred divide; in their own ways, both the writings of religion and science both concern themselves with some of the same themes, although they may, at times, use seemingly incompatible language.

Ultimately, however, the most powerful statement made by Caprica comes about as a result of the extension to arguments made on screen:  by invoking virtual reality, the series begs viewers to consider the overlay of an entirely subjective reality onto a more objective one.[9] Not only presenting the coexistence of multiple realities as a fact, Caprica asks us to understand how actions undertaken in one world affect the other. On a literal level, we see that the rail line of New Cap City (a virtual analogue of Caprica City, the capital of the planet of Caprica)[10] is degraded (i.e., “updated) to reflect a destroyed offline train, but, more significantly, the efforts of Zoe and Clarice speak to the ways in which our faith in virtual worlds can have a profound impact on “real” ones. How, then, do our own beliefs about alternate realities (be it heaven, spirits, string theory, or media-generated fiction) shape actions that greatly affect our current existence? What does our vision of the future make startlingly clear to us and what does it occlude? What will happen as future developments in technology increase our sense of presence and further blur the line between fiction and reality? What will we do if the presence of eternal virtual life means that “life” loses its meaning? Will we reinscribe rules onto the world to bring mortality back (and with it, a sense of urgency and finality) like Capricans did in New Cap City? Will there come a day where we choose a virtual existence over a physical one, participating in a mass exodus to cyberspace as we initiate a type of secular rapture?

As we have seen, online environments have allowed for incredible amounts of innovation and, on some days, the future seems inexplicably bright. Shows like Caprica are valuable for us as they provide a framework through which the average viewer can discuss issues of presence and virtuality without getting overly bogged down by technospeak. On some level, we surely understand the issues we see on screen as dilemmas that are playing out in a very human drama and Science Fiction offerings like Caprica provide us with a way to talk about subjects that we will confront in the future although we may not even realize that we are doing so at the time. Without a doubt, we should nurture this potential while remaining mindful of our actions; we should strive to attain apotheosis but never forget why we wanted to get there in the first place.

Works Cited

Lessig, L. (2006, January). Socialtext. Retrieved September 10, 2011, from Code 2.0: https://www.socialtext.net/codev2/

Pesce, M. (1999, December 19). MIT Communications Forum. Retrieved September 12, 2011, from Magic Mirror: The Novel as a Software Development Platform: http://web.mit.edu/comm-forum/papers/pesce.html


[1] Although the show is generally quite smart about displaying the right kind of content for the medium of television (e.g., flushing out the world through channel surfing, which not only gives viewers glimpses of the world of Caprica but also reinforces the notion that Capricans experience their world through technology), the ability to visualize V world (and the transitions into it) are certainly an element unique to an audio-visual presentation. One of the strengths of the show, I think, is its ability to add layers of information through visuals that do not call attention to themselves. These details, which are not crucial to the story, flush out the world of Caprica in a way that a book could not, for while a book must generally mention items (or at least allude to them) in order to bring them into existence, the show does not have to ever name aspects of the world or actively acknowledge that they exist. Moreover, I think that there is something rather interesting about presenting a heavily visual concept through a visual medium that allows viewers to identify with the material in a way that they could not if it were presented through text (or even a comic book). Likewise, reading Neal Stephenson’s A Diamond Age (which prominently features a book) allows one to reflect on one’s own interaction with the book itself—an opportunity that would not be afforded to you if you watched a television or movie adaptation.

[2] By American cable television standards, with the unrated and extended pilot featuring some nudity.

[3] Much less Science Fiction as a genre!

[4] One could equally make the case that V world also represents a logical extension of MUDs, MOOs, and MMORPGs. The closest modern analogy might, in fact, be a type of Second Life space where users interact in a variety of ways through avatars that represent users’ virtual selves.

[5] Although beyond the scope of this paper, Zoe also represents an interesting figure as both the daughter of the founder of holoband technology and a hacker who actively worked to subvert her father’s creation. Representing a certain type of stability/structure through her blood relation, Zoe also introduced an incredible amount of instability into the system. Building upon the aforementioned hacker tradition, which itself incorporates ideas about youth movements from the 1960s and lone tinkerer/inventor motifs from Science Fiction in the early 20th century, Zoe embodies teenage rebellion even as she figures in a father-daughter relationship, which speaks to a particular type of familial bond/relationship of protection and perhaps stability.

[6] Although the link is not directly made, fans of Battlestar Galactica might see this as the start of resurrection, a process that allows consciousness to be recycled after a body dies.

[7] In addition, of course, is the data that is collected about us involuntarily or without our express consent.

[8] As background context for those who are unfamiliar with the show, the majority of Capricans worship a pantheon of gods, with monotheism looked upon negatively as it is associated with a fundamentalist terrorist organization called Soldiers of The One.

[9] One might in fact argue that there is no such thing as an “objective” reality as all experiences are filtered in various ways through culture, personal history, memory, and context. What I hope to indicate here, however, is that the reality experienced in the V world is almost entirely divorced from the physical world of its users (with the possible exception of avatars that resembled one’s “real” appearance) and that virtual interactions, while still very real, are, in a way, less grounded than their offline counterparts.

[10] Readers unfamiliar with the show should note that “Caprica” refers to both the name of the series and a planet that is part of a set of colonies. Throughout the paper, italicized versions of the word have been used to refer to the television show while an unaltered font has been employed to refer to the planet.


Mutable Masses?

It’s the End of the World as We Know It (And I Feel Fine)

Notably, however, the fears associated with the masses have not been limited to one particular decade in American history:  across cultures and times, we can witness examples akin to tulip mania where unruly mobs exhibited relatively irrational behavior. Given the reoccurring nature of this phenomenon, which receives additional credence from psychological studies exploring groupthink and conformity (Janis, 1972; Asch, 1956), we might choose to examine how, if at all, the cultural critiques of the 1950s apply to contemporary society.

Recast, the criticisms of mass culture presumably resonate today in a context where popular culture holds sway over a generally uncritical public; we might convincingly argue that media saturation has served to develop a modern society in which celebrities run wild while evidencing sexual exploits like badges of honor, traditional communities have collapsed, and the proverbial apocalypse appears closer than ever. Moreover, having lost sight of our moral center while further solidifying our position as a culture of consumption since the 1950s, the masses have repeatedly demonstrated their willingness to flash a credit card in response to advertising campaigns and to purchase unnecessary goods hawked by celebrity spokespeople in a process that demonstrates a marked fixation on appearance and the image in a process reminiscent of critiques drawn from A Face in the Crowd (Hoberman, 2008a; Ecksel, 2008). Primarily concerned with the melding of politics, news, and entertainment, which harkens back to Kierkegaard-inspiried critiques of mass culture, current critics charge that the public has at long last become what we most feared:  a mindless audience with sworn allegiances born out of fielty to the all-mighty image (Hoberman, 2008a).

Arguably the most striking (or memorable) recent expression of image, and subsequent comingling bewteen politics and entertainment, centered around Sarah Palin’s campaign for office in 2008. Indeed, much of the disucssion regarding Palin centered around her image and colloquisims rather than focusing solely on her abilities. [1] Throughout her run, Palin positioned herself as an everyman figure, summoning figures such as “Joe Six-Pack” and employing terms such as “hockey mom” in order to covey her relatability to her constituents.[2] In a piece on then-Vice-Presidential candidate Sarah Palin, columnist Jon Meacham questions this practice by writing:  “Do we want leaders who are everyday folks, or do we want leaders who understand everyday folks?” (2008). Palin, it seemed to Meacham, represented much more of the former than the latter; this position then  leads to the important suggestion that Palin was placed on the political bill in order to connect with voters (2008). Suddenly, a correlary between Palin and Lonesome Rhodes from A Face in the Crowd becomes almost self-evident.

At our most cynical, we could argue that Palin is a Lonesome-type figure, cleverly manipulating her image in order to connect with the disenfranchised and disenchanted. More realistically, however, we might consider how Palin could understand her strength in terms of her relatability instead of her political acumen; she swims against the current as a candidate of the people (in perhaps the truest sense of the term) and provides hope that she will represent the voice of the common man, in the process challenging the status quo in a government that has seemingly lost touch with its base. In some ways, this argument continues to hold valence in post-election actions that demonstrate increasing support of the Tea Party movement.

However, regardless of our personal political stances, the larger pertinent issue raised by A Face in the Crowd is the continued existence of an audience whose decision-making process remains heavily influenced by image—we actually need to exert effort in order to extract our opinion of Sarah Palin the politician from the overall persona of Sarah Palin. Although admittedly powerful, author Mark Rowlands argues that a focus on image—and the reliance on the underlying ethereal quality described by Daniel Boorstin as being “well known for [one’s] well-knownness” (Boorstin, 1962, p. 221)—is ultimately damning as the public’s inability to distinguish between items of quality leads them to focus on the wrong questions (and, perhaps worse, to not even realize that we are asking the wrong questions) in ways that have very real consequences. Extrapolating from Rowlands, we might argue that, as a culture that is obsessed with image and reputation, we have, in some ways, forgotten how to judge the things that really matter because we have lost a sense of what our standards should be.

Ever the Same?

So while the criticisms of critics from the Frankfurt School might appear to hold true today, we also need to realize that modern audiences exist in a world that is, in some ways, starkly different from that of the 1950s. To be sure, the mainstream media continues to exist in a slightly expanded form but new commentary on the state of American culture must account for the myriad ways in which current audiences interact with the world around them. For instance, work published after Theodor Adorno’s time has argued against the passive nature of audiences, recognizing the agency of individual actors (Mattson, 2003; Shudson, 1984).[3] Moreover, the new activity on the part of audiences has done much to comingle the once distinctly separate areas of high and low culture in a process that would have likely confounded members of the Frankfurt School. The current cultural landscape encompasses remix efforts such as Auto-Tune the News along with displays of street art in museum galleries; projects once firmly rooted in folk or pop art have transcended definitional boundaries to become more accepted—and even valued—in the lives of all citizens. While Adorno might be tempted to cite this as evidence of high culture’s debasement, we might instead argue that these new manifestations have challenged the long-held elitism surrounding the relative worth of particular forms of art.

Additionally, examples like Auto-Tune the News suggest that advances in technology have also had a large impact on the cultural landscape of America over the past half century, with exponential growth occurring after the widespread deployment of the Internet and the resulting World Wide Web. While the Internet certainly provided increased access to information, it also created the scaffolding for social media products that allowed new modes of participation for users. Viewed in the context of image, technology has helped to construct a world in which reputations are made and broken in an instant and we have more information circulating in the system than ever before; the appearance of technology, then, has not only increased the velocity of the system but has also amplified it.

Although the media often showcases deleterious qualities of the masses’ relationship with these processes (the suicide of a student at Rutgers University being a recent and poignant example), we are not often exposed to the incredible pro-social benefits of a platform like Twitter or Facebook. While we might be tempted to associate such pursuits with online predators (a valid concern, to be sure) or, at best, unproductive in regard to civic engagement (Gladwell, 2010), to do so would to ignore the powerfully positive uses of this technology (Burnett, 2010; Lehrer, 2010; Johnston, 2010). Indeed, we need only look at a newer generation of activist groups who have built upon Howard Rheingold’s concept of “smart mobs” in order to leverage online technologies to their benefit (2002)—a recent example can be found in the efforts of groups like The Harry Potter Alliance, Invisible Children, and the Kristin Brooks Hope Center to win money in the Chase Community Giving competition (Business Wire, 2010). Clearly, if the masses can self-organize and contribute to society, the critiques of mass culture as nothing more than passive receptors of media messages need to be revised.

Reconsidering the Masses

If we accept the argument that audiences can play an active part in their relationship with media, we then need to look for a framework that begin to address media’s role in individuals’ lives and to examine the motivations and intentions that underlie media consumption. Although we might still find that media is a corrosive force in society, we must also realize that, while potentially exploiting an existing flaw, it does not necessarily create the initial problem (MacGregor, 2000).

A fundamental building block in the understanding of media’s potential impact is the increased propensity for individuals (particularly youth) to focus on external indicators of self-worth, with the current cultural climate of consumerism causing individuals to focus on their inadequacies as they begin to concentrate on what they do not have (e.g., physical features, talent, clothes, etc.) as opposed to their strengths. Simultaneously both an exacerbation of this problem and an entity proffering solutions, constructs like advertising provide an easy way for youth to compensate for their feelings of anxiety by instilling brands as a substitute for value:  the right label can confer a superficial layer of prestige and esteem upon individuals, which can act as a temporary shield against criticism and self-doubt. In essence, one might argue that if people aren’t good at anything, they can still be associated with the right brands and be okay. Although we might be tempted to blame advertising for this situation, it actually merely serves to exploit our general unease about our relationship to the world, a process also reminiscent of narcissism (Lasch, 1979).

Historian Christopher Lasch goes on to argue that, once anchored by institutions such as religion, we have become generally disconnected from our traditional anchors and thus have come to substitute media messages and morality tales for actual ethical and spiritual education (1979). The overlapping role of religion and advertising is noted by James Twitchell, who contends that, “Like religion, which has little to do with the actual delivery of salvation in the next world but everything to do with the ordering of life in this one, commercial speech has little to do with material objects per se but everything to do with how we perceive them” (1996, 110). Thus, we might classify religion, advertising, entertainment, and celebrity as examples of belief systems (i.e., a certain way of seeing the world complete with their own set of values) and use these paradigms to begin to understand their respective (and ultimately somewhat similar!) effects on the masses.

A Higher Power

Ideologies such as those found in popular culture, religion, or advertising tell believers, in their own ways, what is (and is not) important in society, something that Twitchell refers to as “magic” (1996, 29). Each manifestation also professes a particular point of view and attempts to integrate itself into everyday life, drawing on our desire to become part of something (e.g., an idea, a concept, or a movement) that is larger than ourselves. Perhaps, most importantly, the forces of advertising, entertainment, religion, and art (as associated with high/pop/folk culture) play on this desire in order to allow humans to give their lives meaning and worth, in terms of the external:  God, works of art, and name brands all serve as tools of classification. While cynics might note that this stance bears some similarities to the carnival sideshows of P. T. Barnum—it does not matter what is behind the curtain as long as there is a line out front (Gamson, 1994; Lasch, 1979)—the terms survive because they continue to speak to a deep desire for structure; the myth of advertising works for the same reasons that we believe in high art, higher education, and higher powers. Twitchell supports this idea by mentioning that “the real force of [the culture of advertising] is felt where we least expect it:  in our nervous system, in our shared myths, in our concepts of self, and in our marking of time” (1996, 124). Constructs like advertising or entertainment, it seems, not only allow us to assemble a framework through which we understand our world, but also continually informs us about who we are (or who we should be) as a collection of narratives that serves to influence the greater perceptions of individuals in a manner reminiscent of the role of television in Cultivation Theory (Gerbner & Gross, 1976). The process of ordering and imbuing value ultimately demonstrates how overarching ideologies can not only create culture but also act to shape it, a process evidenced by the ability of the aforementioned concepts to consume and/or reference previously shared cultural knowledge while simultaneously contributing to the cultural milieu.

Given our reconsideration of mid-century cultural critiques, it follows that we should necessarily reevaluate proposed solutions to the adverse issues present within mass culture. We recall the advice of A Face in the Crowd’s Mel Miller (i.e., “We get wise to them”) and reject its elitist overtones while remaining mindful of its core belief. We recognize that priding ourselves on being smart enough to see through the illusions present in mass culture, while pitying those who have yet to understand how they are being herded like so many sheep, makes us guilty of the narcissism we once ascribed to the masses—and perhaps even more dangerous than the uneducated because we are convinced that we know better. We see that aspects of mass culture address deeply embedded desires and that our best hope for improving culture is to satisfy these needs while educating audiences so that they can better understand how and why media affects them. Our job as critics is to encourage critical thinking on the part of audiences, dissecting media and presenting it to individuals so that they can make informed choices about their consumption patterns; our challenge is to convincingly demonstrate that engagement with media is a crucial and fundamental part of the process. If we ascribe to these principles, we can preserve the masses’ autonomy and not merely replace one dominant ideology with another.


[1] Certainly being a female did not help this as American women are typically subject to a “halo effect” wherein their attractiveness (i.e., appearance) affects their perception (Kaplan, 1978)

[2] Palin has continued the trend, currently employing the term “mama grizzlies,” a call-to-arms that hopes to rally the willingness of women to fight in order to protect things that they believe in. Interestingly, a term that reaffirms the traditional role of women as nurturing matriarchs has been linked to feminist movements, a move that seems to confuse the empowerment of women with a socially conservative construct of their role in American life (Dannenfelser, 2010).

[3] We can also see much work conducted in the realm of fan studies that supports the practice of subversive readings or “textual poaching,” a term coined by Henry Jenkins (1992), in order to discuss contemporary methods of meaning making and resistance by fans.


The Truth Shall Set You Free?

Young people handle dystopia every day:  in their lives, their dysfunctional families, their violence-ridden schools.

—Lois Lowry[1]

The Age of Information.

Today, more than ever, individuals are awash in a sea of information that swirls around us invisible as it is inescapable. In many ways, we are still grappling with the concept as struggle to sort, filter, and conceptualize that which surrounds us. We complain about the overbearing nature of algorithms—or, perhaps more frighteningly, do not comment at all—but this is not the first time that Western society has pondered the role and influence of information in our lives.

Access to information provides an important thematic lens through which we can view dystopic fiction and although it does not account for the entirety of the genre’s appeal in and of itself (or, for that matter, the increase in its popularity), we will see that understanding the attraction of dystopia provides some insight into the the societies that produce it and elucidates the ways in which the genre allows individuals to reflect on themes present in the world around them—themes that are ultimately intimately connected with the access and flow of information. My interest here lies specifically in YA dystopic fiction and its resonance with the developmental process of teenagers.

Lois Lowry’s quote suggests that today’s youth might be familiar with tangible aspects of dystopia even if they do not necessarily exist in a state of dystopia themselves; dystopia, then, is fundamentally relatable to youth.[2] Interpersonal violence in schools—on both the physical and virtual levels—has become a growing problem and can be seen as a real life analogue to the war-torn wastelands of YA dystopia; although the physical destruction present in fiction might not manifest in the everyday, youth may identify with the emotional states of those who struggle to survive.[3] And, given the recent and high-profile nature of bullying, issues of survival are likely salient for modern youth.[4]

As a writer, it should come as no surprise that Lowry, like literary critic Darko Suvin, primarily describes the concept of dystopia in literary terms; while a valid, if limited perspective, this does not preclude the term also possessing socio-political implications, with one potentially arguing that the relatable nature of dystopia extends far beyond the iterations outlined by Lowry into the realm of ideology.[5] On a basic level, dystopia often asks protagonists to perform a type of self-assessment while simultaneously evaluating preexisting hierarchal structures and systems of authority.[6] Given that this process asks individuals to contrast themselves with the society that surrounds them, one might make the argument that the themes of utopia and dystopia possess an implicit political element, regardless of authors’ intentions.

Moreover, consider the prevalent construct of the secret as a defining characteristic of dystopian societies like those presented in the classic works of Brave New World and Nineteen Eighty-Four.[7] Often located in the cultural history of the dystopia (e.g., “What events caused us to reach this point?”) or the sustained lies of the present (e.g., “This is for your protection”), acquisition of new (hidden) knowledge represents a fundamental part of the protagonist’s—and, by extension, the reader’s—journey. For young adults, this literary progression can mirror the development occurring in real life as individuals challenge established notions during the coming-of-age process; viewed through the lens of anthropology, dystopian fiction represents a liminal space for both the protagonist and the reader in which old assumptions and knowledge are questioned during a metaphorical rite of passage. [8],[9] And, although the journey itself provides a crucial model trajectory for youth, perhaps more important, however, is the nature of the secret being kept:  as Lowry alludes to, modern youth undoubtedly realize that their world—our world—like that of any dystopia, contains elements of ugliness. The real secret, then, is not the presence of a corrupted underbelly but rather why rot exists in the first place.

Aside from the type of knowledge or even the issues latent in its accessibility, however, we can see that modern culture is undergoing a rather radical reconfiguration with regard to the social structures surrounding information flow. Although we still struggle with the sometimes antagonistic relationship between citizens and the State mirrored in classic and YA dystopia, we have also developed another dimension:  citizen versus citizen. Spurred on by innovations in technology that have made mobile gadgetry increasingly affordable and accessible to the public, on-location reporting has grown from the relatively useful process of crowdsourcing information to a practice that includes surveillance, documentation, and vigilante justice as we display our moral outrage over someone else’s ungodly behavior through platforms like paparazzi photos, tweeting of overheard conversations, and the ever-popular blog—we, in effect, have assumed the mantle of Big Brother. It would seem that, like Dr. Moreau, we have been granted knowledge and ability without wisdom.

Moreover, let us consider how youth currently exist in a culture of confession that was not apparent during previous cycles of utopia/dystopia. Spurred on in part by daytime talk shows, reality television, press conference apologies, and websites like PostSecret, the current environment is suffused with secrets and those willing to share their intimate stories for a price. Somewhat in opposition to confession’s traditional role in Catholicism, secrets now play an active role in public life despite their private nature, a process that mirrors the juxtaposition of personal and public histories by protagonists in YA dystopia.[10],[11] Moreover, we quickly come to see the increased relevancy of this trend when we consider how individuals, groups, organizations, and societies begin to define themselves in terms of the secrets that they hold about others and themselves. The prevalence of events like corporate espionage, copyright infringement lawsuits, and breakdowns in communication between youth and parents all point to entities that wish to contain and restrict information flow. If being an American in the 20th century meant being defined by material possessions, being an American in the 21st century is to be defined by information and secrets. And, if this is indeed the case, how might we view our existence as one that occurs in a series of ever-expanding dystopias? As it turns out, Lowry might have been more correct than she realized when she noted young people’s familiarity with dystopia.

But perhaps this development is not so surprising if we consider the increasing commodification of knowledge in postmodern culture. If we ascribe to Jean-Francois Lyotard’s argument regarding the closely intertwined relationship between knowledge and production—specifically that the cultivation of new knowledge in order to further production—and therefore that information sets are a means to an end and not an end in and of themselves, we witness a startling change in the relationship between society and knowledge.[12] In opposition to the idealistic pursuit that occurred during the Enlightenment period, modern conceptualizations seem to understand knowledge in terms of leverage—in other words, we, like all good consumers, perennially ask the question, “What can you do for me?” Furthermore, the influence of commercialism on Education (i.e., the institution charged with conveying information from one generation to the next) has been probed, conjecturing that educational priorities might be dictated by concerns of the market.[13] Notably, these cultural shifts have not disavowed the value of knowledge but have changed how such worth is determined and classified.

The Frankfurt School’s pessimistic views of mass culture’s relationship with economic influences and independent thought aside, Lyotard also points to the danger posed by the (then) newly-formed entity of the multinational corporation as a body that could potentially supersede or subvert the authority of the nation-state.[14] Businesses like Facebook and Google accumulate enormous amounts of information (often with our willing, if unwitting, participation) and therefore amass incredible power, with the genius of these organizations residing in their ability to facilitate access to our own information! Without castigating such companies—although some assuredly do—we can glimpse similarities between these establishments’ penchant for controlling the dissemination of information and the totalitarian dictatorships prevalent in so many dystopian societies. In spite of the current fervor surrounding the defense of rights outlined in the Constitution, we largely continue to ignore how companies like Google and Facebook have gained the potential to impact concepts like freedom of assembly, freedom of speech, and freedom of information; algorithms designed to act as filters allow us to cut through the noise but also severely reduce our ability to conceptualize what is missing. These potential problems, combined with current debates over issues like privacy, piracy, and Net Neutrality indicate that power no longer solely resides in knowledge but increasingly in access to it.


[1] Lois Lowry, quoted in Hintz, Carrie, and Elaine Ostry. Utopian and Dystopian Writing for Children and Young Adults. (New York: Routledge, 2003).

[2] One might even argue that those who read dystopian fiction most likely do not inhabit a dystopian world, for they would not have the leisure time to consume such fiction.

[3] This point, of course, should not be taken in a manner that discounts the legitimate struggles of children who grow up in conflict states.

[4] See Ken Rigby, New Perspectives on Bullying. London: Jessica Kingsley Publishers, 2002and Marilyn A. Campbell “Cyber Bullying: An Old Problem in a New Guise?” Australian Journal of Guidance and Counseling 15, no. 1 (2005): 68-76.

[5] Clare Archer-Lean, “Revisiting Literary Utopias and Dystopias: Some New Genres.” Social Alternatives 28, no. 3 (2009): 3-7.

[6] Kennon, Patricia. “‘Belonging’ in Young Adult Dystopian Fiction: New Communities Created by Children.” Papers: Explorations into Children’s Literature 15, no. 2 (2005): 40-49.

[7] Patrick Parrinder, “Entering Dystopia, Entering Erewhon.” Critical Survey 17, no. 1 (2005): 6-21.

[8] Hintz and Ostry, Utopian and Dystopian. 2003.

[9] Parrinder, “Entering Dystopia, Entering Erewhon.” 2005.

[10] Shannon McHugh and Chris Tokuhama, “PostSecret: These Are My Confessions.” The Norman Lear Center. June 10, 2010. http://blog.learcenter.org/2010/06/postsecret_these_are_my_confes.html

[11] John Stephens, “Post-Disaster Fiction: The Problematics of a Genre.” Papers: Explorations into Children’s Literature 3, no. 3 (1992): 126-130.

[12] Jean-Francois Lyotard, The Postmodern Condition: A Report on Knowledge. (Manchester: Manchester University Press, 1979).

[13] Suzanne de Castell and Mary Bryson, “Retooling Play: Dystopia, Dysphoria, and Difference.” In From Barbie to Mortal Kombat, edited by Justine Cassell and Henry Jenkins. (Cambidge: The MIT Press, 1998).

[14] Lyotard, The Postmodern Condition. 1979.