Teaching Popular Cultural Semiotics

Jack SolomonJack Solomon is professor of English at California State University, Northridge, where he teaches literature and critical theory. He is often interviewed by the California media for analysis of current events and trends. He is co-author, with Sonia Maasik, of Signs of Life in the U.S.A.: Readings on Popular Culture for Writers and California Dreams and Realities: Readings for Critical Thinkers and Writers and is author of The Signs of Our Time and Discourse and Reference in the Nuclear Age.

A Digital Canary in the Coal Mine?

posted: 10.16.14 by Jack Solomon

Recently I received a student journalist’s request to comment on a phenomenon that she identified as a decline in traditional dating practices among millennials.  More specifically, she wanted to know what I think about certain “practice dating” groups that are forming to guide young people in how to behave during actual face-to-face dates.  “Why,” she asked me, “is there a growing need for practice dates, and why are millennials finding it harder to communicate face to face?”

Wow.  Sometimes the signifiers just leap out at you.

After all, one of the more nagging questions that have emerged in the age of digital communication is just what might happen to human interpersonal skills when so much socializing is conducted via virtual social networks.  The notorious prevalence of vile (and even violent) commentary on the Net is one indicator that digital communication may not be conducive to the development of basic social skills, but that alone is not sufficient evidence from which to draw any conclusions.  One could always persuasively argue, for example, that Internet bile is simply the expression of bad feeling that was always prevalent anyway but now is far easier to express to a far wider audience.  But this practice dating thing opens up whole new vistas of semiotic possibility.

Consider: have you ever observed a group of people (or simply a couple) sitting together and obviously associated, but rather than looking at or addressing each other everyone is staring into a smart phone?  The scene is so common that it is difficult not to have observed it.

Now, try that sort of behavior on a date.

But, wait a minute, that must be exactly what is happening in today’s dating scene, or else why would young people be forming “practice date” events to help each other learn how to interact with someone face-to-face without constantly diving back into the social network?  Somehow, millennials themselves are becoming aware that their social instincts are being reshaped by technology (throw in the growing phenomenon of “sexting” and you can see how even Eros is being affected), and they are struggling to do something about it.  I can imagine sessions devoted to learning how to stare into someone’s eyes, rather than into your iPhone, or learning how just to talk with someone without tweeting or posting Instagram selfies.

Now, interpreting such a cultural signifier as the practice date scene is not the same thing as criticizing anyone.  After all, my generation, the Baby Boomers, are accused of having had our attention spans shortened by another technological intervention—TV—and I believe that it is altogether likely that it is perfectly true.  The effects of technology on psychological, and perhaps even biological, evolution are profound, and as the world is swept by the digital revolution, it behooves us to pay attention to the canaries twittering around us.  And when young folks need self-help sessions in dealing face-to-face with young folks, that is a very profound tweet.

Comments: (0)
Categories: Jack Solomon
Read All Jack Solomon

It Ain’t Over When the Hashtag Sings

posted: 10.2.14 by Jack Solomon

Well, the two-year long campaign is over, the votes have been counted, and the Scots have voted to remain in the United Kingdom. The vote was both decisive, and a bit of a surprise in light of the eve-of-election polls—which predicted a much closer outcome—so close that many who campaigned for independence appear to have been genuinely confident of victory.

If one had been going by the trending analytics of the #YesScotland movement, which led the #BetterTogether movement by a good three-to-one margin, according to the BBC, the outcome of the referendum would have been even more surprising. And if social media analytics were the means by which democracies make their decisions, Scotland would probably be an independent nation today.

Which takes me to the point of my analysis. From reading a lot of online commentary, even at supposedly staid sites like Inside Higher Education and The Chronicle of Higher Education, I often get the impression that a lot of participants in the “comments” sections believe that if they can get the most posts in on their side of any particularly controversial topic, then, somehow, they have won something.  Similarly, if your “side” can get in more tweets with the right hashtags than the other side, then, for many people, you’ve won.  I can’t help but think that this sort of thing has been encouraged by the cultures of Facebook and Twitter, whereby one accumulates “likes,” “friends” and “followers” that are taken as genuine signifiers of popularity and/or importance.  RTV shows like American Idol, with their mass media simulacra of actual election-based voting, have also had a probable influence on this phenomenon.

But as the Scottish vote can remind us, when all is said and done and the actual (not virtual) votes are counted, social media are still just that: social media, not voting platforms.  For all the glamor, money, and attention that social media enjoy in the world today (indeed, it could be argued with little difficulty that social media are the most dominant expressions of popular culture in our time), we are not at the point where democratic decision making is going to be a matter of winning the hashtag wars.  While it is not impossible to imagine a time when social media platforms may actually become venues for real-world voting outcomes, we’re not there yet.

Comments: (0)
Categories: Jack Solomon
Read All Jack Solomon

The Ice Bucket Challenge

posted: 9.18.14 by Jack Solomon

No, I’m not going to post a You-Tube video of myself getting doused in ice water, and, indeed, by the time this posts, the ice bucket challenge will have probably morphed into something else anyway—most likely a series of parodies.  Rather, I wish to submit this latest of virally-initiated fads to a semiotic analysis, seeking what it says about the culture that has so enthusiastically embraced it.

As always in a semiotic analysis, we begin with a system of associations and differences, and with some history.  The actual act—dousing someone with a large bucket of ice water—of course, refers back to a once spontaneous, and then institutionalized, end-of-Super Bowl ritual by which the winning coach is sloshed with the melted remains of the Gatorade barrel.  That is part of the system in which we can locate the current fad, but already we find a significant difference.  That difference lies in the fact that the Super Bowl related ice bucket prank is not only an act of celebration but one celebrated by a highly elite masculine club (in fact there is a faint aura of hazing about it), while the ice bucket challenge is an act of pure populism.  Not only can anyone participate, but it is, by definition, a mass activity through which individuals are “called out” to participate (indeed, there is a certain whiff of coercion about the matter, a trick-or-treat vibe that caused even Barack Obama to say “no thank you, I’ll just make a monetary contribution”).  Thus, the ice bucket challenge can be associated with such medical research fund raising activities as wearing yellow Live Strong bracelets or participating in walkathons, but it is also a reflection of a hetero-directed society whereby (in this case benignly and for a good cause) individual behavior is dictated by group pressure.

America, which prides itself on its tradition of individualism (this is one of our chief mythologies), has a hetero-directed tradition as well that goes all the way back to the founding of the Massachusetts Bay Colony.  For the people that we know as the “Puritans” (their own name for themselves was the Congregationalists) had a very group-oriented worldview, one that compelled every individual member in the Congregation to demonstrate to his or her co-religionists the signs of salvation, or face expulsion.

The tug-of-war between staunch individualism and hetero-directedness is one of the most enduring contradictions in American history and culture.  In some decades (the fifties are notorious for this), hetero-directedness weighs more heavily (it isn’t called “hetero-directedness”, of course: we know it as “conformity”); in other decades, anti-conformist individualism is dominant (the sixties generation at least viewed itself as anti-conformist).

The tug-of-war at present is especially complex.  On the one hand, digital communications technology has been a tremendous nurturer of hetero-directedness.  From the sudden viral explosions that produce flash mobs, zombie walks, and, yes, the ice bucket challenge, to the constant sharing of individual experience on the world wide web, digitality has created a global hive that is always abuzz with Netizens caught up in a network of constant group behavior.  But on the other hand, we are also living in an era of intense libertarianism, a hyper-individualism often expressed, paradoxically enough, by way of the same social media behind the global hive.

It is this sort of non-dialectical mixture of individualism and hetero-directedness that makes America such a culturally complicated, and, well, paradoxical place.  While revealing such paradoxes does not resolve them, it at least helps us to understand ourselves as a society a bit better.

Comments: (0)
Categories: Jack Solomon
Read All Jack Solomon

A Sign of the Times

posted: 8.14.14 by Jack Solomon

One wouldn’t ordinarily consider an opinion piece by Robert J. Samuelson—The Washington Post‘s top economics columnist—as a candidate for semiotic analysis.  But a recent column of Samuelson’s reveals so much about the current state of American consciousness that it is quite useful for illuminating an important part of the background needed for the construction of any system being constructed for the purpose of cultural analysis.  So I will be looking at it here.

Samuelson’s brief essay is entitled “The (millennial) parent trap,” and in it he bemoans (this is not too strong a term for it) the precarious economic prospects not only for his own three “20-something” children, but also for all of the parents like him. The opening sentences of his op-ed piece pretty much sums it all up: “You could hear the tension in his voice. His 20-something daughter was living at home. She had a graduate degree from a good university that, in times past, would have led to a solid job. But she had no job and no prospect of one. He worried and wondered how long this would last.  He has plenty of company.”

What is most striking about Samuelson’s piece is not the raft of economic statistics that he brings to bear upon the well-known economic woes of millenials in the wake of the Great Recession, but the emotion that he displays over the matter.  Samuelson is usually a pretty low-key writer, an economist more at home with the logic of numerical analysis than with emotive expression.  But when such a man writes words like “The unwritten social contract of .  .  .  [our] .  .  .  era presumed that the economy would be strong enough so that when children reached a certain age, they could be ‘launched’ into the adult world and would not crash. It’s this contract that has now broken down,” you know that something is really happening.  A famous economist and journalist who presumably belongs to the upper-middle class, Samuelson would seem to be immune from such worries about his children.  The fact that he is demonstrably not immune shows just how deep the problem is.

And here is my semiotic point.  The impact of the Great Recession just may be the great game changer in American history, disrupting America’s fondest mythology, the one we call “the American dream.”  Signals of this disruption appear throughout popular culture (especially in the hit HBO series Girls), but as Samuelson’s lament indicates, it is not simply a matter for story lines.  The story line of America itself is being rewritten, and if we want to understand much of what is going on in the country today (especially its intractable divisiveness and ideological polarization), we need to take into consideration the fact that more and more Americans are seeing their country as a land of “betrayal,” not “opportunity.”

A final disclaimer: having no children of my own, and having survived the economic turmoil in perfectly good shape, my analysis is not a reflection of my own worries or emotions.  But when an unemotional fellow like Robert J. Samuelson lets his hair down in The Washington Post in this way, you can be pretty confident that the times they are a’ changin’.

Comments: (0)
Categories: Jack Solomon
Read All Jack Solomon

Back to Critical Thinking

posted: 7.31.14 by Jack Solomon

One of the most common demands made upon colleges and universities today is that they must teach “critical thinking.”  As a great believer in the teaching of critical thinking, I feel that it is incumbent upon all of us who teach it to be very clear about just what we think critical thinking is, however.  I have offered my own semiotics-based take on the matter in this blog before and will not repeat it now.  My focus this time will be on the sorts of standardized multiple-choice tests that have been offered on critical thinking for assessment purposes.  For having looked at some of these tests, I can conclude that while they do contain some of the elements of critical thinking (specifically, the ability to distinguish logical fallacies from sound logic, and pseudo-argument from valid argument), they are still very incomplete in their approach to the subject and need to be supplemented by what I will call the empirical side of critical reasoning.

Here’s why.  It is perfectly possible to construct a logically valid argument on the basis of false information.  For example, if it were true that there is no global warming going on in the world, no climate change, and no increase in atmospheric greenhouse gases, then it would be logical to argue that nothing needs to be done about the problem because it doesn’t exist.  This argument is being made right now and I presume that my readers will see what’s wrong with it, but I’ll spell it out: the empirical facts as determined by virtually every reputable climate scientist on earth dispute its grounding premise. In other words, to think critically about global climate change, one has to study the science of the matter, and only then can a valid and logical argument be made.  (It is worth pointing out that when one of the last holdouts among prominent climate scientists finally conceded that the scientific evidence indeed pointed to anthropogenetically induced climate change, he was denounced on personal grounds by climate change deniers, not logical or scientific ones.  See how the Christian Science Monitor reported the story in 2012 here.

To generalize: critical thinking includes logical and rhetorical skills (they are necessary), but such skills are not sufficient.  Every problem in critical thinking requires knowledge of the relevant facts.  These facts can be scientific, or historical, or mathematical, or based in any number of other knowledge disciplines, but without knowledge of the facts (call it “content”), there cannot be adequate reasoning.  This is why “reasoning skills” cannot be disassociated from content-based education in science, history, and so on and so forth.

I am perfectly aware of the postmodern and/or poststructural objection to my position, an objection based in both a deconstruction of reason itself and of the existence of any facts apart from values.  Having written an entire book contesting this point of view (Discourse and Reference in the Nuclear Age, 1988), I am not going to attempt to refute it here.  I’ll only say this (echoing something Bruno Latour has written):  if you don’t accept scientific (or other forms of) factuality, then you have no basis on which to challenge climate change denial.  And, more to the point: while you may have a basis for “critique,” you do not have a firm basis for critical thinking.

This is why the critical thinking apparatus of Signs of Life in the U.S.A. is grounded in Peircean rather than structuralist or poststructuralist semiotics.  Charles Peirce was a philosophical and scientific realist.  He acknowledged the mediational role of signs, but wrote that semiotic systems are grounded in reality.  I will concede that no one can finally prove the truth of this perspective, but from a Pragmatistic point of view it offers a far more effective basis for the teaching of critical thinking than one that offers no answer to those whose arguments are founded in made-up “facts,” or in no facts at all.

Comments: (0)
Categories: Jack Solomon
Read All Jack Solomon

The Whirled Cup

posted: 7.17.14 by Jack Solomon

With the World Cup standing as the globe’s most prominent popular cultural event of the moment, I think it is appropriate for me to take a cultural semiotic look at it, especially in the wake of all the commentary that has followed Brazil’s rather epic loss to Germany in the semi-finals.  As I write this blog, Holland is playing Argentina in the second semi-final, but since neither the outcome of that game nor the final to follow is of any significance from a semiotic point of view, I will not concern myself here with the ultimate outcome of the games but will focus instead on the non-player reactions to the entire phenomenon.

Let me first observe that while I am myself not a fan of the game that the rest of the world calls football (I’m not a fan of the game that Americans call football either), I am fully aware that to much of that world the prestige of the World Cup is roughly equaled by the value to us Americans of the World Series, the Super Bowl, the NCAA Final Four, the NBA finals, and the BCS championship combined. I have also been surprised to learn that the Olympic gold medal for football has hardly a fraction of the significance of the World Cup for the rest of the world, as signified by Argentina’s attitude towards Lionel Messi (currently the world’s greatest scorer, but perhaps the greatest of all time), who brought home Olympic gold in 2008 but is still regarded as a lesser man than Diego Maradona, who, in spite of a controversial career that boasts no Olympic gold medals, did bring home the Cup in 1986.  (Perhaps lesser “man” is the wrong term:  Argentines simply regard Maradona as “God”).

So I get the point that football is a very big deal in the rest of the world, so big that it may not be possible for most Americans to grasp just how big a deal it is.

Which takes me to the semiotic question: why is football such a big deal?  What is going on when a reporter from Brazilian newspaper O Tempo can remark, in the wake of the 1-7 defeat at the hands (or feet) of Germany:  ”It is the worst fail in Brazil’s history. No-one thought this possible. Not here. Not in Brazil.  People are already angry and embarrassed. In a moment like this, when so desperate, people can do anything because football means so much to people in Brazil”?

To answer this question I should perhaps begin by clearing the decks in noting that I don’t think that Ann Coulter has the answer.  I mean, American football, basketball, and baseball (our most passionately followed sports) are team sports too (Coulter appears to think that soccer-football is morally inferior because it is too team oriented and insufficiently individualistic, which is odd when one considers that names like Maradona, Pele, Bobby Charlton—and let’s throw in Georgie Best for good measure—are names in Argentina, Brazil, and Great Britain that are at least as magical as Babe Ruth, Joe Montana, and LeBron James are in America, and probably a lot more so).

So how can it be explained?  As always there is no single explanation: this question is highly overdetermined.  But let’s start with the sheer variety of sporting choices in America.  The list of easily available spectator and participant sports here is so long there really isn’t much point in trying to list them.  America has them all, and so the appeal of any given sport must always be taken in the context of a lot of other sports competing for attention (which is why Los Angeles, the second largest metropolitan market in America, can get along perfectly well year after year without an NFL franchise).  On the other hand, in much of the rest of the world while football isn’t precisely the only game in town, it is often practically so (let me except those African nations wherein long distance running is practically the only game in town: which is why Africans—in men’s competitions, not women’s—win most of the important marathons).  A game that doesn’t require much in the way of expensive equipment, football can be played by all classes, and of course offers a fantasy pathway to fame, glory, and riches for impoverished football dreamers.  In other words, for the rest of the world, football is the big basket into which nations put most of their sports eggs.

But who cares anyway?  Whether someone is carrying a ball over a line, kicking a ball into a net, throwing a ball into a basket, or hitting a ball onto the grass or into the bleachers (and so on and so forth), what difference does it make?  Why is Brazil in despair?  Why do people die at soccer-football games?  What gives with British soccer hooligans?

Here things get complicated.  Perhaps the most important point to raise is that sporting events have served as sublimated alternatives to war since ancient times.  The original Olympics, for example, featured events that were explicitly battle oriented—today’s javelin event at the modern Olympics recalls the days of spear throwing and a foot race run while carrying a shield—and the role of international sport in modern times continues to be that of a symbolic substitute for more lethal conflict (consider the passionate competitions between the USA and the USSR during the Cold War, with the 1972 Olympic basketball final and the 1980 hockey “miracle on ice” looming especially large in memory).  While I could go on much further here, suffice it to say that the significance of the World Cup is intimately tied up with nationalism and international conflict.  So when the Brazilian “side” fails to kick as many balls into a net as the German side, the emotional feel is akin to having lost a war.  This is not rational, but human beings are not invariably rational animals.  Signs and symbols can be quite as important as substantial things.

Americans right now are trying to get into the game when it comes to the passions of global football, but in spite of decades of youth football competition and legions of soccer moms, it really hasn’t happened yet.  All in all, American sport is still rather isolationist (I do not say this as a criticism): though we call the World Series, well, the World Series, only American teams play in that game, and the Super Bowl is only super on our shores.  But while there may be something parochial about our sporting attitude, at least it isn’t a matter for a national crisis if “our” team loses.  That’s not a bad thing.

Personally (and not semiotically), I believe that people should only get passionate about their own exercise programs (I feel awful if I miss a day of running), but, consistent with the mores of a consumer society, sport in America is increasingly a spectator affair, something to watch others do for us as a form of entertainment.  It isn’t good for the national waistline, but at least we aren’t in a state of existential angst because a handful of guys with tricky feet just lost in the semi-finals.

By the way: Argentina just went into the final.  Maybe Messi will be God.

(Alas.)

Comments: (2)
Categories: Jack Solomon
Read All Jack Solomon

The Beat Goes Off

posted: 7.3.14 by Jack Solomon

I confess to a certain fascination for the Beat generation.  Not because I belonged to it, mind you (I’m getting old but I’m not that old: the Beats belonged to my parents’ generation), but because of their profound influence on America’s cultural revolution, a revolution that continues to roil, and divide, Americans to this day.  In other words, if you want to understand what is happening in our society now, knowing something about the history of the Beats is a good place to start.

Please understand that when I say this, my purpose is semiotic, not celebratory.  In fact, as far as I am concerned, the Beats, and their Boomer descendants, all too often equated personal freedom with hedonistic pleasure, leading America not away from materialism (as the counterculture originally claimed to do) but to today’s brand-obsessed hyper-capitalistic consumerism.  What the Frankfurt School called “commodity fetishism” has morphed into what Thomas Frank has called the “commodification of dissent” (you can find his essay on the phenomenon in Chapter 1 of Signs of Life in the USA), wherein even anti-consumerist gestures are sold as fashionable commodities, while money and what it can buy dominate our social agenda and consciousness.

But what interests me for the purposes of this blog is the fate of three recent movies that brought the Beats to the big screen.  The first is Walter Salles’ production of Jack Kerouac’s signature Beat novel, On the Road (2012), a story that had been awaiting a cinematic treatment ever since Marlon Brando expressed an interest in it in 1957.  Another is John Krokides’ Kill Your Darlings (2013), a treatment of the real-life killing of David Kammerer by Lucien Carr—a seminal figure in the early days of the Beats and close friend of Allen Ginsberg, William Burroughs, and Jack Kerouac.  And the third is Big Sur (2013), a dramatization of Kerouac’s novel of the same title.

What is most interesting about these movies is their box office: though On the Road enjoyed a great deal of pre-release publicity and starred such high profile talent as Kristen Stewart, Viggo Mortensen, Kirsten Dunst, and Garrett Hedlund, its U.S. gross was $717,753, on an estimated budget of $25,000,000 (according to IMDb).  International proceeds were somewhat better (about eight and a half million dollars), but all in all, this was a major flop.

Kill Your Darlings did even worse.  Starring the likes of Daniel Radcliffe (as Allen Ginsberg?!) and Michael C. Hall, it grossed just $1,029,949, total (IMBd).

Big Sur, for its part, grossed .  .  .  wait for it .  .  . $33,621 (IMDb).  Even Kate Bosworth couldn’t save this one.

Can you spell “epic fail”?

As I ponder these high profile commercial failures, I am reminded of another recent literary-historical movie set in a similar era, which, in spite of an even higher level of star appeal, flopped at the box office: Steven Zaillian’s 2006 version of Robert Penn Warren’s classic novel All the King’s Men.  Resituating the action from the 1930s to the 1950s, and boasting an all-star cast including such luminaries as Sean Penn, Jude Law, Anthony Hopkins, Kate Winslet, Mark Ruffalo, and the late James Gandolfini, the movie grossed $7,221,458 on an estimated $55,000,000 budget (IMBd).

Now, it is always possible to explain commercial failures like these on aesthetics: that is, they simply could be badly executed movies.  And it is true that All the King’s Men got bad reviews, while On the Road‘s reception was somewhat mixed (Wikipedia).  Kill Your Darlings, on the other hand, actually did pretty well with the reviewers and won a few awards (again according to Wikipedia).  But the key statistic for me is the fact that Jackass Number Two was released in the same weekend as All the King’s Men and grossed $28.1 million dollars (Wikipedia), four times as much King’s, twenty-eight times as much as Darlings, and about forty times (US box office) as much as Road.  I don’t even want to calculate its relation to Big Sur.  So I don’t think that aesthetics explains these failures entirely.

Especially when one considers how just about any movie featuring superheroes, princesses, pirates, pandorans, malificents, and minions (not to mention zombies and vampires), draws in the real crowds.  Such movies have an appeal that goes well beyond the parents-with-children market and include a large number of the sort of viewers that one would expect to be interested in films starring Kristen Stewart, Daniel Radcliffe, and Jude Law.  But unlike the literary-historical dramas that failed, these successful films share not only a lot of special effects and spectacle but fantasy as well; and this, I think is the key to the picture.

Indeed, you have to go back to the 1970s to find an era when fantasy was not the dominant film genre at the American box office, and since the turn of the millennium fantasy has ruled virtually supreme.  While it is not impossible to attain commercial success with a serious drama (literary-historical or otherwise), it is very difficult.

The success of movies like Glory, The Butler, and The Help demonstrates that movies that tackle racial-historical themes resonate with American audiences, so I do not think that the failure of these Beat films can be attributed simply to America’s notorious disinterest in history.  And, after all, The Great Gatsby (2013 version) did well enough.  Perhaps it is nothing more than a disinterest in movies that are made by directors who are so personally enamored with their material that they forget that they have to work hard to make it just as attractive to audiences (I get this impression from some Amazon reviews of the DVD of Kill Your Darlings).  Artistic types tend to identify with the Beats (the original hipsters), but apparently today’s hipsters aren’t interested in hipster history.  Given the failure of On the Road, Kill Your Darlings and Big Sur (not to mention All the King’s Men), I would be surprised to see any future efforts in this direction, however.  If nothing else, today’s youth generation appears to be uninterested in the youthful experiences of their grandparents—spiritual and actual.  In all fairness, I suppose that one cannot blame them.

Comments: (0)
Categories: Jack Solomon
Read All Jack Solomon

Girls

posted: 5.22.14 by Jack Solomon

Having just read two large classes worth of student papers whose purpose is to semiotically analyze the HBO hit series Girls, I am learning a great deal about this popular program.  There are many striking things about the show that I could write about, but for the purposes of this brief blog I will choose only one:  the notoriously high level of nudity in Girls Indeed, as my students tell me, it appears that Hannah “gets naked” about twice per episode, and that this phenomenon is a much discussed feature of the show.  So, if you will, I will join the discussion here.

The debate over Hannah’s nudity seems to hinge on the physical appearance of the actress/writer who both portrays her and created her in the first place.  Similarly to the Dove “Real Beauty” campaign, defenders of Lena Dunham’s nakedness celebrate this display of an ordinary female body, countering complaints about it with the retort that no one seems to be complaining about all of the nudity in Game of Thrones—another highly popular show among Millenials featuring more conventionally beautiful actresses.  And so, from this perspective, Hannah’s nudity strikes a blow for women’s liberation.

I don’t think that this is all there is to the matter, however; for when we situate Girls in the system of contemporary television, we can see that there is a whole lot of such nudity and sexuality to be found—in Mad Men, for instance, wherein sexual threesomes (two women to a man, not the other way around)—as well as a lot of rape in shows like Sons of Anarchy, and, notoriously, Game of Thrones.  Indeed, reading my student papers is a bit of a jolting experience as I see the way they seem to take it for granted that of course television is going to be filled with rape scenes and not-so-soft porn.

The explanation (or excuse) for all this nudity, sexuality, and rape is often that it makes contemporary television more “realistic,” and, in an era where campus sexual assault has become a matter of national concern all the way up to the White House, this explanation is certainly true enough.  But there is a difference between a story that tells of such things and one that graphically shows it, and there, for me, lies the crux of the matter.

Let’s get back to Hannah’s nudity.  It generally occurs during decidedly unpleasant sex scenes, scenes in which Hannah is not only not experiencing much pleasure but is being humiliated in one way or another.  Marnie (another Girls regular) is also willing to humiliate herself sexually to hold on to her “boyfriend” Charlie.  Indeed, sex in Girls seems to bring almost nothing but humiliation, or worse.

This is quite different from shows like Friends, which bears a number of similarities to Girls.  In Friends, too, young people struggled to make it in New York during a down economy, and there was plenty of sexuality in that show too.  But the sex in Friends (while often rather puerile) was neither so explicit nor so painful as it so often is in Girls.  No, something has changed.  The sky has darkened.

Thus, I am unable to accept the Third Wave feminist argument that the sex and nudity in Girls (and contemporary television in general) is an expression of female empowerment, and that what counts is that women can choose what to do with their bodies.  There is simply too much of an appearance that such “choices” are really responses to what is expected of them.  Instead, I see something of a vicious circle: television shows that depict young women being sexually humiliated in order to satisfy their viewers’ demands for “realism,” while young women, seeing such humiliation on so many of their favorite programs, come to expect that in their lives and behave accordingly.  Art here doesn’t only reflect reality; it helps shape it.

Perhaps that is the most striking sign of all here: that it is a dismal time to be young in America, and the young know it.  Whether economically or romantically, the world, shows like Girls are saying, is off kilter.  Whether or not things are really that bad, the responses to Girls indicate that people think that they are, and enjoy the dark humor that the program delivers.  There are worse things than laughing at the darkness, however, and Girls, after all, is a comedy, sort of.

Comments: (1)
Categories: Jack Solomon
Read All Jack Solomon

Entrepreneurs in Toy Land

posted: 5.9.14 by Jack Solomon

A brief news item in the Chronicle of Higher Education reports that 89%  of those business leaders polled in a Northeastern University survey believe that “colleges should increase teaching about entrepreneurship”.  Given the fact that such corporate thinking has come to dominate current discourse on higher education and its purposes, it is worthy of a semiotic analysis, and I will sketch one out accordingly here.

First let’s be clear on the meaning of the word “entrepreneurship” itself so there isn’t any confusion.  Entrepreneurship is the defining quality of an entrepreneur, and an entrepreneur is someone who founds and directs new business ventures.  Of course, when thinking of such people, names like Steve Jobs, Bill Gates, Elon Musk, and Mark Zuckerberg always come to mind, and that, presumably, is what the business leaders surveyed in the Northeastern poll have in mind.

Fine, I do not mean to challenge this ethos of corporate creativity.  It is consistent with a number of traditional American mythologies, including our valuing of individualism, self-reliance, and what used to be called the Protestant Work Ethic.  But the problem lies in two major contradictions that the survey does not mention, contradictions that become clear when we look at the larger cultural system within which entrepreneurship functions.

The first contradiction lies in the fact that even as the entrepreneur is celebrated in corporate and educational discourse, the reality is that contemporary capitalism is becoming increasingly monopolistic.  For every successful entrepreneur there are countless entrepreneurs whose efforts have been wiped out by the giant companies that have already made it (just consider what Microsoft did to Netscape, or what Facebook did to MySpace, or, for that matter, what Sebastian Thrun once predicted about the fate of American universities in the era of the MOOC).  As the rules that were written in the Progressive Era to stem the monopoly capitalism of the late nineteenth century are loosened ever further (just look at the current controversy over FCC rules for the regulation of Internet ISP providers for an example of this trend), the odds against successful entrepreneurship are lengthening.  So for currently successful business leaders to urge today’s students to be more entrepreneurial seems more than a little problematic.  It’s like urging students to pursue an information technology education and then sending a significant portion of our IT jobs offshore.  This is a contradiction so deep that it could be called a betrayal.

But, as I say, there is a second contradiction.  Let’s recall that an entrepreneur is a hard-working, self-reliant individualist.  But at the very same time that American business leaders are calling for more entrepreneurial education, it is they, bolstered by billions of advertising and marketing dollars, who have created a society of passive consumerism and pleasure seeking hedonism.  Those of us in education who would like to see our students work hard and think critically are swimming upstream against an always-on entertainment society wherein instant gratification and 24/7-join-the-crowd social networking are significant obstacles to student success.  You can’t work effectively and individually when you are constantly sending selfies to Instagram, listening to music, texting, updating your Facebook page, downloading TV programs, doing some online shopping, tweeting, “following,” “friending” and otherwise multitasking on your smart phone.  But that is exactly what American business is working so hard to get our students to do.

So what I, as an educator, want to say to the business leaders who want me to teach entrepreneurship is, “please get out of my way.”  Stop pushing my students to believe that the instant gratification of every pleasure is far more important than time spent in study and personal effort.  If, as another part of the same Northeastern survey reports, fifty-four percent of the same business leaders believe that “the American higher-education system is falling behind developing and emerging countries in preparing students for the work force,” then it is time for those leaders to clean up their own house and stop treating students as consumers.

But I have no expectation at all that anything of the sort will happen.  After all, their own entrepreneurial success is grounded in wiping out the competition and treating human beings as markets: in short, in destroying the conditions that foster entrepreneurship.

Comments: (0)
Categories: Jack Solomon
Read All Jack Solomon

The Colbert Report

posted: 4.24.14 by Jack Solomon

How do you know when an entertainment event is a cultural signifier?

Easy: it’s when Rush Limbaugh asserts that “it has just declared war on the heartland of America.”

That, anyway, is what Limbaugh has been widely reported as saying upon word that CBS has hired Stephen Colbert to replace Dave Letterman on CBS’s Late Show And while Limbaugh’s declaration may be the most spectacular of responses to Colbert’s hiring, it is but one of a virtually endless stream of comments about what is, from one perspective, merely a corporate personnel decision on the part of CBS.  But when such a decision gets this sort of attention, you can be quite confident that it is more than it appears: it is, in short, a sign, and therefore worthy of a semiotic analysis.

In conducting such an analysis, we don’t need to judge either Colbert or any particular response to his new television role.  The point is to analyze the significance both of his hiring and of the reaction to it.  And, as always, we need to begin with some contextualization.

Let’s look first at the system of late-night talk show hosts.  There have been quite a lot of them, but none looms larger than the late Johnnie Carson.  As the now legendary host of The Tonight Show, Carson turned that program—and late night TV in general—into an institution, and what that institution once represented in the Carson era (I’m thinking here of the 1960s) was what I can best describe as a kind of laid-back Eisenhower-Republican ethos: a mild-mannered, good-humored Middle Americanism that, in the current political climate, would probably be denounced as being “socialistic.”  Carson himself wasn’t political—at least not in any overtly partisan fashion—but in an era when a cultural revolution was convulsing the nation, his cheerfully bland monologues, banter, and let’s-kick-everything-off-with-a-golf-swing manner was a kind of haven for Middle American adults: the parents of those hell-raising baby boomers who regarded The Tonight Show as just another “plastic” signifier of an “Establishment” that they wished either to escape or transform.

That’s why it was so significant that, when Carson retired, such edgier hosts as Jay Leno (who got Carson’s job) and David Letterman (who essentially took Carson’s place, though on a different channel) took over the late night watch.  These hosts—especially Letterman—were chosen precisely because they appealed to those edgier young baby boomers who, during the 1970s, had ceased to disdain late night talk shows and had become a coveted audience for it.  Though hardly an earth shaking development, the ascendancy of Letterman was but one of many signifiers by the early 1980s of a certain mainstreaming of the cultural revolution.  The conservative backlash to the 1960s may have captured the White House through twelve years of Reagan-Bush, but within popular culture, at least, one of the last bastions of Middle Americanism had fallen, so to speak, to the left.

Today’s choice of an even edgier entertainer—in fact, America’s most wickedly funny satirist of right-wing media—to replace Letterman thus signifies an intensification of the trend.  The Colbert Report is one of the current youth generation’s favorite television programs, and CBS’s choice of Colbert among so many other powerful contenders (Tina Fey, anyone? Amy Poehler?) most certainly appears intended to capture the millennials as an audience for its post-Letterman Late Show.

I think that Limbaugh, then, is wrong to attribute either cultural or political motivations to CBS’s choice.  CBS’s motivations, like any motivation in a commercial popular culture, are simple: the company wants to make money, and it has judged that by appealing to millennials it will do just that.  In other words, CBS isn’t “declaring war” on anyone.  Still, I would agree with Limbaugh (did I just say that?!) that the choice of Colbert has political significance.  After all, Colbert’s whole shtick lies in a very partisan (much more partisan than Letterman, and even more so than Jon Stewart) ridiculing of some of the icons of conservative media, and that’s political.

For me, the fact that CBS apparently feels that it can safely choose as potentially divisive an entertainer as Stephen Colbert is to rule the castle that Carson built (albeit on another station) is what is most significant here.  The cultural revolution has rolled along to a new stage.  In fact, it may well one of the most potent signs today of the changing demography that has been worrying Republican strategists as they seek ways of recapturing the White House.  Whether or not the Colbert-hosted Late Show becomes a popular hit remains to be seen (as I have hinted above, a woman host might have been a better idea, and, after all, there is still a sizable Middle American component to the audience for late night talk TV and it might not take to Colbert), but the significance of this choice will remain.  The Democrats, so to speak, have won this round, and they never even had to enter the ring.

So hail to the new satirist-in-chief.  I wonder how long his term will run.

Comments: (0)
Categories: Jack Solomon
Read All Jack Solomon