Converse ‘Till It Hurts

Contentious, formerly an e-newsletter devoted to issues of content development and delivery, went blog last week, after a year’s hiatus of the e-newsletter. And this reader just isn’t interested. The reasons are several: the hurried, unstructured prose that characterizes many blogs; the technical hurdles RSS and blogging software still represent; the difficulty in searching and finding content easily (although I wonder how wrong it is to expect websurfers to know their software, at least enough to use “find in page”?); and so on. What strikes me most, though — and it’s a complaint I’ve seen in other forms frequently around the web — is the following comment:

I find blogs to be generally unstructured and difficult to find things in, especially with unchecked and/or unmoderated comments sections. How many megabytes of, “Ooo, me too!” should I be expected to look through to find someone who may have made a useful addition to the conversation?

OK, no accounting for matters of personal taste, but it seems to me that this kind of sentiment is less about taste than it is about a deep-seated fear of conversations. I’m the first to admit that comments can turn into mob scenes, but they can turn into remarkably cogent discussions, bringing to bear expertise that the blog-owner didn’t have or didn’t know how to bring into play. For instance, an incredible discussion was launched at Alas, a Blog in response to a post (which, alas, seems to be no longer at the URL I bookmarked at the time) on transexuality, feminism, and gender construction. A real conversation developed, with commentators drawing on their experiences as transvestites, transexuals, homosexuals, and heterosexuals from all over the political spectrum in dealing with one of the fundamental — and thus most uncomfortable — ambiguities we are confronted with in modern life.

Conversations scare people, I think, for the same reason trans-vestitism/sexualism do — conversations are ambiguous, shifting, never quite “fixed” in place, maybe — like the Internet — always already broken. There is no predicting just what will happen in even the most banal conversation. All over the world, marginalized groups — African-American urban teenagers, Yemeni tribalists, Brazilian working men, and others — have raised to the level of an art the practice of raising the stakes in a conversation higher and higher until just before violence erupts. A tricky proposition, to be sure, and if you play the game wrong, you end up with a busted jaw, or worse.

Conversations are, by their nature, not only unpredictible but uncontrollable. We may fool ourselves with etiquette and political correctness, but we know that even within the lines we define as “proper” a lot of damage can be done — and we also know that those lines are more “suggestions” than real barriers. This unpredictability and uncontrollability are part and parcel of conversation. Although anthropologists recognize a “phatic” function of speech (the exchange of formalities being a good example, where following the form is far more important than the content of the exchange, the intent being to recognize and reinforce social bonds rather than to communicate knowledge; the “me toos” the letter-writier above cites are also an example of phatic speech), most conversation is meant to work through or towards the unknown — whether that’s what we should get for dinner or what kind of society we should live in. Our partners introduce even more unknowns — we never know precisely what someone might choose to take offense at (try it with your significant other some time…) or exactly what they know or don’t know. So conversations involve risks, and danger, and for that they are pretty scary. But consider the alternative — the one-way flow of information that characterizes most media communication. Is there any less risk or danger? Well, given the results of PIPA‘s recent study, “Misperceptions, The Media and The Iraq War”, showing a positive correlation between watching Fox News and misunderstanding basic details about the war in Iraq, perhaps the risks are just as great — maybe greater.

More to the point, though, conversation is what makes us us. It is at the same time the source of information about acting in the world and action in the world. Conversation is where we draw the raw material of our selves and is the field in which we shape ourselves from that raw material. This fear of conversation, of “what they might say”, seems to me a rejection of our humanity itself — but that’s what scares me. Because as far-fetched as it may sound, it seems to me that their is a faction in modern society that would like us a little less human, a little less active, a little more like machines or robots. And, yes, a little less conversational.

Blogging Into the New Age

The Columbia Journalism Review offers a special report on The New Age of Alternative Media this month, with several articles on blogs and blogging. Highlights include:

  • From “Blogworld and Its Gravity

    [T]hese amateurs, especially the ones focusing on news and current events, are doing some fascinating things. Many are connecting intimately with readers in a way reminiscent of old-style metro columnists or the liveliest of the New Journalists. Others are staking the narrowest of editorial claims as their own ? appellate court rulings, new media proliferation in Tehran, the intersection of hip-hop and libertarianism ? and covering them like no one else. They are forever fact-checking the daylights out of truth-fudging ideologues like Ann Coulter and Michael Moore, and sifting through the biases of the BBC and Bill O’Reilly, often while cheerfully acknowledging and/or demonstrating their own lopsided political sympathies. At this instant, all over the world, bloggers are busy popularizing underappreciated print journalists (like Chicago Sun-Times columnist Mark Steyn), pumping up stories that should be getting more attention (like the Trent Lott debacle), and perhaps most excitingly of all, committing impressive, spontaneous acts of decentralized journalism.

  • The Media Go Blogging, a short guide to blogs associated with major media outlets.
  • Killer Apps“, a too-short examination of the technological forces driving blogging (and vice versa). “In 1999 there were dozens of blogs. Now there are millions. What happened?”
  • From “Terms of Authority“, a look at the fate of authority in these days of distributed intelligence, comes this description of “the public” as an idea:

    The public is an idea because it takes imagination to conceive of such a thing – the great mass of people spread out over the nation but in touch with the same events, leading private lives but paying public matters some attention. It becomes more than an idea when people act on it, as Jay Leno does in his nightly monologue on the day’s news: “You all saw this, right? . . .”

I’m not one of these people who salivates over every mention of blogging in the press, or that even cares whether blogging ever achieves “mainstream success” (whatever that is). But the CJR is a smart magazine, and the pieces on alternative media included in this special report show it. Blogging — and publishing alternative weeklies, and writing ‘zines, and broadcasting commercial radio or television — isn’t important as a demonstration of technological proficiency in and of itself. Media is, in our modern mass society, our way of being society. It is literally and redundantly the medium in which we live and the mediator between us as individuals and us as millions of people sharing the world-space we live in. Blogging is exciting right now, in some cases because it’s new, but also because it is, finally andafter much, much hype, a way for individuals to actively engage media, to control, even in a small way, the medium of their identities. And that’s no small thing.

Men Are the New Children

The infantilization of women is well-enough known to be a cliché, from “protective” male chauvinists to the sexual standard set by Britney Spears and Cristina Aguilera. In the interest of fair play, apparently, a German bar has opened a “Männergarten“–a place for women to drop their men-children off while mommy-wife goes shopping for men:

The living room-sized space at one end of the Nox bar in central Hamburg looks like it’s been perfectly equipped as a children’s daycare center. There are comic books spread out on tables, comfortable couches, a remote-controlled car, plastic toys and even a playpen of sorts with a construction set. It’s only when you catch a glance at the copies of Penthouse and Playboy scattered about that you realize this is not your average kids’ area.

In fact, children aren’t allowed here; women aren’t either. The Nox bar has set aside this room for men only. More precisely, for men who have no desire to tag along with their wives or girlfriends while they look for skirts, scarves and handbags in the designer boutiques in Hamburg’s premier shopping district.

For 10 Euros (about $11.65 US), the “boys” get a meal, 2 beers, and all the laddish entertainment their tiny little heads can absorb before they have to go nappy-time.

Bars in Cologne, Munich, and Berlin are planning their own Männergartens.

What Happened to Me?

To those who have visited repeatedly over the past month, I must apologize for my long absence. I’ve taken on two jobs, one part-time for the local library district (scheduling conference room and theater usage) and the other at the local community college (teaching “Intro. to Cultural Anthropology”). The first pays the bills, the second is my first big step towards a career as an academic. Between class preparation and grading papers for one job and the other job’s irregular schedule (especially at the moment as my colleague in the department is out on 8 weeks sick leave recovering from an operation), I’ve had little time or, more importantly, energy to sit down and even surf the Internet, let alone write to this site. Even when I have come across something I wanted to talk about (Johnny Cash’s death, for instance) so much time has elapsed before I could set aside an hour or two to write that the air of immediacy disappears — I might as well write about how deeply saddened I was by the death of U.S. Grant at this point. (OK, Grant was a dick, and Cash far from, but you get the point — it was a looooong time ago, now.)

I can’t really say things will improve, not in the short term anyway. I’m likely to be busier and busier as the semester progresses. In my mind there’s the sense that I could at least keep up somewhat if I kept my posts to a more reasonable length, but there’s also the realization that this is unlikely to happen. So I’m left with the unsatisfying position of announcing that OneMansOpinion.org will remain open and active, but with a much reduced level of activity on my part. Over the long term, I hope that this will continue to be a productive outlet for me — I’m pretty proud of some of the work I’ve done here, and hope to have the chance to be proud of future work as well.

Are Martin Sheen and Rob Reiner My Friends?

In the past couple of days, I’ve received e-mails from Martin Sheen and Rob Reiner asking me to support Howard Dean’s campaign for the presidency. OK, not directly from these illustrious personages — both e-mail were sent with the same return e-mail address, info@deanforamerica, only the “Sender’s Name” field was changed to reflect the messages’ origination from on high. And I’m pretty sure that text like the following originated in committee rather than from Martin Sheen’s fevered imaginings:

We need Howard Dean’s bold leadership in the White House. Strong fundraising keeps our campaign’s momentum going; it pays for much-needed media buys in key battleground states and it attracts new supporters to Howard Dean. And continuing strong fundraising will also help with the campaign’s latest bold move: hiring a coordinator for each of Iowa’s 99 counties.

(Incidentally, do we need another “bold” president?)

What happened with Dean’s campaign? I gave kudos some time ago to Dean’s campaign for “getting it”, Internet-wise — for using the Internet in smart and creative ways to reach out to American voters and help not just to promote a presidential candidate but to actively realize a community of democratically-minded citizens. So it saddens me to see these previously clever and progressive (in every sense of the word) campaigners stooping to the level of spam — bombarding my in-box with pleas for support, altering reply-to information to increase the likelihood of my reading their latest missives (which wouldn’t be an issue if they weren’t already sending me so much e-mail that I don’t have the time, let alone the inclination, to read it all).

Maybe the Cluetrain has left Dean’s station for destinations unknown. What I’m beginning to see is not the logical next step in a campaign to use the Internet in new and exciting ways to contribute to the ongoing construction of a democratic civil society, but a retreat into corporate-speak and corporate-think, the replacement of innovation with marketing. Instead of an attempt to involve me in Dean’s campaign, or in progressive politics as a whole, I am just a consumer of ideology at the end of a long, impersonal e-mail pipe. “Give ’em some Martin Sheen — polls show that Americans trust Sheen, he transmits a presidential image that we can associate with our product, er, we mean, candidate.” Quite frankly, I’m being broadcasted to, and I don’t like it!

As the predatory manhunter Helen says in the movie Singles, “Desperation is the worst perfume.” Is the stink rolling off these latest campaign effort from the Dean camp a sign of a campaign that’s already losing steam, maybe shaken by the entry of another center-oriented liberal candidate with greater name recognition and 4 shiny stars on his shoulderbars? We’re still 13 months off from the election, 11 or so from the campaign convention — have progressive Democrats used all the tricks they had up their sleeves already? More importantly, having opened up the channel of communication (even while hiding behind the deanforamerica.com e-mail address), can I now consider Sheen and Reiner my friends? Could I borrow money from them? How about their cars?

We Are All Postmodern

I am often surprised by the scorn that the term "postmodern" (and its variations) meets with, both in academia and in the general population. I find that "postmodern" is a term that is "bandied about" quite a bit without much substance or conviction behind it, in much the same way that a secularist like myself will yell out "Damn you!" without actually considering him- or her- or myself to be imploring a vengeful deity to consign one’s interlocutor to Hell. Although I don’t consider myself much of an expert on the topic — I’m rather an interested follower — I figured I take a crack at, if not nailing it down (a decidedly non-postmodern thing to do), at least injecting some meaning into the word(s).

"Postmodernism" is hard to define–made all the harder by postmodernism’s rejection of fixed definitions. We might start by at least sketching out some rough borders, by laying out a few things that postmodernism is not. It’s not a time period, despite the "post-" prefix. Although certain of its trends or tendencies date to the early ’70s, postmodernism and modernism exist since then more or less side by side, and elements of postmodernism can be found as far back as the mid-19th century (many of the features of capitalism outlined by Marx we today see as hallmarks of postmodernism). Its also not "outside" of modernity or modernism — postmodernism is more fruitfully thought of as a "phase" of modernity, or even as a "quality" of the modern. It is not "anti-modern" or "non-modern" or anything else–it is part of the modern.

Because the "postmodern" label was first applied to architecture, from which the label spread to other domains, it bears examining just what postmodern architecture is. The postmodern architects rejected the unadorned glass-and-steel boxes of the modernists in favor of an architecture that was more sensitive to concerns of place and use. Unlike the modernists who viewed their sleek celebrations of the latest building techniques as suitable to every milieu, the postmodernists designed buildings that explicitly referenced local traditions, local history, and the local landscape. They also rejected the modernists strict separation of habitable space from the functioning "innards" of the building, exposing air conditioning and heat ducts, plumbing, light fixtures, elevator cables, and other elements that the modernists had hidden from casual view. The "Discovery" in 2001 is modernist; the "Alexi Leonov" in 2010 is postmodernist. The difficulty in clearly delineating "modernism" and "postmodernism" is clearly apparent in the postmodernists appropriation of local landscapes, referencing the work of ur-modernist Frank Lloyd Wright.

Postmodernism soon began to "pop up" in domains far removed from architecture. In literature, post-structuralists like Derrida and Foucault set the stage for postmodernism in their challenges to narrative and discursive authority in the text. Where the New Criticism had separated the text from the life and times of its author, Foucault insisted that the text be understood not simply as influenced by the culture around it, but as a part of that culture, capable of exerting its own influence in the world. Derrida went a step further, privileging the multiplex readings any text can give rise to over the author’s intentions. In challenging the idea that a narrative should give rise to a limited set of meanings reflecting an author’s intentions, Derrida came up against the larger narratives that (attempt to) give meaning to entire societies, the so-called Master Narratives. If the internal structure of a narrative like Levi-Strauss’ Tristes Tropiques could be upset or "put into play" as Derrida showed in Of Grammatology, so that meanings that seemed apparent were turned upside down and inside out until they seemed to mean the opposite of what the author intended, so to could the structure of society and of the narratives — the myths, histories, and "common sense" ideas — that structured it.

In political economy, the term was best invoked (in my opinion) in David Harvey’s The Condition of Postmodernity: An Enquiry into the Origins of Cultural Change. Harvey uses "postmodern" to describe the economic system characterized by flexible accumulation, and in opposition to "modern" Fordism (the expansion of Taylorist production methods into a full-blown social order). Fordism represents not just the technical model of production perfected by Ford (the assembly line, the fine-grained deskilling of manufacturing jobs, the Taylorist disciplining of individual labor, products aimed at a single mass market, and so on) but the creation of a consumer market in which Fordist production could flourish. Ford’s belief in a body of workers who could afford the products they made, his erection of worker housing complete with company-paid recreation directors, his purchase of and editorship over the newspapers his workers would read, and other acts worked to assure that off-hours Ford workers used their leisure time within the enveloping culture of the company itself, and recognized that workers were also consumers. In conjunction with the Keynesian state to carry the burden of a surplus labor force necessary to accommodate American industry’s rapid but uneven growth following WWII, the Ford’s model of an affluent worker/consumer society dominated American economic life until well into the ’60s.

With the development of a host of new technologies, though — computers to handle inventory and order fulfillment, automated production machinery to reduce the "changeover" time needed to gear up for a new product, television advertising and a sophisticated marketing industry to help target products more efficiently to the people most likely to buy them — this model began to be eroded by a new model for organizing labor and consumption, flexible accumulation. The new technologies greatly reduced the economies of scale that Ford and his cohort had exploited to build their empires. Instead of offering one model in one color to the entire population, it was now profitable to offer several models or even several different kinds of products, to several smaller "niche markets". On-demand production made it not only easier to quickly retool to produce another product, but reduced the risks associated with production by filling orders as they came in, rather than producing a massive inventory up front, before the demand could be assessed. The most successful companies no longer produce anything at all — companies like Nike don’t own a single factory, instead controlling a web of contractors and sub-contractors. The everyday tasks of doing business — payroll, order fulfillment, procurement, maintenance, and marketing — are increasingly outsourced. The effect on labor has been marked: where the "model" worker of the ’50s and ’60s might have looked forward to life-long employment, union representation, and a fixed pension, the "model" postmodern worker is a temp or independent contractor with (if s/he’s lucky) a 401k or IRA which (again, if she’s lucky) might be worth something when s/he reaches retirement age (and, if not, there’s always room for another part-time greeter at the local Wal-Mart).

The change in production techniques was paralleled by a change in consumption, too. With the social fragmentation of the ’60s, both the rise of ethnic identity politics and the nascent Me Generation’s clamor for ways to stand out from "the rest of the crowd", there no longer existed any single market. Instead of "keeping up with the Joneses", the consumer of the early ’70s sought ways to distinguish themselves from the Joneses. Ethnic groups, having long subscribed to modernist universalism as the ticket to equality, began to celebrate their differences from the white majority, and producers stepped in to meet this growing need. Rather than a single mass market, there emerged a multitude of niche markets, smallish "clumps" of demand often formed in direct opposition with other niches.

As in architecture, the arts also experienced a postmodern opposition movement, set off in the early ’70s by the feminist movement and pop art and later expressed through the incorporation of folk art and other "outsider art". In something of a reversal of postmodernism in literature, in the arts postmodernism entailed a return to narrative following the modernists total rejection of narrative (along with figurativity). If the color field paintings of Rothko or the drip paintings of Pollock stand as the height of modernism, the paintings of Lichtenstein stood explicitly as "slices" of a hinted-at narrative. The total negation of self in the finished work of the modernists (one critic, though which one I can’t remember at the moment — maybe Greenburg — defined modernism exclusively in terms of the diminishing visibility of brushstrokes, the mark of the painter’s presence preserved in the surface of the finished work — prompting Lichtenstein to reply with a series of paintings depicting brushstrokes) was rejected by women and minority painters who explicitly drew on their own life histories and ethnic identities for source material, as in Basquiat‘s incorporation of urban street graffiti and Black history and personal life events. Another difference between modernist and postmodernist art lay in the attitude towards the medium. Modernist painters like Pollock reveled in the sheer "mediumness" of their medium, essentially making paintings of paint. Their postmodern descendents include photorealists like Chuck Close and Richard Estes whose work so closely emulates the world around them that it is hardly recognizable as paintings at all.

Irony is a common thread running through all domains of postmodern life (and rumours of its death have been, I fear, greatly exaggerated). A sense of ironic detachment is a central survival adaptation in a world dominated by mass media, market-speak, and profit-driven newsrooms. The postmodern citizen lives in a world constructed almost wholly of lies and image: 4 out of 5 dentists do not recommend sugarless gum for their patients who chew gum (nobody ever asked them), the newest toy advertised on Saturday morning will not do half the things it is shown doing, drinking beer will not make you more attractive, whatever product is mentioned in the 11 o’clock news program’s scare segment will not kill you or scar your children, neither Cosmo nor Maxim will make you a better lover, neither "the leading brand" nor its competitor will get grass stains out of your pants, and no matter what you buy you will generally not be a happier or healthier person because of it. We live in a world in which the only choices are products and brands that have made ridiculous claims that we know to be false, but we also have to eat, drink, wash our clothes, and so on. How else but ironically can we buy Palmolive dish soap when we know, absolutely know, that it will not make our hands softer?

This goes somewhat deeper than just our response to consumer products. The rise of the self-help movement, for instance, and its cousins in New Age philosophy and Oprahaic therapy has filled our everyday vocabulary with a host of platitudes and cliches that we know are simply not true but which are all we have with which to construct our verbal responses to the tragedies that befall ourselves and others. How can we loudly and repeatedly profess our disdain of money — "Money isn’t the only thing in life", "Money can’t buy happiness", "Money is the root of all evil", "The best things in life are free", and so on — when almost all our actions in life are focused around money? Irony. Irony allows us to refrain from laughing when a supervisor describes our workplace as a "family" and speaks of loyalty to a company — a company that, we know, will lay us off without the slightest hesitation if a reduction of "redundancies" (i.e. workers) will boost the next quarter’s profit margin. In the postmodern world, earnestness is lethal — to truly believe in something, without the slightest reservation, is to make oneself vulnerable to all sorts of disappointment, from the pain of the "painless" Epilady to the shock of finding one’s insurance cancelled as one faces a grave illness.

The thing that comes to most people’s mind in response to postmodernism is the jargon. Although postmodernism describes a much wider swath of social life than just the limited academic movement, it is the academic work struggling to comprehend and explain the postmodern that has become most associated with postmodernism. And that work is, by and large, admittedly difficult reading. A host of "hegemonies" and "always-alreadies" and "subjectivities" and other big-dollar words tend to scare off all but the most committed readers. Alan Sokal’s postmodern hoax, in which a completely fabricated scientific paper heavy with postmodernisms was accepted and published, stands as the model of how postmodernist jargon is used to hide a lack of substance and insight — which is not a totally fair assessment. Although surely a lot of hot air is made intellectually acceptable by postmodernist language, I would venture that this is no more common that it has been in any other field of endeavour, from middle-age scholasticism to Enlightenment-era scientism to 19th century scientific racism to high modernism. Tom Wolfe’s The Painted Word mocks the pretentious language of high modernist art critics (and patrons of Pollock, Rothko, Jasper Johns, and the other great modernist painters) Harold Rosenberg, Clement Greenberg, and Leo Steinberg, accusing the modernists, like Sokal to the postmodernists, of using fancy language to hide the total lack of content in their favored art movements.

The denseness of postmodern theory comes from a number of sources. Of course, it’s theory, and theory can be dense, no matter what the field. The same problems we might have reading, say, Baudrillard we would also have reading advanced mathematical theory, or physics. It’s just not always possible to describe complex systems like society, culture, or economy in a simple, straight-forward manner. But I think a lot of the trouble comes from other factors. First, a lot of postmodern theory comes from specific fields — philosophy, literary criticism, social science — each of which has its own specialized style and vocabulary. Second, a great deal comes from France, which presents two problems for English-speaking readers. The French intellectual system is very different from that of the British, American, Canadian, and most other English-speaking peoples (with India being a notable exception, and Indian theory is, likewise, rather dense for non-Indian tastes). More important, I think, is that complex ideas in French tend not to translate well into English, and translators tend to make choices that act as barriers to English-speakers. Third, there’s the politics of academia — like it or not, for better or worse, knowingly or unknowingly, academics erect boundaries around their tiny domains of expertise, opening the gates only to those with the determination and talent to learn the "secret language" of the masters. Fourth, in a field founded on Derrida’s distrust of narrative and Foucault’s distrust of discourse, postmodern theorists are prone to experiment with language, to write in ways that often intentionally deny any fixed interpretation, to leave much of their argument implicit and open to interpretation. Sometimes this is pretentious — "Let’s see what they make of this!" — but it can as easily be a product of humility — "How can I pretend to have the answers in a world where there are no answers?".

Postmodernism is, by and large, a reaction to the universalizing tendencies of modernism. In its attention to the particularities of place, identity, and context, postmodernism denies the basic modernist orientation towards "Man" (or "humanity"). When Hollywood made The Diary of Anne Frank into a movie in the ’50s, it expunged or downplayed nearly all the "Jewish" details in favor of a story with a universal message. The Holocaust was not bad because of what it did to Jews, and especially not for what it did to the specific young woman on whose diary the movie was based, but because Anne Frank could be any of us (as could her oppressors, as Hannah Arendt showed in another modernist work, Eichmann in Jerusalem). Today, Jewish people such as Elie Weisel jealously protect the Jewish specificity of the Holocaust (and the Broadway version of Diary of Anne Frank restores Anne’s Jewish specificity to the tale). At the root of postmodern society is the growing recognition that what is right in one place at one particular time may not be right in another place or time. This is, I propose, deeper than the easy (and all-too-often dismissive) label "relativism" suggests, implying a kind of deeply-situated pragmatism in our relations with the world. In a globalized world without the benefit of a clear-cut path to "salvation" (or "success), for that matter), we find ourselves forced to reckon with all the varied ways of doing things — with the full knowledge that, though they may differ from our own, they may well be better, or at least as effective as the ways that are familiar to us. Thus a kind of ironic detachment emerges even towards our own thoughts and actions — we must act as if we knew our way was the "right" way of doing something, or risk a total paralysis. But we do so always knowing that others might do things far differently, and that in another context our own ways might not only appear strange but become, in fact, useless.

Finally, I conclude this piece with some random thoughts that occurred to me as I thought all this through, offered as illustrations of a sort. I haven’t worked through all of the implications of these thoughts; for the most part, they just "feel" right:

  • For modernists, the medium is the message; for postmodernists, the message is the medium.
  • Modernists believe that racism will end when humanity accepts its similarities; postmodernists believe that racism will come to an end when humanity accepts its differences.
  • Chock Full O’ Nuts is modern, Starbucks Blue Mountain Blend is postmodern.
  • James Joyce’s Ulysses is both modern and postmodern. Go figure!
  • "Leave it to Beaver" is modern; "Sesame Street" is postmodern.
  • The American Melting Pot is modern; the American Salad Bowl is postmodern. Ironically, though, most initiatives that claim to celebrate the American Salad Bowl are entirely modern.
  • Al Jolson’s Jazz Singer is modern; John Zorn’s Masada is postmodern.
  • Free Market ideology, with its focus on individual choice and competition, tends to be postmodern, but the corporations and politicians that advocate the Free Market tend to be very, very modern.
  • Jazz is modern; Hip-Hop is postmodern.
  • Doctor’s orders are modern; support group-driven research is postmodern.

We Are All Postmodern

I am often surprised by the scorn that the term "postmodern" (and its variations) meets with, both in academia and in the general population. I find that "postmodern" is a term that is "bandied about" quite a bit without much substance or conviction behind it, in much the same way that a secularist like myself will yell out "Damn you!" without actually considering him- or her- or myself to be imploring a vengeful deity to consign one’s interlocutor to Hell. Although I don’t consider myself much of an expert on the topic — I’m rather an interested follower — I figured I take a crack at, if not nailing it down (a decidedly non-postmodern thing to do), at least injecting some meaning into the word(s).

"Postmodernism" is hard to define–made all the harder by postmodernism’s rejection of fixed definitions. We might start by at least sketching out some rough borders, by laying out a few things that postmodernism is not. It’s not a time period, despite the "post-" prefix. Although certain of its trends or tendencies date to the early ’70s, postmodernism and modernism exist since then more or less side by side, and elements of postmodernism can be found as far back as the mid-19th century (many of the features of capitalism outlined by Marx we today see as hallmarks of postmodernism). Its also not "outside" of modernity or modernism — postmodernism is more fruitfully thought of as a "phase" of modernity, or even as a "quality" of the modern. It is not "anti-modern" or "non-modern" or anything else–it is part of the modern.

Because the "postmodern" label was first applied to architecture, from which the label spread to other domains, it bears examining just what postmodern architecture is. The postmodern architects rejected the unadorned glass-and-steel boxes of the modernists in favor of an architecture that was more sensitive to concerns of place and use. Unlike the modernists who viewed their sleek celebrations of the latest building techniques as suitable to every milieu, the postmodernists designed buildings that explicitly referenced local traditions, local history, and the local landscape. They also rejected the modernists strict separation of habitable space from the functioning "innards" of the building, exposing air conditioning and heat ducts, plumbing, light fixtures, elevator cables, and other elements that the modernists had hidden from casual view. The "Discovery" in 2001 is modernist; the "Alexi Leonov" in 2010 is postmodernist. The difficulty in clearly delineating "modernism" and "postmodernism" is clearly apparent in the postmodernists appropriation of local landscapes, referencing the work of ur-modernist Frank Lloyd Wright.

Postmodernism soon began to "pop up" in domains far removed from architecture. In literature, post-structuralists like Derrida and Foucault set the stage for postmodernism in their challenges to narrative and discursive authority in the text. Where the New Criticism had separated the text from the life and times of its author, Foucault insisted that the text be understood not simply as influenced by the culture around it, but as a part of that culture, capable of exerting its own influence in the world. Derrida went a step further, privileging the multiplex readings any text can give rise to over the author’s intentions. In challenging the idea that a narrative should give rise to a limited set of meanings reflecting an author’s intentions, Derrida came up against the larger narratives that (attempt to) give meaning to entire societies, the so-called Master Narratives. If the internal structure of a narrative like Levi-Strauss’ Tristes Tropiques could be upset or "put into play" as Derrida showed in Of Grammatology, so that meanings that seemed apparent were turned upside down and inside out until they seemed to mean the opposite of what the author intended, so to could the structure of society and of the narratives — the myths, histories, and "common sense" ideas — that structured it.

In political economy, the term was best invoked (in my opinion) in David Harvey’s The Condition of Postmodernity: An Enquiry into the Origins of Cultural Change. Harvey uses "postmodern" to describe the economic system characterized by flexible accumulation, and in opposition to "modern" Fordism (the expansion of Taylorist production methods into a full-blown social order). Fordism represents not just the technical model of production perfected by Ford (the assembly line, the fine-grained deskilling of manufacturing jobs, the Taylorist disciplining of individual labor, products aimed at a single mass market, and so on) but the creation of a consumer market in which Fordist production could flourish. Ford’s belief in a body of workers who could afford the products they made, his erection of worker housing complete with company-paid recreation directors, his purchase of and editorship over the newspapers his workers would read, and other acts worked to assure that off-hours Ford workers used their leisure time within the enveloping culture of the company itself, and recognized that workers were also consumers. In conjunction with the Keynesian state to carry the burden of a surplus labor force necessary to accommodate American industry’s rapid but uneven growth following WWII, the Ford’s model of an affluent worker/consumer society dominated American economic life until well into the ’60s.

With the development of a host of new technologies, though — computers to handle inventory and order fulfillment, automated production machinery to reduce the "changeover" time needed to gear up for a new product, television advertising and a sophisticated marketing industry to help target products more efficiently to the people most likely to buy them — this model began to be eroded by a new model for organizing labor and consumption, flexible accumulation. The new technologies greatly reduced the economies of scale that Ford and his cohort had exploited to build their empires. Instead of offering one model in one color to the entire population, it was now profitable to offer several models or even several different kinds of products, to several smaller "niche markets". On-demand production made it not only easier to quickly retool to produce another product, but reduced the risks associated with production by filling orders as they came in, rather than producing a massive inventory up front, before the demand could be assessed. The most successful companies no longer produce anything at all — companies like Nike don’t own a single factory, instead controlling a web of contractors and sub-contractors. The everyday tasks of doing business — payroll, order fulfillment, procurement, maintenance, and marketing — are increasingly outsourced. The effect on labor has been marked: where the "model" worker of the ’50s and ’60s might have looked forward to life-long employment, union representation, and a fixed pension, the "model" postmodern worker is a temp or independent contractor with (if s/he’s lucky) a 401k or IRA which (again, if she’s lucky) might be worth something when s/he reaches retirement age (and, if not, there’s always room for another part-time greeter at the local Wal-Mart).

The change in production techniques was paralleled by a change in consumption, too. With the social fragmentation of the ’60s, both the rise of ethnic identity politics and the nascent Me Generation’s clamor for ways to stand out from "the rest of the crowd", there no longer existed any single market. Instead of "keeping up with the Joneses", the consumer of the early ’70s sought ways to distinguish themselves from the Joneses. Ethnic groups, having long subscribed to modernist universalism as the ticket to equality, began to celebrate their differences from the white majority, and producers stepped in to meet this growing need. Rather than a single mass market, there emerged a multitude of niche markets, smallish "clumps" of demand often formed in direct opposition with other niches.

As in architecture, the arts also experienced a postmodern opposition movement, set off in the early ’70s by the feminist movement and pop art and later expressed through the incorporation of folk art and other "outsider art". In something of a reversal of postmodernism in literature, in the arts postmodernism entailed a return to narrative following the modernists total rejection of narrative (along with figurativity). If the color field paintings of Rothko or the drip paintings of Pollock stand as the height of modernism, the paintings of Lichtenstein stood explicitly as "slices" of a hinted-at narrative. The total negation of self in the finished work of the modernists (one critic, though which one I can’t remember at the moment — maybe Greenburg — defined modernism exclusively in terms of the diminishing visibility of brushstrokes, the mark of the painter’s presence preserved in the surface of the finished work — prompting Lichtenstein to reply with a series of paintings depicting brushstrokes) was rejected by women and minority painters who explicitly drew on their own life histories and ethnic identities for source material, as in Basquiat‘s incorporation of urban street graffiti and Black history and personal life events. Another difference between modernist and postmodernist art lay in the attitude towards the medium. Modernist painters like Pollock reveled in the sheer "mediumness" of their medium, essentially making paintings of paint. Their postmodern descendents include photorealists like Chuck Close and Richard Estes whose work so closely emulates the world around them that it is hardly recognizable as paintings at all.

Irony is a common thread running through all domains of postmodern life (and rumours of its death have been, I fear, greatly exaggerated). A sense of ironic detachment is a central survival adaptation in a world dominated by mass media, market-speak, and profit-driven newsrooms. The postmodern citizen lives in a world constructed almost wholly of lies and image: 4 out of 5 dentists do not recommend sugarless gum for their patients who chew gum (nobody ever asked them), the newest toy advertised on Saturday morning will not do half the things it is shown doing, drinking beer will not make you more attractive, whatever product is mentioned in the 11 o’clock news program’s scare segment will not kill you or scar your children, neither Cosmo nor Maxim will make you a better lover, neither "the leading brand" nor its competitor will get grass stains out of your pants, and no matter what you buy you will generally not be a happier or healthier person because of it. We live in a world in which the only choices are products and brands that have made ridiculous claims that we know to be false, but we also have to eat, drink, wash our clothes, and so on. How else but ironically can we buy Palmolive dish soap when we know, absolutely know, that it will not make our hands softer?

This goes somewhat deeper than just our response to consumer products. The rise of the self-help movement, for instance, and its cousins in New Age philosophy and Oprahaic therapy has filled our everyday vocabulary with a host of platitudes and cliches that we know are simply not true but which are all we have with which to construct our verbal responses to the tragedies that befall ourselves and others. How can we loudly and repeatedly profess our disdain of money — "Money isn’t the only thing in life", "Money can’t buy happiness", "Money is the root of all evil", "The best things in life are free", and so on — when almost all our actions in life are focused around money? Irony. Irony allows us to refrain from laughing when a supervisor describes our workplace as a "family" and speaks of loyalty to a company — a company that, we know, will lay us off without the slightest hesitation if a reduction of "redundancies" (i.e. workers) will boost the next quarter’s profit margin. In the postmodern world, earnestness is lethal — to truly believe in something, without the slightest reservation, is to make oneself vulnerable to all sorts of disappointment, from the pain of the "painless" Epilady to the shock of finding one’s insurance cancelled as one faces a grave illness.

The thing that comes to most people’s mind in response to postmodernism is the jargon. Although postmodernism describes a much wider swath of social life than just the limited academic movement, it is the academic work struggling to comprehend and explain the postmodern that has become most associated with postmodernism. And that work is, by and large, admittedly difficult reading. A host of "hegemonies" and "always-alreadies" and "subjectivities" and other big-dollar words tend to scare off all but the most committed readers. Alan Sokal’s postmodern hoax, in which a completely fabricated scientific paper heavy with postmodernisms was accepted and published, stands as the model of how postmodernist jargon is used to hide a lack of substance and insight — which is not a totally fair assessment. Although surely a lot of hot air is made intellectually acceptable by postmodernist language, I would venture that this is no more common that it has been in any other field of endeavour, from middle-age scholasticism to Enlightenment-era scientism to 19th century scientific racism to high modernism. Tom Wolfe’s The Painted Word mocks the pretentious language of high modernist art critics (and patrons of Pollock, Rothko, Jasper Johns, and the other great modernist painters) Harold Rosenberg, Clement Greenberg, and Leo Steinberg, accusing the modernists, like Sokal to the postmodernists, of using fancy language to hide the total lack of content in their favored art movements.

The denseness of postmodern theory comes from a number of sources. Of course, it’s theory, and theory can be dense, no matter what the field. The same problems we might have reading, say, Baudrillard we would also have reading advanced mathematical theory, or physics. It’s just not always possible to describe complex systems like society, culture, or economy in a simple, straight-forward manner. But I think a lot of the trouble comes from other factors. First, a lot of postmodern theory comes from specific fields — philosophy, literary criticism, social science — each of which has its own specialized style and vocabulary. Second, a great deal comes from France, which presents two problems for English-speaking readers. The French intellectual system is very different from that of the British, American, Canadian, and most other English-speaking peoples (with India being a notable exception, and Indian theory is, likewise, rather dense for non-Indian tastes). More important, I think, is that complex ideas in French tend not to translate well into English, and translators tend to make choices that act as barriers to English-speakers. Third, there’s the politics of academia — like it or not, for better or worse, knowingly or unknowingly, academics erect boundaries around their tiny domains of expertise, opening the gates only to those with the determination and talent to learn the "secret language" of the masters. Fourth, in a field founded on Derrida’s distrust of narrative and Foucault’s distrust of discourse, postmodern theorists are prone to experiment with language, to write in ways that often intentionally deny any fixed interpretation, to leave much of their argument implicit and open to interpretation. Sometimes this is pretentious — "Let’s see what they make of this!" — but it can as easily be a product of humility — "How can I pretend to have the answers in a world where there are no answers?".

Postmodernism is, by and large, a reaction to the universalizing tendencies of modernism. In its attention to the particularities of place, identity, and context, postmodernism denies the basic modernist orientation towards "Man" (or "humanity"). When Hollywood made The Diary of Anne Frank into a movie in the ’50s, it expunged or downplayed nearly all the "Jewish" details in favor of a story with a universal message. The Holocaust was not bad because of what it did to Jews, and especially not for what it did to the specific young woman on whose diary the movie was based, but because Anne Frank could be any of us (as could her oppressors, as Hannah Arendt showed in another modernist work, Eichmann in Jerusalem). Today, Jewish people such as Elie Weisel jealously protect the Jewish specificity of the Holocaust (and the Broadway version of Diary of Anne Frank restores Anne’s Jewish specificity to the tale). At the root of postmodern society is the growing recognition that what is right in one place at one particular time may not be right in another place or time. This is, I propose, deeper than the easy (and all-too-often dismissive) label "relativism" suggests, implying a kind of deeply-situated pragmatism in our relations with the world. In a globalized world without the benefit of a clear-cut path to "salvation" (or "success), for that matter), we find ourselves forced to reckon with all the varied ways of doing things — with the full knowledge that, though they may differ from our own, they may well be better, or at least as effective as the ways that are familiar to us. Thus a kind of ironic detachment emerges even towards our own thoughts and actions — we must act as if we knew our way was the "right" way of doing something, or risk a total paralysis. But we do so always knowing that others might do things far differently, and that in another context our own ways might not only appear strange but become, in fact, useless.

Finally, I conclude this piece with some random thoughts that occurred to me as I thought all this through, offered as illustrations of a sort. I haven’t worked through all of the implications of these thoughts; for the most part, they just "feel" right:

  • For modernists, the medium is the message; for postmodernists, the message is the medium.
  • Modernists believe that racism will end when humanity accepts its similarities; postmodernists believe that racism will come to an end when humanity accepts its differences.
  • Chock Full O’ Nuts is modern, Starbucks Blue Mountain Blend is postmodern.
  • James Joyce’s Ulysses is both modern and postmodern. Go figure!
  • "Leave it to Beaver" is modern; "Sesame Street" is postmodern.
  • The American Melting Pot is modern; the American Salad Bowl is postmodern. Ironically, though, most initiatives that claim to celebrate the American Salad Bowl are entirely modern.
  • Al Jolson’s Jazz Singer is modern; John Zorn’s Masada is postmodern.
  • Free Market ideology, with its focus on individual choice and competition, tends to be postmodern, but the corporations and politicians that advocate the Free Market tend to be very, very modern.
  • Jazz is modern; Hip-Hop is postmodern.
  • Doctor’s orders are modern; support group-driven research is postmodern.

They Report (Finally…)

According to an internal EPA report released Thursday evening, “White House officials pressured the agency to prematurely assure the public that the air was safe to breathe a week after the World Trade Center collapse.” Apparently, all EPA statements were vetted through the National Security Council, which is chaired by Mr. Bush, and which “convinced EPA to add reassuring statements and delete cautionary ones.”

For those of us who live (or lived, in my case) in New York City during and after the 9/11 attacks, this is hardly news. WBAI, even in its temporarily eviscerated state, reported almost daily on air quality issues and the EPA’s complicity in covering up the dangers to NYC residents. For weeks, a chemical tang saturated the air, making it difficult to breathe (or, indeed, knowing the source of the stench, to want to). Once WBAI’s news team was restored, Juan Gonzalez made regular reports on the cover-up, covering it as well in the Daily News, such as this article from almost a year ago. People were suffering from a wide range of respiratory afflictions, ranging from asthma attacks to nosebleeds to nagging headaches–and being told, over and over, that the air was fine.

Now it’s two years later, and the EPA — moments after alleged conspirator Christine Todd Whitman, the administration’s voice from inside the agency, has left the stage — is finally starting to open up and claim some responsibility for its failure. But what next? Heads should roll — but won’t, not in this administration. People like myself and my 10 million fellow New Yorkers have already spent months snorting down this tainted air — it’s too late to change that, now. And next time there’s a disaster? Who’s going to believe the air is safe when the President (or George Bush, if he’s still in office) pushes us to get out and shop?

I Report.

“There are hard cases and there are easy cases,” the judge said. “This is an easy case. This case is wholly without merit, both factually and legally.”

So saith U.S. District Judge Denny Chin in dismissing Fox’s request for an injunction against Al Franken.

In an unexpected move, Fox News also carried the story, exactly as written by Associated Press. Which suggests to me that either a) Fox has realized how embarassing this whole thing has been for them and feels that running the story is less embarassing than not running the story, or b) Fox News doesn’t have anyone reading these stories when they come off the wire.

You Decide.

What is it with People?

Over the last couple of days, the daily number of visitors to this site has roughly tripled. WHich would normally be nice, but all the new traffic is coming from web searchs for information on Ghyslian Raza, the “Star Wars Kid”, who I’ve written about a couple of times now. 67 of my last 100 hits were searches of this sort.

What’s the big deal all of a sudden? Back when the videos were at the peak of their popularity, I only got a couple hits a day like this–now all of a sudden I’m getting dozens?

Anyway, I think that’s strange. And now you know I think it’s strange. Of course, by writing about this, I am just increasing the likelihood that someone searching gor Ghyslian Raza will visit my site. Soon, I hope to be the Internet’s premier clearinghouse for Raza information, all without actually ever posting anything about him. Isn’t the “Net grand?