Thursday, November 12, 2009

Camille Paglia Bashes Claude Levi-Strauss

In her Salon column this week Camille Paglia spared a few column inches to consider and then completely trash the entire career of Claude Levi-Strauss:

"Continuing on the theme of overrated male writers, I was appalled at the sentimental rubbish filling the air about Claude Lévi-Strauss after his death was announced last week. The New York Times, for example, first posted an alert calling him "the father of modern anthropology" (a claim demonstrating breathtaking obliviousness to the roots of anthropology in the late 19th and early 20th centuries) and then published a lengthy, laudatory obituary that was a string of misleading, inaccurate or incomplete statements. It is ludicrous to claim that Lévi-Strauss single-handedly transformed our ideas about the "primitive" or that before him there had been no concern with universals or abstract ideas in anthropology.

Beyond that, Lévi-Strauss' binary formulations (like "the raw and the cooked") were a simplistic cookie-cutter device borrowed from the dated linguistics of Ferdinand de Saussure, the granddaddy of now mercifully moribund post-structuralism, which destroyed American humanities departments in the 1980s. Lévi-Strauss' work was as much a fanciful, showy mishmash as that of Joseph Campbell, who at least had the erudite and intuitive Carl Jung behind him. When as a Yale graduate student I ransacked that great temple, Sterling Library, in search of paradigms for reintegrating literary criticism with history, I found literally nothing in Lévi-Strauss that I felt had scholarly solidity.

In contrast, the 12 volumes of Sir James George Frazer's "The Golden Bough" (1890-1915), interweaving European antiquity with tribal societies, was a model of intriguing specificity wed to speculative imagination. Though many details in Frazer have been contradicted or superseded, the work of his Cambridge school of classical anthropology (another of whose ornaments was the great Jane Harrison) will remain inspirational for enterprising students seeking escape from today's sterile academic climate."


Now you know I couldn't let that go unanswered! I posted the following comment:

Bashing Levi-Strauss? Really?

As someone who made your academic bones explicating ad nauseum the opposition between Apollonian and Dionysian, I am surprised that you so blithely dismiss Claude Levi-Strauss. To reduce his massive career to a few-sentence caricature implies that you haven't read him carefully or completely.

Even if its granted that his structural armature was a bit overwrought; even if you discount his visionary explication of Amerindian mythology; even if you deduct from his oeuvre all writings from the 1960’s onwards, at least you can grant him some props for the sense and sensibility of his Tristes Tropiques and let him rest in peace. Just sayin’.

Friday, November 6, 2009

Executive Severance is a Textnovel.Com Editor's Pick!

I recently submitted my in-progress Twitter novel Executive Severance to a site called Textnovel.com which helps fledgling authors like myself get noticed. I'm happy to announce that my story has become an "Editor's Pick"!

Please go to Textnovel.com and vote for my story.

Wednesday, November 4, 2009

Claude Lévi-Strauss and Media Ecology

I recommend two excellent obituaries about Claude Lévi-Strauss, the father of Structural Anthrology, who died this past weekend. The New York Times does a good job summarizing his life and times.

The Guardian does a better job explaining the roots of Levi-Strauss' Structural Anthropology and I believe, underscoring its importance to Media Ecology. In particular, Maurice Bloch of The Guardian writes:

The basis of the structural anthropology of Lévi-Strauss is the idea that the human brain systematically processes organised, that is to say structured, units of information that combine and recombine to create models that sometimes explain the world we live in, sometimes suggest imaginary alternatives, and sometimes give tools with which to operate in it. The task of the anthropologist, for Lévi-Strauss, is not to account for why a culture takes a particular form, but to understand and illustrate the principles of organisation that underlie the onward process of transformation that occurs as carriers of the culture solve problems that are either practical or purely intellectual.

It seems to me that there is an unspoken assumption in Media Ecology that there are no differences in the intellectual capabilities of peoples of different ages or technological achievement. By this I don't mean differences in sensory balances, which may be determined by the particular technologies or media of communication available, but rather differences in the basic structure and capacity of the human mind.

When we use the terms, "oral" or "literate" or "post literate" in lieu of "primitive" or "modern", we are not referring to intellectual complexity or intelligence, but rather the modes of thought, the uses of systems of symbols and the religious, social and psychology outlooks encouraged or discouraged by a media environment. In refusing to see the people of cultures without writing (as he called them) as "primitive" or somehow inferior to Western white races, Lévi-Strauss provided the philosophical foundation for McLuhan, Postman and Ong. In a letter to the journal Technology and Culture in 1975, McLuhan acknowledged his debt to Lévi-Strauss' structural methodology for his own Laws of the Media.

If it is possible to distinguish a "primitive" mind from our own then how could we apply Marshall McLuhan's Laws of the Media universally across all cultures and time periods? We can talk about the sensory impact of different types of communication media in different eras only if we accept that the basic mental equipment and the capacity for intellectual activity we are born with has been the same throughout all human history and everywhere in the world. In his exhaustive analysis of Amerindian mythology, Levi-Strauss put the study of human culture on a scientific basis and his work belongs in our Media Ecology foundational canon along with Lewis Mumford, John Dewey and Edmund Carpenter.

Lévi-Strauss wrote:
I therefore claim to show, not how men think in myths, but how myths operate in men's minds without their being aware of the fact.

Isn't this what we Media Ecologists claim in our own studies of how symbol systems and technologies affect human beliefs and activities? Lévi-Strauss discovered and demonstrated connections between seemingly disparate mythic stories, and offered explanations for seemingly random elements of those stories. His methodology can be used as model for ways to interpret the products of our contemporary culture, which, while seeming to be unrelated, actually constitute our system (or systems) of symbolic meanings.

Rest in peace Professor Lévi-Strauss, and thank you for your life and your work.

Wednesday, September 9, 2009

New New Media by Paul Levinson


Allyn & Bacon, 2009. 240 Pages


As an experienced media ecologist and communication scholar, Paul Levinson brings to his new work, New New Media, a keen insight into the effects of computer-based communication forms. Levinson documents his encounters with various contemporary forms including blogging, wikis, podcasts and social sites like Facebook and MySpace. Along with a multitude of examples from actual web experience, Levinson compares and contrasts the “new new” media with traditional media and suggests how widespread adoption of these new forms will affect existing social institutions and attitudes.

Levinson sets the phenomenon of blogging in both an historical and a media ecological context. To properly understand what is happening on the web today, it is necessary to understand the way differing media have influenced information transmittal over human history. Thus the nature of blogging is comprehensible if we understand the pluses and minuses of oral, print and mass media communication and the impact the various stages of communication development have had on social mores and cultural and political movements.

Levinson distinguishes the “new new” media from previous forms (including the “old” new media) by the relative ease of entry for non-professional content producers and the absence of gatekeepers. Anyone with a keyboard, a monitor and a web connection can become a movie mogul, a music megastar, a political pundit, an investigative journalist or a widely-read novelist. If Levinson is right, the various internet based media are dramatically altering our notions of professionalism, consumerism, artistry and performance.

Expertly conversant on the mechanics of blogging, Levinson presents not just a scholarly survey, but also a how-to for aspiring bloggers. He discusses individual and group blogging, the influence (or lack thereof) of blogging gatekeepers, and the monetization of blogging content. In comparing blogs to books, Levinson provides an easy reference point to which both Millennials and Baby-boomers can relate.

Blogging’s influence on our social institutions is still in the state of becoming. For example, as the traditional print and mass media news outlets decline, the potential of blog-based investigative journalists to fill in the void remains to be seen. Levinson’s discussion of bloggers’ 1st Amendment rights is on target, and I’m sure would inspire some interesting online discussions.

This very immediacy may be the only shortcoming of Levinson’s book. The relevance of many of Levinson’s examples, while appropriate for this current edition, may quickly pass out of the public sphere, and therefore out of contextual significance. While we may still be talking about the “Obama Girl” during the next election cycle, other references may not be familiar to readers in 2012. This is both a strength and weakness of Levinson’s use of hyper-current examples. The references illustrate his points well, but their possible fleeting nature may be a hindrance in the long term. Things change so fast that each new edition of the book may require significant re-writing, or perhaps a migration from the printed page to a hyper-text online wiki edition. This may be unavoidable given the nature of the topic.

Today’s twenty-somethings and younger, members of the so-called “Millennial Generation,” inhabit the world depicted by The New New Media. They live in a world where texting, tweeting, blogging, Facebook and MySpace and a myriad of other social media are taken for granted and become the tools used for their interactions with their peers and the outside world. As a member of the “Baby Boomer,” generation, I found myself continually checking out Levinson's references to these various social media on my computer. Levinson is deeply involved in many actual aspects of the “new new” media and documents this in his book. So I have viewed his blog pages, his tweets, listened to some of his podcasts, etc. Though this may seem to non-millenials as an introduction to a disorienting brave new world, Levinson’s down-to-earth discussion of the “new new” media is an effective introduction to the impact of cyberspace structures and institutions on our current media environment.

Monday, July 27, 2009

Today's Rainbow Report

Yesterday's thunderstorms brought a bit o' luck to members of the Forest Hills Tennis Club. As documented in these unretouched (honestly!) photos taken from my apartment window, it is clear that the pot o' gold was to be found at the club!





Friday, July 17, 2009

A Literary Experiment: Twitstery on Twitter

Since May 6th I have been running a literary experiment on Twitter. I've published, 140 characters at a time, a comic murder mystery. The purpose of the experiment is to see if persistent creation of content leads to an increase in "followers" -- and to have fun while doing it.

For those of you who have just joined us, I am re-publishing here the entire Twitstery story to date. Remember that each of these entries was limited to 140 characters or less (including the #Twitstery hatchtag), that they appeared on average twice daily (so there is a time factor involved) and that I'm no Mickey Spillane. Feel free to continue following me by searching for "rblechman" on Twitter.

My hunch is that this sort of narrative owes more to the funny pages of classical print newspapers than to the long form fiction narratives of Charles Dickens. As such, my guide has been more Al Capp and less Raymond Chandler. And yes, there's a little bit of Inspector Clouseau and of "Police Squad" thrown in there.

Comments, questions, guesses as to who the murderer is are welcome.

Tuesday, June 16, 2009

Will Revolutionary Geeks and User Generated Content Topple the Ayatollah?


During the 2008 United States presidential election we experienced the first indication of a previously unknown political media ecology. Driven by social media such as YouTube, Facebook, MySpace and Twitter and propagated via computer, cellphone and MP3 player, these elements of what Fordham University professor Paul Levinson has called the “New New Media,” changed our national political landscape and are now working globally to transform political balances around the world. At home, grassroots organizers for Barack Obama were able to bypass the mainstream media and speak directly to potential voters and to orchestrate small-cap fund raising drives on an unprecedented scale. Off-the-cuff comments from candidates captured by portable devices drove news cycles for weeks at a time and changed political fortunes. For example, one instance of George Allen’s career ending “macaca” video has currently been viewed on YouTube almost 400,000 times. As Levinson notes in his upcoming book, The New New Media: “the true or fully empowered new new media user also has the option of producing content, and consuming content produced by hundreds of millions of other new new media consumer-producers.”

Now, with the current election fiasco in Iran, we are seeing the true potential of the new new media. The obviously fraudulent Iranian election outcome might have gone unnoticed and unchallenged in previous political media environments. At the very least, the Iranian ruling powers would have been able to clamp down on information flow by shutting down media outlets and controlling reporters’ access to the events.

Not anymore. Cell phone videos and snapshots of demonstrations and reprisals, “Tweets” with tactical and other organizing information and other new new media reporting have completely trumped Iranian efforts to control the public perception of their election. As Richard Engel noted on the Rachel Maddow Show last night, to control the user generated content of civil protest the Iranian rulers would have to shut down the entire country:

“What the Iranian crackdown is, it’s very old fashioned. They want to control the media so they’re cutting off phones and they’re kicking out established reporters and harassing reporters. That’s very 1980’s, 1990’s way of a media crackdown. It has not helped them control the information war.”

In the 1980’s Neil Postman argued that any new technology disseminated to the populace by our electronic conglomerates constituted an uncontrolled social experiment on society. Every new medium or device presents a Faustian bargain, creating winners and losers within the population based solely on the characteristics of the technology. The new new media change the flow of information from the one-to-many of traditional media outlets to the many-to-many of the internet. Without single chokepoints to block the flow of information, would-be tyrants are finding it difficult to control the narrative of their national political events and the word gets out from multiple sources, with pictures!

The upside of the new new media is that democratic inclinations gain new traction against entrenched despotic institutions. The downside is that turmoil is inevitable as current power holders seek to retain their positions. In our own country this turmoil is played out by the decline and fall of the Republican Party and the not coincidental individual incidents of right-wing violence that accompany that collapse. Overseas, the chaos and destruction may be more pronounced as entire societies react to the potentialities of the new new media and the violence spills out into the streets.

Sunday, May 17, 2009

Maureen Dowd on Cheney's Saturnine Policies

I don't ususally post on political events, leaving that task to my betters. Less frequently do I quote Maureen Dowd who I find generally writes snark without substance. However in today's New York Times Dowd has hit the nail on the head concerning Republican reactions to Nancy Pelosi's involvement in George Bush and Dick Cheney's lawbreaking:
Nancy Pelosi’s bad week of blithering responses about why she did nothing after being briefed on torture has given Republicans one of their happiest — and harpy-est — weeks in a long time. They relished casting Pelosi as contemptible for not fighting harder to stop their contemptible depredations against the Constitution. That’s Cheneyesque chutzpah.
and
Besides, the question of what Pelosi knew or didn’t, or when she did or didn’t know, is irrelevant to how W. and Cheney broke the law and authorized torture.
President Obama wants to avoid the national gut-wrenching that a full accounting (and the resulting prosecutions) would subject the country to. I submit that without a full accounting and without holding responsible those who committed these crimes in our names, there is no moving forward.

Tuesday, May 12, 2009

In Memoriam - Dr. Leonard Shlain

I've just learned that Leonard Shlain died yesterday after a long struggle with brain cancer. I was first introduced to Dr. Shlain at the 2002 Media Ecology Conference at Marymount Manhattan College in New York City. His talk on The Alphabet and the Goddess (website here) was remarkable and I immediately purchased both the book and a video of his lecture. Since that time Dr. Shlain has been a regular attendee at Media Ecology Association and related functions and his contributions to the field are substantial.

Here is a YouTube video of an interview with Dr. Shlain as part of the University of California "Conversations with History" series concerning Art and Science:



From his website:

A celebration of Leonard’s life will be held on Friday, May 15th at 1:00 PM at Sherith Israel Synagogue, 2266 California Street at Webster, San Francisco, CA 94115.

In lieu of flowers, contributions may be made to the Leonard Shlain Scholarship Fund at The Saybrook Graduate School and mailed as follows:

Att: Ed Patuto, Shlain Scholarship Fund
Saybrook Graduate and Research Center
747 Front Street
San Francisco, CA 94111
415.394.5675


My condolences to his family. His death is a great loss to us all.

Thursday, May 7, 2009

Saturday, April 25, 2009

Susan Boyle's Transformation: We Have Met The Ugly Duckling And He Is Us

Turnabout is fair play as Susan Boyle turns Les Mis into Les Millions in her now famous YouTube fairy tale.

The Susan Boyle video clip that currently is reaching new viewer heights on YouTube exhibits aspects implying post-production tinkering (or at least extensive pre-production planning) which moves it from the realm of real time cinema verité to preconceived narrative.
The way Boyle's stunning performance is preceded with shots of her in the waiting room, the contrast of her plebeian appearance with the glamour and celebrity of the judges, even her song choice creates a specific effect. Is it a coincidence that this would-be ugly duckling chose as her performance piece Fantine's swan song from Les Miserables?

I had a dream my life would be
So different from this hell I’m living
So different now from what it seemed
Now life has killed the dream I dreamed.

Imagine after that lengthy and somewhat embarrassing introduction, Ms. Boyle had begun singing "Oklahoma!" or "Luck Be A Lady Tonight!" The audience reaction might have been quite different.
The presentation and contemplation of transformation is a key characteristic of mythology, properly understood. Myths and fairy tales use a magical transformation as a standard narrative device. The ugly ducking transforms into the beautiful swan. The kitchen drudge transforms into the beautiful princess. The frog transforms into the handsome prince. What is different about the Boyle YouTube video, which might be called a Twitter fairy tale, is that it is we, the audience, that is transformed, not the protagonist.
Using multiple shots of the Britain's Got Talent judges, hosts, and audience, this video narrative clearly documents their (and by extension our) transformation from ugly critics to enthusiastic supporters. By contrast, Susan Boyle herself remains unchanged, except in our eyes. This reversal of transformational aspect as a narrative device is what makes this video so compelling, and I believe it could only happen in our television-weaned, computer-enhanced, social networking era.
Particular storytelling techniques shape themselves to the available medium. In distinguishing the "light through" aspect of the video image vs. the "light on" nature of movies, Marshall McLuhan observed that with television (and by extension the computer monitor) the viewer is the screen. New media present opportunities to tell old stories in a new way, and from a different vantage point. The salient feature of this Twitter-Tale is that it replaces the protagonist with the audience as the object of transformation. We have met the ugly duckling and he is us.

Monday, April 6, 2009

A Model Media Ecologist

Under the tutelage of professors Neil Postman, Terry Moran and Christine Nystrom, it was the practice in the 1970's of New York University's Program in Media Ecology to hold annual graduate student conferences where each doctoral class picked one member to deliver a "State of the Class" address.

At the fall 1976 Conference, my own Class of 1977 decided to do something different. I had access to a Sony reel to reel black and white Betamax recorder and a camera, and so instead of one class member giving a 30 minute address, each of us in the Class of '77 prepared up to five minutes on video tape of our own personal metaphor for What is Media Ecology? A Model Media Ecologist was my contribution. (I still have the complete video of the Class of '77 if anyone is interested.)

I sang it to the tune of Gilbert & Sullivan's A Modern Major General. I also used a lot of props to add visual humor to the comic lyrics. For instance, when I sang the line "I also know the difference 'tween me and a theologist" I put on a clerical collar. It is worthwhile to click on the link to view the original recording:



I'm proud to say that Casey M.K. Lum has included A Model Media Ecologist at the beginning of his history of Media Ecology, Perspectives on Culture, Technology and Communication: The Media Ecology Tradition published by Hampton Press. No, I don't get any royalties, although I think I should.*

For those of you who haven't already downloaded A Model Media Ecologist from iTunes, here are the lyrics (modified slightly to bring them into the 21st Century):

A Model Media Ecologist

I am the very model of a Media Ecologist
I also sense the difference 'tween me and a theologist
I've read a bit of Mumford and a little of McLuhan
I also have a fair idea what Watslavik is doing.

Of Levi-Strauss and Jacques Ellul I seem to have a smattering.
The work of Ames and Cantril I am very often flattering.
I'm versed in Systems Theory and in models mathematical
Which I'll dispute with you until the start of my sabbatical.


I know how Shannon-Weaver strove to overcome their channel noise.
I'm well aware that Hayakawa hung out with the Senate boys.
Although it would be better to have been an anthropologist
I am the very model of a Media Ecologist!
I can recite the history of radio and telephone.
As well as why it is Korzybski's ghost is never left alone.
I've studied silent language and the biases of media,
Of Structuralistic notions I'ma real encyclopedia.

I've learned proxemics, kinesics, linguistics styles polemical.
I know why Greeks were oral and why monks were academical.
Then I'll recite five verses from a Bible made by Guternberg,
And guess the probability you know the work of Heisenberg.

Why TV is immediate, massaging your right hemisphere,
While functioning discursively is bound someday to disappear.
Although it would be better to have been an icthyologist,
I am the very model of a Media Ecologist!

When I can tell the difference 'tween "dub" and "dupe" and "master tape";
When I can tell a hot film splicer from a waffle plate;
When showing films or video no longer gets the best of me;
When I can show awareness of the workings of 'lectricity;

When laser beams and holograms no longer seem so magical;
When my attempt to splice a tape does not turn out so tragical;
In short when I've a smattering of modern day technology,
Then I'll feel better saying I know Media Ecology!


For my modern hardware training, though I'm plucky and advertury,
Has only been brought down to the beginning of last century!
Although it would be better to have been a gynecologist,
I am the very model of a Media Ecologist!


*BTW. As a published poet (see above), in 2005 I claimed the title of Media Ecology Association Poet Laureate. However, after reading Lance Strate's body of work, as published at his own MySpace blog "Lance Strate's Blogversed" (available here) I hereby abdicate in his favor!

Wednesday, March 25, 2009

The Battlestar Galactica Guide to Great Literature

As they contemplated Season Four, Battlestar Galactica's writers confronted the narrative mess of the previous three years and exclaimed "There must be some way out of here!"

Would-be screenwriters, novelists and playwrights can learn an important lesson from this past week’s Battlestar Galactica finale. For those not tuned into the BSG universe, the series finale revealed that Starbuck, the plucky fighter pilot who died and came back to life a few seasons back, was not quite human. You may think that BSG’s writers mixed up coffee brands in their minds, Starbuck’s Incorporated with Chock Full 0’ Nuts (that heavenly coffee), when they reincarnated Starbuck not as an android or a clone or some other high SciFi concept, but rather as a true angel. In fact, Angel Starbuck allowed the writers to conveniently tie up of a number of loose ends, contradictory story arcs and mythological red herrings that kept viewers coming back for more Human/Cylon action week after week and season after interrupted season.

In true Deus ex Machina fashion, Angel Starbuck leads the wandering BSG survivors to Earth, not the cinder Earth they previously visited, but our own true Earth of 150,000 years ago where the primitive native inhabitants sat around their campfires humming Bob Dylan tunes. The various BSG humans, Cylons and hybrids disembark, toss their advanced technology into the nearest convenient fusion recycler, scatter themselves to the Earth’s four corners and presumably become fruitful and multiply. Having completed her angelic mission, Angel Starbuck simply vanishes, leaving Lee Adama ("Apollo") to wonder on God's inscrutability.

Flash forward to our present-day world on the verge of creating its own Cylons thanks to Japanese robotics advances, and we witness two angels in America. They appear in the guise of Cylon Caprica 6 and Human Gaius Baltar strolling arm-in-arm through the streets of Manhattan, and go about wryly commenting on our civilization’s chance to get the cybernetics thing right this time.

So Battlestar Galactica turns out to have been about angels, not robots, divine intervention, not binary interpolation. A better title for the series might have been "Cylons In The Hands of An Angry God." This is where the other arts can learn a lesson from television in general and Battlestar Galactica in particular. No matter how dire the circumstances, how severe the situation, how irreconcilable the protagonists, there is no conceivable story line that can’t be resolved by supernatural agency.

A survey of the great literature of the world reveals that, with the exception of The Bible, The Koran, John Milton's Paradise Lost and possibly James Joyce's Finnegans Wake, no writer of note has hit upon this simple device to resolve the dramatic crises of their writings. In tale after tale, protagonists suffer the slings and arrows of outrageous fortune without the benefit of divine intervention.

Imagine a Shakespeare's Hamlet, Act V, where an Angel prince Hamlet exchanges the poison drinks and weapons for less lethal alternatives and convinces usurper Claudius to voluntarily abdicate his throne to a newly heroic Prince of Denmark.

Or an Arthur Miller's Death of a Salesman where a reincarnated Angel Ben Loman appears bearing a new, lucrative sales route to bestow on his father.

How about an update of Margaret Mitchell's Gone With The Wind where Angel Melanie reappears and leads the South to victory, saves baby Bonnie from her equestrian mishap and convinces Rhett and Scarlett that they were truly meant for each other.

And of course, there would be a Herman Melville's Moby Dick where another angelic Starbuck finally nails the great white whale for Captain Ahab with a propitious cast of his harpoon.

You can see the possibilities.

Post-modern critics may argue that dramatic art isn’t like that. In our poetry, our plays, our books and our movies, bad things happen to good people all the time and recently deceased revenants with heavenly bodies don’t always appear to make things right.

Aristotle taught us that art imitates nature. Isn't it about time that art imitate the supernatural?

Friday, March 20, 2009

Columbia Journalism Professor Says "F*** New Media!"

An homage to the guidance and sagacity of Columbia University Journalism Professor Ari Goldman.


According to New York Magazine, when addressing his "Reporting and Writing I" class, Columbia Journalism Professor Ari Goldman is reported to have said "Fuck new media!" and to have described online media training as "playing with toys." His print-centric approach to journalism joins a chorus of practicing newsgatherers contemplating the end of the newspaper business as we know it.

It might appear a bit self-serving or conflicted when bastions of the mainstream media publish article after article bemoaning the death of newspapers, or claiming that only their business model for the collection and dissemination of information will save the American republic. Thus there are Walter Isaacson over at Time Magazine, David Lazarus of the LA Times and David Carr of the New York Times (among many others) who insist that readers pay for their news or suffer an increase in corruption or the end of the Republic. According to these sources, if news dissemination moves to the Internet, we must adopt a new, lucrative business model that will generate revenue sufficient to support their extensive news operations.

At least L. Gordon Crovitz over at the Wall Street Journal is upfront about his perceived need to feather journalists' nests. Under the heading "Information Wants to Be Expensive" he writes:


“People are happy to pay for news and information however it's delivered, but only if it has real, differentiated value. Traders must have their Bloomberg or Thomson Reuters terminal. Lawyers wouldn't go to court without accessing the Lexis or West online service.”


I wouldn't say I'm happy to pay for my news, especially when the traditional newsgathering operations set much of the agenda of what is worth investigating and knowing about and what isn't.


What traditionalists contemplating the future of news on the internet don't mention is that the need to charge their readers is a result of the hyperlink structure of the World Wide Web itself where banner ads have not yet (and may never) replace the revenue generated by print advertising.


Under the current business model in newspapers, the amount of news that is "fit to print" is determined by the number of column inches of advertising sold. The money I pay for my personal copy of the paper largely goes to support the newsstand where I make my purchase.


Of course, setting up pay tiers for information automatically creates text-based information "haves" and broadcast media-based information "have-nots", not exactly what the Founding Fathers envisioned when the drew up the First Amendment. Those who can pay will get the internet value, the rest of us news seekers will have to watch or listen to broadcast headlines.


There are alternatives already in production on the web. Blogs, Wikis, Facebook groups, Twitter cabals and many other information sharing operations are still in the process of becoming, but may have the potential to replace the key functions of mainstream media with free, open access to just the information each of us needs. As David Bollier notes in The Huffington Post, a myriad of below-the-radar activities on the Web are undermining corporate gatekeeping and control of news content:


“There are now countless online communities dedicated to generating their own content. It turns out that the joys of shopping pale in comparison to the pleasures of sharing and curating information with a community of peers.”


One can easily imagine a near future without newsprint:


Well, its been two years since the last printed newspaper shut down and I’ve finally settled into to the newspaperless media ecology. My day started with a two way tweet to President Obama concerning the latest stimulus package, protesting the inclusion of yet another bailout for NBC, CBS and ABC. The President agrees that network broadcasting is obsolete, but we can’t afford to let the three majors fail. Meanwhile, over at Fox, the “all reality programming all the time” former network, Bill O’Reilly was voted off “Debating with the Stars.”


I pulled out my handheld to review this hour’s digital news headlines, some of which I had contributed, when I noticed that our new puppy, Rush, had had another accident on the new carpet. “Bad boy!” I scolded him, tapping him lightly on the nose with my PDA. I completed my other chores, cycling out the old disposable laptop from the bottom of the budgie cage and lining the bottom of the garbage pail with old thumbdrive detritus. I wonder what we used before they came up with that solution?


As usual to start my work day, I exchanged text messages with my congressman, my senator and my friend in the Middle East who keeps my Facebook group up to date on the Palestine-Israeli détente. I noted that my YouTube video has achieved 100,000+ views and surveyed some of the response videos. I considered starting a new group, “Media Ecologists against the use of sepia tone videos” but put it off until later.


Later I set up a three way video conference with my SO who is away on business in Chicago and my daughter, who is on a mid-term break trip to Africa. We finalized plans for our family vacation this summer to one of the new National Tree Farm Parks that recently opened while the country gives the older national parks a few years fallow time to complete recovery from the ravages of the Bush years.


My daughter is researching and shooting a school report on the history of newspapers and had some questions:


  • Is it true that the first toy airplanes were made out of something called "paper"?
  • Did opinion columns and editorials once only go one way?
  • What is papier maché?
As printed newspapers go the way of buggy whips, anti-maccassers and Republicans, it is comforting to know that the traditions and the triumphs of the age of newspaper journalism is being preserved by the Newseum in Washington D.C(which bills itself as "the world's most interactive museum") and online. Someday I’ll take my daughter there to see it in person.

So, Professor Goldman, perhaps the better message to your students (and would-be future journalists) would be: "Make love to the new media, not war!"

_______________________

I don't often post an entire article written by someone else on my blog, but this overview of this new "Digital Republic" by David Bollier is so germane to the continuing news about newspaper decline that I think its worth a read.

From The Huffington Post, March March 19, 2009:


How the Commoners Built a Digital Republic of Their Own

The Bush Administration achieved a virtual lockdown of American political culture for eight years, bringing policy innovation to an utter standstill. So consider this improbable fact: one of the most significant achievements in open, participatory democracy in history burst forth during the Bush years.

Working in the parallel universe of the Internet, a loosely coordinated, global federation of digital tribes built a new kind of democratic culture. This culture is embodied in free and open source software, the blogosphere and hundreds of wikis on specialized topics. It can be seen in remix music and amateur videos, the flourishing social networking sites, and new types of "open business" models.

These innovations are not primarily creatures of government or the marketplace. They represent a new "commons sector" -- a realm of collective wealth generated by ordinary people through their own resourcefulness and sharing, largely outside of the money economy.

Although the tech world gets a lot of attention, few people appreciate how the new commons sector is achieving a slow-mo political revolution. As I put it in the subtitle of my new book Viral Spiral, the commoners have built a digital republic of their own. Using software code, free public licenses authorizing sharing and their own imaginations, the commoners have built an impressive civic, economic and cultural infrastructure that belongs to them. It is a world based on open access, decentralized creativity, collaborative intelligence and cheap and easy sharing.

The established order, meanwhile -- the world of centralized control, strict intellectual property rights and hierarchies of credentialed experts -- is under siege. Broadcast networks, daily newspapers, government agencies and politicians are still nominally "in control" -- but with each passing day, the new culture of the commons asserts its powers and out-maneuvers the old order.

The influence of this new sector -- law professor Lawrence Lessig has dubbed it "free culture" -- is large and growing. There are, for example, thousands of free software and open source software programs that power Web sites and blogs, information archives and social networking communities. Where would we be without GNU Linux (operating system), Mozilla (web-browsing), Thunderbird (email), bitTorrent (file-sharing) and BIND, Perl and Apache, which are central to many Internet functions? Linux alone -- a free program created by a vast commons of programmers -- is estimated to have spawned some $30 billion in economic activity.

More than 150 million Web objects now use Creative Commons licenses, an ingenious "hack" around copyright law that lets people allow the legal sharing, copying and distribution of their works. Online sharing and collaboration have become so popular that companies now base their business models around them.

Yet the real story is the power of the commons itself. There are now countless online communities dedicated to generating their own content. It turns out that the joys of shopping pale in comparison to the pleasures of sharing and curating information with a community of peers.

For every name-brand commons like Craigslist, Flickr and Wikipedia, there are thousands of impressive niche sites like Flu Wiki (decentralized tabulation of flu outbreaks), Wikitravel (user-generated travel guides) and Jamendo (music sharing). Sometimes these commons actual serve as "staging areas" for commercial startups. The Internet Movie Database, now the leading database of film facts and credits, was started by two film buffs. Gracenote, the database that looks up information about audio CDs, was started by a community of music fans. This is a new macroeconomic reality -- the commons as an incubator for market innovation.

To date, the commons sector has largely eluded mainstream attention because it is so fragmented and decentralized. It doesn't necessarily make money and it is run by self-organized amateurs. Neither government nor corporations are "in charge" of this eclectic, unorganized realm. It's supposedly a world of bloggers in their pajamas and teenagers exchanging silly videos via YouTube. How can we take it seriously?

Not surprisingly, powerful people from President Obama to corporate executives to newspaper publishers use the commons sector as a convenient foil. They try to dismiss it as a way to show that they remain in control -- and that the insurgent digital republic can be safely ignored.

The commoners know better.

After centuries of being victimized by market forces, the commoners now have powerful tools to protect and advance their interests. They no longer have to put up with the privatization and commodification of their shared inheritance and collective work -- a process known as "enclosure."

The commoners now have their own software infrastructures and open platforms. They have their own legal licenses to prevent anyone from "taking private" their content. One need only recall how Disney appropriated fairy tales and literary classics to build its corporate empire. Or how commercial broadcasters have used the public's airwaves for decades, for free. Or how Big Pharma pays a pittance (if anything) for exclusive rights to federal drug research -- which is then sold back to us as expensive proprietary drugs.

But in the online world, the commoners are asserting their control. Think how the mainstream media are often two steps behind the blogosphere, and how GNU Linux has taken huge market share away from Microsoft. Consider how YouTube is stealing audiences from the broadcast networks....and how the music industry has now eliminated "digital rights management" encryption from most recorded music.

Remember how Barack Obama's candidacy was borne aloft by the commoners acting on their own -- and think how Obama and Congress now face a mobilized public that is more actively engaged in our national political life than ever.

While centralized media continues their sad decline, remix artists and indie musicians and filmmakers are producing some of the most daring new works around. Newcomers with style and vision are using the Web to reach audiences cheaply and directly, without having to get the approval of stodgy, risk-averse Hollywood gatekeepers.

In education and science, there are strong movements underway to reclaim control over knowledge. In the face of soaring subscription rates for academic journals, academics have created more than 3,900 "open access" journals that are free to everyone, in perpetuity. M.I.T. and dozens of other universities have put their curricula up on the Web for free, spurring a new "open courseware" movement.

Students frustrated by exorbitant textbook prices are starting to develop "open textbook" projects, in the style of a wiki, so that they can pay $25 for a print-on-demand textbook with the latest scholarship, rather than $125 for a standard commercial textbook that may be outdated.

An open culture, a sharing economy and a digital republic: the foundations for this new world actually matured during the nightmarish Bush years, beneath its contemptuous gaze. Now that such radical ideas as participation, transparency and accountability have a stable home on the Internet (provided Net Neutrality can be assured), the challenge will be to safeguard this world -- and build it out even further.

R. Buckminster Fuller once said, "You never change things by fighting the existing reality. To change something, build a new model that makes the existing model obsolete." That's exactly what the commons sector is doing. For all the thrashing about that will surely occur in coming years, somehow I think I know who will prevail.

David Bollier is an editor of OntheCommons.org and the author of Viral Spiral: How the Commoners Built a Digital Republic of Their Own (New Press). For several short video interviews with Bollier on the "viral spiral," visit here, here and here.

Friday, March 6, 2009

The Decline and Fall of the Times Roman Empire

What we know about the death of newspapers
-or-
Do 400,000 Twitters = 1 New York Times?


If you still read newspapers and magazines, or watch network television for that matter, you are probably aware that times are tough for the mainstream media. Latest casualty: The Rocky Mountain News which folded after 150 years in the press. News rooms across the country are laying off staff and cutting costs. Even the venerable New York Times is forced to sell and lease back its headquarters to stay afloat. Of course, the journalistic consensus is that the fault lies not in themselves but in their competition. In a recent issue of Time Magazine Walter Isaacson blames the Internet for print journalism’s decline:

“The problem is that fewer of these consumers are paying. Instead, news organizations are merrily giving away their news. According to a Pew Research Center study, a tipping point occurred last year: more people in the U.S. got their news online for free than paid for it by buying newspapers and magazines.”

His solution? Micropayment charges that would allow newspapers to collect revenue from web browsers:

“Under a micropayment system, a newspaper might decide to charge a nickel for an article or a dime for that day's full edition or $2 for a month's worth of Web access. Some surfers would balk, but I suspect most would merrily click through if it were cheap and easy enough.”

Over at the LA Times, David Lazarus, rejects the iTunes model for a new revenue generation, an “iNews” as it were, in favor of a subscription approach that would provide the funding for expensive news gathering:

“But unless we want digital newsrooms staffed by skeleton crews of a dozen or so reporters and editors, we have to accept that it costs money to cover news events, perform investigations and tell yarns.”

Bad times affect not only print but also broadcast television. Under the headline “Broadcast TV Faces Struggle to Stay Viable,” Tim Arango at the New York Times quotes Jeff Zucker of NBC Universal:

“…broadcast television is in a time of tremendous transition, and if we don’t attempt to change the model now, we could be in danger of becoming the automobile industry or the newspaper industry.”

Ouch! As newspapers go the way of the buggy whip it is appropriate to examine where the defenders of the press have got it wrong, and where they are right. I have included a mention of broadcast television because the news organizations of broadcast media have often adopted the poses and nomenclature of print journalism even though their now digital-based product is quite a different animal.

Defenders of the press as it stands mistake the physical medium of print with the function of the Press in a democratic society as envisioned by our nation’s founders. Indeed, the rational behind including "freedom of the press" in the First Amendment was detailed by Alexander Hamilton in Federalist #84. Answering the objection that a large central government would be too far away to be effectively monitored and controlled, he wrote:

“The public papers will be expeditious messengers of intelligence to the most remote inhabitants of the Union.”

This is necessary because

“Of personal observation they can have no benefit. This is confined to the citizens on the spot. They must therefore depend on the information of intelligent men, in whom they confide; and how must these men obtain their information? Evidently from the complexion of public measures, from the public prints, from correspondences with their representatives, and with other persons who reside at the place of their deliberations.”

The remedy for the dangers of a remote, central government is a system of communication which, of necessity in that colonial era, relied on the printing press to carry word to citizens at a distance from the seats of government. Living at the height of the print era, Hamilton would naturally rely on the printing press as the medium of choice to preserve the transparency of government he deemed essential to a democracy. If the internet had existed in his time, he might have deferred to any number of Internet blogs rather than to the printing press.

Paul Starr’s epitaph “Goodbye to the Age of Newspapers (Hello to the New Era of Corruption)” in the March 4 issue of The New Republic is one of dozens of recent laments that mistake the medium for the message. Starr assumes that newspapers have everywhere and always lived up to Hamilton’s ideals, or that only through the medium of ink pressed on newsprint can the "Truth" be revealed and corruption curtailed. He notes that:

“Although the rise of broadcast journalism changed the newspaper business, radio and television did not kill it because newspapers retained their local advantages in providing information to readers and connecting advertisers and consumers in a city.”

I was at CBS News in the 1980s when the decision was made to convert the entire news operation from a cost center to a profit center. Salaries of top news stars were increased. News support operations deemed not essential to the primary goal of maximizing ratings were abandoned. For example, CBS News used to employ a staff of full time research librarians and a facility in-house for news staffers to use in the development of their stories. This was among the first things to go, with a resulting decline in the quality and quantity of fact checking for news productions.

This decision to extract the monetary value of CBS’s crown jewel was not based on ideological or editorial criteria, but on a purely financial one. That such a criterion would ultimately lead to the tarnishing of the jewel never seemed to have occurred to them. The resulting, inevitable degradation of the broadcast news product has also tainted print journalism as newspapers struggled to maintain relevance in the face of sound bite news delivery.

The new information environment of broadcasting required a subtle (or perhaps not so subtle) change in journalistic practices and created a gap between what the public wanted to know and what the public needed to know. This gap, being environmental, was largely invisible until the advent of the Internet. The “amateurs” of this new media environment have brought this gap to the foreground, focusing our attention on unquestioned compromises of mainstream media news gathering and reporting that have little to do with real journalism.

Newspapers’ reliance on advertising and classified revenues has always left them vulnerable in economic downturns. This vulnerability has become critical in the face of simultaneous assault for eyes and minds by a competing medium, the Internet. Had print journalism really fulfilled Hamilton’s vision of the Fourth Estate, large scale newspapers might still be viable. If collectively newspapers were still the source the distant public could turn to for information important to their lives and well-being, we might not be witnessing newsprint’s end. The problem is that often they did just the opposite. I won’t go into the shortcomings of the obviously biased papers like Rupert Murdoch’s New York Post, the Washington Times or the Chicago Tribune. Even the so-called liberal papers of record like the Washington Post and the New York Times have fallen short of the mark more often then not.

For too long most of the press has gone along with the Washington establishment to get along. Publishers and editors alike mistook the physical ownership of the printing press for the spiritual ownership of Hamilton’s function of the Press.

On both economic and political fronts, the mainstream media often have failed to keep the public informed. Where was any of the press during the length of Madoff Ponzi scandal? More than twenty years in the making, with numerous warnings from whistleblowers like Harry Markopolos, but no financial reporting organization picked up the lead. For that matter, where were the warnings of the current Great Recession? Not only did the mainstream media failed to call the Bush Administration to account during the lead-up to the Iraq War, most of them actively enabled that catastrophic misdirection, including the New York Times whose own Judith Miller helped cheerlead the war.

But you don’t have to look only to the most recent events to see the shortcomings of the press. During much of the Vietnam Era if you wanted the straight facts on the War you had to seek out a tiny little independent weekly newsletter by I.F. Stone.

Having consolidated their smaller competitors out of existence, the declining newspapers can’t use the same trick that they used in the face of broadcast journalism, that is exploiting “local advantages in providing information to readers and connecting advertisers and consumers in a city.” This opportunity has been sucked away by the Internet.

In other times of media change, old media found new, albeit smaller niches in which they thrived. When video killed the radio star, radio said “I shrink, therefore FM.” In a similar manner, newspapers must reinvent themselves to survive. By this I don’t mean to find new business models or sources of revenue to continue doing the same old thing.. To retain the mantle of the “Fourth Estate” the old guard media must rediscover what reporting is really about. Maybe the example of I.F. Stone’s Weekly from forty years ago can serve as a model. Stone suggested that if you can’t compete with the media, go small, go independent:

"Reporters tend to be absorbed by the bureaucracies they cover; they take on the habits, attitudes, and even accents of the military or the diplomatic corps. Should a reporter resist the pressure, there are many ways to get rid of him... But a reporter covering the whole capital on his own — particularly if he is his own employer — is immune from these pressures."

Monday, March 2, 2009

Stopping by the Woulds on a Snowy Day

Being snowed in would be much better if the weather wasn't so terrible and we could do something with the free time. After all, I spent Sunday soaking up the requisite 15 hours of television to make up for viewing time lost during the week to office hours. And, to be honest, I have not yet reached the stage where weekday television programming (Rachel Ray? Seriously?) is appealing. No bowl games, no playoffs, no golf tournaments in sunny climes, no marathon airings of Lord of the Rings or Star Wars, no Bear Grylls exposing his digestive system to something abhorrent, not even a commercial showing Lewis Black in Aruba.

Sure there are carpets to be vacuumed, floors to be washed, other household chores to be deferred, but is that really the best use of this bonus time? I could write another inspiring blog about media ecology, but I haven't been able to come up with a good tetrad about snow. Is a snowstorm an extension of our senses anyway?

So I sit watching the snow on my neighbor's air conditioner pile up and check the Accutrak Radar every so often. Oh wait, they're plowing the rooftop parking garage across the way. See you later.

Tuesday, February 17, 2009

I'm #26! Yay!

Ranking blogs is so "old media".

There was a time when Time Magazine was a truly national magazine, without an obvious political agenda and as such, was a valuable resource for newshounds. I think that time was for a year or two in the mid 1970's.

Alas, those days are gone. What better evidence for this than their current list of the top 25 blog sites for 2009? I was discouraged and outraged when I discovered that my own blog A Model Media Ecologist is not considered by Time Magazine to be one of the top 25 blogs (in the country, the world, the blogosphere?) Time seems to be completely unaware that A Model Media Ecologist won the Blogger of the Year Award (see here).

Here's a Media Ecology question. Why aren't magazines in trouble like newspapers? Specifically, why isn't Time in jeopardy?

I realize that such sour grapes are unbecoming a model media ecologist. Time Magazine has as much right to exist as the next main stream media publication, biases and all. Of course, you may wonder why any of the main stream media should be judging and ranking any the new media in the first place. Isn't there a conflict of interest there?

I wrote an imaginary letter of complaint to Time and received this imaginary reply:

Dear (your blog name here):

We appreciate your concern about the many worthy blog sites that did not make it into our pantheon of the Top 25. Perhaps it will be of some solace to you to be told (and this is completely off the record) that your own blog missed it by only that much, that is, (your blog name here) came in at 26!

Congratulations, and better luck next year!

Sincerely,

Time Magazine

It is some solace, but not much. I might feel better if my imaginary Time Magazine correspondent had taken the trouble to actually type the name of my blog into his boilerplate.

Oh well, its on to the Peabody Awards!

Monday, February 2, 2009

Un-Buttoning Cinema

Film criticism prospers when time flies like a boomerang.

In the current issue of The New Yorker, David Denby bemoans the decline of the Academy Awards selection process, focusing on the paltry quality of this year’s Oscar picks compared to previous years. In toting up the golden votes present vs. past, he notes:
The total of thirteen nominations for “Benjamin Button” has to be some sort of scandal. “Citizen Kane” received nine nominations, “The Godfather: Part II” eleven, and this movie, so smooth and mellow that it seems to have been dipped in bourbon aging since the Civil War, is nowhere close to those two.

If we accept Denby’s premise that the Academy of Motion Picture Arts and Sciences Awards was ever about quality rather than commerce, and concentrate on the seeming vacuity of Benjamin Button it is clear that Denby misses the true message of the film, which is both a savage rebuttal of traditional linear modes of film viewing and a call to arms for all of us to revisit and re-evaluate our cinema favorites by viewing them in reverse.

I have obtained a bootleg DVD of Benjamin Button and have discovered that the film plays much better backward rather than forward. By means of the reverse button on my remote control, I can watch the marvel of the film’s protagonist growing old while everyone and everything around him rejuvenates. For example, Cate Blanchette transforms from a crotchety middle aged woman with a truly Medean mother complex into a vibrant young woman with an alarming Oedipal complex.

In forward time Benjamin Button takes us through a veritable IMDb of cinematic classics as the protagonist de-matures through a Grapes of Wrath New Orleans, an End of an Affair Moscow, a Guns of Navaronne war era, A Harold and Maude 50’s romance, a decidedly non Brando-esque Wild One on a road trip, until finally settling into a perverse version of Look Who’s Talking. In reverse, Benjamin Button’s United States of America progresses from its present Bush-ian chaos back to a golden age when robber barons, racial apartheid and the absence of womens’ or workers’ rights characterized the century.

Benjamin Button is beyond post-modern and therefore deserves something beyond post-modern critcism. Looking backwards, as it were, may be the new technique for looking forwards. For example, run Citizen Kane in reverse, and the Wellsian morality tale takes on a new patina. Rather than summing up or explaining a man's entire life experience by means of a lost childhood toy, getting that "Rosebud" discovery out of the way at the beginning of the film enables us to see that Kane is really about growing young gracefully. In the end (former beginning) of the film, young Kane is joyfully reunited with his beloved sled and returns to a life where happiness is determined by the ups and downs of snow accumulation.

The Godfather: Part II in reverse chronicles the ascent of Michael Corleone from the depths of mafia moral corruption of America circa 1955 to the heights of the mafia moral corruption left over from The Godfather: Part I, setting us up for his ultimate regeneration in the backwards viewing of that previous epic.

This new approach to film criticism, which might be call pre-post modernism, works for film after film, whether its Tom Joad and his family triumphantly returning to their Oklahoma farm, in effect putting the wine back into the grapes, or Moses closing the Red Sea as the Israelites reconsider the relative merits of being the “chosen people.”

I’ve run dozens of films through this reverse critical process and the result has been a deeper understanding of the human condition and the art of filmmaking. The sole exception so far for some reason is Memento, which makes no sense no matter which way its viewed.

Thursday, January 22, 2009

What We Know About Battlestar Galactica

Any sufficiently advanced technology is indistinguishable from magic

The final season of Battlestar Galactica has begun and the teaser commercials have posed the question: Who is the 5th secret Cylon? While this will be the focus of the final ten BSG episodes, there are a number of other questions the series has presented that may not be resolved by the final curtain.

1. Why do Cylons’ spines glow red when they make love?
It would seem that such an obvious sexual tell would be counterproductive for a cadre of seductive simulacrums. In all the years of sexual subversion, did no human ever wonder why their incredibly attractive partners insisted on the missionary position?

We do know that Cylons like sex as much as the next automaton and that they are genetically compatible with humans. They claim to experience “Love” and they purport, at times, to having free will. One can only conclude that the glowing red spine was a feature meant to be included only in Christmas Cylons, but someone slipped in production.

2. How did Cylons develop monotheism?

BSG humans are portrayed as generally secular and polytheistic. Neither Greek nor Hebrew, but rather both and more, human BSG characters sport names or appellations like "Adama," "Apollo," "D'Anna" and for the coffee worshippers amongst us "Starbuck." Their twelve colonial worlds correspond to the twelve signs of the zodiac. They say things like "Thank the Gods" and "Gods help us."

The robotic Cylons are monotheistic, fanatical and proselytizing. Despite their claim that their one god is “love,” or perhaps because of it, they bring about the destruction of the twelve human colonies, killing billions of people and then zealously pursue the few survivors. There is one chilling scene from the first season where the Cylon attack is imminent and Number 6 bends over a carriage to kill an infant. It is unclear whether this is an act of mercy or a preemptive strike.

The odd thing is that the Cylons, being robots, have already achieved eternal life. They literally cannot be killed. Or rather, we see them continually dying and then being reborn. Their reincarnation factory vessels are even called “resurrection ships.” A reborn Cylon is not a type or a clone. It is a recreation of the dead individual Cylon, downloaded from the original with memories and emotions intact. In other words, one of the core motivators of many of Earth’s religions is already an integral part of Cylon existence. The only exception to the rule is if a Cylon dies out of range of a resurrection complex. Then they truly die.

If, in spite of being formed in the image of their creators, Cylons reject polytheism, how did they stumble across monotheism?

There is a school of cultural evolutionary thought that suggests that a pre-existing condition to the adoption of monotheism is a phonetic alphabet and some degree of literacy. In a 1977 Issue of ETC: The Journal of General Semantics, in an article titled "Alphabet, Mother of Invention," Marshall McLuhan (yes, that Marshall McLuhan) and Robert K. Logan speculate on the possible origin of monotheism:


"Western thought patterns are highly abstract, compared with Eastern. There developed in the West, and only in the West, a group of innovations that constitute the basis of Western thought. These include (in addition to the alphabet) codified law, monotheism, abstract science, formal logic, and individualism. All of these innovations, including the alphabet, arose within the very narrow geographic zone between the Tigris-Euphrates river system and the Aegean Sea, and within the very narrow time frame between 2000 B.C. and 500 B.C. We do not consider this to be an accident. While not suggesting a direct causal connection between the alphabet and the other innovations, we would claim, however, that the phonetic alphabet played a particularly dynamic role within this constellation of events and provided the ground or framework for the mutual development of these innovations." (Emphasis added)

While the final verdict on this Media Ecological interpretation of religious thought is still out, there surely is some confusion over how the artificial intelligence products of the pantheistic human culture of BSG could arrive at the notion of one God. Religious robots, while intriguing, remain a problem, especially self-ordained monotheistic robots.

Computer processing, as we understand it, requires at least binary notation, which would imply a minimum of two gods. I believe that the depiction of Cylons as monotheistic in the absence of human mortality or alphabetic literacy can only be seen as a true leap of faith on the part of BSG's creators.

3. Why didn’t the Cylons make their “skin jobs” better than they are?
Humanoid Cylons are stronger, arguably smarter and definitely sexier than their human counterparts. However, given the range of possibilities presented by human/android genetics, one wonders why the Cylons didn’t do more?

How about x-ray vision or invulnerability? Is a spider-like precognition out of the question? At the very least, all Cylons could have been equipped with metallic claws that pop out of their knuckles on command.

When you compare humanity’s current evolutionary state to our closest monkey’s uncle, it is clear we are far superior. Our brains are so large we only need to use 10% and often use much less. Every year some Olympian or Marathoner runs faster, jumps higher, or swims more synchronously. To your average orangutan, we must seem like the types of Super Hominid into which they’d all like to evolve. For those of us already at the summit of Earth’s evolutionary trail, where is there to go but up, as in “Up, Up and Away!”?

It may be that Cylons, while clearly superior to humans in every conceivable way, lack the ability to imagine the next great steps in humanoid evolution and the amount of spandex required.

4. Why do the Cylons want to breed half-human/half-Cylon children? Why have children at all?
Any parent who has been on the receiving end while changing a diaper, who has been involved in any school science project or who has attempted sound moderately coherent while explaining the facts of life to a pre-pubescent human child would wonder why Cylons wouldn’t design their offspring to skip right to adulthood. Would-be Cylon parents will soon discover that it is not possible to annihilate the remainder of the human race while coordinating a schedule of after-school activities.

5. Finally, who is the fifth Cylon? What’s the deal with Starbuck? And what about Earth-that-was?

I personally believe that Starbuck is Amelia Earhart and Colonel Tigh is Jimmy Hoffa. The fifth Cylon is not Ellen Tigh, he's Howard Hughes. Or maybe Walt Disney's head. Wait. Didn't he invent animatronics?

That's it! The Cylons are Disney World Character refugees, who fled Earth when Lawrence Lessig finally got Congress to approve term limits on corporate copyrights! First they evolved from singing bears and cavorting pirates into “toaster-heads.” Now they swing full circle back to humanoid approximations of perfection, but they have not been able to completely eliminate the desire to slavishly cater to the pre-adult offspring of their creators.

If this “Magic Kingdom Galactica” hypothesis is true, we should be on the lookout for an upcoming Battlestar Galactica episode that would be a dead giveaway: “Cylons On Ice”.

Corporate Tax Liabilities

Has anyone ever done a cost/benefit analysis of corporate taxes? I'm sure there are accounting tools which would let us estimate how much Federal, State and Local value a particular corporation realizes each year versus how much they pay in taxes. This would include a pro rata share of usage of public utilities, public services and public spaces. It would give us a yardstick to determine who is paying their fair share and how much their taxes should be increased.

Does anyone know of anything like this?

Wednesday, January 14, 2009

DWUI-CP: Driving While Under the Influence of a Cell Phone

I was on Thom Hartmann's radio program yesterday, discussing the use of cell phones while driving. Thom's position was that regulating cell phone usage in cars was an imposition and probably unenforceable anyway. I suggested that like drunk driving, driving while cell-phoning impaired responses and may result in hazardous driving. Here is an edited clip of the transcript:

Hartmann: The Safety Council... now they want to tell you that when you are driving your car you cannot use a hands-free cell phone. You know the little Bluetooth things that most cars have where you've got the phone in your pocket and your car answers the phone and you're just talking at the windshield wipers or the dashboard. You don't even have to take your phone out of your pocket. Hands-free phones. And they're saying we should ban this. It turns out 2600 deaths, 12000 serious injuries per year, this is 6% of vehicel crashes, are attributed to people using phones. Now that's not hands-free phones, that's just phones. But included in that some small percentage of people using hands-free phones and they say "Let's make that illegal."

I say enough of the conservatives telling us what we can do or not do anywhere on earth. Or maybe we should just ban talking in the car, because their argument is that...its not just what you're doing with your hands, its that your head is in the conversation. OK, so lets ban talking to people in the car. In fact, lets even take it a step beyond that. Lets make it required that people have duct tape in their car and that when they get in and sit down not only do you have to fasten your seat belt but you also have to tear off a three inch strip of duct tape and put it over your mouth. Because how are you going to know. I mean, if a police officer's driving by and they see somebody talking, they see their mouth moving, how do they know that that person is not talking to the person next to them or is not talking on a hands-free phone?
....
Blechman:
There are researchers who have shown that using a low-fidelity instrument like a cell phone as compared to a high fidelity radio or even talking to a person in a car takes more brain processing power. Its what McLuhan called a cool medium and you actually develop a form of tunnel vision while you're doing that.

Hartmann:
Right, but if it was Bluetooth and it was running through the speakers in your car that would not be the case, would it?

Blechman:
Its still low fidelity. The signal is coming from the cell carrier, not from the FM radio station.

Hartmann:
Oh, so its the bandwidth. Now that's interesting. Now let me extend the logic on this. You're saying that the bandwidth is narrow because there's not such a broad spectrum of frequency the brain has to focus on it more intently. You know AM radio has a narrower bandwidth than FM radio. Should we ban AM radio?
Blechman:
Well, I guess you have to find the degree of involvement.

Hartmann:
There's some critical threshhold there. That's interesting, Bob! I tell you, I've got the smartest listeners in the world! I thought I was bringing some science to the table here and Bob trumps me! Well done Bob!


Its not every day that I can trump Thom Hartmann! And while I did get the McLuhan reference in, I'm sorry I wasn't able to work in Media Ecology or my blog site as well.

I think the fact that using the cell phone can be demonstrated to affect perception is an excellent example of McLuhan's "Cool Medium" terminology. Like connect-the-dot drawings as compared to photographs, cartoons as compared to paintings or traditional television as compared to cinema, a cool medium requires that we "fill in the blanks."

We experience a voice we hear over the phone as the same as normal conversation because our brains are filling in the gaps. Its only when we hear a recording of a phone conversation that we become aware of the low fidelity. That the process of talking on a cell phone can result in tunnel vision is truly interesting.

So thank you Thom for having me on and for recognizing how smart McLuhan was!

Sunday, January 11, 2009

Revisited: What We Know About the Joker

In honor of Heather Ledger winning the Golden Globe award for best supporting actor in The Dark Knight, I am reposting my piece on The Joker from this past July:


Ruminations about the man behind the masque.


Though this past weekend’s top performing movie is titled The Dark Knight, it might easily have been called "The Clown Prince." Heath Ledger’s portrayal of the Joker, already hailed as Oscar-worthy, owes more to Michael Keaton’s Beetlejuice than it does to previous Batman malefactors. Ironically, Keaton was the first film Batman and could have played off against himself as both the Caped Crusader and the Prince of Fools.


Like Keaton's Beetlejuice, The Joker in this latest Batman-iteration is the ultimate trickster: a destroyer of worlds and a slayer of men, whose word cannot be trusted and whose motives cannot be divined.


The Joker’s wild success throughout The Dark Knight's dark nights depends on a script which constitutes a stacked deck in his favor. For most of the two hours of this latest Batman saga, everything goes the Joker’s way. He knows where mob kingpins will be meeting and gains access with impunity. He easily defeats the defense mechanisms of a high-security bank. He cannily manipulates good guys and bad guys alike seeking both a higher class of criminal and a lower class of law enforcer. He survives high speed truck flips, Kevlar-armored right crosses and highrise bungee jumps.


Though he is painted up to be an enigma wrapped in a riddle (or was that someone else?), based on evidence from The Dark Knight, we do know the following things about The Joker:
  • He is a munitions expert. He is equally at home with C4 suppositories and oil barrel chemical peels.

  • Though he is an expert project manager, at least in the bank-robbery field, he is prone to waste his resources.

  • He is empathic. He knows just what to say to push anyone over the edge of madness, and then leap in after him.

  • He is a man ahead of his Timex. The Joker can take a likin’ from Batman and keep on tickin’. He may once have belonged to a fight club.

  • He obviously was involved in covert ops in the past. He knows how to anticipate scenarios and plan alternatives.

  • He knows where to acquire esoteric weapons and how to use them.

  • He moonlights as a Mary Kay agent.

  • He has had access to Jack Benny’s joke vault.
Omigod! The Joker is Jason Bourne!