Wordyard

Hand-forged posts since 2002

Archives

About

Greatest hits

Gore for president? He should aim higher

May 31, 2006 by Scott Rosenberg

The most interesting aspect of hearing Al Gore talk tonight here at the D Conference is that I went into the hour-and-a-half session hoping that Gore would run in 2008, and by the end I was hoping he wouldn’t.

Oh, he’s definitely in good form — impassioned and funny. Kara Swisher kicked off by asking him “Are you not not running?” and he parried, “That completely dismantles my defenses. I guess I have to resort to full candor now.”

He talked, of course, about global warming. He also talked at length about Current.tv, the cable network he started that focuses on videos submitted by the public. He delivered a mini-lecture about “information ecology and the structure of the marketplace of ideas” from the medieval monastery through Gutenberg and on to Tom Paine and the Founding Fathers, and argued that the broadcast TV era was an aberration, a throwback to a one-way media universe in which “the individual could not join the conversation,” and then pointed to the Internet as the next turn of the wheel, back towards the individual.

Of course it would be a refreshing, even astonishing thing to elect a president who actually understood all this and was capable of explaining it to people.

But as Gore talked more and began answering questions from the crowd it became clear that his analysis of today’s political mediascape is even deeper and angrier. Someone asked him why we couldn’t just kill the canard that “there’s still scientific debate about global warming” by getting the science faculties at 100 universities to sign a letter expressing their consensus. With weary determination, Gore explained that there have been lots of letters, including one signed by dozens of Nobel Prize winners, but few in the room would have heard of them, because they didn’t get covered. They didn’t matter — because truth (or what we might call consensus reality) in the Bush era has ceased to be a product of rational discourse and instead come under the sway of political propaganda.

Gore went on: On the eve of the Iraq war, something like 70 percent of American voters believed that Saddam Hussein was responsible for the 9/11 attacks. And when Sen. Byrd delivered his jeremiad on the Senate floor at that time, few of his colleagues were even in the chamber. Why? Because, Gore declared, no one pays attention any more to what’s said on the floor of the Senate — except for each senator’s political opponents, who might find some quotation to use against the incumbent. Meanwhile, the senators were out at cocktail parties raising checks to build war chests so they could purchase TV commercials during the next election cycle. Our reality is then shaped not by the deliberations of our elected officials, but by these TV barrages — “short emotional messages that are repeated over and over again by those who have enough money to purchase the time.”

I found Gore’s acid-sharp anatomy of this devastation of the political landscape even more terrifying than his now-familiar arguments about the environment. Because it’s this legislative paralysis and political bankruptcy that has left us utterly unable to respond to the warming crisis. How can we make smart choices when reality itself is a target of political subversion? What’s the point in repeating that there is overwhelming scientific consensus about global warming when we remain stuck with a media that’s still willing to publish nonsense like today’s Holman Jenkins column in the Wall Street Journal?

Jenkins says “it wouldn’t be too surprising if tomorrow’s consensus were that CO2 is cooling, or neutral, or warming here and cooling there.” That, Gore said, is like saying, “Gravity may repel us from the earth’s surface; it may repel us in some places and hold us down in other places. It’s an open question.”

Gore argues that the challenge of responding to global warming is this generation’s version of the World War II generation’s challenge of defeating fascism — and that we can, as they did, earn moral authority and find our strength by meeting it. “What I have on my side here is reality,” he said. In our denial of the evidence on warming, “we have been living in a bubble of unreality.”

Gore’s fierce dedication to his quest, which he rightly defines as a moral and spiritual issue rather than a political one, left me thinking that a run for president on his part would be a waste. Gore should take his anger and his understanding and dedicate it not just to the specific, overwhelmingly important environmental cause he has chosen to champion, but also to changing the very structure of our media landscape so that it can support a “reality-based community” once more. He’ll need to do the latter, anyway, if he is to get anywhere with the former.

Filed Under: Events, Media, Politics, Technology

If everyone has the same privilege, is it still a privilege?

May 27, 2006 by Scott Rosenberg

More on the Apple v. Does decision:
Denise Howell dissects the decision. Dave Winer takes Apple to task: “It’s unwise and hypocritical of Apple Computer, to profit from the expansion of the online community — the latest Mac comes with promotional material touting its ability to write blogs and create podcasts — and at the same time trying to control it to suit its corporate purposes.”

This court has now declared that anyone “doing journalism” on the Net is entitled to the protections the law provides journalists. That’s a great decision. But don’t expect the old-school journalism establishment to cheer in unison (despite the participation of some of its members as amici curiae on behalf of the online journalists). The next phase of this discussion will inevitably include the sound of hand-wringing: Where do we draw the line? If anyone publishing on the Net — and that means almost everyone these days — can be protected by a shield law, won’t the shield laws erode?

Extending a basic privilege — the right to ask questions and publish answers — to the broad public doesn’t come without cost to someone. In this case, a lot of traditional journalists are going to fret about the erosion of their own existing privileges. Don’t be surprised if there are more absurd proposals for things like “journalism certifications” and Official Journalist Membership Cards.

Filed Under: Blogging, Media

California court: shield law applies to anyone who gathers and disseminates news

May 26, 2006 by Scott Rosenberg

The decision in the Apple v. Does case, in which I am proud to have participated in a tiny way (as signatory to an amicus brief), just came down, and it is a win for the wider universe of bloggers and other Internet-based writers and self-publishers.

See Lauren Gelman’s report. Here’s the ruling (PDF). Here’s a release from EFF. More after I’ve had a chance to read in full.

This appears to be one key passage:

  We decline the implicit invitation to embroil ourselves in questions of what constitutes “legitimate journalis[m].” The shield law is intended to protect the gathering and dissemination of news, and that is what petitioners did here. We can think of no workable test or principle that would distinguish “legitimate” from “illegitimate” news. Any attempt by courts to draw such a distinction would imperil a fundamental purpose of the First Amendment, which is to identify the best, most important, and most valuable ideas not by any sociological or economic formula, rule of law, or process of government, but through the rough and tumble competition of the memetic marketplace.

Any judge who uses the phrase “memetic marketplace” seems to have immersed himself fully in the subject!

Filed Under: Blogging, Media, Personal, Technology

Gregg Easterbrook’s global warming alarm — too little, too late

May 25, 2006 by Scott Rosenberg

Gregg Easterbrook has long been a foot-dragger in the global warming debate. He’s positioned himself as an optimist and a pragmatist and a non-alarmist. In practice that has meant knocking the Kyoto Treaty and, as an ostensibly liberal or at least centrist global-warming skeptic, providing cover for anti-environmentalists — sort of like how Joe Lieberman’s support of the Iraq war has provided the Bush administration with a fig-leaf of bipartisanship.

So on one level we should applaud Easterbrook’s piece on yesterday’s New York Times op-ed page declaring that, yes, he is finally now persuaded that global warming really is a problem, that all the returns are finally in and the weight of scientific evidence now overwhelmingly points to human activities as a major factor in the climate change we’ve begun to witness. “Based on the data, I’m now switiching sides regarding global warming, from skeptic to convert,” he wrote.

I’m glad Easterbrook has chosen to declare his change of heart so publicly. But, you know, one thing we expect from pundits is that they be just a little bit ahead of the curve. Easterbrook’s 11th-hour conversion may provide some useful fodder in the propaganda battle against right-wing ostriches. In my book, it also discredits his further pronouncements on this topic.

His early call on global warming — don’t worry yet, things will probably work out okay, there’s still hope it’s all just statistical noise — was dead wrong. The people he derided as alarmists were right. So pardon me for suggesting that it is now time for Easterbrook to hang up his hat as an expert on this subject. I don’t want to hear his latest recommendations against Kyoto or his endorsement of carbon-trading schemes as the only solution to the problem. There are other people I trust a lot more, because they made the right call on this issue when it was a lot harder to make.

Pundits make risky guesses all the time. Those that guess right over time gain credibility; those who guess wrong ought to lose it. To express this in terms the market-loving Easterbrook can understand, it’s the risk/reward mechanism as applied to information. For example: Saying “Google is important!” today, or any time over the last several years, doesn’t win you any points. Those of us who said it back in 1998 — when the conventional wisdom of the bubble-dazzled industry was that search engines didn’t matter anymore — perhaps earned a little extra credibility when the prediction proved correct. Observers who accurately predicted the likely outcome of the invasion of Iraq — like Thomas Powers — are going to get a fuller hearing from me than those who cheered the ludicrous “cakewalk” talk and pooh-poohed the difficulty of rebuilding the nation post-Saddam.

So, as we struggle to figure out how to deal with global warming, I will continue to ask, “Why should I listen to Gregg Easterbrook?”, and place my bets on the observers who put their careers on the line to sound an early alarm — people like Bill McKibben, and, yes, Al Gore.

Filed Under: Media, Politics

Reluctance to give credit

May 12, 2006 by Scott Rosenberg

In the early days of the Web, when we were just getting Salon off the ground, we noted with amused snorts how big media outlets were unwilling to credit anyone doing original work online — they’d prefer, when they bothered to acknowledge a source at all, to use vague attributions like “a Web site” or “on the Web.”

These days Salon gets somewhat more respect. Hey, it’s only been ten years we’ve been doing our independent journalism thing — not long enough to belong to any clubs, assuming we’d even want that, but enough to warrant a named-attribution tip of the hat, some of the time.

The rise of blogs has occasioned another turn of this same wheel. Josh Marshall, whose Talking Points Memo (and new TPMMuckraker spinoff) regularly breaks news on those stories it focused on, notes with amusement today that the New York Times won’t actually credit his site for a scoop about documents relating to bribery in the Dusty Foggo/CIA/Cunningham/Wilkes imbroglio.

The documents simply “appeared on the Internet Tuesday,” the Times story says. Apparently they simply materialized.

This is like writing about the Times’ scoop on NSA spying by writing, “News of the program appeared on paper last month.”

Filed Under: Blogging, Media, Salon

In case you missed it: Poniewozik on Colbert

May 8, 2006 by Scott Rosenberg

Amid the swarm of post-Colbert commentary, James Poniewozik’s acute observation on his blog at Time stands out:

  Colbert wasn’t playing to the room, I suspect, but to the wide audience of people who would later watch on the Internet. If anything, he was playing against the room — part of the frisson of his performance was the discomfort he generated in the audience…
What anyone fails to get who said Colbert bombed because he didn’t win over the room is: the room no longer matters. Not the way it used to. The room, which once would have received and filtered the ritual performance for the rest of us, is now just another subject to be dissected online.

Filed Under: Media

Colbert’s critics should put away their laugh meters

May 3, 2006 by Scott Rosenberg

Today the agenda for discussing Colbert at the White House Correspondents’ Dinner is, “Was he, or wasn’t he, funny?”

As any performer knows, humor is intensely subjective; it is brittle, circumstantial; it depends on the moment, what came before, who’s in the room, how much they drank. I wasn’t there in that banquet room. It seems that Antonin Scalia found Colbert’s jokes hilarious; President Bush, along with much of the crowd, apparently did not. Viewing the video after the fact, I happened to find much of it funny. So have millions of downloaders and Bittorrent-ers and Youtube-sters.

But none of that really matters. Evaluating this event on laugh-meter scores is absurd — it’s just one more way of marginalizing and dismissing what actually happened that night. Just for a moment, Colbert brought a heavily sheltered President Bush face to face with the outrage and revulsion that large swathes of the American public feel for him and what he has done to our country. He did so at an event in which a certain level of jovial kidding is sanctioned, but he stepped far beyond. His caricature of a right-wing media toady relied on irony, and irony rarely elicits belly laughs, but at its best, it provokes doubt and incites questions. The ultimate goal of Colbert’s routine was not to make you laugh but to make you think; it aimed not to tickle but to puncture.

In that sense, those observers who have criticized Colbert for being rude to the president are absolutely right. As I wrote yesterday, the performance was a deliberate act of lese majeste. That means it was meant to pop the balloon of protective ritual around Bush and let reality in, so we can see him — along with those in the press who have been complicit with him — for what he is.

Inside the Beltway, humor is supposed to be disarming, “humanizing.” Ever since Richard Nixon appeared on “Laugh-in” and said “Sock it to me!,” suggesting that he was not quite the conservative gorgon that he seemed to be, politicians have wanted to use comedy as a prop in their own campaigns of self-promotion. But that’s a late-20th-century degradation of comedy. There’s an older tradition — stretching back to the commedia dell’arte and beyond, into the medieval court and its “all-licensed” fools — in which the comic seeks the discomfiture of the powerful.

Colbert’s act had less in common with cable-channel comedy shows than with the work of Dario Fo, the Italian iconoclast who specializes in lese majeste (he likes to poke fun at the Pope). In this it resembled Michael Moore’s Fahrenheit 9/11, but it was smarter than that propagandistic montage, and braver — delivered live, as it was, in the belly of the press-corps beast it was skewering.

So now we have the sad spectacle of the media desperately puffing air back into the popped balloon of the president’s dignity, pretending that nothing happened. The Bush impersonator was funnier! cry the pundits. Colbert bombed! Well, they can sneer all they want about whether or not he slayed ’em in D.C. Out here in the reality-based community that increasingly encompasses the American electorate, Colbert hit his targets. And they will never look quite the same.

Filed Under: Culture, Media, Politics

Stephen Colbert and the Beltway disconnect

May 2, 2006 by Scott Rosenberg

Sunday and Monday the Net was abuzz with word of Stephen Colbert’s bracing, revelatory acts of lese majeste at the White House Correspondents’ dinner. Videos were posted. Emails were exchanged. Word spread. This was, or at least felt like, a watershed event, an emperor’s-new-clothes sort of moment.

That, apparently, is not how it seemed from inside the Beltway bubble. Colbert’s highwire irony apparently left the D.C. press corps cold. It didn’t even merit a mention in the New York Times coverage of the event. Colbert “fell flat because he ignored the cardinal rule of Washington humor: Make fun of yourself, not the other guy,” the Washington Post told us. It seemed that a silly routine that President Bush concocted with a Bush impersonator went over better with this crowd.

At Salon we’re well accustomed to this disconnection between the D.C. consensus and the view from beyond the Beltway. We felt it keenly during the mad Monica days, when capital insiders and mainstream media boffins puffed themselves up with outrage at an inconsequential presidential transgression while a significant portion of the rest of the nation sat there thinking, “Get over it — move on, and get back to work on the real problems we face.” Today, this dynamic is inverted: the outrage lies beyond the Beltway, where it’s almost impossible to believe how badly the nation has been run into the ground by the current administration and its allies.

In Washington, it seems, the emperor’s nudity remains a verboten topic, and our leader is to be feted with business-as-usual niceties. Meanwhile, beyond the corridors of power, the clothes vanished a long time ago, the folly is transparent, and we can’t believe the ugliness of the resulting spectacle. Our young people are dying in a war based on a lie, our national leadership reeks of corruption, our economic well-being has been sold out for a mess of tax-break pottage, the global environment is being wrecked for our children, the absence of a smart energy policy has left us powerless in the face of an oil shortage — and we are supposed to be nice?

Maybe the editors and reporters in that banquet room didn’t find Colbert funny. Watching his performance at home, I couldn’t stop laughing.

[Watch Colbert here (Videodog, Youtube 1, 2, 3); read Michael Scherer’s Salon piece; there’s a full transcript over at Kos.]

LATE ADD: Dave Johnson calls the absence of mainstream Colbert coverage an “intentional blackout.” Me, I don’t think it’s coordinated in quite that way; newsrooms independently reach the same (wrong) conclusion about what’s newsworthy — then see their choices reinforced by those of their colleagues at other outlets. Mostly I think they resented Colbert’s jabs at them — and cheered themselves up by telling themselves that he wasn’t really funny.

Filed Under: Culture, Media, Politics, Salon

The plagiarism plague

April 27, 2006 by Scott Rosenberg

In the wake of the latest pair of plagiarists caught — a young Harvard student novelist with a fat book deal and the CEO of Raytheon — we are left, once more, to shake our heads and wonder: Why do they do it? Isn’t everyone on notice today that Google has made it virtually certain that you will get caught?

My assumption has always been that writers do it because, fundamentally, writing is hard, shortcuts are tempting, and some writers lack the self-discipline and/or self-respect to resist that temptation. That’s one of a bunch of possible reasons Jack Shafer’s essay on the latest plagiarists proposes over at Slate. Another rationale he suggests is “Even If You Get Caught, You’ll Probably Get Away With It.”

We can’t make writing any easier; it is what it is. But we — everyone in the fields of journalism, publishing and media — can surely do a better job of shaming and shunning those who are caught.

Filed Under: Media

Kurt Andersen and the new bubble, redux

April 25, 2006 by Scott Rosenberg

Last night Kurt Andersen posted a comment in response to my post below about his New York magazine article on the new Net bubble. It deserves highlighting. Andersen wrote:

  Actually, my point in admitting my ignorance in early 1994 (of the Web) and early 2000 (of blogs) and early 2002 (of RSS) was not so much the ignorance, which I don’t think was at all unusual (let alone extreme) at those respective moments — but rather how quickly in the internet realm the arcane becomes commonplace. (The phrase “worldwide Web,” for instance, had appeared exactly twice in the New York Times when I first heard it in early 1994; “blog” apparently wasn’t coined until the spring of 1999, and didn’t appear in the Times until the spring of 2001; and RSS first appeared in the Times in the spring of 2003.)

Fair enough. So Andersen’s point wasn’t to emphasize that he was unusually far behind the curve, but rather to underscore how speedily the phenomena he was catching up to would go mainstream. But I think these divergent readings of the same passage only end up underscoring my argument — that such things look very different from the West Coast end of the telescope.

I don’t know how useful it is to venture deeper into the thickets of chronology. “Early 1994” is a lot different from later 1994 in matters of early Web awareness; Peter Merholz may have coined the term “blog” in spring 1999, but the concept of “weblog” was long-established by then (I wrote in May, 1999: “A phenomenon known as the weblog is one of the fastest-growing and most fertile creative areas on the Web today”); RSS was in wide use at Salon and other places by 2000 and commonplace by 2001-2.

More interesting, to me, is the usage of New York Times reference-counting as a yardstick of prevalence. My argument was about how slow and sometimes blind the New York media culture can be to picking up on trends and practices that have already become commonplace elsewhere, particularly in Silicon Valley and the Web industry. It wouldn’t surprise me that New York Times keyword counts similarly lag. I mean, RSS first rearing its head in spring 2003? I — and a lot of other people — were living inside our feed readers by then.

Certainly, this industry moves fast. But the New York perspective tends to see new tech and Web trends as popping up instantaneously, out of nowhere, and that exaggerates their true speed and robs us of the opportunity to understand their provenance.

The Web wouldn’t have seemed like quite the bolt-out-of-the-blue if you’d been paying attention to the steady acceleration in Internet growth and awareness that had preceded it in the early ’90s (a lot of people had Internet e-mail before they’d ever heard the prefix ‘http’). Blogs were less of a surprise if you’d had an ear cocked to the remarkable flourishing of personal Web-based journals from 1995-8. If you checked in on any kind of frequency to Dave Winer’s Scripting News in the late ’90s, which a lot of us did, you couldn’t help getting an education in RSS.

All of which is simply to underscore my argument: that media people ought to pay a little more advance attention to technology people. The techies’ early-adoption enthusiasms serve as a distant-early-warning system — not infallible, but valuable — for the new wrinkle that will be a media-world craze in two or three years. I can understand how New York was blindsided in the 1990s. But there’s no excuse for it today.

Filed Under: Media, Technology

« Previous Page
Next Page »