Wordyard

Hand-forged posts since 2002

Archives

About

Greatest hits

When cover-ups aren’t stupid at all

October 30, 2005 by Scott Rosenberg

Friday we learned that, according to prosecutor Patrick Fitzgerald, Scooter Libby told a series of bald lies to his grand jury. And so now we are hearing the old choral reminder, “it’s the cover-up, stupid.” Cover-ups are, by general acclaim, worse than the crimes they try to hide. This piece from today’s Times Week in Review is typical — it opens with Charles Colson of Watergate infamy declaring, “I don’t know why people don’t learn this lesson.”

According to this line of thinking, the denizens of the Beltway who keep getting caught engaging in cover-ups are all stricken with some similar malady: expedience cut with arrogance and the sense of invulnerability that comes with power.

The problem with this analysis is that it fails to engage with the practical, temporal dynamics of most cover-ups, in which the idea is not to cover something up indefinitely but to cover it up through some significant date — most often, an election. The part of the Watergate cover-up that mattered most was the part that took place between the June 1972 break-in and the November 1972 election; Watergate, we too often forget, was one incident in a massive campaign of election-tampering.

Similarly, Scooter Libby’s apparent lies need to be understood not in the abstract but in their place on the electoral calendar. The Bush administration’s paramount goal through the 2004 election cycle was to prevent an open national debate on the mistakes it had made in the run-up to the Iraq war. When Joe Wilson’s whistleblowing threatened to begin such a debate, Cheney’s office sprung into action; when the smear campaign backfired, the White House damage-control effort aimed to limit any fallout till after the election. That’s why we heard all the broad denials that look so incautious today: the lid had to be kept on this pot (just as the Senate had to be prevented from releasing, or even preparing, reports about White House misuse of intelligence data).

How important — and successful — all this was can be seen in this quote from Fitzgerald’s Friday press conference:

  FITZGERALD: I would have wished nothing better that, when the subpoenas were issued in August 2004, witnesses testified then, and we would have been here in October 2004 instead of October 2005. No one would have went to jail.

Here the prosecutor was talking about the delays to his investigation that stemmed from the refusal of journalists, most notably the New York Times’ Judith Miller, to testify. But he also reminds us that, under other circumstances — in which journalists had construed their public responsibilities differently and government officials hadn’t chosen the coverup route — his investigation could well have delivered its verdict on the threshold of a criticially important election.

Maybe, in the absence of a coverup, Fitzgerald would have been left with nothing and no one to indict; or maybe he’d have been able to move more directly against the officials responsible for outing Valerie Plame. We’ll never know, of course. And so this alternate-history timeline of a 2004 election in which voters had a fuller picture of the Bush administration’s desperate, foolish, incompetent sell-the-war campaign remains unreadable.

Just as an act of perjury can thoroughly derail a criminal inquiry and make it impossible for the justice system to come to a clear determination of fact, so a coverup can, if it achieves a short-term goal, sometimes create new “facts on the ground” that no revelatory inquest or subsequent prosecution can roll back.

That is what Scooter Libby’s “coverup” achieved. It was one of the most significant of a series of “kick the can down the road” tactics that helped the Bush team eke out its 2004 win and hang on to power. Whether or not Libby ends up going to jail, I’d say he probably considers that a success.

Filed Under: Politics

Tell the casualties about those technicalities

October 25, 2005 by Scott Rosenberg

It is a strange, strange thing to hear the wagon-circling Republican spin on the likely forthcoming White House indictments:

 

Senator Kay Bailey Hutchison, Republican of Texas, speaking on the NBC news program “Meet the Press,” compared the leak investigation with the case of Martha Stewart and her stock sale, “where they couldn’t find a crime and they indict on something that she said about something that wasn’t a crime.”

Ms. Hutchison said she hoped “that if there is going to be an indictment that says something happened, that it is an indictment on a crime and not some perjury technicality where they couldn’t indict on the crime and so they go to something just to show that their two years of investigation was not a waste of time and taxpayer dollars.”

God forbid a special prosecutor should deliver charges based not on the original crime under investigation but instead on “some perjury technicality”! How soon they forget. A mere seven years ago, such technicalities led the Republican Party to impeach a popular chief executive presiding over an era of peace and prosperity. As it happened, the Starr inquest dug its way through the ancient history of the Clinton family finances for many more years than the Fitzgerald investigation — and, unable to find anything there, ended up prosecuting the president for lies he delivered about a tawdry sex scandal that had zero relevance to any national issue.

The Fitzgerald investigation, on the other hand, may hang on an arcane issue of revealing the identity of a CIA covert operative — but it is rooted in a controversy that is still costing American lives and harming the national interest. The Bush administration led the U.S. into war under false pretenses. It ignored the intelligence it didn’t want to believe and it ballyhooed information that it should have known was lies. Then it waged a brutal “politics of personal destruction” against anyone who questioned its misbegotten policies and the arrogant and incompetent policy-making process that spawned them.

At almost any other time in American history, such events would have naturally and inevitably inspired an independent investigation to find and air the truth. But the Bush administration learned its lesson from the 9/11 commission, and Republican dominance over “all three branches of government” today means that its cover-ups, generally, have been successful.

But it appears that the roundabout path of the Plame inquiry’s special prosecutor will finally begin to bring to light at least some of the heart of the matter. If there is heat, and with any luck some light, around these indictments, it will not be related to issues of anonymous sourcing in the press or the status of a covert agent: it will be about the American public finally getting a clear picture of just how far off the track of truth and sanity Bush and Cheney drove the country in their single-minded determination to invade Iraq.

That’s why the two columns on today’s New York Times op-ed page that attempt to downplay the importance of this issue are so off-base (and why my Salon colleague Joan Walsh is so on target in her analysis today). Nick Kristof argues, “It was wrong for prosecutors to cook up borderline and technical indictments during the Clinton administration, and it would be just as wrong today. Absent very clear evidence of law-breaking, the White House ideologues should be ousted by voters, not by prosecutors.” It remains to be seen what sort of “very clear evidence” Fitzgerald has or doesn’t have. If he has evidence that top White House officials, all the way up to the vice president, lied in an attempt to cover up a campaign of character assassination against a critic who presented evidence of a larger cover-up of a campaign of deception that led the nation into war, I’d say that goes well beyond a “borderline and technical indictment.”

John Tierney offers an even broader preemptive dismissal of the Fitzgerald indictments. His flip headline — “And your point is?” — suggests that the whole scandal is trivial. The CIA didn’t really know what was going on in Iraq; its analyses were all over the map, so we should give the Bush administration the benefit of the doubt. So what if they committed the nation to a bloody war of choice based on a mistake? They were just as confused as everyone else! “No one deserves to be indicted on conspiracy charges for belonging to a group that believed Iraq had weapons of mass destruction. Foreign policy mistakes are not against the law.”

No, they are not — but lying about them to a grand jury is. And while some mistakes are made because of bad information, the Iraq mistake was based more on willful arrogance, along with a tendency to shoot messengers like Joe Wilson who bore good information.

This is not about interns and stained dresses; it is about a tragic war that is still being tragically fought. And from where I sit, the 2000 dead American soldiers, and an untallied greater number of dead Iraqis, are owed some truth on a level that the president and vice-president are constitutionally incapable of delivering, or perhaps even comprehending.

Filed Under: Politics

Alan Kay: “Generate enormous dissatisfaction”

October 20, 2005 by Scott Rosenberg

I am entering the final sprint of completing a first draft of my book between now and Thanksgiving or so, so pardon my general bloggy sluggishness. My plan is to resume somewhat more active blogging in December and return in full blast by January.

In the meantime, here’s something that caught my eye:

One of the computing pioneers whose work I’ve had the pleasure of digging into for my book is Alan Kay. In the course of my research I had occasion to read Kay’s epic account of The Early History of Smalltalk. Smalltalk is the object-oriented programming language Kay created in the early 1970s at Xerox PARC (while he was also inventing much of the rest of modern computing). The paper is full of interesting stuff, but this observation near the end, about how to motivate yourself to tackle difficult challenges, jumped out at me:

  A twentieth century problem is that technology has become too “easy”. When it was hard to do anything whether good or bad, enough time was taken so that the result was usually good. Now we can make things almost trivially, especially in software, but most of the designs are trivial as well. This is inverse vandalism: the making of things because you can. Couple this to even less sophisticated buyers and you have generated an exploitation marketplace similar to that set up for teenagers. A counter to this is to generate enormous disatisfaction with one’s designs using the entire history of human art as a standard and goal. Then the trick is to decouple the disatisfaction from self worth — otherwise it is either too depressing or one stops too soon with trivial results.

“Generate enormous dissatisfaction” with one’s work — well, gee, that’s something most ambitious people know how to do, one way or another. But such dissatisfaction quickly blossoms into neurotic self-doubt. Ergo Kay’s careful recommendation to “decouple the dissatisfaction from self-worth”: that’s genius. And, I might add, really, really helpful to anyone laboring over a big project like, say, a book.

Of course, this means that you have to figure out other bases for self-worth than the work one has generated enormous dissatisfaction with!

Filed Under: Dreaming in Code, Food for Thought, Technology

Web 2.0 jottings

October 7, 2005 by Scott Rosenberg

Today I had to get some writing done, so I stayed away from the final sessions of Web 2.0 — where apparently, among other things, Google announced a new RSS reader (which was totally slammed and unreachable when I tried to visit earlier). But here are some notes from yesterday’s sessions.

I hadn’t heard of Writely before; it’s another Ajax-style Web app transposing a traditional software function into web-based software — in this case, word-processing. I’m putting it in the “check out when I have time” bin.

By many accounts, Zimbra was the hottest product to launch at the conference’s 13-company “Launchpad,” which featured plenty of other interesting debuts (Jeff Jarvis has good notes on the others). Zimbra is an Ajax-based Outlook replacement (e-mail, calendar, contacts). Its apparent homage to an old Talking Heads song was duly noted by whoever was running the music at Web 2.0; “I Zimbra,” the cryptic lead track from “Fear of Music,” could be heard between panels.

At the open source panel, Sun’s Jonathan Schwartz tried very hard to persuade us that what was really important about open source software isn’t that the code is open or that anyone can improve it but simply that it’s given away free. Mozilla’s Mitchell Baker did an excellent job of debunking this point of view, not by directly disputing it but by explaining exactly what’s so great about Firefox: “Our goal is to make things easy to change,” she said. “It’s easy to try things out. You can try things out quickly. We can try 15 or 20 things at once and see which work.”

And, she added, that “we” there? “It isn’t us.” That is, the people trying out 15 or 20 things aren’t sitting in the offices of the Mozilla Foundation or even part of the core development team; they’re all over the Web. And they can try those things out because, er, the code is open, not because the product costs zero dollars. Sure, most Firefox users aren’t programmers and can’t do anything with the source themselves. But they can benefit from a much broader set of improvements and options made possible by the open source model.

Jeremy Allaire debuted Brightcove, which looked basically like a content management system for video — not that interesting for end-users, but more for video producers or large-site managers looking to integrate more video. Still, pretty impressive as a well-thought-out approach to bringing more commercial video content onto the Web in ways that don’t totally freak out the “content owners” yet are not entirely hostile to the medium.

Jason Fried of 37signals offered a ten-minute rant on the virtues of “less” as a competitive advantage: “It takes three people to build anything online these days: if you have more than three people, you have too many.”

AOL’s Jonathan Miller told an amusing story of how, when he took over the company in the depths of the dot-com doldrums, he handled the resentment he found at various divisions of Time Warner, where employees and execs were disgruntled about how the AOL/Time merger had gone — they felt they’d been snookered by AOL. He told them about having his car towed in Manhattan, and visiting the godforsaken place you go to get your car, and waiting in line forever, and getting angrier and angrier, and finally getting to the front of the line and seeing a sign that read: “The person here did not tow your car. They are here to help you get your car back. If you cooperate, you will get your car back faster.”

That’s what he told the unhappy Time campers: “I did not tow your car.”

Mickey Hart was on stage at the end of the day Thursday, talking about the history of the Dead and the “tapers” the band allowed to record their shows. He pointed out ways in which that community was similar to today’s file-trading hordes, and ways that it was different. But one thing he said stood out for me: The Dead played for pay and they played for free; “we always played better when we played free.”

Filed Under: Business, Events, Technology

Diller’s tale

October 6, 2005 by Scott Rosenberg

Barry Diller was the kickoff interview here at Web 2.0 yesterday afternoon, which was more than a little odd, because Barry Diller does not appear to have anything to do with Web 2.0 — if, by Web 2.0, we mean, as conference hosts John Battelle and Tim O’Reilly said, an approach that involves innovation on the Web platform, an “architecture of participation,” lightweight business models, Web services with no lock-in, and so on.

No one has been smarter than Diller about rummaging through the broken and disused parts of old-Web flameouts and using them to assemble money-generating machines in relatively dull markets. And yet he has had no success — maybe even no interest — in creating innovative services or bringing new ideas to the Web. His company is a sort of Night of the Living Dot Com Dead.

Diller does not suffer fools — or interviewers — gladly, and he reserves a special sardonic disdain for tech-industry hype. That can be refreshing. I first heard his digital-skeptic act over a decade ago, at a panel at the old Intermedia conference in 1993, where he shared the stage with Bill Gates, Apple’s John Sculley and cable mogul John Malone. While the other spouted visionary platitudes, Diller simply fumed at their disconnection from his reality. (I wrote about the event for my old paper, here.)

Today, Diller is still wearing his skeptic’s hat; at Web 2.0 he turned it on those among the new wave of Web visionaries who have dared to dream that our new publishing and searching technologies might help bring a wider conversation into being beyond control of the broadcast world’s gatekeepers. “There’s just not that much talent in the world,” Diller says, “and talent almost always outs.”

On the one hand, Diller likes the Web, because it makes it easier for people to strut their stuff, if they have any: “If you have an idea, you can get it up and out, and good ideas resonate.” On the other hand, don’t expect some sort of renaissance of creativity to happen when the Web allows us to tap the talents of a wider swath of humanity: “I think that entertainment — TV, movies, games — I think it’s going to be a relatively few people who do that, simply because there is not enough talent, and it is not hiding out somewhere…”

For Diller, in other words, the Long Tail has no snap. Putting the tools of creation and distribution into the hands of the 99 percent of humanity who have hitherto had no access to them won’t fill a bigger pool of culture; the existing talent scouts of Hollywood and its equivalents have already done perfectly well, thank you, at tapping all the talent that’s there.

I’m sorry, I worked for 15 years as a theater and movie critic, and I know that Diller is wrong. Sure, I did my time working at a theater reading the slush pile of unproduced play submissions; I spent too many hours watching the awful 95 percent of movies that do manage to get produced and released. I don’t have any illusions about repealing Sturgeon’s Law.

But the promise of the Net, still not fulfilled but hanging there hopefully before us, is that a free, open, teeming network can actually provide more opportunity for “talent” to “out” than a handful of overworked script readers, slush-pile combers and A&R men. To think otherwise — to think that the existing corporate cultural system is the most efficient mechanism imaginable for the identification of artistic talent — is pure arrogance.

Based on what he said here, I think Barry Diller believes he is someone who understands the Internet because he knows so well how to make money through it. But I don’t believe he understands the first thing about what makes it anything more than just a money machine.

Filed Under: Business, Events, Technology

Salon’s new look

October 6, 2005 by Scott Rosenberg

I should take a brief break from the hurlyburly here at Web 2.0 to point whatever tiny handful of my blog readers over to Salon proper, where my colleagues have, as of last night, unveiled the central piece of the site’s continuing redesign. It’s the first Salon redesign that I have mostly sat out of, from my on-leave perch this year. It is, naturally, a collaboration by many great people. But if those of you with longer memories detect a certain feeling of connection with previous Salon designs of the mid and late ’90s — elegance and openness — that’s the hand of Mignon Khargie, Salon’s original design director, who returned to lend her sharp talents to the project.

There are, inevitably, kinks to work out, and some features that still need to be rolled in. But it’s great to see Salon beginning to evolve again. For a few years in the early part of this decade, we devoted a lot of energy simply to survival. Now we’re able to change and grow again.

[having trouble posting this remotely…flaky network here at Web 2.0 — let’s see if this works….]

Filed Under: Salon

Pop the bubbly

October 5, 2005 by Scott Rosenberg

John Battelle and Tim O’Reilly opened the second edition of the Web 2.0 conference this afternoon with an exchange along these lines: Battelle said that last year, the mood at the conference was simply, “We made it” — we survived the Internet industry’s dark winter. This year, he said, it’s more like, “Something really important is going on — let’s not screw it up.” O’Reilly added: “We are definitely running the risk of another hype cycle.”

I’d say it’s no longer a risk, it’s a reality. It’s too late in the evening to post too much about what I saw and heard today at Web 2.0 — more tomorrow. But let’s just say that the whiff of bubble-mania that was in the air at the conference’s first edition a year ago has now blossomed into a heady eau de dot-com.

The conference mixes up idealistic developers who have worked themselves half-blind coding the next super-cool but not-quite-usable-yet Web applications with sharp-eyed financiers looking for the next big thing that they can flip fast for a killing. In this regard, Web 2.0 — both the conference and the vague but real thing it is named for — is like the bastard offspring of the O’Reilly Emerging Technology Conference and the tech-investment gatherings of yore.

I do not know what will come of this not-so-holy union, but from the feel of things at the Hotel Argent today, it seems likely that a certain number of people will get rich, a certain amount of money will be wasted, several important new companies and technologies will emerge and some indeterminate number of investors will be fleeced. So that means it’s probably too late, John and Tim — the hype-cycle wheel is already in spin, up, up, up.

Filed Under: Business, Events, Technology

Manifesto destiny

September 27, 2005 by Scott Rosenberg

My old friend, game designer extraordinaire Greg Costikyan, has been ranting about the depressing state of the games industry recently. Tonight he announced that he is getting off his rhetorical duff and going to try to do something about its problems. He quit his job and is forming a new company called Manifesto Games.

  Its motto is “PC Gamers of the World Unite! You Have Nothing to Lose but Your Retail Chains!” And its purpose, of course, will be to build what I’ve been talking about: a viable path to market for independent developers, and a more effective way of marketing and distributing niche PC game styles to gamers.

Greg is also planning to write about the whole process of launching the company on his blog. Since he’s argued that one of the roots of the industry’s malaise is its business structure, he intends to write publicly about the fascinating game of financing his startup. He’s a sharp writer and he doesn’t suffer fools gladly, so that should be…fun!

Filed Under: People, Technology

Deja vu all over again

September 25, 2005 by Scott Rosenberg

Those of us who lived through successive waves of the media industry’s infatuation with the Internet from 1996 through 2000 or so may have thought we’d seen every possible folly that can arise when people mistake the Web for a broadcast medium. We had Webshows and Netshows and Netcasts and all manner of awfulness from MSN and AOL, Time-Warner and the TV networks and Disney. (I fumed in Salon about this profusion of “channels” on the youthful Web back in 1997.) When the dot-com bubble broke, it seemed we could finally bid farewell to the delusion that you can “program” for the Web just like you program TV. Through all of that nuttiness, Yahoo was one of a small handful of companies that seemed to understand the fundamentally un-TV-ish nature of the Web, and it profited steadily from that understanding.

So I nearly sputtered out a mouthful of coffee Saturday morning when I read the New York Times’ piece about Lloyd Braun, the former TV exec who is now running a big chunk of Yahoo.

  As chairman of ABC’s entertainment group, Mr. Braun had a penchant for big offbeat concepts like “Lost,” which won the Emmy for best drama. At Yahoo, why not create programs in genres that have worked on TV but not really on the Web? Sitcoms, dramas, talk shows, even a short daily humorous take on the news much like Jon Stewart’s “Daily Show” are in the works…. So Mr. Braun’s job is straightforward: invent a medium that unites the showmanship of television with the interactivity of the Internet.

If you read the entirety of Saul Hansell’s piece, it seems clear that Braun and his boss Terry Semel aren’t entirely ignorant of the nature of the medium they’re working in. They know that Net-based video comes in little pieces, gets remixed by the multitude and spreads virally. But I guess they can’t shake off the habits of their professional lifetimes, because it sure sounds like they’re saying something remarkably similar to what we’ve heard from the discredited peddlers of “Net shows” past: Move over, all you amateurs and geeks, and let some real broadcasters teach you how it’s done! They may be publishing material on the Web, but they still think in terms of big-splash Events and boffo shows.

I know that a lot of smart people who deeply understand the way the Net functions work at Yahoo. The company made a savvy move in bringing on Kevin Sites to lead their first real effort in original content — he’s a versatile journalist who’s been living in the online cross-currents for several years now. Maybe Yahoo will prove my skepticism wrong, and its programmers will be the first of the multitude to go down the road labeled “Let’s make the Net more like TV” and find that it’s not a dead end. But it seems more likely to me that we’ll be reading headlines in two or three or four years about Yahoo shutting down a lot of its experiments in this area, just as its predecessors did.

Filed Under: Business, Media, Technology

Notable events

September 23, 2005 by Scott Rosenberg

I have been hunkered down writing, sticking religiously to the schedule I’ve imposed on myself. But I’ll venture forth from my den over the next couple weeks for a few things.

Tonight, I’m planning to go hear Ray Kurzweil talk at the Herbst Theater in San Francisco, as part of the Long Now Foundation’s seminars. I’ve always found Kurzweil’s vision of “the coming singularity” as interesting to ponder as it is hard to believe. He’s got a new book on the subject, too, titled “The Singularity is Near.”

A week from Monday, I’ll be back in San Francisco to hear B.K.S. Iyengar, the founder of modern yoga, speak. I’ve been practicing Iyengar-style yoga for more than 10 years now, though I still feel very much like a beginner; it’s kept me sane through some major crises, including becoming a parent, being a parent, nursing a company through financial straits, and trying to write a book. Iyengar is 86 now, and says this visit to the U.S. (in part a book tour) will be his last. I’ll just be hearing him lecture; my wife, Dayna Macy, leaves this weekend for a week-long conference in Colorado organized by her company, Yoga Journal, where he’s giving a workshop, too. And she’ll be blogging about it (along with a former Salon colleague, Kaitlin Quistgaard, and other Yoga Journal folks).

Two weeks from now (10/7-9) is the latest edition of the Digital Storytelling Festival, to be held for the first time in San Francisco, over at KQED headquarters. I’ll only be able to attend part of the fest this year (I’ll be participating in a presentation about the work of the late festival founder, Dana Atchley) because it’s the weekend we celebrate our boys’ birthday, too. But I’m sure the whole thing will be great.

Finally, also that week, I’ll be trying to keep up with as much as I can of the second edition of Web 2.0, Oct. 5-7. The John Battelle/O’Reilly production will be my last chance to try to keep up with this ever-fermenting industry before I go into deep-retreat mode and attempt to finish my book. (Except I’ll have to emerge some time in November, because that’s when Salon is planning special, not-yet-announced but stay-tuned-for-more, 10th-anniversary festivities!)

Filed Under: Events

« Previous Page
Next Page »