Wordyard

Hand-forged posts since 2002

Archives

About

Greatest hits

How the bridge news flowed

October 27, 2009 by Scott Rosenberg


Bay Bridge cable down (via Twitpic)

[photo from twitpic via Larfo]

 

I have a very personal relationship with the ups and downs of the Bay Bridge replacement project. This is not only because I’m a Berkeley resident who often depends on the structure. And it’s not only because I’m lucky enough to have a view of the bridge (distant but majestic) from my back window.

I used the project as a framing device in Dreaming in Code. You see, there are always people pounding the table complaining, “Why can’t we build software the way we build bridges?” It’s a fair question, but it forgets a couple of things. There’s the obvious: software is abstract, bridges are physical, and therefore they are constructed differently and behave differently. But the table-pounders are also forgetting about the long history of bridge failures. As I watched the Bay Bridge project unfold during the time I worked on Dreaming, I began by wondering what made bridge-building and software construction so different. Three years later, as the bridge project had quadrupled in cost, been redesigned several times and been put on hold for many months by a political dispute, I ended up asking whether the two undertakings were really so different after all.

Now the bridge even has its very own bug, and some down time. They could hang a big Fail Whale from its girders!

As it happened, I spent this evening playing a new game with one of my sons, so I was relatively off the grid, and found out about the bridge’s sudden closure only when I scanned Twitter a little while ago.

I first turned to the SF Gate home page, where I found a solid and informative lead story that must have been assembled and posted by the Chronicle’s reporters and editors very quickly indeed. The Chron story also leads the Google News block on the event.

The Oakland Tribune also had a reasonably thorough piece, with a focus on commute details, that the San Jose Mercury News — now part of the same chain — reprinted, along with another Trib feature that basically compiled people’s Twitter messages about the event. The Santa Rosa Press-Democrat has a solid take posted, too.

KRON had a fairly full report of its own and easily accessible video from its newscast. CBS5 had an AP story and some raw video. KGO/ABC had a brief story. Yahoo had a fuller version of the AP’s story. KTVU had a story credited to itself and Bay City News.

Over at SFist I found a bloggy take on the event, with more links but less hard info than the Chron story (which SFist linked to). Other local blogs, like Berkeleyside and Oakland Local, also did some linking and summarizing.

CNN had a brief story. As I write this, the New York Times’ new Bay Area blog doesn’t have anything up. Wikipedia’s Bay Bridge page already has a sentence about the news. And over at Spot.us you can find a pitch — out for a while now but likely to see fresh wind in its sails — for an investigative project by some veteran journalists, backed by the Public Press and McSweeney’s (whose founder, Dave Eggers, seems to have kicked in a generous grant), looking into why the bridge project has had such problems.

So there you have it. The longterm investigative pieces that might once have come from the big-paper newsroom must now be funded by other means (I kicked in my $20!). But the papers are still doing some valuable spot-news work. With a story like this, at least, the best combination of speed and depth in an early report still comes from the leading local daily newspapers.

We knew that, of course. But we also know that we simply aren’t going to be able to count on having those sources that much longer. This week brought news of a precipitous decline in the Chronicle’s circulation. We should be planning (as Dave Winer has been urging for a long time) for life without it.

And that means figuring out how to make sure that our community has a way to find out what happened, and what’s going on, the next time a cable breaks on the bridge.

Filed Under: Media

People think the press gets a lot wrong. Maybe they’re right.

September 16, 2009 by Scott Rosenberg

[crossposted from the MediaBugs blog]

Americans trust the news media less than ever: “Just 29% of Americans say that news organizations generally get the facts straight, while 63% say that news stories are often inaccurate,” according to the latest results from the Pew Research Center released this week. That represents a drop of 10 percentage points from 2007, when 53% of Americans said that news stories were often inaccurate. And an alarming 70 percent of people surveyed believe that news organizations “try to cover up their mistakes.”

Pew Research Center survey report There’s a problem here, for sure. Many journalists understand this and work hard, every day, to try to solve it. Others are in denial. In reaction to this report, journalism scholar Jay Rosen wrote the following series of tweets yesterday:

Top explanations from journalists for fall in public confidence: 1. All institutions less trusted; 2. Cable shout-fest; 3. Attacks take toll

Top explanations from journalists for fall in public confidence, cont. 4. Environment more partisan; 5. Public confusion: news vs. opinion.

Top explanations from journalists for fall in public confidence, cont. 6. People want an echo chamber; 7. Numbers don’t really show a fall.

Each of these explanations doubtless has some merit. But together they constitute a kind of head-in-the-sand stance. Missing from the list is the simplest, most obvious explanation of all: Maybe we’ve lost confidence in the press because of its record of making mistakes and failing to correct most of them.

In other words, perhaps so many people think the news is full of inaccuracies because, er, they’re right.

Read Craig Silverman’s excellent book Regret the Error, based on his blog of the same name, and you’ll learn the sad numbers from the best studies we have on this topic: They show that the percentage of stories that contain errors ranges from 41 to 60 percent. Scott Maier, a journalism professor at the University of Oregon who has studied this field, tells Silverman that he found errors are “far more persistent than journalists would think and very close to what the public insists, which I had doubted.” Only a “minuscule” number of these errors are ever corrected.

Some of these errors are substantive, others seemingly trivial. But each one of them leaves readers or sources who know the topic shaking their heads, wondering how much else of the publication’s work to trust.

Since reversing this dynamic is the central goal of MediaBugs, we’ll be writing about it a lot here.

Filed Under: Media, Mediabugs

Bowden on Sotomayor: Blame the bloggers, again

September 8, 2009 by Scott Rosenberg

Mark Bowden is a seriously good reporter, and his piece in the new Atlantic, “The Story Behind the Story,” is one that every student of today’s mutating media should read. Bowden traces the route by which the soundbite that came to define, though not derail, Sonia Sotomayor’s Supreme Court nomination entered the media bloodstream. I can wholeheartedly recommend the reporting in Bowden’s piece, but I must take issue with some of his interpretation.

The “wise Latina” clip, it turns out, was first unearthed by a conservative blogger named Morgen Richmond and published on his blog, called VerumSerum. And the problem with that, Bowden suggests, is that Richmond, being a partisan in search of ammunition rather than a journalist in search of truth, presented it to the world without making an effort to understand it or put it in context — to see that, in fact, Sotomayor wasn’t saying anything that outrageous at all: As Bowden puts it, “Her comment about a ‘wise Latina woman’ making a better judgment than a ‘white male who hasn’t lived that life’ referred specifically to cases involving racial and sexual discrimination.”

Bowden credits Richmond as “a bright and fair-minded fellow,” but argues that his “political bias made him tone-deaf to the context and import of Sotomayor’s remarks. Bear in mind that he was looking not simply to understand the judge, but to expose her supposed hidden agenda.”

…he makes no bones about his political convictions or the purpose of his research and blogging. He has some of the skills and instincts of a reporter but not the motivation or ethics. Any news organization that simply trusted and aired his editing of Sotomayor’s remarks, as every one of them did, was abdicating its responsibility to do its own reporting. It was airing propaganda. There is nothing wrong with reporting propaganda, per se, so long as it is labeled as such. None of the TV reports I saw on May 26 cited VerumSerum.com as the source of the material, which disappointed but did not surprise Richmond and Sexton.

The trouble with all this is that Bowden is focusing his ire on the wrong people. Richmond is not, as far as I know, claiming to be a journalist — and yet, as Bowden admits, he is actually “fair-minded” enough to feel that the Sotomayor quote was maybe not that big a deal. Surely the failure here is on the part of the TV news organizations that turned it into a marquee soundbite without looking more deeply into it. Wasn’t that their job, their process, their vetting — the safeguard that ostensibly distinguishes them from the unwashed blogging masses? Aren’t they the ones who are supposed to be after truth rather than scalps?

Blogs may have helped accelerated gotcha journalism, but hit pieces and skeletons-in-closets existed long before their advent. The partisan warfare around Clarence Thomas’s nomination far outdid the Sotomayor hearings, and Anita Hill’s charges — whatever your view of them — required no blog posts to ignite their conflagration. The Web has crowdsourced opposition research, but the conflicts that motivate it have been around for ages.

It is television that creates soundbites; the Web at least allows for far more context and nuance, though it does not always deliver them. I do not understand how Bowden could fail to see this. He writes (of Richmond and his co-bloggers):

I would describe their approach as post-journalistic. It sees democracy, by definition, as perpetual political battle. The blogger’s role is to help his side. Distortions and inaccuracies, lapses of judgment, the absence of context, all of these things matter only a little, because they are committed by both sides, and tend to come out a wash. Nobody is actually right about anything, no matter how certain they pretend to be. The truth is something that emerges from the cauldron of debate. No, not the truth: victory, because winning is way more important than being right. Power is the highest achievement. There is nothing new about this. But we never used to mistake it for journalism. Today it is rapidly replacing journalism, leading us toward a world where all information is spun, and where all “news” is unapologetically propaganda.

“The blogger’s role is to help his side.” This is sometimes true, but no more definitive than to say, “The TV newsperson’s role is to help his side.” It is a broad-brush dismissal of an entire class of writers who are actually far more diverse in their goals and techniques. It is no more accurate than the carping of the extremists (of both left and right) who tar all “MSM” journalists with the sins of a minority of hacks or ideologues. It’s disheartening to see a writer of Bowden’s stature placing himself on that level.

There are pundits and news-show hosts who earn our trust as straight shooters, and there are others for whom partisanship plainly trumps truth. There are reporters who aim to shoot straight, and others who hide their own blatant partisanship behind a scrim of ersatz objectivity. In the end, all we can do is find individuals and institutions who, based on their record and their willingness to show their process, seem to place truth ahead of “victory.” Such individuals and institutions are no rarer on the Web, and among bloggers, than among the old guard of journalism. If the public is being ill-served by echo-chamber coverage and shallow sound-bite gotcha clips, the cable news channels bear primary responsibility. Bowden’s own narrative of the Sotomayor “story behind the story” is just the latest demonstration.

BONUS LINK: Here’s Richmond’s thoughtful response to Bowden.

Filed Under: Blogging, Media, Say Everything

Something there is that doesn’t love a paywall

August 20, 2009 by Scott Rosenberg

Before I built a wall I’d ask to know
What I was walling in or walling out,
And to whom I was like to give offence.
Something there is that doesn’t love a wall,
That wants it down!

— Robert Frost, “Mending Wall”

This week the conversation about pay walls for news sites online — a/k/a the “how do we make them pay for news?” question — has reached a feverish pitch. In what may well be remembered as the apex of the ostrich argument, the Washington Post’s Paul Farhi maintains, in the American Journalism Review, that newspapers should either “build that paywall high” or — this is where the ostrich beak burrows far below daylight — quit the Web entirely.

“Downplaying the Web, or dropping it altogether and going back to print only, looks not just smart for the struggling newspaper industry, but potentially lifesaving,” Farhi writes.

Never mind that newfangled printing-press thing! Can’t you see we’ve got scribes to support?

A number of exasperated media observers who think the paywall is a bad idea but who have grown tired of the endless debate are echoing Farhi’s cry: If you think it’s such a great idea, they’re saying to publishers, shut up already and just start charging.

For the thorough explanation of why the strategy is doomed, just read Alan Mutter’s post today: “Publishers consistently have told me that they fear they could lose 75% or more of their traffic and banner revenue if they started to charge for content.” My experience at Salon –where we briefly went “all pay” after 9/11, when the ad market disappeared — suggests that even this number is optimistic.

I’m exasperated too, but I won’t join the “put up or shut up” crowd because I’d hate to see the further ghettoization of oldfashioned journalistic expertise on the Web. New models for news are sprouting on the Web every day. The journalism profession has a wealth of expertise and knowhow; the support of a dying industry’s paychecks will continue to dwindle, but the expertise can still be transmitted to a new generation of journalism ventures. That won’t happen If major media outlets wall themselves off from the Web. They will cut off not only their revenue but also their chance to influence the practice of journalism as it evolves online.

The alternative to “go ahead, build your wall” is for newspaper companies to accept that monopoly profits will not return and cannot be replaced. (Yes, I know that accepting such a reality is difficult and unlikely.) Instead, begin exploring new business models by starting from the revenue side and seeing what sort of complementary journalism can be supported.

John Robinson, editor of the Greensboro News & Record, has taken this notion to heart and called for ideas and proposals. Brainstorming rather than masonry — what a concept!

Filed Under: Media

Time to retire the term “blogger”?

August 18, 2009 by Scott Rosenberg

Has the word “blogger” become meaningless?

Consider this item (from Mediabistro’s Fishbowl LA):

We asked [Jay] Rosen what he thought of the term “blogger” and how there is not a word to distinguish a journalist who blogs and a numbnut who blogs.

“Blogger will become such a broad term it will lose all meaning,” he told FBLA.

Rosen later elaborated on Twitter:

We don’t say “Emailer James Fallows,” even though he uses email. Eventually, it will be the same with the term “blogger.”

Let’s unpack this.

“Blogger” confuses us today because we’ve conflated two different meanings of “blogging.” There is the formal definition: personal website, reverse chronological order, lots of links. Then there is what I would call the ideological definition: a bundle of associations many observers made with blogs in their formative years, having to do with DIY authenticity, amateur self-expression, defiant “disintermediation” (cutting out the media middleman), and so on.

Today professional journalism has embraced the blog form, since it is a versatile and effective Web-native format for posting news. But once you have dozens of bloggers at the New York Times, or entire media companies built around blogs, the ideological trappings of blogging are only going to cause confusion.

Still — wary as I am of taking issue with Rosen, whose prescience is formidable — I don’t think we will see the term “blogger” fade away any time soon. There’s a difference between a term that’s so broad it’s lost all meaning and a term that has a couple of useful meanings that may conflict with each other.

After all, we still use the word “journalist,” even though it has cracked in two (“journalist” as professional label vs. “journalist” as descriptor of an activity). This is where human language (what programmers call “natural language”) differs from computer languages: our usage of individual words changes as it records our experience with their evolving meanings.

In other words, the multiple meanings of the word “blogger” may bedevil us, but they also tell a story.

Filed Under: Blogging, Media, Say Everything

Y Combinator’s “request for startups” in journalism

August 16, 2009 by Scott Rosenberg

I’m fascinated by this: Paul Graham’s startup-seeding outfit, Y Combinator, has announced that, with each new funding cycle, it’s now going to issue a sort of open call for submissions in a particular area. The general idea is what interested TechCrunch in writing the story up. But what caught my eye was the substance of the first request: “The Future of Journalism.”

The reason newspapers and magazines are dying is that what they do is no longer related to how they make money from it. In fact, most journalists probably don’t even realize that definition of journalism they take for granted was not something that sprang fully-formed from the head of Zeus, but is rather a direct though somewhat atrophied consequence of a very successful 20th century business model.

What would a content site look like if you started from how to make money–as print media once did–instead of taking a particular form of journalism as a given and treating how to make money from it as an afterthought?

Bingo! To me, this passage crystallizes the problem with so much of the “how do we get consumers to pay?” headscratching that is consuming media pros today. The death spiral of the old business model for news has some more twists and turns before the beast expires, but it is irreversible. The old bundle of information services and advertising that supported print journalism is gone, Humpty-Dumpty style, and nobody’s going to glue it back together. A deeper rethinking is needed, and those of us who want to see journalism thrive ought to be working hard to come up with answers to Graham’s question.

Graham envisions small teams that encompass writing, programming and design skills; in the Y Combinator model, they get a (very) small investment upfront, some bureaucratic assistance and some networking help. That’s one way of seeding lots of experiments. But I think Graham’s stark framing of the problem is as valuable as the bits of cash he’s spraying around; ambitious journalists and their programmer/designer friends don’t need to wait for Y Combinator to take up this challenge.

I have to admit that the phrase “treating how to make money from it as an afterthought” struck a nerve, because that really was how things were at the beginning at Salon and so many other journalism-oriented startups in the early years of the Web. This approach was understandable, and maybe excusable, in 1995; today, it’s a non-starter.

Graham’s challenge is elegantly simple: Instead of starting with the journalism and then puzzling out how to support it, start with the plan for revenue, then figure out what journalism might complement it. Recognize that the realm where innovation is most needed is the business side and how it relates to the journalism. Stop thinking of the two as a pair of unrelated entities lashed together, like some ungainly antique motorcycle/sidecar combo. Begin dreaming up, and testing out, approaches that provide a more organic connection between the reporting we need and the income that supports it.

This will sound alarms and seem heretical to all of us who grew up in the old “journalism on one side of a wall, business on the other” world. And yes, media businesses conceived along Graham’s lines will need not only a business plan but a plan for earning and keeping their readers’ trust.

I’m not too worried about that. It’s the easy problem, one that smart journalists already know how to handle. The business side, that’s the wicked problem. Ideas for solving it ought to make good starting points.

I’m grateful to Graham for boiling the issue down so neatly. And no, I don’t have any specific examples or ideas yet: if I did, I’d be assembling a team! But maybe you do.

Filed Under: Business, Media

Hunches — in combat, and on the Web’s wilds

July 29, 2009 by Scott Rosenberg

A lot of people have flagged Benedict Carey’s piece in yesterday’s Times, “In Battle, Hunches Prove to Be Valuable,” and with good reason: it’s a fascinating report on research into the way the brain combines visual data and emotional responses to shape the sort of instant-gut-reaction decisions that soldiers make as they evaluate threats.

The examples the piece draws on are from U.S. soldiers’ experiences in Iraq, where every stray boulder or trashheap by the roadside could be hiding a deadly bomb.

Reading Carey’s story, I thought of parallels in the distinctly less lethal — but still occasionally perilous — informational environment of the Web. What are the little signals that tell us, “You can trust this page”? And what are the red flags that tell us, “Watch out, something’s off here”?

These are important. Of course, they can help us protect ourselves from outright scammers (phishers who build lookalike bank websites to try to steal your passwords, and so on). But they can also help us sift and sort through the news and information that flows through our browsers, focusing on the good and discarding the bad.

Some of these signals are glaringly obvious (no “About” page? come on!). Others are subtler (are the writer’s arguments logical? Are statements of fact documented by links?).

What are some of the tools you use? I’ll be teaching a workshop this coming weekend as part of the Stanford Professional Publishing Course, and would love to hear your suggestions.

Filed Under: Blogging, Media

A.P. goes nuclear on fair use

July 24, 2009 by Scott Rosenberg

“A.P. Cracks Down on Unpaid Use of Articles on Web.” That’s the headline on a New York Times article right now. But if you read the article, you see that the Associated Press’s new campaign isn’t only about “unpaid use of articles,” it’s about any use of headlines as links. In other words, it sounds like A.P. is pulling the pin on a legal Doomsday Machine for news and information on the Web — claiming that there is no fair use right to link to articles using a brief snippet of verbiage from that article, or the original headline on the article.

In other words, if that Times story were by the A.P., I would be breaking the A.P.’s new rules just by using the ten words at the beginning of this post. My new book, which is filled with hundreds of quotes and URLs that (on the book’s website) link to the sources, would be a massive violation of the rules.

The A.P. seems to want to try to squeeze money both from Google and from sites that aggregate headlines. The Times story says: “The goal, [A.P. president Tom Curley] said, was not to have less use of the news articles, but to be paid for any use.” (Under A.P. rules, could I quote that?)

This move is foolish and self-defeating. If it has to, Google can simply block A.P. stories, and I’m sure it will choose to do that rather than agree to pay A.P.’s new fees. More simply, Google’s lawyers can point to the fact that any publisher can already opt out of Google’s system any time he/she wants to.

The A.P. isn’t going to build the hundreds-of-millions-of-dollar business it speaks about based on this effort; the most it can hope for is to sequester its version of the news off in a corner from the rest of the Web, where fewer and fewer will read it.

The danger is that this conflict will make it into the courts and some judge will narrow the fair use principle in ways that hurt both the Web and the free flow of information in our society.

As I wrote last year:

In the meantime, the biggest priority here for those of us who care about the long-term health of the web is that we don’t wind up with a terrible legal precedent that defines fair use in some newly constricted way. The people who are calling the AP out on this aren’t crazed piratical scofflaws; they’re journalists and authors, just as I am, people who pay the rent based on the value of the content they produce. But you need some assurance that you can quote brief excerpts or you can’t write non-fiction at all.

For a primer on this issue, you can see these posts (first, a second, a third, and a wrap) from last year, when A.P. got into a scrap with well-known blogger Rogers Cadenhead by sending him a legal takedown notice.

UPDATE: Zach Seward at Nieman Lab has a post covering some of the legal aspects of this story.

Filed Under: Blogging, Business, Media, Say Everything

Where’s Twitter’s past, and what’s it’s future?

July 21, 2009 by Scott Rosenberg

Blogs privilege the “now.” New stuff always goes on top. But they also create a durable record of “then” — as I have learned in spending the last couple of years digging through the back catalog of blogging. One of the great contributions of blogging software is to organize the past for anyone who writes frequently online. Before blogs, with each new addition to a website we had to think, where does this go, and how will I find it later? Blog tools, as personal content management systems, ended that era.

Twitter is great at “now.” But as far as I can tell, it’s lousy at “then.” It offers no interface to the past. You can’t easily navigate your way backwards in time.

Recently, I wanted to figure out the date of my first tweet. It’s still there in the database. But there’s no simple way to locate it. (Folks on Twitter pointed me to services like mytweet16 that dig up your oldest tweets, or tweetbook.in, which puts your whole Twitter history into a PDF, so there’s a way to do it, but not much of a useful interface.)

Each tweet is timestamped and lives at a unique URL. So it should be possible to build the machinery to organize one’s tweets into a more coherent record. (Dave Winer has written about this and done some work to store his Twitter past.) But — again, as far as I’ve been able to determine — we don’t really have a clear sense, or commitment from Twitter the company, of how long these URLs are going to be around.

The other big weakness of Twitter as a sort of universal microblogging platform is that all its interaction is happening on one company’s server, in that company’s database. That poses some fierce technical problems if the Twitterverse keeps scaling up. (See for instance this comment by Chuck Shotton at Scripting News: “IMO, Twitter is a toy to be experimented with until it breaks and is replaced by a properly implemented solution that will persist, scale, and be as open as the protocols above.”)

If Twitter can engineer its way out of the scaling dilemma, we’re still looking at a platform that is owned by one company. One of Dave Winer’s original message as a proto-blogger in the mid-90s was to warn us about such platform ownership and to celebrate the arrival of the Web as the platform that nobody owns. Today Winer is sounding the same alarms about Twitter, and they are worth weighing. While I find Twitter far more open to the Web than, say, Facebook — which really feels like an AOL-style walled garden — it’s still just one company, with one “namespace,” or set of unique names for people to claim (good Twitter IDs will probably run out even faster than domain names).

To date I think Twitter has done a pretty fine job of serving its platform and its users — though I have qualms, as many do, about the way its Suggested User List mixes up editorial and business roles without taking full responsibility for either. But once the company decides it’s time to “monetize” — whether that happens next month or year or decade, and whether it’s handled sensitively or crudely — we are likely to see old-fashioned conflicts between serving users and serving the quarterly revenue targets re-emerge.

Best case: Twitter hits a home-run by finding an innovation that, like Google’s targeted text ads, brings in revenue without degrading the primary service. (There is a subtle argument — espoused by Rich Skrenta and others — that Google, in monetizing its pages, corrupted the link-ranking on which its whole search engine depends. But for most of us, Google managed to make a fortune without noticeably reducing its usefulness — a neat feat.) Worst case: Twitter fails to figure out a business model and its investors grow impatient, forcing the service to overload us with advertising like a tanking dotcom in 2001.

On his blog at BNET, David Weir recently recorded the following comment from an anonymous Silicon Valley insider: “Twitter is exactly what the Internet was around 1996. It represents nothing less than the New Internet. It is the game-changer.”

I share the general enthusiasm for Twitter as a model for real-time interaction. But I don’t fully buy the “New Internet” notion. By 1996, people like me (and David Weir, and Evan Williams, and Dave Winer, and countless others) had flocked to the Internet because it was wide open. In the World of Ends formulation, “No one owns it. Everyone can use it. Anyone can improve it.” Twitter, exciting as it is, falls far short of that kind of game-changing.

[This post follows on from yesterday’s How Twitter Makes Blogs Smarter.]

Filed Under: Blogging, Media, Say Everything

Nikki Finke, David Carr, invisible rewrites and the Web’s original premise

July 20, 2009 by Scott Rosenberg

David Carr’s profile of Hollywood gossip blogger Nikki Finke contained two statements that I thought shouldn’t stand without challenge.

She isn’t always right and, as her critics have pointed out, she’s not above using the new-media prerogative of going into her archives and changing the bad call to a good one…

[Patrick Goldstein] hastens to add that Ms. Finke has gone into her own archive to correct errors. Bill Wyman, who blogs at Hitsville.org, documented an instance in which she altered a previous post about a director getting a job, then took credit for a scoop when it turned out to be somebody else.

Ms. Finke said both men were wrong on the specifics and each had a personal vendetta against her, a frequent theme whenever criticism of her work came up. She does say that she considers Web articles to be living things, reflecting “the latest information I have received.”

Yes, Web articles are “living things,” and they can and often are updated and fixed. But no responsible Web journalist makes substantive changes in copy after the fact without leaving a record — a strikeout, a note in the text indicating a change or update was made, or something like that. No one who self-identifies as an online journalist claims the right to make such invisible rewrites as a “new media prerogative.”

To admirers and detractors, she is the perfect expression of the Web’s original premise, which suggested that a lone obsessive could own the conversation.

The “Web’s original premise” was that if you created a simple standard for linked hypertext documents, people and institutions would add content and build a global repository of information. Tim Berners-Lee had no particular interest, as far as I know in empowering “lone obsessives” or helping anyone “own” the “conversation.”

Filed Under: Blogging, Media

« Previous Page
Next Page »