Wordyard

Hand-forged posts since 2002

Archives

About

Greatest hits

In Defense of Links, Part Two: Money changes everything

August 31, 2010 by Scott Rosenberg

This is the second post in a three-part series. The first part was Nick Carr, hypertext and delinkification. The third part is In links we trust.

The Web is deep in many directions, yet it is also, undeniably, full of distractions. These distractions do not lie at the root of the Web’s nature. They’re out on its branches, where we find desperate businesses perched, struggling to eke out one more click of your mouse, one more view of their page.

Yesterday I distinguished the “informational linking” most of us use on today’s Web from the “artistic linking” of literary hypertext avant-gardists. The latter, it turns out, is what researchers were examining when they produced the studies that Nick Carr dragooned into service in his campaign to prove that the Web is dulling our brains.

Today I want to talk about another kind of linking: call it “corporate linking.” (Individuals and little-guy companies do it, too, but not on the same scale.) These are links placed on pages because they provide some tangible business value to the linker: they cookie a user for an affiliate program, or boost a target page’s Google rank, or aim to increase a site’s “stickiness” by getting the reader to click through to another page.

I think Nick Carr is wrong in arguing that linked text is in itself harder to read than unlinked text. But when he maintains that reading on the Web is too often an assault of blinking distractions, well, that’s hard to deny. The evidence is all around us. The question is, why? How did the Web, a tool to forge connections and deepen understanding, become, in the eyes of so many intelligent people, an attention-mangling machine?

Practices like splitting articles into multiple pages or delivering lists via pageview-mongering slideshows have been with us since the early Web. I figured they’d die out quickly, but they’ve shown great resilience — despite being crude, annoying, ineffective, hostile to users, and harmful to the long-term interests of their practitioners. There seems to be an inexhaustible supply of media executives who misunderstand how the Web works and think that they can somehow beat it into submission. Their tactics have produced an onslaught of distractions that are neither native to the Web’s technology nor inevitable byproducts of its design. The blinking, buzzing parade is, rather, a side-effect of business failure, a desperation move on the part of flailing commercial publishers.

For instance, Monday morning I was reading Howard Kurtz’s paean to the survival of Time magazine when the Washington Post decided that I might not be sufficiently engaged with its writer’s words. A black prompt box helpfully hovered in from the right page margin with a come-hither look and a “related story” link. How mean to Howie, I thought. (Over at the New York Times, at least they save these little fly-in suggestion boxes till you’ve reached the end of a story.)

If you’re on a web page that’s weighted down with cross-promotional hand-waving, revenue-squeezing ad overload and interstitial interruptions, odds are you’re on a newspaper or magazine site. For an egregiously awful example of how business linking can ruin the experience of reading on the Web, take a look at the current version of Time.com.
[Read more…]

Filed Under: Business, Media, Net Culture

In Defense of Links, Part One: Nick Carr, hypertext and delinkification

August 30, 2010 by Scott Rosenberg

For 15 years, I’ve been doing most of my writing — aside from my two books — on the Web. When I do switch back to writing an article for print, I find myself feeling stymied. I can’t link!

Links have become an essential part of how I write, and also part of how I read. Given a choice between reading something on paper and reading it online, I much prefer reading online: I can follow up on an article’s links to explore source material, gain a deeper understanding of a complex point, or just look up some term of art with which I’m unfamiliar.

There is, I think, nothing unusual about this today. So I was flummoxed earlier this year when Nicholas Carr started a campaign against the humble link, and found at least partial support from some other estimable writers (among them Laura Miller, Marshall Kirkpatrick, Jason Fry and Ryan Chittum). Carr’s “delinkification” critique is part of a larger argument contained in his book The Shallows. I read the book this summer and plan to write about it more. But for now let’s zero in on Carr’s case against links, on pages 126-129 of his book as well as in his “delinkification” post.

The nub of Carr’s argument is that every link in a text imposes “a little cognitive load” that makes reading less efficient. Each link forces us to ask, “Should I click?” As a result, Carr wrote in the “delinkification” post, “People who read hypertext comprehend and learn less, studies show, than those who read the same material in printed form.”

This appearance of the word “hypertext” is a tipoff to one of the big problems with Carr’s argument: it mixes up two quite different visions of linking.

“Hypertext” is the term invented by Ted Nelson in 1965 to describe text that, unlike traditional linear writing, spreads out in a network of nodes and links. Nelson’s idea hearkened back to Vannevar Bush’s celebrated “As We May Think,” paralleled Douglas Engelbart’s pioneering work on networked knowledge systems, and looked forward to today’s Web.

This original conception of hypertext fathered two lines of descent. One adopted hypertext as a practical tool for organizing and cross-associating information; the other embraced it as an experimental art form, which might transform the essentially linear nature of our reading into a branching game, puzzle or poem, in which the reader collaborates with the author. The pragmatists use links to try to enhance comprehension or add context, to say “here’s where I got this” or “here’s where you can learn more”; the hypertext artists deploy them as part of a larger experiment in expanding (or blowing up) the structure of traditional narrative.

These are fundamentally different endeavors. The pragmatic linkers have thrived in the Web era; the literary linkers have so far largely failed to reach anyone outside the academy. The Web has given us a hypertext world in which links providing useful pointers outnumber links with artistic intent a million to one. If we are going to study the impact of hypertext on our brains and our culture, surely we should look at the reality of the Web, not the dream of the hypertext artists and theorists.

The other big problem with Carr’s case against links lies in that ever-suspect phrase, “studies show.” Any time you hear those words your brain-alarm should sound: What studies? By whom? What do they show? What were they actually studying? How’d they design the study? Who paid for it?

To my surprise, as far as I can tell, not one of the many other writers who weighed in on delinkification earlier this year took the time to do so. I did, and here’s what I found.
[Read more…]

Filed Under: Culture, Media, Net Culture

Why trust Facebook with the future’s past?

August 23, 2010 by Scott Rosenberg

Comments weren’t working for a while today. Apologies to anyone whose words got eaten! Should be working again now.

An odd moment during the Facebook Places rollout last week has been bugging me ever since.

From Caroline McCarthy’s account at CNet:

Facebook not only wants to be the digital sovereignty toward which all other geolocation apps direct their figurative roads, it also wants to be the Web’s own omniscient historian.

“Too many of our human stories are still collecting dust on the shelves of our collections at home,” Facebook vice president of product Christopher Cox said as he explained the sociological rationale behind Facebook Places… “Those stories are going to be placed,” Cox said. “Those stories are going to be pinned to a physical location so that maybe one day in 20 years our children will go to Ocean Beach in San Francisco, and their little magical thing will start to vibrate and say, ‘This is where your parents first kissed.'”

From Chris O’Brien’s post:

Cox: “…Technology does not need to estrange us from each other.”

“Maybe one time you walk into a bar, you sit down at the bar, and you put your magical 10-years-into-the-future phone down. And suddenly it starts to glow. ‘This is what your friend ordered here’. And it pops up these memories…’Go check out this thing about the urinal that your friend wrote about when they were here about eight months ago.’ ”

Cox explained that all these check-ins, photos, and videos could be gathered on pages about a place to create “collective memories.”

“That’s dope.”

Yeah, that’s dope all right. Doper still would be for Facebook to begin performing this role of “omniscient historian” or “memory collector” right now. As I’ve been arguing for some time, Neither Facebook nor Twitter is doing a very good job of sharing the history we’re recording on them.

Everything we put on the Web is both ephemeral and archival — ephemeral in the sense that so much of what we post is only fleetingly relevant, archival in the sense that the things we post tend to stay where we put them so we can find them years later. Most forms of social media in the pre-status-update era — blogging, Flickr, Delicious, Youtube and so on — functioned in this manner. They encouraged us to pile up our stuff in public with the promise that it would still be there when we came back. As Marc Hedlund put it: public and permanent.

Twitter, at least, places each Tweet at a “permalink”-style public URL. So if you save a particular Tweet’s address you can find it again in the future. Otherwise, you’re out of luck. (You can make local copies of your Tweetstream, but that’s more of a backup than a linkable public archive.) Presumably Twitter is keeping all this data, and they’ve said that they’re handing a complete record over to the Library of Congress. But the data isn’t public and permanent for the rest of us. I think we’re just supposed to take it on faith that we’ll get the keys back to it eventually. (Jeff Jarvis says he interviewed Evan Williams and “told him I want better ways to save my tweets, making them memory.” Hope to hear more from that. By linking to Jeff’s tweet here I have fished it out for posterity, one needle plucked from the fugitive haystack.)

Meanwhile, Facebook is even less helpful. Lord knows what happens to the old stuff there. Is there any way to find what you wrote on Facebook last year? I hope so, for the sake of the millions of people who are chronicling their lives on Mark Zuckerberg’s servers. But I’ve certainly never been able to find it.

In fact, Facebook is relentlessly now-focused. And because it uses its own proprietary software that it regularly changes, there is no way to build your own alternate set of archive links to old posts and pages the way you can on the open Web. Facebook users are pouring their hearts and souls into this system and it is tossing them into the proverbial circular file.

All of which led me to wonder what Facebook could possibly be thinking in asking us to imagine Places as a future repository for our collective history. After all, Facebook could be such a repository today, if it actually cared about history. It has given no evidence of such concern.

Maybe in the future all manner of data will, as Cox put it so charmingly, cause our “little magical things to start to vibrate.” I mean, dope! But if my kids are going to find out about the site of their parents’ first kiss, I’ll have to provide that information to someone. I don’t think it will be Facebook.

Filed Under: Blogging, Media, Net Culture

Dr. Laura, Associated Content and the Googledammerung

August 20, 2010 by Scott Rosenberg

I was on vacation for much of the last couple of weeks, so I missed a lot — including the self-immolation of Dr. Laura Schlessinger. Apparently Schlessinger was the last public figure in the U.S. who does not understand the simple rules of courtesy around racial/religious/ethnic slurs. (As an outsider you don’t get a free pass to use them — no matter how many times you hear them uttered by their targets.) She browbeat a caller with a self-righteous barrage of the “N-word” — and wrote her talk-show-host epitaph.

I shed no tears for Dr. Laura — why do we give so much air time to browbeaters, anyway? — and I don’t care much about this story. But after reading a post over at TPM about Sarah Palin’s hilariously syntax-challenged tweets defending Schlessinger, I wanted to learn just a bit more about what had happened. So of course I turned to Google.

Now, it may have been my choice of search term, or it may have been that the event is already more than a week old, but I was amazed to see, at the top of the Google News results, a story from Associated Content. AC, of course, is the “content farm” recently acquired by Yahoo; it pays writers a pittance to crank out brief items that are — as I’ve written — crafted not to beguile human readers but to charm Google’s algorithm.

AC’s appearance in the Google lead position surprised me. I’d always assumed that, inundated by content-farm-grown dross, Google would figure out how to keep the quality stuff at the top of its index. And this wasn’t Google’s general search index recommending AC, but the more rarefied Google News — which prides itself on maintaining a fairly narrow set of sources, qualified by some level of editorial scrutiny.

Gee, maybe Associated Content is getting better, I thought. Maybe it’s producing some decent stuff. Then I clicked through and began reading:

The Dr. Laura n-word backlash made her quit her radio show. It seems the Dr. Laura n-word controversy has made her pay the price, as the consequences of herbrought down her long-running program. But even if it ended her show, it may not end her career. Despite being labeled as a racist, and despite allegedly being tired of radio, the embattled doctor still seems set to fight on after she leaves. In fact, the Dr. Laura n-word scandal has made her more defiant than ever, despite quitting.

I have cut-and-pasted this quote to preserve all its multi-layered infelicities. The piece goes on in this vein, cobbled together with no care beyond an effortful — and, I guess, successful — determination to catch Google’s eye by repeating the phrase “Dr. Laura n-word” as many times as possible.

The tech press endlessly diverts itself with commentary about Google’s standing vis-a-vis Facebook, Google’s stock price, Google’s legal predicament vis-a-vis Oracle, and so forth — standard corporate who’s-up-who’s-down stuff. But this is different; this is consequential for all of us.

I was a fairly early endorser of Google back in 1998, when the company was a wee babe of a startup. Larry Page impatiently explained to me how PageRank worked, and I sang its deserved praises in my Salon column. For over a decade Google built its glittering empire on this simple reliability: It would always return the best links. You could count on it. You could even click on “I’m feeling lucky.”

I still feel lucky to be able to use Google a zillion times a day, and no, Bing is not much use as an alternative (Microsoft’s search engine kindly recommends two Associated Content stories in the first three results!). But when Google tells me that this drivel is the most relevant result, I can’t help thinking, the game’s up. The Wagner tubas are tuning up for Googledammerung: It’s the twilight of the bots.

As for Associated Content, it argues — as does its competition, like the IPO-bound Demand Media — that its articles are edited and its writers are paid and therefore its pages should be viewed as more professional than your average run-of-the-mill blogger-in-pajamas. I think they’ve got it backwards. I’ll take Pajama Boy or Girl any day. Whatever their limitations, they are usually writing out of some passion. They say something because it matters to them — not because some formula told them that in order to top the index heap, they must jab hot search phrases into their prose until it becomes a bloody pulp.

Let me quote longtime digital-culture observer Mark Dery, from his scorcher of a farewell to the late True/Slant:

The mark of a real writer is that she cares deeply about literary joinery, about keeping the lines of her prose plumb. That’s what makes writers writers: to them, prose isn’t just some Platonic vessel for serving up content; they care about words.

The best bloggers know a thing or two about this “literary joinery.” And even bad bloggers “care about words.” But the writer of Associated Content’s Dr. Laura post is bypassing such unprofitable concerns. He chooses his words to please neither himself nor his readers. They’re strictly for Google’s algorithm. The algorithm is supposed to be able to see through this sort of manipulation, to spit out the worthless gruel so it can serve its human users something more savory. But it looks like the algorithm has lost its sense of taste.

[I should state for the record that in the course of my business work for Salon.com I had occasion to meet with folks from Associated Content. They were upright and sharp and understood things about the Web that we didn’t, then. They’ve built a successful business out of “content” seasoned to suit the Googlebot’s appetite. It’s just not what we think of when we think of “writing.” And if this piece is any indication, there isn’t an editor in sight.]

BONUS LINK: If you want to understand more fully the process by which “news” publishers watch Google for trending topics and then crank out crud to catch Google’s eye, you cannot do better than this post by Danny Sullivan of SearchEngineLand. Sullivan calls it “The Google Sewage Factory”:

The pollution within Google News is ridiculous. This is Google, where we’re supposed to have the gold standard of search quality. Instead, we get “news” sites that have been admitted — after meeting specific editorial criteria — just jumping on the Google Trends bandwagon…

Filed Under: Business, Media, Technology

Bloomberg: a scarlet-letter correction policy?

August 12, 2010 by Scott Rosenberg

One of the things we’re trying to accomplish with MediaBugs is to encourage a change in newsroom culture. Journalists are still often reluctant to admit error, or even discuss the possibility of a mistake, for fear that it undermines their authority. But today a growing number of them understand that accuracy is best served, and authority best preserved, by being more open about the correction process.

That is the attitude we’ve encountered at most of the Bay Area news institutions where we’ve demoed MediaBugs. Unfortunately, it’s not what we found at Bloomberg when we tried to obtain a response on behalf of blogger Josh Nelson, who’d filed an error report at MediaBugs about a Bloomberg story.

Nelson raised a specific and credible criticism about the headline and lead on a Bloomberg report based on a national poll. Bloomberg’s coverage, Nelson argued, didn’t accurately reflect the actual question that its pollsters had asked about the Obama administration’s ban on deepwater oil drilling in the Gulf. (The story and headline said that “Most Americans oppose President Barack Obama’s ban” on such drilling, but the poll asked about a general ban on all Gulf drilling, while Obama has placed a temporary hold on deepwater drilling.) Bloomberg, as we described recently, circled the wagons in response.

The news service, of course, has every right to “stand by its story.” But since Nelson has raised a reasonable question, Bloomberg’s public deserves a reasonable response. It would be useful for its readers — and its colleagues at publications like the San Francisco Chronicle, which reprinted the story — to hear from the editors why they disagree with Nelson. Apparently they believe their copy accurately reflects the poll they took, but they have yet to offer a substantive case explaining why.

Institutional behavior of this kind always leaves me scratching my head. A comment posted on our previous post on the Bloomberg bug over at the PBS MediaShift Idea Lab proposed an intriguing theory: A former Bloomberg journalist suggested that the company’s personnel policies came down so hard on employees who made errors that they were reluctant to admit them at all.

These standards, which are meant to make people super-careful before publishing a story, actually serve as a perverse incentive and cause people at all levels of the newsroom to resist correcting stories after they are published if there is any way to justify leaving the story as is.

This was, we thought, worth a follow-up, and so we contacted the commenter. He turned out to be Steven Bodzin, who’d worked as a reporter in San Francisco and Venezuela for Bloomberg for four years before leaving the company in March. My colleague Mark Follman spoke at length with Bodzin last week.

Bodzin said he “rarely saw complaints from the public get ignored.” He told us that Bloomberg’s culture is actually “hypersensitive” to public response but especially focused on issues raised by sources or by customers who subscribe to its terminal service (Bloomberg’s business was built on selling real-time market data to the financial industry over its own network — only later did it begin distributing news and information on public networks).

Bodzin described his own “prolific” first year as a Bloomberg correspondent, during which five of his stories were cited as exemplary in the company’s weekly internal reviews. He also had an unusually high number of corrections that year — which he attributed to the intense pace of the job — and got the message from his superiors that “you really have to bring that down.” He says that made him more careful. But he observed that the stigma that Bloomberg attached to corrections also encouraged a sort of silence in the newsroom in the face of potential problems.

Certainly there were situations where you realize something is wrong but you’re gonna say “I didn’t see that” or just forget about it.
At Bloomberg that’s considered a really serious offense … but at the same time, if you or nobody else mentions it … no harm no foul. I think it happens. One time a colleague of mine, who’d already had one correction that day, saw one and said to me: “I am Olympically burying this error.”

We asked Bodzin about the specific issue Josh Nelson raised about the drilling-ban poll.

They see this case as a question of interpretation, a judgment call — this is their own poll, a lot of reporters and editors are involved, so they [would all] get a correction. So they aren’t going to want to do it.

What we’re looking at here isn’t some revelation of blatantly irresponsible behavior but a subtler insight into the complex interplay of motivation inside a big organization. Bloomberg is hardly the only company where such a dynamic may be at work. What’s important is that the people who lead such institutions understand the need to change the dynamic — to rebalance the incentives inside their newsrooms.

Unfortunately, this incident suggests that Bloomberg’s culture today clings to the wagon-circling habit. As so much of the rest of the journalism field moves toward more open models, it remains an old-fashioned black-hole newsroom, happy to pump stories out to the world but unwilling to engage with that world when outsiders toss concerns back in. Bodzin explained, “Staffers aren’t supposed to talk to press at all — you’re supposed to send reporters to the PR department.”

And that’s exactly what we found when we tried to get comment from Bloomberg about the issues Bodzin raised. When we asked senior editors at Bloomberg to discuss their own policies and newsroom culture, they shunted us over to Ty Trippet, director of public relations for Bloomberg News, who wrote back:

Our policy is simple: If any Bloomberg News journalist is found to be hiding a mistake and is not transparent about it, their employment with Bloomberg is terminated.

So Bloomberg looks at a nuanced psychological question of newsroom behavior and responds with an “Apocalypse-Now”-style “terminate with extreme prejudice.” Doesn’t exactly give you confidence about the company’s ability to foster a culture of openness around the correction process.

Earlier this week Bloomberg announced the hire of Clark Hoyt — the Knight Ridder veteran who for the last three years served as the New York Times’ public editor. In that ombudsman-style role he served as a channel for public concerns about just the sort of issues we are raising here about Bloomberg.

Though Hoyt’s new management job at Bloomberg’s Washington bureau isn’t a public-editor role, it does put him squarely in the chain of command for stories like the oil-drilling poll. So maybe he’ll look into this, and also more generally at how Bloomberg handles public response to questions of accuracy. Right now, the company’s stance is one that hurts its reputation.

Filed Under: Media, Mediabugs

Change is good, but show your work: Here’s a WordPress revisions plugin

August 3, 2010 by Scott Rosenberg

A couple of weeks ago I posted a manifesto. I said Web publishers should let themselves change published articles and posts whenever they need to — and make each superseded version accessible to readers, the way Wikipedians and software developers do.

This one simple addition to the content-management arsenal, known as versioning, would allow us to use the Web as the flexible medium it ought to be, without worrying about confusing or deceiving readers.

Why not adopt [versioning] for every story we publish? Let readers see the older versions of stories. Let them see the diffs. Toss no text down the memory hole, and trigger no Orwell alarms.

Then I asked for help creating a WordPress plugin so I could show people what I was talking about. Now, thanks to some great work by Scott Carpenter, we have it. It’s working on this blog. (You can get it here.) Just go to the single-page form of any post here (the one that’s at its permalink URL, where you can see the comments), and if the post has been revised in any way since I published it, you can click back and see the earlier versions. You can also see the differences — diffs — highlighted, so you don’t have to hunt for them.

The less than two weeks since my post have given us several examples of problems that this “show your work” approach would solve. One of them can be found in the story of this New York Times error report over at MediaBugs.

An anonymous bug filer noticed that the Times seemed to have changed a statistic in the online version of a front-page story about where California’s African Americans stood on pot legalization. As first published, the story said blacks made up “only” or “about 6 percent” of the state population; soon after it was posted, the number changed to “less than 10 percent.” There’s a full explanation of what happened over at MediaBugs; apparently, the reporter got additional information after the story went live, and it was conflicting information, so reporter and editor together decided to alter the story to reflect the new information.

There is nothing wrong with this. In fact, it’s good — the story isn’t etched in stone, and if it can be improved, hooray. The only problem is the poor confused reader, who saw a story that read one way before and now reads another way. The problem isn’t the change; it’s the failure to note it. Showing versions would solve that.

Another Times issue arose yesterday when the paper changed a headline on a published story. The original version of a piece about Tumblr, the blogging service, was headlined “Facebook and Twitter’s new rival.” Some observers felt this headline was hype. (Tumblr is successful but in a very different league from the vastness of Facebook and Twitter.) At some point the headline was rewritten to read “Media Companies Try Getting Social With Tumblr.” Though the article does sport a correction now fixing some other errors, it makes no note of the headline change.

I don’t know what official Times policy is on headline substitution. Certainly, Web publications often modify headlines, and online headlines often differ from print headlines. Still, any time there’s an issue about the substance of a headline, and the headline is changed, a responsible news organization should be forthright about noting the change. Versioning would let editors tinker with headlines all they want.

I do not mean to single out the Times, which is one of the most scrupulous newsrooms around when it comes to corrections. Practices are in a state of flux today. News organizations don’t want to append elaborate correction notices each time they make a small adjustment to a story. And if we expect them to, we rob ourselves of the chance to have them continuously improve their stories.

The versioning solution takes care of all of this. It frees writers and editors to keep making their work better, without seeming to be pulling a fast one on their readers. It’s a simple, concrete way to get beyond the old print-borne notion of news stories as immutable text. It moves us one decent-sized step toward the possibilities the Web opens up for “continuing stories,” iterative news, and open-ended journalism.

How the plugin happened: I got some initial help from Stephen Paul Weber, who responded to my initial request to modify the existing “post revision display” plugin so as to only list revisions made since publication. Weber modified the plugin for me soon thereafter (thank you!). Unfortunately, I failed to realize that that plugin, created by D’Arcy Norman, only provided access to version texts to site administrators, not regular site visitors.

Scott Carpenter, the developer who’d originally pointed out the existing plugin to me, stepped up to the plate, helped me work up a short set of requirements for the plugin I wanted, and set to work to create it. Here’s his full post on the subject, along with the download link for the plugin. We went back and forth a few times. He thought of some issues I hadn’t — and took care of them. I kept adding new little requirements and he knocked them off one by one. I think we both view the end-product as still “experimentally usable” rather than a polished product, but it’s working pretty well for me here.

As the author of a whole book on why making software is hard, I’m always stunned when things go really fast and well, as they did here. Thanks for making this real, Scott!

If you run WordPress and like the idea of showing your work, let us know how it goes.

Filed Under: Media, Mediabugs, Software

You are not an eyeball: Why tracking is the ad biz’s last gasp

August 1, 2010 by Scott Rosenberg

Marketers are following you around on the Internet. They don’t know your name but they know what you do, what you buy, where you buy it, what you’re interested in, and more. The sites you visit collect this information on behalf of networks that then roll you up with other like-minded people in packages, as if you were a subprime mortgage, and sell your eyeballs to advertisers.

People inside the Web industry generally know all this and take it for granted. People outside mostly don’t. That explains some of the wide variation in reaction to a big package the Wall Street Journal published Saturday that chronicles how advertisers track users online.

I found it fascinating that two of the smarter Web veterans I know — Jeff Jarvis and Doc Searls — arrived at opposite perspectives on the Journal coverage. How did that happen? Let’s climb what I’ll call the ladder of reaction to this story, and we can see.

At the bottom rung, we have a simple everyday reader’s freakout. OMG They’re spying on us! This, it seems to me, is the level at which the Journal’s coverage was pitched. It’s full of loaded language: A headline that refers to “your secrets.” References to “surveillance” and “surreptitious” practices. Repeated use of the phrase “sophisticated software” to describe run-of-the-mill stuff that we’ve lived with for years, like the cookie files invented at the dawn of the Web by Lou Montulli (and that anyone can easily delete from their browser).

On the next rung up the ladder we have what I predict will be the response of the punditocracy, the editorial page writers and columnists. They will weigh in early this week, shake their heads in disapproval and demand that the government step in and pile more privacy regulations on the Internet advertising industry.

This will drive the Web industry insiders — up on the ladder’s third rung — even crazier than the Journal feature itself did. For them, the activities the Journal describes are simply old news. This is where we find Jeff Jarvis, who described the Journal feature as “the Reefer Madness of the digital age”: “I don’t understand how the Journal could be so breathlessly naive, unsophisticated, and anachronistic about the basics of the modern media business.” Similarly, Terry Heaton found the Journal’s coverage biased and behind the curve: “It’s like somebody at the paper had been sleeping for ten years and woke up to discover it’s the year 2010!”

Insiders will worry that an anti-tracking backlash might throttle the Web advertising industry at just the moment when big media institutions are praying that online ad revenue might help them make up for all the ad income they’re losing in their offline businesses.

Even more important, they will argue that tracking isn’t an invasion of privacy at all, since the advertisers mostly don’t know you by name or personal identity. Instead, they see you as a bundle of demographic traits and acquisitive tendencies. We owe the maintenance of this important distinction to an ad-tracking scare of a previous era, the great DoubleClick/Abacus controversy of 1999. Yes, this issue has been with us since 1999, which does make you wonder about the Journal’s breathless tone today.

The most important argument the insiders make is the very simple one that tracking, done right, actually performs a useful service: It helps reduce your exposure to ads you don’t care about and shows you more ads that you actually want to see.

This brings us up high to rung number four, where we meet Doc Searls, who is sitting on his own little platform that he’s built over the years, and inviting us to sit down with him and listen.

And he’s saying to the Web insiders: You guys are missing two points. The first is that “most real people are creeped out by this stuff,” even if it is old hat to you. The second is that you aren’t thinking big enough if you think that tracking users’ behavior is the best the Web can do.

You think the Web is all about making inefficient advertising more efficient, when it’s really about eliminating advertising as we have known it entirely, by giving us “better ways for demand and supply to meet — ways that don’t involve tracking or the guesswork called advertising.”

Searls has been elaborating this argument from the early days of the Cluetrain Manifesto to his current work at Project VRM. He’s saying: We know ourselves and our needs better than any third party’s guesswork. The Internet can enable us to speak directly to the marketplace about what we want. We can have a direct conversation with vendors of the things we are thinking about purchasing:

if I had exposed every possible action in my life this past week, including every word I wrote, every click I made, everything I ate and smelled and heard and looked at, the guesswork engine has not been built that can tell any seller the next thing I’ll actually want… Meanwhile I have money ready to spend on about eight things, right now, that I’d be glad to let the right sellers know, provided that information is confined to my relationship with those sellers, and that it doesn’t feed into anybody’s guesswork mill.

I find Searls’ vision appealing, even as I recognize the disruption it portends. The end of advertising also means the end of the business of delivering eyeballs to advertisers. It means that creative people and journalists and other “content creators” will need to abandon the old media’s compromised triangle trade (with creators ferrying consumers to advertisers) and learn how to fill public needs directly. That means we’ll need new ways to fund public-good information (foreign news, accountability journalism, investigations) once we can no longer pay for it with the overflow from advertising-monopoly profits.

That’s the future. Today, I actually think the Journal is doing a public service by writing about stuff industry insiders already know about — even if the paper went over the top in its intimations of dark marketing conspiracies. But it would be so much more of a service to look beyond the desperate thrashings of the badly wounded ad industry — and toward the better model that is struggling to be born.

Filed Under: Business, Media

Heffernan vs the SciBloggers: when community becomes commodity

July 30, 2010 by Scott Rosenberg

As you may have read, a group of high-profile and high-quality science bloggers recently left the network that had long housed them because the parent company had done a deal with Pepsi to create a nutrition blog in their midst.

Now we have a high-handed column from the New York Times’ Virginia Heffernan, which basically tells these bloggers: Grow up. Get real. This is the way the world works!

Most writers for “legacy” media like newspapers, magazines and TV see brush fires over business-editorial crossings as an occupational hazard. They don’t quit anytime there’s an ad that looks so much like an article it has to be marked “this is an advertisement.”

That may be because they have editors who (when they’re good) fight to defend standards against the encroachment of the business side. These bloggers had no choice but to represent themselves.

Heffernan goes on to fume about the bloggers’ “eek-a-mouse posturing” and mines their work for quotes that make them look silly or small-minded. I’ve read a lot of these blogs over the years and don’t recognize them in her portrait.

But she misses the bigger story here, so let me lay it out for you. The ScienceBlogs saga is a version of a tale that keeps repeating itself in our online culture — the one where a group of people who (correctly or not) thought of themselves as a community discover that they are being treated as a commodity.

This has been happening from the very beginning of human congregation online. It happened when AOL got sued by its moderators; it happened when the WELL’s pioneers lost their trust in the businessman who bought the service in the mid-’90s. I’m sure it will keep happening, so let’s try to understand it a little better than Heffernan does.

The ScienceBlogs affair is not a case of a bunch of reporters in a newsroom crying foul because a church/state line was crossed. This is a group of writers who believed they were collaborating in their own little space on the Web, a meritocracy of sorts built on their own labor. Then they woke up to the rude realization that somebody else owned their real estate — and was going to sell some of the space without their having any say in the matter.

As I understand it, the Pepsi blog was not an advertorial; it was a blog manned by Pepsico-salaried nutritional scientists. It might have been a good blog, for all we know. But it represented a change in the rules. The bloggers weren’t consulted. They thought of themselves as party hosts, and discovered that management though of them as “a source of revenue” (in the words of Bora Zivkovic, a SciBlogger who wrote the definitive post on the controversy).

For Heffernan, it might be better to try to imagine that her Times employers had sold the office or cubicle next to hers to some sponsor’s hand-picked writer, who would henceforth fill the magazine page opposite hers: “Here’s a sponsored journalist — have fun together!”

But, really, it’s not the details of the Pepsi blog that are important. After all, ScienceBlogs’ owner, Seed, withdrew the scheme once the bloggers raised a ruckus. It was too late. The bloggers had lost the illusion that they were involved in a community; they saw the businessman behind the curtain. There was no going back.

This loss of innocence is, I think, a nearly universal experience online. It occurs when one’s initial surge of idealistic delight at the freedom and opportunities of boundless self-expression slams into the realities of the media business online.

People who have experienced this will thereafter keep their antenna out and much more finely tuned to questions of ownership and governance and autonomy. They will not use the word “community” without thinking about it. They will also never again feel quite the same unqualified delight in sharing their writing online.

Should the science bloggers have known what was coming? Should they have been less innocent? Probably. But then they might not have been as exuberantly good at what they did.

I don’t think the outcome is a tragedy. The former ScienceBloggers will continue to be science bloggers, producing great posts and forming new communities. I think they’ll just handle the business-and-independence issues a little more carefully next time around. They are learning from their experience; I wish Heffernan had done so too.

BONUS LINKS: Ex-SciBlogger David Dobbs has a thoughtful response on his Neuron Culture blog.

And Jason Goldman, still on SciBlogs, helps point Heffernan to where the “real science” can be found there.

LATE UPDATE: Heffernan has posted a response at Dobbs’ blog.

Filed Under: Blogging, Media

Wikileaks: when it’s not a scoop but it’s still news

July 30, 2010 by Scott Rosenberg

In the chorus of critical reaction to the Wikileaks Afghanistan documents we heard two strains of criticism: One suggested that the material would harm the U.S. war effort and endanger people working for it. The other suggested that, because no earth-shattering headline could be mined from the mountain of documents, the whole thing was a waste of time.

I’m not in a position to offer strong views on the first criticism — except that, as a journalist, I always lean toward disclosure unless there’s clear likelihood of immediate harm to specific individuals. But the second criticism needs some review.

News organizations have always competed on the basis of scoops. The Wikileaks documents haven’t offered them anything that they can recognize as a scoop. You can picture the conversation:

Editor: What’d you find?
Reporter: Well, there’s a ton of fascinating detail about a lot of incidents. A little more detail about the problems with Pakistani intelligence. And a whole lot of local color…
Editor: Just give me the top line. What’s the headline?
Reporter: Uh, “Afghan war going as badly as everyone thought”?
Editor: Go find a fire somewhere, wouldja?

The journalistic ecosystem runs on scoops — pieces of information, not already public, that one news organization has and others don’t. The public cares less about this competition for scoops; it simply desires news — information it needs and wants to know, and that it didn’t previously have.

Not all scoops are real news. And now, with Wikileaks’ Afghan docs, we have a big example of real news that isn’t a scoop. I call it real news because it is a body of previously unavailable-to-the-public information about a matter that ought to be of deep concern to the public (an ongoing war). The absence of a single headline-able revelation makes this news harder for the news ecosystem to digest — but doesn’t make it any less “news,” or any less valuable.

The digestion may take considerably more time than journalists have patience for. The significance of the documents may emerge in the work of magazine writers or book authors. It may emerge in the hands of historians working long after we’re all dead — in which case we may well think, “Is that news?” Of course it is — we just can’t see it yet.
[Read more…]

Filed Under: Media

Bloomberg circles the wagons on misleading Gulf spill poll coverage

July 29, 2010 by Scott Rosenberg

News organizations’ default response to criticism is to circle the wagons.

“We stand by our story!” is a stirring thing to say, and sometimes it’s even the right thing. But in the web world of 2010, where everyone has a public platform, ignoring critics can also squander a news outlet’s credibility and alienate its audience.

The basic premise of MediaBugs — which I laid out in this video — is that news organizations can begin winning back the public trust they have lost by engaging civilly, in public, with people who criticize them about specific errors. Whoever is right in the end, and whether the newsroom decides to run a correction or not, the editors are better off explaining their thinking than slamming the door on dialogue.

For an example of precisely the wrong way of handling legitimate questions about coverage, consider the controversy over a recent Bloomberg opinion poll.
[Read more…]

Filed Under: Media, Mediabugs

« Previous Page
Next Page »