Wordyard

Hand-forged posts since 2002

Archives

About

Greatest hits

In Defense of Links, Part Two: Money changes everything

August 31, 2010 by Scott Rosenberg

This is the second post in a three-part series. The first part was Nick Carr, hypertext and delinkification. The third part is In links we trust.

The Web is deep in many directions, yet it is also, undeniably, full of distractions. These distractions do not lie at the root of the Web’s nature. They’re out on its branches, where we find desperate businesses perched, struggling to eke out one more click of your mouse, one more view of their page.

Yesterday I distinguished the “informational linking” most of us use on today’s Web from the “artistic linking” of literary hypertext avant-gardists. The latter, it turns out, is what researchers were examining when they produced the studies that Nick Carr dragooned into service in his campaign to prove that the Web is dulling our brains.

Today I want to talk about another kind of linking: call it “corporate linking.” (Individuals and little-guy companies do it, too, but not on the same scale.) These are links placed on pages because they provide some tangible business value to the linker: they cookie a user for an affiliate program, or boost a target page’s Google rank, or aim to increase a site’s “stickiness” by getting the reader to click through to another page.

I think Nick Carr is wrong in arguing that linked text is in itself harder to read than unlinked text. But when he maintains that reading on the Web is too often an assault of blinking distractions, well, that’s hard to deny. The evidence is all around us. The question is, why? How did the Web, a tool to forge connections and deepen understanding, become, in the eyes of so many intelligent people, an attention-mangling machine?

Practices like splitting articles into multiple pages or delivering lists via pageview-mongering slideshows have been with us since the early Web. I figured they’d die out quickly, but they’ve shown great resilience — despite being crude, annoying, ineffective, hostile to users, and harmful to the long-term interests of their practitioners. There seems to be an inexhaustible supply of media executives who misunderstand how the Web works and think that they can somehow beat it into submission. Their tactics have produced an onslaught of distractions that are neither native to the Web’s technology nor inevitable byproducts of its design. The blinking, buzzing parade is, rather, a side-effect of business failure, a desperation move on the part of flailing commercial publishers.

For instance, Monday morning I was reading Howard Kurtz’s paean to the survival of Time magazine when the Washington Post decided that I might not be sufficiently engaged with its writer’s words. A black prompt box helpfully hovered in from the right page margin with a come-hither look and a “related story” link. How mean to Howie, I thought. (Over at the New York Times, at least they save these little fly-in suggestion boxes till you’ve reached the end of a story.)

If you’re on a web page that’s weighted down with cross-promotional hand-waving, revenue-squeezing ad overload and interstitial interruptions, odds are you’re on a newspaper or magazine site. For an egregiously awful example of how business linking can ruin the experience of reading on the Web, take a look at the current version of Time.com.
[Read more…]

Filed Under: Business, Media, Net Culture

In Defense of Links, Part One: Nick Carr, hypertext and delinkification

August 30, 2010 by Scott Rosenberg

For 15 years, I’ve been doing most of my writing — aside from my two books — on the Web. When I do switch back to writing an article for print, I find myself feeling stymied. I can’t link!

Links have become an essential part of how I write, and also part of how I read. Given a choice between reading something on paper and reading it online, I much prefer reading online: I can follow up on an article’s links to explore source material, gain a deeper understanding of a complex point, or just look up some term of art with which I’m unfamiliar.

There is, I think, nothing unusual about this today. So I was flummoxed earlier this year when Nicholas Carr started a campaign against the humble link, and found at least partial support from some other estimable writers (among them Laura Miller, Marshall Kirkpatrick, Jason Fry and Ryan Chittum). Carr’s “delinkification” critique is part of a larger argument contained in his book The Shallows. I read the book this summer and plan to write about it more. But for now let’s zero in on Carr’s case against links, on pages 126-129 of his book as well as in his “delinkification” post.

The nub of Carr’s argument is that every link in a text imposes “a little cognitive load” that makes reading less efficient. Each link forces us to ask, “Should I click?” As a result, Carr wrote in the “delinkification” post, “People who read hypertext comprehend and learn less, studies show, than those who read the same material in printed form.”

This appearance of the word “hypertext” is a tipoff to one of the big problems with Carr’s argument: it mixes up two quite different visions of linking.

“Hypertext” is the term invented by Ted Nelson in 1965 to describe text that, unlike traditional linear writing, spreads out in a network of nodes and links. Nelson’s idea hearkened back to Vannevar Bush’s celebrated “As We May Think,” paralleled Douglas Engelbart’s pioneering work on networked knowledge systems, and looked forward to today’s Web.

This original conception of hypertext fathered two lines of descent. One adopted hypertext as a practical tool for organizing and cross-associating information; the other embraced it as an experimental art form, which might transform the essentially linear nature of our reading into a branching game, puzzle or poem, in which the reader collaborates with the author. The pragmatists use links to try to enhance comprehension or add context, to say “here’s where I got this” or “here’s where you can learn more”; the hypertext artists deploy them as part of a larger experiment in expanding (or blowing up) the structure of traditional narrative.

These are fundamentally different endeavors. The pragmatic linkers have thrived in the Web era; the literary linkers have so far largely failed to reach anyone outside the academy. The Web has given us a hypertext world in which links providing useful pointers outnumber links with artistic intent a million to one. If we are going to study the impact of hypertext on our brains and our culture, surely we should look at the reality of the Web, not the dream of the hypertext artists and theorists.

The other big problem with Carr’s case against links lies in that ever-suspect phrase, “studies show.” Any time you hear those words your brain-alarm should sound: What studies? By whom? What do they show? What were they actually studying? How’d they design the study? Who paid for it?

To my surprise, as far as I can tell, not one of the many other writers who weighed in on delinkification earlier this year took the time to do so. I did, and here’s what I found.
[Read more…]

Filed Under: Culture, Media, Net Culture

“We’re Hot as Hell and We’re Not Going to Take It Any More” — guest post by Bill McKibben

August 27, 2010 by Scott Rosenberg

I don’t normally do guest posts. This is an exception. My friend Bill wrote this earlier this month after Congress’s effort to pass the most minimal energy legislation collapsed. If you haven’t already read it at TomDispatch or Huffington Post or 350.org, here’s another chance. It’s that important.

Three steps to establish a politics of global warming

Try to fit these facts together:

  • According to the National Oceanic and Atmospheric Administration, the planet has just come through the warmest decade, the warmest 12 months, the warmest six months, and the warmest April, May, and June on record.
  • A “staggering” new study from Canadian researchers has shown that warmer seawater has reduced phytoplankton, the base of the marine food chain, by 40% since 1950.
  • Nine nations have so far set their all-time temperature records in 2010, including Russia (111 degrees), Niger (118), Sudan (121), Saudi Arabia and Iraq (126 apiece), and Pakistan, which also set the new all-time Asia record in May: a hair under 130 degrees. I can turn my oven to 130 degrees.
  • And then, in late July, the U.S. Senate decided to do exactly nothing about climate change. They didn’t do less than they could have — they did nothing, preserving a perfect two-decade bipartisan record of no action. Senate majority leader Harry Reid decided not even to schedule a vote on legislation that would have capped carbon emissions.

I wrote the first book for a general audience on global warming back in 1989, and I’ve spent the subsequent 21 years working on the issue. I’m a mild-mannered guy, a Methodist Sunday School teacher. Not quick to anger. So what I want to say is: this is fucked up. The time has come to get mad, and then to get busy.

For many years, the lobbying fight for climate legislation on Capitol Hill has been led by a collection of the most corporate and moderate environmental groups, outfits like the Environmental Defense Fund. We owe them a great debt, and not just for their hard work. We owe them a debt because they did everything the way you’re supposed to: they wore nice clothes, lobbied tirelessly, and compromised at every turn.

[Read more…]

Filed Under: Politics, Science

20 years of Web-whacking: my SXSW talk

August 24, 2010 by Scott Rosenberg

I had such a great time at South by Southwest last spring talking about blogging that I threw my hat in the ring again for next year.

My idea this time: “The Internet: Threat or Menace?” — a guided tour through two decades of tirades, fusillades and rants against the Internet, the Web, and all the other stuff people do with computers.

There’s rich history here, much of it already forgotten, some of it extremely funny. I’ve read a lot of these books and essays already. I’m eager to try to figure out why so many Internet critiques have that undead-zombie quality: you know they’ve got no life left in them, yet they keep lurching forward, leaving trails of slime for the rest of us to slip on. Yes, of course, there are legitimate and valuable critiques of the Net and what it hath wrought. And a reasonable amount of fatuous utopian hot air as well. I will lay it out and we can all roll our eyes together.

If you want to give me a chance to do this, you know the drill: hie thee PanelPicker-ward and cast your ballot. And spread the word. I will be grateful. If I am picked, I will enlist all of you as collaborators here as I try to stretch my arms around this vast topic.

But I’ll understand if you’d rather just sit back and let me do all the work.

And if enough of you vote, I will attempt to distill the material to its essence.

In haiku.

Oh yes. Many other fine people are proposing interesting sessions at SXSW. Here’s a handful I’ve come across that I recommend to you:

Justin Peters of CJR running a panel on “Trust Falls: Authority, Credibility, Journalism, and the Internet”

Mother Jones’ panel on “Investigative Tweeting? Secrets of the New Interactive Reporting”

Jay Rosen’s “Bloggers vs. Journalists: It’s a Psychological Thing”

Dan Gillmor on “Why Journalism Doesn’t Need Saving: an Optimist’s List”

Steve Fox assembling a panel on “That Was Private! After Weigel does privacy exist?”

My friends at XOXCO have a couple of proposals: Ben Brown on “Behind the Scenes of Online Communities” and Katie Spence with “Tales of the Future Past: Web Pioneers Remember.”

And tons more that I’m sure I’ve missed…

Filed Under: Events, Net Culture, Personal

Why trust Facebook with the future’s past?

August 23, 2010 by Scott Rosenberg

Comments weren’t working for a while today. Apologies to anyone whose words got eaten! Should be working again now.

An odd moment during the Facebook Places rollout last week has been bugging me ever since.

From Caroline McCarthy’s account at CNet:

Facebook not only wants to be the digital sovereignty toward which all other geolocation apps direct their figurative roads, it also wants to be the Web’s own omniscient historian.

“Too many of our human stories are still collecting dust on the shelves of our collections at home,” Facebook vice president of product Christopher Cox said as he explained the sociological rationale behind Facebook Places… “Those stories are going to be placed,” Cox said. “Those stories are going to be pinned to a physical location so that maybe one day in 20 years our children will go to Ocean Beach in San Francisco, and their little magical thing will start to vibrate and say, ‘This is where your parents first kissed.'”

From Chris O’Brien’s post:

Cox: “…Technology does not need to estrange us from each other.”

“Maybe one time you walk into a bar, you sit down at the bar, and you put your magical 10-years-into-the-future phone down. And suddenly it starts to glow. ‘This is what your friend ordered here’. And it pops up these memories…’Go check out this thing about the urinal that your friend wrote about when they were here about eight months ago.’ ”

Cox explained that all these check-ins, photos, and videos could be gathered on pages about a place to create “collective memories.”

“That’s dope.”

Yeah, that’s dope all right. Doper still would be for Facebook to begin performing this role of “omniscient historian” or “memory collector” right now. As I’ve been arguing for some time, Neither Facebook nor Twitter is doing a very good job of sharing the history we’re recording on them.

Everything we put on the Web is both ephemeral and archival — ephemeral in the sense that so much of what we post is only fleetingly relevant, archival in the sense that the things we post tend to stay where we put them so we can find them years later. Most forms of social media in the pre-status-update era — blogging, Flickr, Delicious, Youtube and so on — functioned in this manner. They encouraged us to pile up our stuff in public with the promise that it would still be there when we came back. As Marc Hedlund put it: public and permanent.

Twitter, at least, places each Tweet at a “permalink”-style public URL. So if you save a particular Tweet’s address you can find it again in the future. Otherwise, you’re out of luck. (You can make local copies of your Tweetstream, but that’s more of a backup than a linkable public archive.) Presumably Twitter is keeping all this data, and they’ve said that they’re handing a complete record over to the Library of Congress. But the data isn’t public and permanent for the rest of us. I think we’re just supposed to take it on faith that we’ll get the keys back to it eventually. (Jeff Jarvis says he interviewed Evan Williams and “told him I want better ways to save my tweets, making them memory.” Hope to hear more from that. By linking to Jeff’s tweet here I have fished it out for posterity, one needle plucked from the fugitive haystack.)

Meanwhile, Facebook is even less helpful. Lord knows what happens to the old stuff there. Is there any way to find what you wrote on Facebook last year? I hope so, for the sake of the millions of people who are chronicling their lives on Mark Zuckerberg’s servers. But I’ve certainly never been able to find it.

In fact, Facebook is relentlessly now-focused. And because it uses its own proprietary software that it regularly changes, there is no way to build your own alternate set of archive links to old posts and pages the way you can on the open Web. Facebook users are pouring their hearts and souls into this system and it is tossing them into the proverbial circular file.

All of which led me to wonder what Facebook could possibly be thinking in asking us to imagine Places as a future repository for our collective history. After all, Facebook could be such a repository today, if it actually cared about history. It has given no evidence of such concern.

Maybe in the future all manner of data will, as Cox put it so charmingly, cause our “little magical things to start to vibrate.” I mean, dope! But if my kids are going to find out about the site of their parents’ first kiss, I’ll have to provide that information to someone. I don’t think it will be Facebook.

Filed Under: Blogging, Media, Net Culture

Dr. Laura, Associated Content and the Googledammerung

August 20, 2010 by Scott Rosenberg

I was on vacation for much of the last couple of weeks, so I missed a lot — including the self-immolation of Dr. Laura Schlessinger. Apparently Schlessinger was the last public figure in the U.S. who does not understand the simple rules of courtesy around racial/religious/ethnic slurs. (As an outsider you don’t get a free pass to use them — no matter how many times you hear them uttered by their targets.) She browbeat a caller with a self-righteous barrage of the “N-word” — and wrote her talk-show-host epitaph.

I shed no tears for Dr. Laura — why do we give so much air time to browbeaters, anyway? — and I don’t care much about this story. But after reading a post over at TPM about Sarah Palin’s hilariously syntax-challenged tweets defending Schlessinger, I wanted to learn just a bit more about what had happened. So of course I turned to Google.

Now, it may have been my choice of search term, or it may have been that the event is already more than a week old, but I was amazed to see, at the top of the Google News results, a story from Associated Content. AC, of course, is the “content farm” recently acquired by Yahoo; it pays writers a pittance to crank out brief items that are — as I’ve written — crafted not to beguile human readers but to charm Google’s algorithm.

AC’s appearance in the Google lead position surprised me. I’d always assumed that, inundated by content-farm-grown dross, Google would figure out how to keep the quality stuff at the top of its index. And this wasn’t Google’s general search index recommending AC, but the more rarefied Google News — which prides itself on maintaining a fairly narrow set of sources, qualified by some level of editorial scrutiny.

Gee, maybe Associated Content is getting better, I thought. Maybe it’s producing some decent stuff. Then I clicked through and began reading:

The Dr. Laura n-word backlash made her quit her radio show. It seems the Dr. Laura n-word controversy has made her pay the price, as the consequences of herbrought down her long-running program. But even if it ended her show, it may not end her career. Despite being labeled as a racist, and despite allegedly being tired of radio, the embattled doctor still seems set to fight on after she leaves. In fact, the Dr. Laura n-word scandal has made her more defiant than ever, despite quitting.

I have cut-and-pasted this quote to preserve all its multi-layered infelicities. The piece goes on in this vein, cobbled together with no care beyond an effortful — and, I guess, successful — determination to catch Google’s eye by repeating the phrase “Dr. Laura n-word” as many times as possible.

The tech press endlessly diverts itself with commentary about Google’s standing vis-a-vis Facebook, Google’s stock price, Google’s legal predicament vis-a-vis Oracle, and so forth — standard corporate who’s-up-who’s-down stuff. But this is different; this is consequential for all of us.

I was a fairly early endorser of Google back in 1998, when the company was a wee babe of a startup. Larry Page impatiently explained to me how PageRank worked, and I sang its deserved praises in my Salon column. For over a decade Google built its glittering empire on this simple reliability: It would always return the best links. You could count on it. You could even click on “I’m feeling lucky.”

I still feel lucky to be able to use Google a zillion times a day, and no, Bing is not much use as an alternative (Microsoft’s search engine kindly recommends two Associated Content stories in the first three results!). But when Google tells me that this drivel is the most relevant result, I can’t help thinking, the game’s up. The Wagner tubas are tuning up for Googledammerung: It’s the twilight of the bots.

As for Associated Content, it argues — as does its competition, like the IPO-bound Demand Media — that its articles are edited and its writers are paid and therefore its pages should be viewed as more professional than your average run-of-the-mill blogger-in-pajamas. I think they’ve got it backwards. I’ll take Pajama Boy or Girl any day. Whatever their limitations, they are usually writing out of some passion. They say something because it matters to them — not because some formula told them that in order to top the index heap, they must jab hot search phrases into their prose until it becomes a bloody pulp.

Let me quote longtime digital-culture observer Mark Dery, from his scorcher of a farewell to the late True/Slant:

The mark of a real writer is that she cares deeply about literary joinery, about keeping the lines of her prose plumb. That’s what makes writers writers: to them, prose isn’t just some Platonic vessel for serving up content; they care about words.

The best bloggers know a thing or two about this “literary joinery.” And even bad bloggers “care about words.” But the writer of Associated Content’s Dr. Laura post is bypassing such unprofitable concerns. He chooses his words to please neither himself nor his readers. They’re strictly for Google’s algorithm. The algorithm is supposed to be able to see through this sort of manipulation, to spit out the worthless gruel so it can serve its human users something more savory. But it looks like the algorithm has lost its sense of taste.

[I should state for the record that in the course of my business work for Salon.com I had occasion to meet with folks from Associated Content. They were upright and sharp and understood things about the Web that we didn’t, then. They’ve built a successful business out of “content” seasoned to suit the Googlebot’s appetite. It’s just not what we think of when we think of “writing.” And if this piece is any indication, there isn’t an editor in sight.]

BONUS LINK: If you want to understand more fully the process by which “news” publishers watch Google for trending topics and then crank out crud to catch Google’s eye, you cannot do better than this post by Danny Sullivan of SearchEngineLand. Sullivan calls it “The Google Sewage Factory”:

The pollution within Google News is ridiculous. This is Google, where we’re supposed to have the gold standard of search quality. Instead, we get “news” sites that have been admitted — after meeting specific editorial criteria — just jumping on the Google Trends bandwagon…

Filed Under: Business, Media, Technology

Heather Gold’s “Unpresenting”

August 18, 2010 by Scott Rosenberg

Right before we left for an idyllic last-gasp-of-summer week on the north coast, I took a day-long Unpresenting workshop with Heather Gold, and I want to recommend it highly and enthusiastically to anyone interested in making their public appearances more engaging, lively, and memorable.

Photo by Carlo de Marchis

Gold is a standup comic, solo performer, Web person and, more recently, promoter of the idea of “tummeling” — the art (descended to us from the dim Borscht Belt past) of breaking the ice for a crowd, warming people up to one another so that a comfortable conversation can flow. “Unpresenting” is her name for a style of public speaking that’s less about imparting information (“I am the expert and am here to tell you X, Y and Z”) and more about opening conversation (“Let’s talk about this stuff — I think X and Y — what do you think?”).

You know the old saying about conferences that what happens in the room is a lot less interesting than what happens in the hall outside? Gold’s workshop provides a roadmap for transforming the room into something more like that hallway.

Some of Gold’s advice is practical, veteran-performers’ tips (like scanning your crowd, particularly at its edges, to keep people feeling included). Some of it is more of a simple challenge to understand what it is that people want to get out of a public event. If it’s just your information they’re after, why not just give them a book or a blog post? If it’s more of your in-person gestalt — a sense of who you are, what you’re like, how you move, and what you sound like, not just what you think — then a looser, more conversational mode will provide that a lot more efficiently than a podium-bound recital or (even worse) PowerPoint bullet lists.

As a former theater critic I’ve always been extra conscious of the preciousness of public time. When anyone gives me ten minutes or an hour in front of a crowd I want to make sure I use it well. And so I’ve always spent a ton of time preparing talks, often writing them out (I am, after all, a writer — that’s where I’m comfortable and confident!), so I can feel I’ve done my best to provide listeners with something of value.

Gold got me thinking about different kinds of value I might have been neglecting. I don’t think my presentations are going to change completely, but I’m definitely planning on playing around with more loosely structured and open-ended formats: less lecture, more conversation. And if you get a chance to learn about unpresenting with Heather, grab it!

Filed Under: Events, People, Personal

Bloomberg: a scarlet-letter correction policy?

August 12, 2010 by Scott Rosenberg

One of the things we’re trying to accomplish with MediaBugs is to encourage a change in newsroom culture. Journalists are still often reluctant to admit error, or even discuss the possibility of a mistake, for fear that it undermines their authority. But today a growing number of them understand that accuracy is best served, and authority best preserved, by being more open about the correction process.

That is the attitude we’ve encountered at most of the Bay Area news institutions where we’ve demoed MediaBugs. Unfortunately, it’s not what we found at Bloomberg when we tried to obtain a response on behalf of blogger Josh Nelson, who’d filed an error report at MediaBugs about a Bloomberg story.

Nelson raised a specific and credible criticism about the headline and lead on a Bloomberg report based on a national poll. Bloomberg’s coverage, Nelson argued, didn’t accurately reflect the actual question that its pollsters had asked about the Obama administration’s ban on deepwater oil drilling in the Gulf. (The story and headline said that “Most Americans oppose President Barack Obama’s ban” on such drilling, but the poll asked about a general ban on all Gulf drilling, while Obama has placed a temporary hold on deepwater drilling.) Bloomberg, as we described recently, circled the wagons in response.

The news service, of course, has every right to “stand by its story.” But since Nelson has raised a reasonable question, Bloomberg’s public deserves a reasonable response. It would be useful for its readers — and its colleagues at publications like the San Francisco Chronicle, which reprinted the story — to hear from the editors why they disagree with Nelson. Apparently they believe their copy accurately reflects the poll they took, but they have yet to offer a substantive case explaining why.

Institutional behavior of this kind always leaves me scratching my head. A comment posted on our previous post on the Bloomberg bug over at the PBS MediaShift Idea Lab proposed an intriguing theory: A former Bloomberg journalist suggested that the company’s personnel policies came down so hard on employees who made errors that they were reluctant to admit them at all.

These standards, which are meant to make people super-careful before publishing a story, actually serve as a perverse incentive and cause people at all levels of the newsroom to resist correcting stories after they are published if there is any way to justify leaving the story as is.

This was, we thought, worth a follow-up, and so we contacted the commenter. He turned out to be Steven Bodzin, who’d worked as a reporter in San Francisco and Venezuela for Bloomberg for four years before leaving the company in March. My colleague Mark Follman spoke at length with Bodzin last week.

Bodzin said he “rarely saw complaints from the public get ignored.” He told us that Bloomberg’s culture is actually “hypersensitive” to public response but especially focused on issues raised by sources or by customers who subscribe to its terminal service (Bloomberg’s business was built on selling real-time market data to the financial industry over its own network — only later did it begin distributing news and information on public networks).

Bodzin described his own “prolific” first year as a Bloomberg correspondent, during which five of his stories were cited as exemplary in the company’s weekly internal reviews. He also had an unusually high number of corrections that year — which he attributed to the intense pace of the job — and got the message from his superiors that “you really have to bring that down.” He says that made him more careful. But he observed that the stigma that Bloomberg attached to corrections also encouraged a sort of silence in the newsroom in the face of potential problems.

Certainly there were situations where you realize something is wrong but you’re gonna say “I didn’t see that” or just forget about it.
At Bloomberg that’s considered a really serious offense … but at the same time, if you or nobody else mentions it … no harm no foul. I think it happens. One time a colleague of mine, who’d already had one correction that day, saw one and said to me: “I am Olympically burying this error.”

We asked Bodzin about the specific issue Josh Nelson raised about the drilling-ban poll.

They see this case as a question of interpretation, a judgment call — this is their own poll, a lot of reporters and editors are involved, so they [would all] get a correction. So they aren’t going to want to do it.

What we’re looking at here isn’t some revelation of blatantly irresponsible behavior but a subtler insight into the complex interplay of motivation inside a big organization. Bloomberg is hardly the only company where such a dynamic may be at work. What’s important is that the people who lead such institutions understand the need to change the dynamic — to rebalance the incentives inside their newsrooms.

Unfortunately, this incident suggests that Bloomberg’s culture today clings to the wagon-circling habit. As so much of the rest of the journalism field moves toward more open models, it remains an old-fashioned black-hole newsroom, happy to pump stories out to the world but unwilling to engage with that world when outsiders toss concerns back in. Bodzin explained, “Staffers aren’t supposed to talk to press at all — you’re supposed to send reporters to the PR department.”

And that’s exactly what we found when we tried to get comment from Bloomberg about the issues Bodzin raised. When we asked senior editors at Bloomberg to discuss their own policies and newsroom culture, they shunted us over to Ty Trippet, director of public relations for Bloomberg News, who wrote back:

Our policy is simple: If any Bloomberg News journalist is found to be hiding a mistake and is not transparent about it, their employment with Bloomberg is terminated.

So Bloomberg looks at a nuanced psychological question of newsroom behavior and responds with an “Apocalypse-Now”-style “terminate with extreme prejudice.” Doesn’t exactly give you confidence about the company’s ability to foster a culture of openness around the correction process.

Earlier this week Bloomberg announced the hire of Clark Hoyt — the Knight Ridder veteran who for the last three years served as the New York Times’ public editor. In that ombudsman-style role he served as a channel for public concerns about just the sort of issues we are raising here about Bloomberg.

Though Hoyt’s new management job at Bloomberg’s Washington bureau isn’t a public-editor role, it does put him squarely in the chain of command for stories like the oil-drilling poll. So maybe he’ll look into this, and also more generally at how Bloomberg handles public response to questions of accuracy. Right now, the company’s stance is one that hurts its reputation.

Filed Under: Media, Mediabugs

Could Google’s neutrality backstab be a fake?

August 5, 2010 by Scott Rosenberg

News that Google and Verizon are negotiating a deal to “jump the Internet line,” as the New York Times put it in a great headline, shocked people who’ve been following the Net neutrality story and upset many of Google’s true believers. Google has long been one of Net neutrality’s most reliable big-company backers.

Net neutrality — the principle that information traveling across the Internet should be treated equally by the backbone carriers that keep the packets flowing — made sense for Google’s search-and-ad business: Keep the Internet a level playing field so it keeps growing and stays open to the Googlebot. It also helped keep people from snickering too loudly at the company’s “don’t be evil” mantra.

So why would Google turn around now, at a time when the FCC is weighing exactly how to shape the future of Net neutrality regulation, and signal a course-change toward, um, evil?

Here are the obvious explanations: Google wants to speed YouTube bits to your screen. Google is in bed with Verizon thanks to Android. Google figures neutrality is never going to remain in place so get a jump on the competition.

None of these quite persuades me. But what if — here is where I pause to tell you this is total speculation on my part — it’s a fake-out? What if Google — or some portion of Google — is still basically behind the Net neutrality principle but realizes that very few people understand the issue or realize what’s at stake? Presumably Google and Verizon, which sells a ton of Android phones, talk all the time. Presumably they talk about Net neutrality-related stuff too.

Maybe someone inside Google who still believes in Net neutrality strategically leaked the fact that they’re negotiating this stuff — knowing the headlines and ruckus would follow. Knowing that this might be a perfect way to dramatize Net neutrality questions and mobilize support for strong Net neutrality rules from the public and for the FCC.

This scenario assumes a level of Machiavellian gameplaying skill on Google’s part that the company has not hitherto displayed. And if the whole story is a feint, it might well not be a strategic move on Google’s part but rather a sign of dissent inside Google, with one faction pushing the Verizon deal and another hoping to blow it up.

Still, worth pondering!

UPDATE: A tweet from Google’s Public Policy: “@NYTimes is wrong. We’ve not had any convos with VZN about paying for carriage of our traffic. We remain committed to an open internet.” [hat tip to Dan Lyke in comments]

Filed Under: Business, Politics, Technology

Careful with that ad headline!

August 5, 2010 by Scott Rosenberg

Here is the front page of a flyer that recently dropped out of my newspaper. It is an ad for a certain very large PC maker whose name rhymes with that fiery place where bad people spend eternity.

We glance at ads very quickly, or we catch them from the corner of our eyes. And when this one passed through my eye and into my brainpan, what I saw was:

DOCK TO DOORSTOP IN ABOUT 48 HOURS.

Which really just doesn’t seem like enough time to enjoy your new laptop…

Filed Under: Business, Technology

« Previous Page
Next Page »