Wordyard

Hand-forged posts since 2002

Archives

About

Greatest hits

Demonetization

November 24, 2012 by Scott Rosenberg

Buried near the end of John Markoff’s front-page feature in the Times today about “deep learning”, neural-net-inspired software, this tidbit, which I think requires no further elaboration, but is worth noting, and noting again:

One of the most striking aspects of the research led by Dr. [Geoffrey] Hinton is that it has taken place largely without the patent restrictions and bitter infighting over intellectual property that characterize high-technology fields.

“We decided early on not to make money out of this, but just to sort of spread it to infect everybody,” he said. “These companies are terribly pleased with this.”

Said companies will (a) build a new industry on these openly shared ideas; (b) make fortunes; and then (c) dedicate themselves to locking those ideas up and extracting maximum profit from them.

That’s inevitable and nothing new. Let’s be glad, though, for the occasional Geoffrey Hintons and Tim Berners-Lees, who periodically rebalance the equation between open and closed systems and keep our cycle of technology evolution moving forward.

Filed Under: Business, Technology, Uncategorized

Missed stories: About that Horace Mann School article in the Times

June 9, 2012 by Scott Rosenberg

I attended the Horace Mann School in Riverdale, N.Y., from 1971 to 1977. I’ve generally thought well of the school as a great environment for a brainy, socially awkward kid like me to learn and grow. I became a writer largely based on my experience there, I learned to love journalism there, and I learned almost as much from my peers as I did from my teachers.

Horace Mann was, plainly, a place of great privilege. (My parents paid a fortune to send me there, and I remain deeply grateful for that.) I took a crazy-long trip each day from my central Queens home to the northwest corner of the Bronx to attend. I did that because the school embraced unorthodox teachers who inspired students. Also because it made ample room for the weird kids. It helped them find other weird kids to share their weird alienation and feel a little less alienated.

Now there’s this. The article is, I believe, thoughtful, fair, and sensitive. The author is a few years younger than I am, but his account accurately reflects the school I remember.

Except, of course, for the part about a decades-spanning pattern of sexual abuse of students by teachers, a pattern that it seems the school largely ignored and that I knew essentially nothing of during my student years.

I’m not a victim myself. I experienced no molestation, or anything even borderline or ambiguous, during my six years at H.M. Still, in the wake of this article, I find myself spending a lot of time and thought re-examining my own past — as I’m guessing are the great majority of my classmates and everyone else who had anything to do with the school in those years. (There have already been extraordinary conversations both in private e-mail and in the public comments on the Times story, and they’ve challenged my assumptions and stretched my thinking on the matter.)

So here’s what i’d say if I could punch a hole in time and send a message to myself on the day, almost exactly 35 years ago, that I graduated from Horace Mann:

You’re not going to believe this, but 35 years from now, H.M. is going to be on the cover of the New York Times Magazine. The humongous article will tell the world, in voluminous detail and with pained concern, about a “secret history of sexual abuse” at the school. Some of the events have already happened, while you were here; most are still to come.

It’s a story about troubled people abusing power, about changing mores and standards, and also about institutional failure and betrayal. A big story, I’d say. And — sorry to break the news — but you missed it.

You were the editor of the Record! You and your friends prided yourselves on attempting to tell the full story of life at the school in print every week! You published exposes of pot-dealing and polled the student body of drug use and thought, in those post-Watergate years, that you were ripping the lid off the truth. But you missed something bigger and more consequential.

I guess you couldn’t have done otherwise. You’re all of 18 years old. You think you know everything — but you’re smart enough to realize how wrong you are, too.

So now that you know this, I want you to think about two lessons.

One lesson for your work: The story you think you’re living is almost never the story of your time that the future will write. For journalists this is, and should be, humbling. It should make you ask questions every time you think you’ve told the truth about a situation. What’s the next layer down? There’s always another one. Never believe you’ve gotten to the bottom of anything. Even if you’ve done a good job, the world keeps rethinking everything. And those decades-spanning changes in how we think and live are the ones will make your head explode. Expect it.

Also, one lesson for your life: Eccentricity can be inspiring. What many of your Horace Mann teachers did, with their arrogance and their mystique and the cults that some of them spun around their subjects and themselves, can be amazingly effective at persuading monkey-minded adolescents to buckle down and care about science, literature, math, Latin, or music. The cult of learning can be beautiful — but it can also be a stalking-horse for something destructive and dangerous, ugly and evil. When seductive eccentricity crosses a line into control and victimization, it becomes a curse, and it can wreck lives.

Like a lot of your teenage friends, you’ve done a pretty good job of distinguishing between these kinds of eccentricity and avoiding the kind that could hurt you. Good for you. But not everyone is as confident or as fortunate. Kids can’t reasonably be expected to draw all the lines that adults, by rights, ought to be drawing for them. It’s up to institutions like schools (and churches, businesses, and governments!) to organize themselves in a way that leaves room for creativity while protecting the participants from abuse. Power always requires accountability. There are no exceptions.

That’s hard. But it’s something adults owe the children they’re raising. Try to remember that!

And then, if I can run this conceit out one more step, I think my newly minted Horace Mann graduate self would probably say something like this in response:

Thanks for the feel-good message on graduation day! There’s not much I can do with what you’ve told me, is there? Shouldn’t you have used your time-lord powers to dump sermons on the Horace Mann trustees?

Teach me this trick and maybe I can deliver you some wisdom in your retirement home. In the meantime, I’ve got a suggestion to throw back at you.

Yeah, I do think I know everything. But I also know I’m actually still a kid. I don’t yet know who I am, but you do, right?

Forget about Horace Mann. You live 3000 miles away from the place now, anyway. You should take all this introspection and turn it on that future world you’re living in.

I know that one of the things that happens to people as they get older is that they become more willing to just go along with the patterns in their lives, to accept a “that’s the way the world is” complacency. Fight that, will you?

You can’t do anything about what happened decades ago. But look around now, in your “now.” Find the stories that are the ones that one day, you’re going to wish somebody had told sooner. Tell them.

Point and match to the 18-year-old. What could I possibly say in response to that except, “I’ll try”?

Filed Under: Personal

Mr. Daisey and the Fact Factory: my take at Grist

March 17, 2012 by Scott Rosenberg

We interrupt this long blog-silence (more on which soon) to note that if you wanna know my take on the Mike Daisey/Apple/This American Life thing, I’ve just posted over at Grist on it.

My career started with writing about theater and specifically solo performance, moved into technology coverage, then took a turn into ethics and accuracy in journalism, and is now focused on sustainability and the environment. So Daisey’s story touched pretty much every one of my nerves.

Here’s an excerpt:

The temptation to round corners, to retouch images, to make a story flow better or a quote read better, faces every creator of non-fiction at every single moment of labor. And we all do it, all the time. We do it by varying degrees. We slice out “ums” from quotes. We leave out material we deem extraneous. No matter how much we verify of the facts that we think are salient, we can never verify everything.

But there are some compasses we can follow and some precedents we can observe. We don’t create composite characters (see: Janet Cooke) — or if we do, we explain exactly what we’re up to. We don’t say we’re reporting from one city when we’re sitting in another (see: Jayson Blair). We don’t simply invent stuff because it makes such great copy (see: Stephen Glass). We don’t invent a fake persona because it “makes people care” (see: Amina Araf).

The distinction between cosmetic changes and substantive fabrications is relatively easy to make. Storytellers get into trouble when they start to write themselves blank checks to “improve” on reality because the ends (in Daisey’s case, “making people care”) justify the means (in Daisey’s case, making shit up).

The whole thing is here.

Filed Under: Media, Personal

WSJ Social: When news apps want to steal your face

September 24, 2011 by Scott Rosenberg

I read about WSJ Social, the newspaper’s experiment at providing a socially driven version of itself entirely inside Facebook, and thought, hey, I should check it out. So I Googled “WSJ Social” and clicked on http://social.wsj.com. Since my browser was already logged in to Facebook, I was immediately confronted with a Facebook permissions screen. I captured it above for posterity.

Here is the problem: All I want to do is see what WSJ is up to. I might or might not actually want to use the product. But before I can proceed, here is what I’m asked to approve:

(1) “Access my basic information — Includes name, profile picture, gender, networks, user ID, list of friends, and any other information I’ve made public.” Well, this stuff is public already, right? I think I can live with this.

(2) “Send me email — WSJ.com may email me directly…” Hmm. I’m not eager to add to my load of commercial email and there’s no indication of the volume involved. But I’m not hugely protective of my email address — you know, there it is in the image above — so I guess this isn’t a dealbreaker.

(3) “Post to Facebook as me — WSJ.com may post status messages, notes, photos, and videos on my behalf.”

Excuse me? You want to do what?

Forget it, NewsCorp. Ain’t happening.

Now, I fully understand that the app may be up to nothing terribly nasty — some or most of what it wants to do may be routine back-end stuff. But it doesn’t provide me with any confidence-building information. Tell me, WSJ Social: How often are you going to post under my account? And what kinds of messages are you going to send? How will I know you’re not going to spam my friends? How do I know the WSJ’s rabid editorial-page id won’t start posting paeans to Sarah Palin under my name?

Facebook permissions screens may have become as widely ignored as Terms of Service checkboxes and SSL certificate warnings. But the notion of the Journal (or anyone else) insisting on its right to “Post to Facebook as me” before it will even let me examine its news product is simply ridiculous.

UPDATE: On Twitter, WSJ’s Alan Murray responds: “Not going to happen. Standard permissions in order to allow WSJ Social to share stories you ‘like’ with your friends.”

Filed Under: Business, Media

My next chapter: Grist

September 12, 2011 by Scott Rosenberg

After a wonderful couple of years writing Say Everything and another great couple of years building and launching MediaBugs, I’m returning to the world of editing: Starting today, I’m the executive editor of Grist.org, the pioneering green news website with the irreverent attitude.

It wasn’t entirely clear to me, after I left Salon four years ago, that I would ever take this kind of job again. It would have to be a very special organization: one that was trying to accomplish something important in the world; one that valued old-fashioned journalism and newfangled digital innovation; and one where the odd set of talents I’ve accumulated across my motley career could actually be put to work in useful ways.

Grist turned out to fit this bill in an almost supernaturally precise way. I first got to know the work Chip Giller and his team were doing there a decade ago at Salon, where we had content-sharing agreement, and I’ve continued to be fan of what they’ve built over the years. Now I have the privilege of taking Grist’s editorial helm at a moment that’s more critical than ever for the future of the planet — and more fluid than ever in the evolution of media.

Can you tell I’m excited?

It’s been a long day, so I think I’d better turn in — but not before pointing you to the sprightly post Chip wrote to welcome me to Grist, and the little note I wrote to introduce myself. There was also a brief press release with some kind words from my former colleague and sometime boss Joan Walsh.

And for those of you wondering about MediaBugs: It’s very much an ongoing project, though obviously I’m going to have less time to devote to it myself. I’ll be posting more here soon on its future, as well as offering a full report on its progress to date and some of the lessons we’ve learned from it.

Filed Under: Media, Personal

Steve Jobs, auteurs, and team-building

September 7, 2011 by Scott Rosenberg


If you look at my life, I’ve never gotten it right the first time. It always takes me twice.
  — Steve Jobs, in a 1992 Washington Post interview

I first wrote about Steve Jobs as a digital auteur in January 1999, in a profile for Salon that tried, in the near-term aftermath of Jobs’ return from exile to Apple, to sum up his career thus far:

The most useful way to understand what Jobs does best is to think of him as a personal-computer auteur. In the language of film criticism, an auteur is the person — usually a director — who wields the authority and imagination to place a personal stamp on the collective product that we call a movie. The computer industry used to be full of auteurs — entrepreneurs who put their names on a whole generation of mostly forgotten machines like the Morrow, the Osborne, the Kaypro. But today’s PCs are largely a colorless, look-alike bunch; it’s no coincidence that their ancestors were known as “clones” — knockoffs of IBM’s original PC. In such a market, Steve Jobs may well be the last of the personal-computer auteurs. He’s the only person left in the industry with the clout, the chutzpah and the recklessness to build a computer that has unique personality and quirks.

The Jobs-as-auteur meme has reemerged recently in the aftermath of his retirement as Apple CEO. John Gruber gave a smart talk at MacWorld a while back, introducing the auteur theory as a way of thinking about industrial design, and then Randall Stross contrasted Apple’s auteurial approach with Google’s data-driven philosophy for the New York Times.

(Here is where I must acknowledge that the version of the auteur theory presented in all these analyses, including mine, omits a lot. The theory originally emerged as a way for the artists of the French New Wave, led by Francois Truffaut, to square their enthusiasm for American pop-culture icons like Alfred Hitchcock with their devotion to cinema as an expressive form of art. In other words, it was how French intellectuals justified their love for stuff they were supposed to be rejecting as mass-market crap. So the parallels to the world of Apple are limited. We’re really talking about “the auteur theory as commonly understood and oversimplified.” But I digress.)

Auteurial design can lead you to take creative risks and make stunning breakthroughs. It can also lead to self-indulgent train wrecks that squander reputations and cash. Jobs has certainly had his share of both these extremes. They both follow from the same trait: the auteur’s certainty that he’s right and willingness (as Gruber notes) to act on that certainty.

Hubris or inspiration? Either way, this kind of auteur disdains market research. “It isn’t the consumers’ job to know what they want,” Jobs likes to say. Hah hah. Right. Only that, the democratic heart of our culture tells us with every beat, is precisely the consumer’s job. To embrace Jobs’ quip as a serious insight is to say that markets themselves don’t and can’t work — that democracy is impossible and capitalism one colossal fraud. (And while that’s an intriguing argument in its own right, I don’t think it’s what Jobs meant.)

I have to assume what Jobs really means here is that, while most of us know what we want when we’re operating on known territory, there are corners that we can’t always see around — particularly in a tumultuous industry like computing. Jobs has cultivated that round-the-corner periscopic vantage for his entire career. He’s really good at it. And so sometimes he knows what we want before we do.

I find nothing but delight in this. I take considerable pleasure in the Apple products I use. Still, it must be said: “I know best” is a lousy way to run a business (or a family, or a government). It broadcasts arrogance and courts disaster. It plugs into the same cult-of-the-lone-hero-artist mindset that Apple’s ad campaigns have celebrated. It reeks of Randian ressentiment and adolescent contempt for the little people.

Jobs’ approach, in Jobs’ hands, overcame this creepiness by sheer dint of taste and smarts. There isn’t anyone else in Apple’s industry or any other who is remotely likely to be able to pull it off. If what Jobs’ successors and competitors take away from all this is that “we know best” can be an acceptable business strategy, they will be in big trouble.

But there’s a different and more useful lesson to draw from the Jobs saga.

The salient fact about the arc of Jobs’ career is that his second bite at Apple was far more satisfying than his first. Jobs’ is a story that resoundingly contradicts Fitzgerald’s dictum about the absence of second acts in American life. In a notoriously youth-oriented industry, he founded a company as a kid, got kicked out, and returned in his 40s to lead it to previously unimaginable success. So the really interesting question about Jobs is not “How does he do it?” but rather, “How did he do it differently the second time around?”

By most accounts, Jobs is no less “brutal and unforgiving” a manager today than he was as a young man. His does not seem to be a story of age mellowing youth. But somehow, Jobs II has succeeded in a way Jobs I never did at building Apple into a stable institution.

I’m not privy to Apple-insider scuttlebutt and all I really have are some hunches as to why this might be. My best guess is that Jobs figured out how to share responsibility and authority effectively with an inner circle of key managers. Adam Lashinsky’s recent study of Apple’s management described a group of “top 100” employees whom Jobs invites to an annual think-a-thon retreat. Jobs famously retained “final cut” authority on every single product. But he seems to have made enough room for his key lieutenants that they feel, and behave, like a team. Somehow, on some level, they must feel that Apple’s success is not only Jobs’ but theirs, too.

Can this team extend Jobs’ winning streak with jaw-droppingly exciting new products long after Jobs himself is no longer calling the shots? And can an executive team that always seemed like a model of harmony avoid the power struggles that often follow a strong leader’s departure? For now, Jobs’ role as Apple chairman is going to delay these reckonings. But we’re going to find out, sooner or later. (And I hope Jobs’ health allows it to be way later!)

If Apple post-Jobs can perform on the same level as Apple-led-by-Jobs, then we will have to revise the Steve Jobs story yet again. Because it will no longer make sense to argue over whether his greatest achievement was the Apple II or the original Mac or Pixar or the iPod or the iPhone or the iPad. It will be clear that his most important legacy is not a product but an institution: Apple itself.

Filed Under: Business, Technology

The case of the New York Times’ terror error

July 28, 2011 by Scott Rosenberg

[This article, which is a collaboration between me and Mark Follman, originally appeared on the Atlantic’s website. Since then it has been the subject of a MediaBugs error report filed by Frank Lindh. Yes, at MediaBugs, not only do we eat our own dogfood, we find it tasty!]

It is hard to describe the interview that took place on KQED’s Forum show on May 25, 2011, as anything other than a train wreck.

Osama bin Laden was dead, and Frank Lindh — father of John Walker Lindh, the “American Taliban” — had been invited on to discuss a New York Times op-ed piece he’d just published about his son’s 20-year prison sentence. The moment host Dave Iverson completed his introduction about the politically and emotionally charged case, Lindh cut in: “Can I add a really important correction to what you just said?”

Iverson had just described John Walker Lindh’s 2002 guilty plea as “one count of providing services to a terrorist organization.” That, Frank Lindh said, was simply wrong.

Yes, his son had pled guilty to providing services to the Taliban, in whose army he had enlisted. Doing so was a crime because the Taliban government was under U.S. economic sanctions for harboring Al Qaeda. But the Taliban was not (and has never been) classified by the U.S. government as a terrorist organization itself.

This distinction might seem picayune. But it cut to the heart of the disagreement between Americans who have viewed John Walker Lindh as a traitor and a terrorist and those, like his father, who believe he was a fervent Muslim who never intended to take up arms against his own country.

That morning, the clash over this one fact set host and guest on a collision course for the remainder of the 30-minute interview. The next day, KQED ran a half-hour Forum segment apologizing for the mess and picking over its own mistakes.

KQED’s on-air fiasco didn’t happen randomly or spontaneously. The collision was set in motion nine years before by 14 erroneous words in the New York Times.

This is the story of how that error was made, why it mattered, why it hasn’t been properly corrected to this day — and what lessons it offers about how newsroom traditions of verification and correction must evolve in the digital age.

[Read more…]

Filed Under: Media, Mediabugs, Politics

Recent work: NY Times’ 9-year-old terror error; local news ethics; Wikipedia

July 21, 2011 by Scott Rosenberg

Sometimes your labor on a bunch of projects comes to fruition all at once. Here are some links to recently published stuff:

Corrections in the Web Age: The Case of the New York Times’ Terror Error — How did a 2002 error in the New York Times wreck a KQED interview in 2011 about John Walker Lindh, the “American Taliban”? And what does the incident tell us about how newsroom traditions of verification and correction must evolve in the digital age? MediaBugs’ Mark Follman and I put together this case study and it’s all here in the Atlantic’s fantastic Tech section. If you’re wondering what the point of MediaBugs is or why I’ve spent so much of the past two years working on it, this is a good summary!

Rules of the Road: Navigating the New Ethics of Local Journalism: I spent a considerable amount of time last winter and spring interviewing a whole passel of editors and proprietors of local news sites as part of this project for JLab, trying to find the tough questions and dilemmas they face as old-fashioned journalism ethics collide with the new shapes local journalism is taking online. It was a blast doing the interviews and fun assembling the results with Andy Pergam, Jan Schaffer and everyone else at JLab. It’s all on the website but it’s also available in PDF and print.

Whose point of view?: In the American Prospect, I used Wikipedia’s article on Social Security as an example to explore how Wikipedia’s principle of “neutral point of view” can break down. Here’s an excerpt:

Wikipedia says virtually nothing about the system’s role as a safety net, its baseline protections against poverty for the elderly and the disabled, its part in shoring up the battered foundations of the American middle class, or its defined-benefit stability as a bulwark against the violent oscillations of market-based retirement piggy banks.

This is a problem—not just for Social Security’s advocates but for Wikipedia itself, which has an extensive corpus of customs and practices intended to root out individual bias.

Filed Under: Media, Mediabugs, Net Culture, Personal, Politics

Circles: Facebook’s reality failure is Google+’s opportunity

June 30, 2011 by Scott Rosenberg

Way back when I joined Facebook I was under the impression that it was the social network where people play themselves. On Facebook, you were supposed to be “real.” So I figured, OK, this is where I don’t friend everyone indiscriminately; this is where I only connect with people I really know.

I stuck with that for a little while. But there were two big problems.

First, I was bombarded with friend requests from people I barely knew or didn’t know at all. Why? It soon became clear that large numbers of people weren’t approaching Facebook with the reality principle in mind. They were playing the usual online game of racking up big numbers to feel important. “Friend count”” was the new “unique visitors.”

Then Facebook started to get massive. And consultants and authors started giving us advice about how to use Facebook to brand ourselves. And marketing people began advocating that we use Facebook to sell stuff and, in fact, sell ourselves.

So which was Facebook: a new space for authentic communication between real people — or a new arena for self-promotion?

I could probably have handled this existential dilemma. And I know it’s one that a lot of people simply don’t care about. It bugged me, but it was the other Facebook problem that made me not want to use the service at all.

Facebook flattens our social relationships into one undifferentiated blob. It’s almost impossible to organize friends into discrete groups like “family” and “work” and “school friends” and so forth. Facebook’s just not built that way. (This critique is hardly original to me. But it’s worth repeating.)

In theory Facebook advocates a strict “one person, one account” policy, because each account’s supposed to correlate to a “real” individual. But then sometimes Facebook recommends that we keep a personal profile for our private life and a “page” for our professional life. Which seems an awful lot like “one person, two accounts.”

In truth, Facebook started out with an oversimplified conception of social life, modeled on the artificial hothouse community of a college campus, and it has never succeeded in providing a usable or convenient method for dividing or organizing your life into its different contexts. This is a massive, ongoing failure. And it is precisely where Facebook’s competitors at Google have built the strength of their new service for networking and sharing, Google+.

Google+ opened a limited trial on Tuesday, and last night it hit some sort of critical mass in the land of tech-and-media early adopters. Invitations were flying, in an eerie and amusing echo of what happened in 2004, when Google opened its very first social network, Orkut, to the public, and the Silicon Valley elite flocked to it with glee.

Google+ represents Google’s fourth big bite at building a social network. Orkut never took off because Google stopped building it out; once you found your friends there was nothing to do there. Wave was a fascinating experiment in advanced technology that was incomprehensible to the average user, and Google abandoned it. Buzz was (and is) a Twitter-like effort that botched its launch by invading your Gmail inbox and raiding your contact list.

So far Google+ seems to be getting things right: It’s easy to figure out, it explains itself elegantly as you delve into its features, it’s fast (for now, at least, under a trial-size population) and it’s even a bit fun.

By far the most interesting and valuable feature of Google+ is the idea of “circles” that it’s built upon. You choose friends and organize them into different “circles,” or groups, based on any criteria you like — the obvious ones being “family,” “friends,” “work,” and so on.

The most important thing to know is that you use these circles to decide who you’ll share what with. So, if you don’t want your friends to be bugged by some tidbit from your workplace, you just share with your workplace circle. Google has conceived and executed this feature beautifully; it takes little time to be up and running.

The other key choice is that you see the composition of your circles but your friends don’t: It’s as if you’re organizing them on your desktop. Your contacts never see how you’re labeling them, but your labeling choices govern what they see of what you share.

I’m sure problems will surface with this model but so far it seems sound and useful, and it’s a cinch to get started with it. Of course, if you’re already living inside Facebook, Google has a tough sell to make. You’ve invested in one network, you’re connected there; why should you bother? But if, like me, you resisted Facebook, Google+ offers a useful alternative that’s worth exploring.

The ideal future of social networking is one that isn’t controlled by any single company. But social networks depend on scale, and right now it’s big companies that are providing that.

Lord knows Google’s record isn’t perfect. But in this realm I view it as the least of evils. Look at the competition: Facebook is being built by young engineers who don’t have lives, and I don’t trust it to understand the complexity of our lives. It’s also about to go public and faces enormous pressure to cash in on the vast network it’s built. Twitter is a great service for real-time public conversation but it’s no better at nuanced social interaction than Facebook. Apple is forging the One Ring to rule all media and technology, and it’s a beaut, but I’ll keep my personal relationships out of its hands as long as I can. Microsoft? Don’t even bother.

Of the technology giants, Google — despite its missteps — has the best record of helping build and expand the Web in useful ways. It’s full of brilliant engineers who have had a very hard time figuring out how to transfer their expertise from the realm of code to the world of human interaction. But it’s learning.

So I’ll embrace the open-source, distributed, nobody-owns-it social network when it arrives, as it inevitably will, whether we get it from the likes of Diaspora and Status.net or somebody else. In the meantime, Google+ is looking pretty good. (Except for that awful punctuation-mark-laden name.)

MORE READING:

Gina Trapani’s notes on “What Google+ Learned from Buzz and Wave”

Marshall Kirkpatrick’s First Night With Google+

Filed Under: Net Culture, Technology

NY Times: “Paper of record” no more?

June 26, 2011 by Scott Rosenberg

New York Times public editor Art Brisbane today addresses an issue that MediaBugs and I have been talking about for a year: the need for news organizations to maintain a record of the changes they make to published stories.

I’ve argued that posting such “versions” of every news story — the way Wikipedia and every open source software project does with their own work — would help newsrooms regain public trust and free journalists to update their work more vigorously while staying accountable.

Brisbane seems to agree, but sounds doubtful that the Times is going to do this any time soon.

Right now, tracking changes is not a priority at The Times. As Ms. [Jill] Abramson told me, it’s unrealistic to preserve an “immutable, permanent record of everything we have done.”

I know the Times has tons of claims on its resources. Jill Abramson has a million demands to juggle. But let me respectfully dispute her “unrealistic” judgment.

Versions of stories are just data. For the Times, or any other website, to save them is a matter of (a) storage space and (b) interface tweaks to make the versions accessible. Today, storage is cheap and getting cheaper, and Web interfaces are more flexible than ever.

Really, there’s nothing unrealistic about preserving an “immutable, permanent record” of every post-publication change made to every story.
Wikipedia — a volunteer organization run by a variety of ad hoc institutions — can do it. Any WordPress blog can do it. It seems peculiarly defeatist for our leading newsroom to shrug and say it can’t be done.

By making story versions “not a priority,” the Times is essentially abdicating its longstanding status as our paper of record as it makes the transition from paper to digital. I doubt that’s what its leaders intend to do. The more they ponder this, the more I think they’ll see that a versioning system for news is not only valuable but inevitable.

Filed Under: Media, Mediabugs

« Previous Page
Next Page »