Can newspapers fix old errors?

The ombudsman’s column in Sunday’s New York Times was the sort of piece that a lot of people will scan and forget. It’s late August, Iraq is burning, the Bush administration is imploding. All Clark Hoyt can find to write about is little errors in old papers that should’ve been corrected and never were? What’s wrong with him?

But actually, I think, if you read between its lines, Hoyt’s column, “When Bad News Follows You,” represents a profound admission, from inside the belly of today’s wounded newspaper beast, of the core problem that industry faces.

Hoyt wrote about people who are coming forward to complain about errors, inaccuracies, slants or misjudgments in old Times stories. These buried arguments are reigniting because the Times’ SEO (search engine optimization) strategy has proven effective at gaining high Google rank for the paper’s old stories. So the Times’s success at boosting the value of its archival content (articles that it charges non-subscribers to access) has had the unintended consequence of unearthing every unfixed error and reopening the argument over every disputed story in the paper’s past.

The distraught subjects of these pieces want the Times to remove the articles from the Web. The paper’s attitude is to recoil in horror. That’s tampering with history! Hoyt seems to concur, at least in part.

You can’t accept someone’s word that an old article was wrong. What if that person who was charged with abusing a child really was guilty? Re-report every story challenged by someone? Impossible, said Jonathan Landman, the deputy managing editor in charge of the newsroom’s online operation: there’d be time for nothing else.

To its credit, the Times has begun trying to make some fixes on old stories, but it’s plainly daunted by the scope of the problem and it worries about messing with the historical record. Hoyt quotes one “ethicist” pundit at the Poynter Institute who advises “great caution” in tampering with old stories, and then another expert at Harvard’s JFK School suggests that the best solution might be for the Times to flush its archive of less important stories so there’d be fewer trivial errors to get right. (What? And give up all those Google hits that took so much work to win?)

I’m sorry, but none of these reactions is adequate. If the Times is truly the “paper of record” that it has always positioned itself as, and its archives deserve high Google rank by virtue of their unimpeachability, then the paper needs to divert some of the cash it will take in thanks to that rank and fund an operation to look into reader complaints about old articles.

Newspapers still have readers because those readers still trust newspapers to get stuff right. Some portion of the public — from progressives who think the Times blew it on Iraq to conservatives who think it’s edited by pinkos and terrorist sympathizers — has already given up on this. But the core of the paper’s readership still believes on some level that he Times can be trusted better than less professional, more casual online sources.

What happens to that faith when the Times — faced with a new series of challenges to the integrity of its stories — simply throws up its hands and says it’s “impossible” to review and either reaffirm or correct its challenged articles? I’ll tell you what happens: people will begin to look more approvingly at online sources with more flexible approaches. Sure, Wikipedia is full of problematic information; but when it comes to improving all that information, Wikipedia doesn’t shrug and say “impossible” — it has a system in place that, over time, tends to weed out bad errors. It’s admittedly and extensively imperfect, but it’s always possible to improve it. Assuming the Times editors accept that their archives are full of imperfections, too (and let’s remember Hoyt’s previous column reminding us that the paper “misspells names at a ferocious rate”), the it’s incumbent on them to figure out a similar way to open up its archives to some process of improvement. Either that or stop trying to peddle them as a valuable commodity.

Unless it accepts the difficult but important work of reviewing the old articles it’s seeding Google with when people challenge them, the Times is essentially declaring that the accuracy of the information it provides suffers a sort of half-life decay. I don’t think that’s a very smart way of, as they say in the biz, defending the franchise. There’s no statute of limitations on the truth.
[tags]clark hoyt, new york times, public editor, corrections, archives, seo[/tags]

Post Revisions:

There are no revisions for this post.

Report an errorClose

Sign up for emails from Scott

Comments

  1. A quick fix would be to put a disclaimer at the top of the article, Wikipedia-style:

    “Some of the facts in this article have been disputed; …”

    Or perhaps let the person post a response at the end of the article.

  2. The expectation set in the early 20th century that newspapers could deliver “truth” was a false one, and we must now let the truth that they cannot prevail. Legendary editor/columnist Walter Lippmann established this “scientific model” of journalism that we have been using in his 1920 book “Liberty and the News.” But he himself retracted this idea two years later in his book “Public Opinion,” where he concluded that the only thing journalists can really do is signalize events — that truth was the domain of think tanks and historians. (Steve Boriss, The Future Of News)

  3. One of the keys to fixiing this problem (as you note, Scott) is money. The other is resources. The Times should indeed employ people (and pay them accordingly–not intern rates) to comb through articles and link them to either notations on corrections or to follow-up articles. A simple “Update” section at the top, with links (like many bloggers will do) will direct people over to better, more reliable or current information. But the links have to be on the top, not the bottom, as many people give up or get stuck on the salacious before they get to the bottom.

    Yet, if no one’s willing to pony up the money for the necessary staff to do this, then Google’s new comments feature may end up making Google the “paper of record”– because of its quicker correction rate (albeit one that could result in heavy p/r flacking as much as correcting) and, probably, its pages higher search ranks.

  4. James Birchall

    I think a rethinking of the presentation of old material is in order. It’s not enough to simply dump out all that old data, tack a date on it and then leave it be. Information is dynamic. So is knowledge.

    I propose that online advertisements in newspaper articles remain as a property of the article. That way, any old articles that suddenly become popular again, re-generate revenue.

    I propose secondly that newspapers start thinking beyond “Today’s Breaking News” and tie in “Today’s news” with yesterday’s stories. How often have we seen an issue make the news, fade into obscurity, then re-emerge with some new information? If you design the news right, you can drill down or up through stories and explore as much of the historical record as you want from the newspaper’s point of view.

    Corrections then end up being new stories on old news. You preserve both the historical record and convey new information that is pertinent to understanding that old information while driving revenue from people that are interested in old stories. Plus old news is clearly marked as being old and is shown in context with newer understanding.

  5. Great post, Scott. I recommended it over on Poynter’s E-Media Tidbits today — where I discussed an option. How about a moderated corrections wiki? Seems like, in itself, that could be a business opp as well as a way to demonstrate a news org’s commitment to accuracy.

    Check it out and tell me what you think

    http://poynter.org/column.asp?id=31&aid=129239

    Thanks

    - Amy Gahran

Trackbacks

Post a comment