Does the Web remember too much — or too little?

Jeffrey Rosen’s piece on “The End of Forgetting” was a big disappointment, I felt. He’s taking on important themes — how the nature of personal reputation is evolving in the Internet era, the dangers of a world in which social-network postings can get people fired, and the fuzzier prospect of a Web that prevents people from reinventing themselves or starting new lives.

But I’m afraid this New York Times Magazine cover story hangs from some very thin reeds. It offers few concrete examples of the problems it laments, resorts to vague generalizations and straw men, and lists some truly preposterous proposed remedies.

Rosen presents his premise — that information once posted to the Web is permanent and indelible — as a given. But it’s highly debatable. In the near future, we are, I’d argue, far more likely to find ourselves trying to cope with the opposite problem: the Web “forgets” far too easily.

Rosen begins with the tale of Stacy Snyder, a Pennsylvania teacher in training who was denied her degree because her teachers’ college didn’t like a photo of her on Facebook. Snyder’s sin? Wearing a pirate hat in a party photo labeled “drunken pirate.” This outrageous promotion of alcohol consumption got her booted from her teaching school and deep-sixed her career. She sued, but a Federal district court has rejected her complaint.

It took me a few minutes of reading into Rosen’s many-thousands-of-words piece before I realized that Snyder’s story would be the only case of an actual person being damaged by the ostensible “end of forgetting” that Rosen would explore in any detail. On this meager evidence, he postulates “a collective identity crisis,” “an urgent problem” for all of society.

But does Snyder’s mistreatment have anything to do with the longevity of the information we post about ourselves online? It was a trivial indiscretion in the present that got her in trouble. Her tale, in fact, has little to do with “the end of forgetting.” You can blame Facebook, but shouldn’t you blame the school administrators and the federal judges even more? The photo is harmless; the trouble lies with the people who have turned it into a problem.

The only two other examples Rosen cites are similarly weak. There is a “16-year-old British girl who was fired from her office job for complaining on Facebook, ‘I’m so totally bored!!!’ ” And a Canadian psychotherapist who was blocked at the US border because a guard turned up his description of an LSD trip 30 years before.

These are both injustices. But humorless bosses and overreaching immigration officials will be with us no matter how we reform Facebook.

People have been losing their jobs because of the Web for almost as long as there has been a Web. We get new stories of Internet-driven job loss with each new iteration of the form of participation — email, chat rooms, forums, blogs (where Heather “Dooce” Armstrong was the poster child, but let’s not forget Ellen Simonetti, Jessica Cutler, Mark Jen, and all the others who’ve been dooced — fired for blogging indiscretions), and now Facebook and Twitter. This is a problem, but one that has less to do with the nature of “digital forgetting” (firing-level indiscretions are nearly all contemporaneous) than with youthful hubris, overconfidence in anonymity, or the misfortune of having a vindictive employer.

With these less-than-ideal examples out of the way, Rosen can get to the heart of his material: He fears that the Internet’s archival memory of each of our missteps will make it impossible for any of us to wipe the slate clean and start afresh. Once, he argues, the Web offered a new frontier, which, like the Old West, allowed us to reinvent ourselves. But now that social safety valve is closed.

Rosen writes that this vision of the Web as frontier “has proved to be another myth.” In truth, it was always a myth. The Internet was never an alternate universe. It has always been interwoven with our “real” offline lives. It offered us new opportunities for expression and connection and imaginative play, and it still does, but only the very naive or the very deluded could ever think that what they did on the Web stayed on the Web.

Rosen maintains that our Internet-driven culture is making it harder for the disgraced to rebuild their lives and careers. But is there any actual evidence that anything has changed in this realm? As in the pre-Internet era, the ability to get up from a knockdown has less to do with making the world “forget” than with power, position and cash.

If you operate in the higher reaches of society, as a successful businessperson or political leader or intellectual, you will get your second act, whatever your shame. Henry Blodget is building a successful media company on the Web; Eliot Spitzer is hosting on CNN; sock-puppet maestro Lee Siegel publishes books and articles.

The people who have to worry are those on the lower rungs of the economy, who are in greater danger of losing jobs and homes and rights at the whim of bosses and bankers and bureaucrats. That problem, alas, predates the Web.

Rosen really loses me when he starts compiling “solutions” to the problem of the “end of forgetting.” Some of the options he lists with a straight face include: DMCA-style takedown notices for “false information” online — if someone publishes something you think is false, you get to serve them with a notice and they have to take it down! Can’t wait to start seeing those fly. Or what about resurrecting Microsoft’s Clippy — perhaps the most reviled software creation of all time — to offer a “reproachful look” at users who are about to upload some risque photo?

These ludicrous ideas don’t stand much chance of harassing us. Society will outgrow the “fired for Facebook” problem once an entire generation with a shared online past begins to take the reins. When everyone has party pictures on Facebook, no one will find them an impediment to employment or higher office. After all, once upon a time, a politician’s admission of past drug use meant kissing the White House goodbye.

In the meantime, of course, people ought to get smarter about what to share and what to keep private. I believe they will; one of my motivations for writing Say Everything was to try to help people learn from the experiences of early bloggers as they move their own lives online. I will also gladly join my voice to Rosen’s in support of giving users much more direct control of their data on services like Facebook. If it takes laws to insure that, let there be laws.

But Rosen is too busy hatching plans for “expire dates” on social-network postings and other artificial-forgetting schemes to give his head the Janus-turn his subject demands. The idea that the Web has a long memory is hardly new (here’s J.D. Lasica’s piece on how “The Web Never Forgets” from 1998). But there is a flipside to this notion: Information online can be fragile and fleeting, as well. There is an entropic quality to everything that is shared online. Data gets lost; servers die; databases are corrupted; formats fall into disuse; storage media deteriorate; backups fail.

The Web is now old enough for us to know just how badly links rot over time. Much of the material from the early days of the Web is already gone. Facebook and Twitter actually make it nearly impossible for you to find older material, even stuff that you’ve contributed yourself. The more dynamic the Web gets and the more stuff we move into “the cloud,” the less confident we can be that information that once was public will remain available to the public.

There are conferences on “digital preservation” these days because this is actually a serious and important problem. We need to solve it for the sake of future historians and for the sake of our descendants. We need, as Dave Winer puts it, to “future-safe” the culture we are creating together today.

In other words: I’m a lot less worried about the Web that never forgets than I am about the Web that can’t remember.

Post Revisions: