You are viewing an old revision of this post, from July 26, 2010 @ 07:02:39. See below for differences between this version and the current revision.
Jeffrey Rosen’s piece on “The End of Forgetting” was a big disappointment, I felt. He’s taking on important themes — how the nature of personal reputation is evolving in the Internet era, the dangers of a world in which social-network postings can get people fired, and the fuzzier prospect of a Web that prevents people from reinventing themselves or starting new lives.
But I’m afraid this New York Times Magazine cover story hangs from some very thin reeds. It offers few concrete examples of the problems it laments, resorts to vague generalizations and straw men, and lists some truly preposterous proposed remedies.
Rosen presents his premise — that information once posted to the Web is permanent and indelible — as a given. But it’s highly debatable. In the near future, we are, I’d argue, far more likely to find ourselves trying to cope with the opposite problem: the Web “forgets” far too easily.
Rosen begins with the tale of Stacy Snyder, a Pennsylvania teacher in training who was denied her degree because her teachers’ college didn’t like a photo of her on Facebook. Snyder’s sin? Wearing a pirate hat in a party photo labeled “drunken pirate.” This outrageous promotion of alcohol consumption got her booted from her teaching school and deep-sixed her career. She sued, but a Federal district court has rejected her complaint.
It took me a few minutes of reading into Rosen’s many-thousands-of-words piece before I realized that Snyder’s story would be the only case of an actual person being damaged by the ostensible “end of forgetting” that Rosen would explore in any detail. On this meager evidence, he postulates “a collective identity crisis,” “an urgent problem” for all of society.
But does Snyder’s mistreatment have anything to do with the longevity of the information we post about ourselves online? It was a trivial indiscretion in the present that got her in trouble. Her tale, in fact, has little to do with “the end of forgetting.” You can blame Facebook, but shouldn’t you blame the school administrators and the federal judges even more? The photo is harmless; the trouble lies with the people who have turned it into a problem.
The only two other examples Rosen cites are similarly weak. There is a “16-year-old British girl who was fired from her office job for complaining on Facebook, ‘I’m so totally bored!!!’ ” And a Canadian psychotherapist who was blocked at the US border because a guard turned up his description of an LSD trip 30 years before.
These are both injustices. But humorless bosses and overreaching immigration officials will be with us no matter how we reform Facebook.
People have been losing their jobs because of the Web for almost as long as there has been a Web. We get new stories of Internet-driven job loss with each new iteration of the form of participation — email, chat rooms, forums, blogs (where Heather “Dooce” Armstrong was the poster child, but let’s not forget Ellen Simonetti, Jessica Cutler, Mark Jen, and all the others who’ve been dooced — fired for blogging indiscretions), and now Facebook and Twitter. This is a problem, but one that has less to do with the nature of “digital forgetting” (firing-level indiscretions are nearly all contemporaneous) than with youthful hubris, overconfidence in anonymity, or the misfortune of having a vindictive employer.
With these less-than-ideal examples out of the way, Rosen can get to the heart of his material: He fears that the Internet’s archival memory of each of our missteps will make it impossible for any of us to wipe the slate clean and start afresh. Once, he argues, the Web offered a new frontier, which, like the Old West, allowed us to reinvent ourselves. But now that social safety valve is closed.
Rosen writes that this vision of the Web as frontier “has proved to be another myth.” In truth, it was always a myth. The Internet was never an alternate universe. It has always been interwoven with our “real” offline lives. It offered us new opportunities for expression and connection and imaginative play, and it still does, but only the very naive or the very deluded could ever think that what they did on the Web stayed on the Web.
Rosen maintains that our Internet-driven culture is making it harder for the disgraced to rebuild their lives and careers. But is there any actual evidence that anything has changed in this realm? As in the pre-Internet era, the ability to get up from a knockdown has less to do with making the world “forget” than with power, position and cash.
If you operate in the higher reaches of society, as a successful businessperson or political leader or intellectual, you will get your second act, whatever your shame. Henry Blodget is building a successful media company on the Web; Eliot Spitzer is hosting on CNN; sock-puppet maestro Lee Siegel publishes books and articles.
The people who have to worry are those on the lower rungs of the economy, who are in greater danger of losing jobs and homes and rights at the whim of bosses and bankers and bureaucrats. That problem, alas, predates the Web.
Rosen really loses me when he starts compiling “solutions” to the problem of the “end of forgetting.” Some of the options he lists with a straight face include: DMCA-style takedown notices for “false information” online — if someone publishes something you think is false, you get to serve them with a notice and they have to take it down! Can’t wait to start seeing those fly. Or what about resurrecting Microsoft’s Clippy — perhaps the most reviled software creation of all time — to offer a “reproachful look” at users who are about to upload some risque photo?
These ludicrous ideas don’t stand much chance of harassing us. Society will outgrow the “fired for Facebook” problem once an entire generation with a shared online past begins to take the reins. When everyone has party pictures on Facebook, no one will find them an impediment to employment or higher office. After all, once upon a time, a politician’s admission of past drug use meant kissing the White House goodbye.
In the meantime, of course, people ought to get smarter about what to share and what to keep private. I believe they will; one of my motivations for writing Say Everything was to try to help people learn from the experiences of early bloggers as they move their own lives online. I will also gladly join my voice to Rosen’s in support of giving users much more direct control of their data on services like Facebook. If it takes laws to insure that, let there be laws.
But Rosen is too busy hatching plans for “expire dates” on social-network postings and other artificial-forgetting schemes to give his head the Janus-turn his subject demands. The idea that the Web has a long memory is hardly new (here’s J.D. Lasica’s piece on how “The Web Never Forgets” from 1998). But there is a flipside to this notion: Information online can be fragile and fleeting, as well. There is an entropic quality to everything that is shared online. Data gets lost; servers die; databases are corrupted; formats fall into disuse; storage media deteriorate; backups fail.
The Web is now old enough for us to know just how badly links rot over time. Much of the material from the early days of the Web is already gone. Facebook and Twitter actually make it nearly impossible for you to find older material, even stuff that you’ve contributed yourself. The more dynamic the Web gets and the more stuff we move into “the cloud,” the less confident we can be that information that once was public will remain available to the public.
There are conferences on “digital preservation” these days because this is actually a serious and important problem. We need to solve it for the sake of future historians and for the sake of our descendants. We need, as Dave Winer puts it, to “future-safe” the culture we are creating together today.
In other words: I’m a lot less worried about the Web that never forgets than I am about the Web that can’t remember.
- July 26, 2010 @ 07:03:18 [Current Revision] by Scott Rosenberg
- July 26, 2010 @ 07:02:39 by Scott Rosenberg
|July 26, 2010 @ 07:02:39||Current Revision|
|Deleted: "The end of forgetting" and the danger of forgetting||Added: Does the Web remember too much -- or too little?|
Note: Spaces may be added to comparison text to allow better line wrapping.
“These are both injustices. But humorless bosses and overreaching immigration officials will be with us no matter how we reform Facebook.”
Facebook/Twitter latest in a long list of whipping, uh, persons: email, chat rooms, forums, blogs, as you point out. Also video games, pinball games, D&D, long hair, pool halls….
Excellent rebuttal! Link rot is really the defacto evidence of forgetting but you go into important depth on the subject. I still lament the loss of some of the specialty sites that were put up on Geocities and Angelfire. There was a lot of garbage but also a lot of valuable, niche information.
I mentioned your piece over on my journal about memory and aesthetics.
I’d argue that you missed part of Rosen’s argument, or at least misconstrued it. The issue is that, despite the privacy provisions we may have on social networks or blogs or in news articles, that information is in fact available. The professor he cited didn’t get found because of Facebook, but because his article was indexed somewhere where it was easy to find him.
You argue that we (society) should just wait until a social networking generation is used to their digital identities being publicly available and the stigma disappears. In the interim, we’ll have chilling effects that (more than likely) will be detrimental to society. chances are that instead of accepting the possibility that people have distinct, if gradually evolving identities, people in positions of power will identify new people as a collage of their digital identities, no matter how dated some of the material is.
I’ll confess I think Rosen missed the mark a little bit, but you miss the point too. People with your personal attitude will continue to run corporations and lending institutions until people in their 20s reach their 40s, and will have the power of hiring or promoting to prevent younger people from ascending the ranks, buying houses, or traveling internationally. You speculate that it won’t matter what they’ve done and what’s available online, but in the 20 to 30 years, a very real possibility is that anyone who makes it to the point where they’re able to control hiring, firing, promotions, loans, or the like will be indoctrinated.
Hi Scott, I’m with you on this — the article made sweeping claims about data retention with little concrete evidence, then offered remarkably thin solutions that were long on technical fixes and short on any coherent vision of how or why people and societies remember and forget.
I wrote back to the NYT Magazine to make the further point that despite popular beliefs that info online never dies (beliefs that are relentlessly exploited in the article), in fact digital preservation is largely an unsolved problem. (When was the last time you tried to retrieve a file from a floppy disk or zip drive? One of my UCLA colleagues says that people will only begin to worry when those baby pictures on the digital camera are gone.) As a few others here have pointed out, information loss and fragmentation pose at least as large a threat to personal identity and autonomy as the dream/myth of total information capture and retention.
Instead of implicitly accepting the idea that total capture is both possible and desirable, we might do better to reflect more on what we gain in the process of forgetting, and then get serious about making choices about what information matters, to whom, when, and why.