Wordyard

Hand-forged posts since 2002

Scott Rosenberg

  • About
  • Greatest hits

Archives

20 years of Web-whacking: my SXSW talk

August 24, 2010 by Scott Rosenberg Leave a Comment

I had such a great time at South by Southwest last spring talking about blogging that I threw my hat in the ring again for next year.

My idea this time: “The Internet: Threat or Menace?” — a guided tour through two decades of tirades, fusillades and rants against the Internet, the Web, and all the other stuff people do with computers.

There’s rich history here, much of it already forgotten, some of it extremely funny. I’ve read a lot of these books and essays already. I’m eager to try to figure out why so many Internet critiques have that undead-zombie quality: you know they’ve got no life left in them, yet they keep lurching forward, leaving trails of slime for the rest of us to slip on. Yes, of course, there are legitimate and valuable critiques of the Net and what it hath wrought. And a reasonable amount of fatuous utopian hot air as well. I will lay it out and we can all roll our eyes together.

If you want to give me a chance to do this, you know the drill: hie thee PanelPicker-ward and cast your ballot. And spread the word. I will be grateful. If I am picked, I will enlist all of you as collaborators here as I try to stretch my arms around this vast topic.

But I’ll understand if you’d rather just sit back and let me do all the work.

And if enough of you vote, I will attempt to distill the material to its essence.

In haiku.

Oh yes. Many other fine people are proposing interesting sessions at SXSW. Here’s a handful I’ve come across that I recommend to you:

Justin Peters of CJR running a panel on “Trust Falls: Authority, Credibility, Journalism, and the Internet”

Mother Jones’ panel on “Investigative Tweeting? Secrets of the New Interactive Reporting”

Jay Rosen’s “Bloggers vs. Journalists: It’s a Psychological Thing”

Dan Gillmor on “Why Journalism Doesn’t Need Saving: an Optimist’s List”

Steve Fox assembling a panel on “That Was Private! After Weigel does privacy exist?”

My friends at XOXCO have a couple of proposals: Ben Brown on “Behind the Scenes of Online Communities” and Katie Spence with “Tales of the Future Past: Web Pioneers Remember.”

And tons more that I’m sure I’ve missed…

Filed Under: Events, Net Culture, Personal

Why trust Facebook with the future’s past?

August 23, 2010 by Scott Rosenberg 8 Comments

Comments weren’t working for a while today. Apologies to anyone whose words got eaten! Should be working again now.

An odd moment during the Facebook Places rollout last week has been bugging me ever since.

From Caroline McCarthy’s account at CNet:

Facebook not only wants to be the digital sovereignty toward which all other geolocation apps direct their figurative roads, it also wants to be the Web’s own omniscient historian.

“Too many of our human stories are still collecting dust on the shelves of our collections at home,” Facebook vice president of product Christopher Cox said as he explained the sociological rationale behind Facebook Places… “Those stories are going to be placed,” Cox said. “Those stories are going to be pinned to a physical location so that maybe one day in 20 years our children will go to Ocean Beach in San Francisco, and their little magical thing will start to vibrate and say, ‘This is where your parents first kissed.'”

From Chris O’Brien’s post:

Cox: “…Technology does not need to estrange us from each other.”

“Maybe one time you walk into a bar, you sit down at the bar, and you put your magical 10-years-into-the-future phone down. And suddenly it starts to glow. ‘This is what your friend ordered here’. And it pops up these memories…’Go check out this thing about the urinal that your friend wrote about when they were here about eight months ago.’ ”

Cox explained that all these check-ins, photos, and videos could be gathered on pages about a place to create “collective memories.”

“That’s dope.”

Yeah, that’s dope all right. Doper still would be for Facebook to begin performing this role of “omniscient historian” or “memory collector” right now. As I’ve been arguing for some time, Neither Facebook nor Twitter is doing a very good job of sharing the history we’re recording on them.

Everything we put on the Web is both ephemeral and archival — ephemeral in the sense that so much of what we post is only fleetingly relevant, archival in the sense that the things we post tend to stay where we put them so we can find them years later. Most forms of social media in the pre-status-update era — blogging, Flickr, Delicious, Youtube and so on — functioned in this manner. They encouraged us to pile up our stuff in public with the promise that it would still be there when we came back. As Marc Hedlund put it: public and permanent.

Twitter, at least, places each Tweet at a “permalink”-style public URL. So if you save a particular Tweet’s address you can find it again in the future. Otherwise, you’re out of luck. (You can make local copies of your Tweetstream, but that’s more of a backup than a linkable public archive.) Presumably Twitter is keeping all this data, and they’ve said that they’re handing a complete record over to the Library of Congress. But the data isn’t public and permanent for the rest of us. I think we’re just supposed to take it on faith that we’ll get the keys back to it eventually. (Jeff Jarvis says he interviewed Evan Williams and “told him I want better ways to save my tweets, making them memory.” Hope to hear more from that. By linking to Jeff’s tweet here I have fished it out for posterity, one needle plucked from the fugitive haystack.)

Meanwhile, Facebook is even less helpful. Lord knows what happens to the old stuff there. Is there any way to find what you wrote on Facebook last year? I hope so, for the sake of the millions of people who are chronicling their lives on Mark Zuckerberg’s servers. But I’ve certainly never been able to find it.

In fact, Facebook is relentlessly now-focused. And because it uses its own proprietary software that it regularly changes, there is no way to build your own alternate set of archive links to old posts and pages the way you can on the open Web. Facebook users are pouring their hearts and souls into this system and it is tossing them into the proverbial circular file.

All of which led me to wonder what Facebook could possibly be thinking in asking us to imagine Places as a future repository for our collective history. After all, Facebook could be such a repository today, if it actually cared about history. It has given no evidence of such concern.

Maybe in the future all manner of data will, as Cox put it so charmingly, cause our “little magical things to start to vibrate.” I mean, dope! But if my kids are going to find out about the site of their parents’ first kiss, I’ll have to provide that information to someone. I don’t think it will be Facebook.

Filed Under: Blogging, Media, Net Culture

Dissing Facebook’s like

July 27, 2010 by Scott Rosenberg 14 Comments

At the Hacks and Hackers event last night, two Facebook representatives took the stage and talked about stuff Facebook can do for news organizations and journalists. But the journalists in attendance had only one thing on their minds: Dislike.

You see, Facebook now lets you “like” things you find online. Facebook wants you to like lots of stuff! But if you don’t like something, it asks you to walk on by, without tossing any brickbats. Journalists, based on last night’s crowd, are unhappy with this limitation. They badly want Facebook to let them actively, explicitly “dislike” things, too.

This suggests that we journalists are a negative bunch who dislike a whole lot of things. We wants to tell the world about them, we do. Nassty Facebook won’t let us!

The problem with “Like” and news content, of course, is that a lot of news is heartbreaking, and if you say you “liked” it you come off callous. This was evident from one of the Facebook presentation’s own slides.

It turns out that, on Facebook as everywhere else, people really respond to “touching emotional stories.” Facebook’s Justin Osofsky and Matt Kelly provided an example of such a tale: a headline that read “US Border Patrol shot a 14-year-old at the Mexican border.” Who wants to “like” that? In such instances, Facebook suggests users be given the option of “recommending” or “sharing” the story instead.

That covers the “bad news” case. But there’s also the “articles I disagree with” case, where you’re outraged by something and you want to share that outrage. “Like,” again, won’t do. But neither will “recommend.” This is the case for which “dislike” might make sense. But based on the rote response of the Facebook people to repeated, increasingly agitated questions on the subject, I don’t think Facebook will ever offer this choice.

The conclusion a lot of people drew was that Facebook was afraid of offending advertisers. That’s quite likely. But I also think Facebook is being smart: It’s avoiding torrents of trollery, negativity, and bullying that a “dislike” button would unleash. Some journalists might be happier in a world full of dislikeness, but I think most everyone else would be bummed.

UPDATE: Patrick Beeson points out in a comment, “I find it ironic that journalists want a dislike button, but detest negative comments posted on the websites that publish their stories.”

Chris O’Brien took great notes from the event — if you want the basics on what Facebook recommends this is highly useful.

Filed Under: Media, Net Culture

Does the Web remember too much — or too little?

July 26, 2010 by Scott Rosenberg 11 Comments

Jeffrey Rosen’s piece on “The End of Forgetting” was a big disappointment, I felt. He’s taking on important themes — how the nature of personal reputation is evolving in the Internet era, the dangers of a world in which social-network postings can get people fired, and the fuzzier prospect of a Web that prevents people from reinventing themselves or starting new lives.

But I’m afraid this New York Times Magazine cover story hangs from some very thin reeds. It offers few concrete examples of the problems it laments, resorts to vague generalizations and straw men, and lists some truly preposterous proposed remedies.

Rosen presents his premise — that information once posted to the Web is permanent and indelible — as a given. But it’s highly debatable. In the near future, we are, I’d argue, far more likely to find ourselves trying to cope with the opposite problem: the Web “forgets” far too easily.
[Read more…]

Filed Under: Blogging, Culture, Media, Net Culture

Roberts is to pager as Bush is to scanner

April 23, 2010 by Scott Rosenberg 3 Comments

Way back in ancient times, a decade ago, I wrote a piece for Salon that mentioned the widely circulated anecdote about President George H.W. Bush (the elder) casting a wondering gaze at a supermarket scanner. The tale had legs during the 1992 election cycle because it echoed a sense in the electorate that Bush was out of touch with the common people who were then suffering through a miserable recession.

I believe Bush was indeed out of touch. But my reference to the tale evoked several outraged emails from readers who accused me of perpetrating an urban myth. Bush had been treated unfairly by this news meme (Snopes.com has the details), and I had repeated the injustice.

I learned a couple lessons from the experience. One was to redouble my efforts as a journalist to question received wisdom. The other, more important lesson was that the knowledge my readers were going to send (and sometimes hurl) my way was invaluable. (Or, in Dan Gillmor’s famous phrase: “My readers know more than I do.”)

I thought of all this recently as I encountered the latest transmutation of the Bush/scanner meme. Yesterday The Huffington Post picked up a report on a law blog that made out Chief Justice Roberts to be a technological naif who had to ask, in the middle of an argument, “what’s the difference between email and a pager?”

I read the original blog post. Then I read the comments. Then I read the link to the original transcript of the argument that a commenter had helpfully provided. And I concluded for myself (you might feel otherwise, but I doubt it) that — however much Roberts may be more radical a conservative than I would wish — he’s not an idiot, and he had a reasonable basis to ask the question.

The self-correcting online feedback loop works a lot faster today than it did 10 years ago, and a lot more openly (we didn’t have comments on Salon back then). The “Roberts doesn’t know what a pager is” meme ought, by rights, to have been stopped in its tracks. It will be very interesting to follow its course in coming weeks and months. Past experience suggests that, despite having been arrested early on on the web, it will now be amplified on cable and in print and have a long half-life in our collective psyche.

Filed Under: Media, Net Culture, Uncategorized

Newspaper comments: Forget anonymity! The problem is management

April 13, 2010 by Scott Rosenberg 58 Comments

This New York Times piece Monday reflects a growing chorus of resentment among newspaper website managers against the “barroom brawl” atmosphere so many of them have ended up with in the comments sections on their sites.

They blame anonymity. If only they could make people “sign their real names,” surely the atmosphere would improve!

This wish is a pipe dream. They are misdiagnosing their problem, which has little to do with anonymity and everything to do with a failure to understand how online communities work.

It is one of the great tragedies of the past decade that so many media institutions have failed to learn from the now considerable historical record of success and failure in the creation of online conversation spaces. This stuff isn’t new any more. (Hell, this conversation itself isn’t new either — see this Kevin Marks post for a previous iteration.) There are people who have been hosting and running this sort of operation for decades now. They know a thing or two about how to do it right. (To name just a few off the top of my head — there are many more: Gail Williams of the Well. Derek Powazek of Fray.com. Mary Elizabeth Williams at Salon’s Table Talk. Caterina Fake and her (ex-)Flickr gang.)

The great mistake so many newspapers and media outlets made was to turn on the comments software and then walk out of the room. They seemed to believe that the discussions would magically take care of themselves.

If you opened a public cafe or a bar in the downtown of a city, failed to staff it, and left it untended for months on end, would you be surprised if it ended up as a rat-infested hellhole?

Comment spaces need supervision — call them hosts or moderators or tummlers or New Insect Overlords or whatever you want, but don’t neglect to hire them! These moderators need to be actual people with a presence in the conversation, not faceless wielders of the “delete” button. They welcome newcomers, enforce the local rules, and break up the occasional brawl — enlisting help from the more civic-minded regulars as needed.

Show me a newspaper website without a comments host or moderation plan and I’ll show you a nasty flamepit that no unenforceable “use your real name” policy can save. Telling Web users “Use your real name” isn’t bad in itself, but it won’t get you very far if your site has already degenerated into nasty mayhem. The Web has no identity system, and though the FBI can track you down if the provocation is dire enough, and if you get editors mad enough they can track you down, too, most media companies aren’t going to waste the time and money. So you’ll stand there demanding “real names,” and your trolls will ignore you or make up names, and your more thoughtful potential contributors will survey your site and think, “You want me to use my real name in this cesspool? No thanks.”

No, anonymity isn’t the problem. (Wikipedia seems to have managed pretty well without requiring real names, because it has an effective system of persistent identity.) The problem is that once an online discussion space gets off to a bad start it’s very hard to change the tone. The early days of any online community are formative. The tone set by early participants provides cues for each new arrival. Your site will attract newcomers based on what they find already in place: people chatting amiably about their lives will draw others like themselves; similarly, people engaging in competitive displays of bile will entice other putdown artists to join the fun.

So turning things around isn’t easy. In fact, it’s often smarter to just shut down a comments space that’s gone bad, wait a while, and then reopen it when you’ve got a moderation plan ready and have hand-picked some early contributors to set the tone you want. If I were running a newspaper with a comments problem, that’s how I’d proceed. Don’t waste your time trying to force people to use their real names in hope that this will improve the tenor of your discussion area; build a discussion area that’s so appealing from the start that it makes people want to use their real names.

Why didn’t newspapers do this to begin with? I think part of the problem is that a lot of them had only the vaguest rationale for opening up comments in the first place. Maybe some consultant told them it was a good idea. Or it looked like the right thing to do to the young members of the Web team, and the front office said “Go ahead and play, kids, just don’t spend any money.” And the comments got turned on with no one minding the store and no clear goal in mind, either on the business side or in the newsroom.

So, media website operators, I suggest that you ask yourselves:

When you opened up comments, was it really about having a conversation with the readers? Then have that conversation! Get the editors and reporters in there mixing it up with the public. Sure, there will be problems and awkward moments; there will also be breakthroughs in understanding.

Maybe, though, no one was ever really serious about that conversation. Maybe the idea was to boost ad impressions with an abundance of verbiage supplied gratis by the readership. In that case, stop complaining about the flame wars and accept that the more abusive your commenters wax, the more your crass strategy will succeed.

Whatever you do, remember that as long as you’re thinking “What’s wrong with those people?” and “What did we do to deserve this?” you’re not taking responsibility for a problem that, I’m sorry to say, you created yourselves.

Filed Under: Business, Media, Net Culture

Say Everything video: Who was the first blogger?

June 1, 2009 by Scott Rosenberg 7 Comments

Today, for your diversion and amusement, I offer you a little home video related to Say Everything, which is now just a bit over a month away from publication: Who was the first blogger?

While I was pondering whether to write a book about the story of blogging in 2007, there was a little flurry of stories claiming that blogging was now ten years old, since Jorn Barger had coined the word “weblog” in 1997. And I thought, hmmm, that’s a pretty debatable proposition. Mike Arrington asked, “Will Someone Who Actually Cares About Blogging Please Write the History Of It?,” I thought, yes: that’s going to be worth doing.

Filed Under: Blogging, Net Culture, Say Everything

MySpace and Geocities — separated at birth

April 23, 2009 by Scott Rosenberg 4 Comments

Once upon a time, there was a Web company that was based not in Silicon Valley but in Santa Monica. It grew at a breathtaking rate. All of its content was created by its users, and though the pages those users created tended to look jumbled and messy, there was an enthusiasm embedded in all that busy-ness, and a fannish passion for pop-cultural pursuits. The company built up such a sheer momentum of traffic that a much bigger company was persuaded to acquire it for a massive sum of money at the height of a speculative Internet frenzy.

This story sounds like that of MySpace, the once-hot social-networking site for bands and their fans that Rupert Murdoch purchased in 2005. Once “the most popular Website in America,” as the title of a recent book had it, MySpace has been left in the dust by Facebook and Twitter in terms of innovation and growth. MySpace is in the news this week because Murdoch and his henchmen have just shown the door to the site’s founding duo, Chris DeWolfe and Tom Anderson, and replaced them with a former Facebook exec. It’s a recession out there, and Murdoch, who somehow believes that MySpace can be his entree to digital power, is eager to turn it around and demonstrate that it can become the online cash cow he has always dreamed of. Good luck there; I think that, even though Murdoch got MySpace for what many considered a bargain price (of around $500 million), it will prove an albatross around his corporate neck.

In fact, though, MySpace isn’t the company I was thinking of in that first paragraph. I was telling the story of Geocities — the MySpace of 1997-1999. Geocities was the most successful of the “build your own website” companies of the mid-90s (there were others, like Angelfire). Before there were blogs, there were Geocities pages, which were sort of like blogs except without the software to manage your content. Geocities pages were easy to build and really difficult to maintain. As a result, Geocities was populated fast — and nearly as quickly became a vast wasteland of abandoned digital real estate. It must have looked good on paper to the bizdev people at Yahoo in 1999, though, because they paid an astonishing $2.87 billion (in bubble-inflated Yahoo stock) for the ramshackle enterprise.

A decade later, Yahoo’s current management — facing tough times and after many rounds of layoffs — has decided to shut Geocities down. I don’t think there are too many people who will cry for this relic of a bygone era.

What I’m thinking is, there’s every reason to think MySpace will follow a similar trajectory, no matter how many executives huff and puff to try to reinflate its sagging appeal. If that’s the case, look for News Corp. to turn off its lights sometime in 2015 — about a decade after Murdoch’s ill-advised acquisition.

BONUS LINK: Harry McCracken surveys the top 15 Web properties of 1999 and asks, where are they now?

Filed Under: Business, Media, Net Culture

Every blog post a “request for comments”

April 7, 2009 by Scott Rosenberg 6 Comments

One of the points I make in Say Everything is that the reverse-chronological format that blogs use is embedded in the DNA of the Web from early high-profile uses in places like Tim Berners-Lee’s first website at info.cern and in Marc Andreessen’s NCSA What’s New page.

Today’s NY Times op-ed page features a great piece by Stephen D. Crocker that explains the history of the Request For Comment or RFC — the format the architects of the Internet used to promote the development of the open, extensible, cross-platform standards on which the Net as we know it today was built. RFCs were pragmatic and humble; the proponent of some new standard for computers to work with one another would put it out in public — at first, before the network itself provided an easier means of circulation, in snail mail — and take in critical comments and suggestions for improvements.

You could see this practice as the genetic foundation for the comments that today are a feature of nearly every kind of page published on the Web. Just as blogging’s reverse-chronological sequencing has its basis in the earliest structures of web pages, Crocker lets us see that the practice of adding a comments thread to blog posts can also be traced back to the early history of the Net.

In this sense, every blog post is, in its way, a “request for comments.”

Filed Under: Blogging, Net Culture, Say Everything

Ecco in the cloud with Amazon

March 24, 2009 by Scott Rosenberg 12 Comments

Late last night — because late night is the time to tinker with software! — I decided to test drive Dave Winer’s recent crib sheet on setting up an Amazon Web Services cloud-based server. Dave called it “EC2 for Poets” (EC2 is the name of Amazon’s service), and I’ve always been a fan of “Physics for Poets”-style course offerings, so — though I do not write poetry — he lured me in.

For the uninitiated, Amazon has set up a relatively simple way for anyone to purchase and operate a “virtual server” — a software-based computer system running in their datacenter that you access across the Net. It’s like your own Windows or Linux box except there’s no box, just code running at Amazon. If you’ve ever run one of those arcade video-game emulators on your home computer, you get the idea: it’s a machine-within-a-machine, like that, only it’s running somewhere else across the ether.

Dave provided crystal clear step-by-step instructions for setting up and running one of these virtual servers. (Writing instructions for nonprogrammers is, as they say in software-land, non-trivial. So a little applause here.) The how-to worked hitch-free; the whole thing took about a half-hour, and by far the longest part was waiting for Amazon to launch the server, which took a few minutes.

But what should one do with such a thing? Dave’s sample installation runs a version of his OPML editor, an outlining tool. That gave me an idea.

Regular readers here know of my dependence on and infatuation with an ancient application called Ecco Pro. It’s the outliner I have used to run my life and write my books for years now. It has been an orphaned program since 1997 but it still runs beautifully on any Win-32 platform; it’s bulletproof and it’s fast. My one problem is that it doesn’t share or synchronize well across the Net (you need to do Windows networking to share it between machines, and I just don’t do that, it’s never made sense to me, as a one-man shop with no IT crew).

But what if I were running Ecco on an Amazon-based server? Then I could access the same Ecco document from any desktop anywhere — Macs too. So I downloaded the Ecco installer (using a browser running on the Amazon-server desktop, which you access via the standard Windows Remote Desktop Connection tool), ran it, and — poof! — there it was, a 12-year-old software dinosaur rearing its ancient head into the new Web clouds:

eccoincloud

What you see here in the innermost window is Ecco itself (displaying some of the sample data it installs with). Around that is the window framing the remote desktop — everything in there represents Windows running in the cloud. The outermost frame is just my own Windows desktop.

This remains very much in Rube-Goldberg-land at this point. Accessing this remote server still requires a few more steps than you’d want to go through for frequent everyday use. (To me it felt like it was about at the level that setting up your own website was in 1994 when I followed similar cribsheets to accomplish that task.) And the current cost of running the Amazon server — which seems to be about 12.5 cents per hour, or $3 a day, or over $1000 a year — makes it prohibitive to actually keep this thing running all the time for everyday needs.

On the other hand, you have to figure that the cost will keep dropping, and the complexity will get ironed out. And then we can see one of many possible future uses for this sort of technology: this is where we’ll be able to run all sorts of outdated and legacy programs when we need to access data in different old formats. Yesterday’s machines will virtualize themselves into cloud-borne phantoms, helping us keep our digital memories intact.

Filed Under: Net Culture, Software, Technology

« Previous Page
Next Page »