Wordyard

Hand-forged posts since 2002

Scott Rosenberg

  • About
  • Greatest hits

Archives

Sarah Lacy’s Once You’re Lucky: Money doesn’t change everything

August 5, 2008 by Scott Rosenberg 2 Comments

I’ve just finished Sarah Lacy’s book Once You’re Lucky, Twice You’re Good: The Rebirth of Silicon Valley and the Rise of Web 2.0, and I’m feeling a little…green. Lacy’s portrait of this decade’s Web industry is so relentlessly shaped by the yardstick of cash — how much money this entrepreneur made, how many millions that startup is valued at — that by the end of the book, you can’t help having absorbed a little of that world view.

As I put down the volume, I found myself thinking, gee, why didn’t I start a company in my dorm room and pocket tens of millions before I turned 30? Then I slapped myself in the face a couple of times and reminded myself that the last time I lived in a dorm room, the Web didn’t even exist — and that when I set out to become a writer the idea wasn’t, how can I make millions, but rather, is it possible to support myself doing what I love? (I was lucky enough to have the world answer “yes!”)

To be fair, Lacy’s a business reporter; she’s written a business book; business is all about money. She paints a colorful and absorbing portrait of the world of Silicon Valley’s latest wave of smart kids to strike it rich. On the other hand, I can’t accept that her account offers an accurate portrait of “the rise of Web 2.0.” Because, in a way, I feel like I was there, too, at least in the earlier phases, talking with many of the same people and companies that Lacy writes about, showing up at many of the same conferences, witnessing the same phenomena. And it just looked, and felt, different to me: at the start, it was much less about retaining control of one’s company and much more about giving control to one’s users.

First, the good stuff about Once You’re Lucky: It’s full of amusing anecdotes, some of them illuminating, and it offers some valuable insights into the motivation of many of today’s young Web entrepreneurs and the complexity of their relationships with their financiers. It gives a great tour of how the startup and venture capital games have changed over the past decade, as the cost of launching a company has dwindled, reducing the need for big upfront investments that dilute founders’ stakes, even as the prospect of everybody-gets-rich IPOs has grown rarer.

I fault the book in a few areas. In tracing the emergence of the Web 2.0 era’s emphasis on social networking and user contributions, Once You’re Lucky is neglectful of the long history of these phenomena that predates the Web 2.0 era. From Amazon book reviews to the Mining Company (later About.com) to the AOL “guides” and on and on, the so-called “Web 1.0” era was actually full of content created by “the crowd.” Its most overinflated and notoriously flaky IPO, in fact, that of TheGlobe.com, was entirely a “community play” (though in a way that betrayed the best possibilities of online community). The Web of the day just wasn’t as efficient as the later generation of companies at organizing the material contributed by users, and there weren’t nearly as many contributors, and Google hadn’t come along yet to help the rest of the Web find the contributions (and to help the companies profit from them).

My biggest beef with Lacy’s book is that its choice of which companies to focus on seems capricious. Maybe it was just based on who she got access to. Plainly, Lacy got lots of great material from one of her central figures, Paypal cofounder Max Levchin, and she paints a thorough profile of the driven entrepreneur. But, his company, Slide, just isn’t all that interesting or innovative. After reading several chapters about it I still can’t tell you exactly what the company’s driving idea is. It does slideshows on MySpace! It’s big on widgets! It out-Facebooks Facebook with apps like Super Poke! But, you know, if you were stuck in the proverbial elevator with Levchin, could he actually tell you what Slide is all about?

There are other stories in the book whose inclusion makes more immediate sense. Few today would argue against Facebook’s significance, and it’s worth the time Lacy spends on it (though one might look for a little more skepticism). Ning may or may not prove important, but Marc Andreessen’s story is valuable in itself. What’s most interesting about Digg is its model for group editing (which, again, is based on “Web 1.0” roots via Slashdot), not its so-far-unfulfilled quest to sell itself.

Lacy might have delivered a more comprehensive portrait of Web 2.0 by offering more than cursory mentions of the companies that, in my book, really created the template for that phenomenon: Flickr, Delicious, the short-lived Oddpost (which got absorbed into Yahoo Mail). These small startups, growing like mushrooms out of the mulch of dead dotcom treetrunks, pioneered virtually all of the tools and technologies we now think of as “Web 2.0”: easy sharing of media creations; tagging of content to create user-generated “folksonomies”; Ajax techniques for inside-the-browser applications; and so on.

It seems that even though these services and companies were at the heart of the invention of Web 2.0, they don’t figure prominently in Lacy’s narrative because, by the financial yardstick, they were relatively small potatoes (all three were acquired relatively early by Yahoo for amounts rumored to be in the low tens of millions). Levchin is a lot richer than the founders and creators of these companies, but in my view, their work was far more significant.

As someone in the middle of writing a book on a related topic that is inevitably going to face similar criticism (how could you write about this blogger and not that one?), I know that Lacy couldn’t possibly cover every significant company. It’s just not clear what criteria she used to make her choices beyond the will-o’-the-wisp that is market valuation (especially wispy when your company is not actually traded on the market).

So this is where I say: the importance of a company does not lie in how rich it makes its founders, but rather in how widely its ideas spread. The business reporter who is too easily mesmerized by the number of zeroes in a company’s valuation is like the political reporter who is only interested in the horse race.

By themselves, numbers are dull. To me, the fluctuations of a company’s market value, like the ebb and flow of a politician’s polling numbers, is only of interest as part of a larger picture: How is that company, or politician, influencing our world?

[The book’s site is here, and here’s Lacy’s blog. Katie Hafner’s critical review is here. The SF Chronicle review by Marcus Banks is here.]

Filed Under: Books, Business, Net Culture, Technology

Please pay attention, please?

June 12, 2008 by Scott Rosenberg 2 Comments

Here’s a few other links carrying on from yesterday’s post about Nick Carr’s lament that Google and the web in general have made it harder for us to pay attention to books.

Howard Rheingold links to a post on Timothy Ferriss’s blog, by Josh Waitzkin, titled “the multitasking virus.” Waitzkin paints a scene in which listless college students shop on their laptops while their professor’s giving an inspired lecture on Gandhi and nonviolent civil disobedience.

Howard, ever the intelligent pragmatist, says he’s most interested in “engaging students in learning how to train their attention.” He’s right. Most of us, today, could use some serious and rigorous training in attention-focusing skills. Meditation is probably the best. Organizational tools can help, too. Whatever works for you. Howard used to urge people to “pay attention to what you’re paying attention to,” and that was good advice; today we also need to pay attention to how we’re paying attention.

It’s undeniable that the web and all its tools add to the volume of potential interruptions in the workday. There’s nothing new about the interruptions themselves, and we faced them long before we had computers on our desks. (My reading of the Waitzkin post, for instance, was interrupted by an unsolicited telemarketing phone call which, however noble the cause — the American Cancer Society — constituted a far more severe violation of my focus than anything my computer screen can throw at me.) But the Net gives anyone with a proclivity for procrastination a nearly infinite number of options to avoid doing whatever one Must Get Done.

This topic is only going to become more urgent. Today’s Wall Street Journal included a review of a new book, Distracted: The Erosion of Attention and the Coming Dark Age, which I just ordered (it’s by a writer named Maggie Jackson, and has a foreword by my friend Bill McKibben). I’ll look forward to reading the book when I get it. (I hope it’s better than the hilariously overwrought subtitle.)

In the meantime, I should say that the Journal reviewer, David Robinson, lost me when he declared that Twitter is “an update service devoted to what-are-you-doing-at-this-moment inanity.” Sure, there are plenty of Twitter users who are inane, but — after a period in which I couldn’t quite get what all the fuss was about — I’m finding my small-but-growing group of people-who-I-follow to be a valuable source of real-time Web pointers. Like any popular Web platform, Twitter is as bad or as good as whatever sliver of it you choose to pay attention to.

Right about now is where I should say that I heard about Howard’s post itself because he posted about it on Twitter.

Filed Under: Blogging, Net Culture

Amanda Congdon’s back — but, er, not first

May 19, 2008 by Scott Rosenberg 1 Comment

I have a special place in my heart for video-blogging star Amanda Congdon, since through some total coincidence she ended up briefly plugging my book before it even came out. Thanks, Amanda! So I read with interest in today’s Times about her return to the web after apparently unsuccessful attempts to transition into more traditional broadcast gigs.

Then I read this:

“She was really one of the first, if not the very first, Internet blog stars,” said Dan Goodman, the president of digital media for Media Rights Capital. “She has been entertaining people in the digital space since there were people to entertain there.”

Where to begin? Congdon’s Rocketboom began, I’m pretty sure, around 2004. I do believe there were a few “Internet blog stars” already at that time.

As for the second claim: I think that “digital space” had its share of entertainment even back in the Usenet days. And certainly, even if your definition of “digital space” begins with HTTP, the first ten years of the Web pre-Rocketboom had its share of laffs, too.

I can’t say I’m surprised that some digital entertainment lawyer might be ignorant of this stuff. But, you know, the Times really shouldn’t be printing such silliness.

Filed Under: Blogging, Net Culture

Rare sighting of Google error message

May 6, 2008 by Scott Rosenberg 1 Comment

We have become dependent on Google as a part of our Web infrastructure (too dependent, some say), in part because Google’s reliability record is so superb. All of which makes the receipt of any sort of error message from any dimension of the Googleverse worthy of note.

Today I tried to access my Google Calendar. Instead I saw this:

GoogleCal Error

A minute later, my calendar returned. But for an instant, I got to thinking about life without Google.

Filed Under: Net Culture, Personal, Software

Clay Shirky and the cognitive surplus

May 1, 2008 by Scott Rosenberg 4 Comments

“You know, much of England was drunk on gin for 20 or 30 years during the 18th century.”

I studied English history, but my brother studied it more deeply than I did. So when he told me that, a long time ago, I filed it away in the back of my brain as an odd fact worth exploring at some point in the future. The file has been undisturbed ever since, until I watched Clay Shirky’s talk at the Web2.0 Expo.

Shirky tugs on that bit of information as part of a much larger argument that’s well worth a view (it’s about a 15-minute video — he’s also posted a transcript). In brief, he suggests that the English were so stunned and disoriented by the displacement of their lives from the country to the city that they anesthetized themselves with alcohol until enough time had passed for society to begin to figure out what to do with these new vast human agglomerations — how to organize cities and industrial life such that they were not only more tolerable but actually employed the surpluses they created in socially valuable ways.

This is almost certainly an oversimplification, but a provocative and fun one. It sets up a latter-day parallel in the postwar U.S., where a new level of affluence created a society in which people actually had free time. What could one possibly do with that? Enter television — the gin of the 20th century! We let it sop up all our free time for several decades until new opportunities arose to make better use of our spare brain-cycles — Shirky calls this “the cognitive surplus.” And what we’re finally doing with it, or at least a little bit of it, is making new stuff on the Web.

This argument is in some ways just an extension of Shirky’s book Here Comes Everybody (I’m in the middle of it, so, you know, maybe it’s all in there, though he says it’s not). But it also frames the larger sense I’ve had, from the moment I first saw the Web in 1994, that its importance lies in its potential for displacing TV.

It was the first medium I’d encountered in my life that looked like it had a chance of somehow challenging or eroding TV’s primacy in our world, and eliminating some of the distortions TV has rendered in our culture and politics. I’d spent the first part of my career chronicling a venerable medium — live theater — that has never properly recovered from the ascent of TV, so you know who I was rooting for.

Recalling a conversation with a TV producer skeptical that the participatory Web was anything more significant than LOLcats and World of Warcraft addicts, Shirky argues, “However lousy it is to sit in your basement and pretend to be an elf, it’s worse to sit in your basement and try to figure out if Ginger or Marianne is cuter….It’s better to do something than to do nothing.”

And so, because somebody chose to write a Web page rather than watch another sitcom, today you can read all you want about Britain’s Gin Craze on Wikipedia.

Filed Under: Blogging, Media, Net Culture

In the Web archives

April 4, 2008 by Scott Rosenberg Leave a Comment

I’ve spent most of this week deep in the archival attic, researching the new book in old documents, digging through the dull roots of today’s Web, planted back in the 90s. It’s been strange and enlightening; I’ve found much interesting material.

One thing that becomes clear is that what we now think of as “the Bubble” was surprisingly brief. The Web actually experienced something of a downturn beginning in late ’97 and early ’98, and extending through the Long Term Capital meltdown later in ’98. It was only toward the end of ’98 that the bubble really began to inflate in a serious way. The High Bubble lasted till April 2000, when the market suddenly realized, like Wile E. Coyote poised in midair above a canyon, that it was standing on air.

So the era of high dotcom madness was really barely a moment: 18 months or so.

The other thing I’ve learned is how much more extensive the Internet Archive is than I’d realized. I’ve been using the archive heavily for days. I’ve picked up some pointers that, perhaps, others already know; I’ll share them anyway in case they prove helpful.

First of all, ignore all the error messages the Archive itself sends you, like “bad gateway” or “failed connection.” These are indicators of momentary failure; they don’t mean your page isn’t there. Try, try again; reload; eventually, you may get what you’re looking for. (On the other hand, error messages that are stored on target pages that represent the archive’s record of a snapshot of the web page itself — they’re real. They mean that the archive’s bot hit that error message and never recorded the page you’re seeking.)

Also: If the archive tells you that the earliest edition of a particular page it has is from, say, 1997, this doesn’t mean that the site’s content from previous years is gone forever. Iit’s true that you’ll probably never be able to recall, for instance, the Hotwired home page from 1995 — since it was constantly mutating, day by day with new content and year by year with new designs. But the material published on a site that lived at permalinked or semi-permalinked addresses can still often be dug up from Archive.org by poking your way carefully from the present into the past through the site’s own “back issues” or archives or “previously” links.

For instance, Web Review, the early GNN-backed web zine, vanished long before the Web Archive started up, along with most of GNN itself — a crib-death for one of the Web’s earliest original content ventures. Still, I was able to unearth my friend Andrew Leonard’s first piece (from Sept. 1995) for Web Review, all about “clickstream” measurement: Here it is.

We don’t have all of the early Web, but we have more of it than you might think!

Filed Under: Media, Net Culture, Say Everything

RIP, Gary Gygax, and the nature of roleplaying

March 10, 2008 by Scott Rosenberg 2 Comments

The death of Gary Gygax, co-inventor of Dungeons and Dragons, has occasioned an outpouring of writing on the place of D&D in our culture. Salon’s Andrew Leonard was fast out of the gate identifying the “genetic influence” of D&D on the world of the Internet.

Next came Seth Schiesel in the Times, with observations on how the game brought isolated devotees together socially. In a fine piece in the Journal, Brian Carney pointed out that the original, pre-computerized D&D was simply “structured, collaborative storytelling” — exactly what attracted me to the game in my youth. I cared very little for the encyclopedic rules and charts (which often made little sense in the earliest editions of the game) and frequently ignored them in my own gamemastering, which I viewed as closer to the role of a stage director. My job was to make sure my players had a great time and went home with great stories, which I would recap in a mimeographed magazine.

Then, on Sunday, Wired’s Adam Rogers, on the Times op-ed page, presented an exhaustive and only slightly-overstated recap of the “D&D built the Web” argument.

So Gygax’s passing away occasioned a sort of distributed coming-out party for journalistic geeks. That seems fitting. For me it also served as a reminder of a question that always hovered in the back of my mind during the years I spent roaming others’ D&D worlds and crafting my own.

In D&D and its role-playing descendants, you play a character whose traits are quantified and typically assigned random starting values. This made perfect sense to me as applied to either physical or supernatural abilities — since you weren’t going to pull out a Sword +2 and charge the guy across the table from you, and fireballs were simply not going to fly across your basement room, you needed some sort of proxy system for evaluating individual abilities in these realms and resolving conflicts.

But other traits, like intelligence and charisma, present themselves naturally in the course of game play. The charismatic player was the one who could rally the gang to his side, and no roll of the dice was going to make the group schlub into a natural leader. So what did the randomly assigned values for these characteristics mean? How could a player who was himself a dim bulb play a character with 18 intelligence points? What was a smart player supposed to do with a character with a low brainpower score?

What ought to happen in D&D when the real-world qualities of a player were at odds with the game’s numerical dictates? Which ought to rule — free will or predestination? From that fateful day in 1975 or so that I first played the polyhedral dice, I never could resolve this Miltonic quandary. I don’t know whether today’s World of Warcraft clans face the same questions. But certainly part of the lasting fun that Gygax bequeathed us was the opportunity to grapple with them.

Filed Under: Culture, Net Culture

Links for February 27th

February 27, 2008 by Scott Rosenberg 1 Comment

  • Ethan Zuckerman — Searching for common ground with Andrew Keen: Zuckerman wants to ask Andrew Keen, the Cult of the Amateur provocateur, a pointed question:

    I planned to ask Keen when he’d become worth listening to. He argues that we should listen to experts, not to amateurs… but this is his first book. Did he become an expert in a single moment of enlightenment? Or when the check from the publisher cleared? If it wasn’t a quantum process, was there a moment as a very good amateur where he was suddently worth listening to? And if so, doesn’t that mean that there could be, theoretically, out there on the citizen-generated internet, someone else worth his time to listen to?

  • JOHO: is the Web different?: David Weinberger divides us all into Web utopians, dystopians and realists. An argument of great clarity.
  • Play This Thing! — Game criticism, why we need it and why reviews arent it: Greg Costikyan bemoans the absence of serious critical writing on the art of game-making.

    Rings a bell for me; way back when I was working as a theater and movie critic and trying to figure out what to do next with my life, I toyed with the idea of trying to write criticism about videogames and computer games. After producing one extended opus on the Mario oeuvre I realized I was already (in my early 30s) way too old for the work.

Filed Under: Blogging, Links, Net Culture

Judging books by the page

February 26, 2008 by Scott Rosenberg 2 Comments

I confess I’m confused.

A good while back I read about the “page 69 test” — apparently descended to us from Marshall McLuhan. The idea is that you can open any book to page 69 and use that to determine whether you will like the book.

Well, OK. Page 69 of Dreaming in Code contains a description of Moore’s Law and concludes, “…there is no Moore’s Law for software. Chips may double in capacity every year or two; our brains don’t.” Whew. I think that’ll do the trick for at least some people.

Only next I read about a variation of this, called the “page 99 test,” and attributed to Ford Madox Ford: “0pen the book to page ninety-nine and read, and the quality of the whole will be revealed to you.” So, let’s see: on page 99 of my book you can read a story about how hard it is for developers to keep up with the tools available to them. In a visit to OSAF, a programmer named Anthony Baxter described his search for ways to speed up the processing of audio data in a Python application. Baxter was the release manager for the most recent version of Python, yet even he had forgotten that the programming language comes with a utility that exactly suited his needs. “The batteries were indeed included, as Python devotees liked to say. But with so many batteries, who could keep track of them all?”

OK. Fine. I’m willing to let my work be judged on this, too!

But now here comes the page 123 test! This one seems less about helping you decide whether to read a book and more about “bibliomancy,” or the art of making oracular use of arbitrarily selected passages of books. The page 123 test dictates that one “grab the nearest book, open to page 123, go down to the 5th sentence and type up the 3 following sentences.”

For Dreaming, this turns out to be a passage about the Chandler Project’s search for a development manager:

As the hunt dragged on, Lou Montulli and Aleks Totic suggested a name from their Netscape days. Michael Toy had been one of a band of employees at Silicon Graphics who left with its founder, Jim Clark, when Clark decided to start a new venture that would turn into Netscape. He had led the company’s programming team through several hyperspeed cycles of the browser wars in an era that redefined the norms for software development, establishing new benchmarks for fast releases of new versions.

I think I know what the multiplication of these memes is getting at. At this rate, authors are going to have to expect to be judged by every page they write. The nerve of that!

Books aren’t typically fractal — you can’t pull out lots of individual parts and allow each to stand for the whole. But each passage ought to count. In the end, every page and every sentence of a book ought to be able to present a good face for the larger entity it belongs to — like a diplomat abroad.

Filed Under: Dreaming in Code, Net Culture

Be Kind Rewind and the infinite garage

February 25, 2008 by Scott Rosenberg Leave a Comment

I haven’t yet seen Michel Gondry’s Be Kind Rewind, but A.O. Scott’s New York Times review made me want to:

…It treats movies as found objects, as material to be messed around with, explored and reimagined. It connects the do-it-yourself aesthetic of YouTube and other digital diversions with the older, predigital impulse to put on a show in the backyard or play your favorite band’s hits with your buddies in the garage.

And the deep charm of Mr. Gondry’s film is that it allows the audience to experience it with the same kind of casual fondness. It is propelled by neither the psychology of its characters nor the machinery of its plot, but rather by a leisurely desire to pass the time, to see what happens next, to find out what would happen if you tried to re-enact “Ghostbusters” in your neighbor’s kitchen.

I would argue that “the do-it-yourself aesthetic of YouTube and other digital diversions” and the “older, predigital impulse to put on a show” are in fact one and the same. It is this motivation that drives a great deal of the creativity on today’s Web. What’s different today is that the “backyard” theater can pack in, potentially, millions. The garage is infinite.

David Edelstein didn’t like it quite as much, but he concludes:

…it radiates the kind of optimism you don’t see in films about how new media is turning us all into passive voyeurs in our own hermetically sealed bubbles. This bubble is warm and inclusive.

Filed Under: Net Culture

« Previous Page
Next Page »