Wordyard

Hand-forged posts since 2002

Archives

About

Greatest hits

Google’s Windows-only world

August 24, 2005 by Scott Rosenberg

Jason Kottke’s intriguing review of the current status of the Web-as-platform question (are Web apps now good enough to threaten the primacy of a certain desktop operating system monopoly? will they ever be?) is only the latest in a long line of musings on this theme that stretch all the way back to Netscape’s heyday. The dream of rendering individual users’ choice of desktop operating system irrelevant by getting them to move all their significant work into the browser was what fueled all those death-march development cycles during the browser wars.

Microsoft cut off Netscape’s air supply — with plenty of help from its victim’s own asphyxiating mistakes — before the browser company could complete building all the parts of this new computing world. Java was supposed to be an alternate road to the same destination; it turned out to be good for some other things, but not for that.

So we lost a few years there.

More recently, the Web-app universe has come roaring back, as GMail, GoogleMaps, Flickr and other Ajax-based Web interfaces have provided users with something speedier and more interesting than the old, slow, click-and-wait world of Web computing. It is possible, today, to begin moving more and more of one’s work and data into browser-accessible stores and programs. This is all great, and it’s unfolding with a kind of inevitability.

For a while there, during the downturn years, it seemed like the Web-based future might arrive without any one company driving it. The new structure of our technology would simply be built by a swarm of lilliputian enterprises that would gradually overwhelm the Gulliver of Redmond.

Suddenly, though, it looks like we’re back in the land of corporate showdown. In a wave of media reports, Google is being cast as the new Netscape — reluctantly, to be sure, since Netscape showed how dangerous it is to say to a company with an effectively bottomless warchest, “Bring it on!” Rather prematurely, I think, a lot of people quoted by Gary Rivlin in this morning’s Times suggest that Google is already the new Microsoft — that the company with the “don’t be evil” motto has morphed into a new evil empire.

Wherever you place Google on this spectrum, there’s no other way to read Google’s latest moves than as part of a broad effort to bring users onto Google’s platform so that, one day, they can be moved off Microsoft’s. That day is doubtless far off. But not unimaginable.

Google’s decision to raise $4 billion more on Wall Street, timed almost certainly not coincidentally to coincide with its release of two new software products (a new desktop application and a new “Google Talk” IM and voice communicator), reinforces the message first sent by GMail: that, when Google defines its mission as “organizing the world’s information and making it universally accessible and useful,” “the world’s information” very much includes your own personal information.

Which leads us to the paradox here. There is one little weakness in the theory that Google is setting out to challenge Microsoft. For some reason, each time Google releases any software that is not browser-based — whether it’s Google Desktop, or Picasa, or the new Google Talk — it has offered only a Windows version of the product. No Mac versions, no Linux versions.

Maybe Google feels that the Mac already offers a rich software environment for geeks (with good desktop search already built into the latest OSX) and Linux isn’t a big enough desktop market. Maybe they just target Windows because, to paraphrase the old bank-robber line, “that’s where the users are.” Or maybe they’re targeting Windows users precisely because they want to woo Microsoft addicts on their own turf.

No doubt, it would take a lot of extra work to release editions of Google software for non-Windows platforms. Cross-platform development is enormously difficult: that’s a fact of software life. (Browser-based software is so attractive because you don’t have to worry about writing different versions for different operating systems; the browser makers have already done that heavy lifting for you.) I always understood this intellectually, but now, after several years of following the work over at OSAF for my book, I feel it in my bones.

But Google has assembled a vast reserve of computer-science horsepower. It is, if Rivlin’s story is to be believed, sucking Silicon Valley’s software brains dry. Surely, with all that coding prowess, Google could set aside some cycles to offer non-Windows users equal access to the cool toys it is providing. If the Googleplexniks are serious about that phrase “the world’s information,” they need to look beyond the realm of Windows. The world doesn’t stop where the “Start” menu ends.

Filed Under: Business, Dreaming in Code, Software, Technology

“Time Management for Anarchists”

May 26, 2005 by Scott Rosenberg

Time Management for Anarchists: This little flash slideshow does a good job of summarizing the principles of the faddish-yet-sensible David Allen “Getting Things Done” philosophy using imagery drawn not from the warrens of corporate America but instead from Emma Goldman and Mikhail Bakunin.

Filed Under: Business

A cure for Mad Boss Syndrome

May 2, 2005 by Scott Rosenberg

With the controversy over John Bolton’s nomination to become Bush’s ambassador to the U.N., it’s been possible for the administration’s supporters to paint Bolton’s opponents as whiners. The Democrats, it seems, don’t like Bolton because he’s, you know, tough. Raises his voice. Pushes around his inferiors. Well, ask the Republicans, what’s wrong with that? Shouldn’t we want a tough guy at the U.N.? We’re the strongest nation on the planet! Why should we care if one of our diplomats is a hardass? We’re supposed to reject an appointment because the guy yells?

Of course, if you’ve been following the story, you may understand that the issue here isn’t one of bad manners — it’s about bad management and bad judgment. Bolton isn’t just a tough guy; he’s a tough guy who apparently used his ire to bludgeon intelligence reports into the shape he sought. It’s one thing to push around your subordinates; it’s quite another to push around the information on which the lives of Americans and American troops depend. The reason Bolton’s nomination strikes so many observers, including me, as so profoundly wrong is that it’s precisely Bolton’s management style — one shared by, and endorsed by, the Vice President’s office — that led to the debacle of American intelligence about Iraq’s weapons of mass destruction in the lead-up to the 2003 invasion.

In the “whatever happened to those WMD?” game, the Bush team has been pretty successful at shrugging off blame or diverting it at the intelligence community: Darn that CIA! How could they have misled us so badly? But Bolton’s confirmation hearings stand as a blunt reminder of what really happened: Bush’s men hammered the intelligence “community”, raged at their troops, threw fits and tantrums and delivered threats and ultimatums until the information flowing up from the field matched the fantasy their ideology dictated. When that fantasy collided with the reality on the ground in Iraq — look, ma, no WMD after all! — these men turned around and said, well, we acted on the best information we had at the time. First they pushed around their subordinates; then they blamed their subordinates. Classy! And, sadly, genuinely dangerous in the realm of national security, which is why the intelligence field has a strong tradition of trying to keep its reports insulated from the political tide — one more tradition, like the Senate filibuster, that the pseudo-conservatives of the Bush cadres are casually tossing overboard.

The Bolton saga strikes a chord with the American public because we’ve all worked with, and most of us have worked for, a Bolton or two in our time, and we know how it goes: Mad Boss shouts at the top of his (it’s usually, though not always, a male phenomenon) lungs until the things people say to him match the things he wants to hear.

I first encountered the world of Mad Bosses in various jobs I held as a college kid and later as a fledgling journalist. I assumed that this was the way of the world — that somehow the role of Being In Charge carried with it a dose of generic rage, and that all bosses would inevitably, at some point, explode and abuse their employees. The macho culture of old-school American newsrooms certainly spawned its share of Mad Bosses, and I’d have my run-ins with them. For me, one of the grand things about leaving the comfortable nest of the newsroom and helping found a company was doing my small part to shape a different, more civilized workplace culture, in which people treated each other — superior and subordinate alike — as colleagues, not kicking posts.

I came to realize that Mad Bossism was not an inevitability; it is, in fact, an anachronism. It flows less from power than from frustration at powerlessness. The boss explodes because the world won’t bend to his will — and it’s supposed to! What good is being boss if it won’t?

This has given me a tad more empathy for the bulging-veined, red-faced bosses of my past, though I’m firm in my determination never to work anywhere near the type again. The truth is, it’s no longer as easy as it used to be to get away with this kind of behavior: Joe or Jane Subordinate is going to be blogging every last twitch of Mad Boss’s tantrums. Just look at what’s happened to the director of Los Alamos National Labs, G. Peter Nanos. If the postings about him on a largely anonymous Los Alamos insiders’ blog are true, he’s a classic Mad Boss. Yet the scientists and engineers who work for him, having reached their limit, aren’t giving up; they’ve used the Web to shame him. Mad Boss may have met his match in Mad Blogger.

I can’t say I’m sad to see the field so leveled. The Web is criticized, and often rightly so, for the incivility of so much of its dialogue. But here’s one instance in which it can actually help counter the sort of offline incivility that for too long has been simply a given of the workplace.

Filed Under: Business, Personal, Politics

Interesting reading

April 4, 2005 by Scott Rosenberg

## Peter Drucker looks at the big picture of the world economy today — really four economies, he says: information, money, multinationals and mercantile exchange.

  For thirty years after World War II, the U.S. economy dominated practically without serious competition. For another twenty years it was clearly the world’s foremost economy and especially the undisputed leader in technology and innovation. Though the United States today still dominates the world economy of information, it is only one major player in the three other world economies of money, multinationals and trade. And it is facing rivals that, either singly or in combination, could
conceivably make America Number Two.

## Cynthia Ozick reviews Joseph Lelyveld’s memoir. I haven’t read the book, but the former N.Y. Times editor apparently did a vast amount of legwork researching his own childhood. This is Ozick’s discussion of the limitations of Lelyveld’s approach:

  …There is no all-pervading Proustian madeleine in Lelyveld’s workaday prose. Yet salted through this short work is the smarting of an unpretentious lamentation: ”If this were a novel,” ”If I were using these events in a novel,” and so on. Flickeringly, the writer appears to see what is missing; and what is missing is the intuitive, the metaphoric, the uncertain, the introspective with its untethered vagaries: in brief, the not-nailed-down. Consequently Lelyveld’s memory loop becomes a memory hole, through which everything that is not factually retrievable escapes. Memory, at bottom, is an act of imaginative re-creation, not of archival legwork. ”Yes, I was finding, it was possible to do a reporting job on your childhood,” Lelyveld insists. Yes? Perhaps no. The memoirist has this in common with the novelist: he is like the watchful spider alert to every quiver on its lines. Sensation, not research.

Well put. I think one of the reasons I chose, as a young writer, a career as a critic rather than as a reporter was that I could not see devoting my life to writing that was all “nailed-down.” Reporting is a necessary and valuable skill, and I have deep respect for those who do it well; it’s hard, hard work, too. But it will typically miss that dimension of “the intuitive, the metaphoric, the uncertain, the introspective.” In American journalism as it is conventionally defined by those who carve out the job descriptions, a critic’s portfolio is broader, and it’s possible, under the right alignment of stars, to feel as well as to record — or rather, to record what one has felt along with what one has witnessed.

## Apparently there’s a movement afoot in the world of writing about games to be less “nailed-down.” It’s called the “New Games Journalism” — “a narrative, experiential approach that acknowledges the effect of the game on the player.” I’ll need to read up. This was sort of what I had in mind 15 years ago when I began to move my attention from the world of theater to the digital realm, and thought, hey, why not try writing more ambitious reviews of videogames? I’d just turned 30, though, and was already feeling that the gaming world was one I would be less and less able to keep up with as the decades advanced. (So right!) So I wrote one opus — an “experiential” discourse on the world of Super Mario — and moved on to broader terrain.

Filed Under: Business, Food for Thought, Personal, Technology

Dosed

March 1, 2005 by Scott Rosenberg

Here’s a little tale of life in the 21st century.

As I suffered through a bout of the usual seasonal cold last week, I found that my supply of my remedy of choice — a generic over-the-counter combo antihistamine and pseudoephedrine (Sudafed) — was running low. As I ran errands, I searched for this variety on the shelves of local drug stores, but to no avail. Finally, this morning, at a Walgreen’s in downtown San Francisco, I found the precise medication, so I thought, gee, better stock up.

But when I plopped three boxes of “Walfinate D” on the counter, the checkout lady said, “There’s a limit of two on those.” She couldn’t tell me exactly why, but since all she wanted to do was ring box number three up separately, I didn’t pursue it.

Back at my desk, I decided to look for answers. I couldn’t remember how to spell “pseudoephedrine” so I just Googled “sudafed controls” and found this page, which pretty much answered my question: Pseudoephedrine is apparently a key raw material for the proprietors of meth labs, so the government wants to limit bulk sales.

First I was irritated that my need for cold relief was being made more inconvenient by the chemistry demands of speed freaks. Then I was delighted at how simple a matter it was, in these Google-powered times, to discover exactly why my cold medicine was considered a suspect substance.

My inconvenience was hardly severe. But if they try to ban my Sudafed, as the commentator on the above page proposes, they’ll have to pry it from my germy, sneezed-into hands!

Filed Under: Business, Personal, Science

The browser war in the rear-view mirror

December 20, 2004 by Scott Rosenberg

Randall Stross’s piece on Firefox in the Sunday Times business section, with its comical quotes from a Microsoft spokesman who suggests that unhappy users buy themselves new computers, brought a little wisp of browser-war nostalgia to mind.

It’s undeniable that, today, if you want to protect your computing life and you run Windows, you’re insane to continue running basic Microsoft applications like Internet Explorer and Outlook. (Firefox and Thunderbird are great alternatives in the open source world. I’m still wedded to Opera and Eudora out of years-long habits. Opera does a great job of saving multiple open windows with multiple open tabs from session to session, even when you suffer a system freeze.) These programs function together in a variety of ways that Microsoft presented as good ideas at the time they were written. Hey, integration means everything works seamlessly, and everyone knows how highly the business world prizes the word “seamless.”

Today it is precisely the same integration — the way, for instance, that ActiveX controls and other code pass freely across the borders of these applications, allowing them to work together in potentially useful but hugely insecure ways — that make IE and Outlook such free-fire zones for viruses and other mischief. (It’s certainly true that the Microsoft universe is targeted by virus authors because it’s where the most users are; but it’s also true that Microsoft’s products are sitting ducks in a way that its competitors in the Apple and open source worlds simply are not.) If you’re willing to turn on Microsoft’s auto-update to keep up with the operating system patches, and to abandon Outlook and IE for your day-to-day work, you can rest relatively easy. But you never know when some other application is calling on that “embedded browser functionality,” when you’re using that Outlook code without even realizing it.

Stross is strangely mum on the antitrust background of these matters. It’s the ultimate, though not entirely unforeseen, irony of the Microsoft saga that the very integration-with-the-operating-system that enabled Microsoft to “cut off the air supply” of its Netscape competition is now looking more and more like the franchise’s Achilles heel. Microsoft fought a tedious, embarrassing and costly legal war with the government to defend its right to embed Web browser functionality in the heart of the operating system. “Our operating system is whatever we say it is! How dare government bureaucrats meddle with our technology!” was the company’s war cry.

Now it turns out that if Gates and company had paid a little more heed to the government they might have done their users, and their business, a favor. Microsoft’s tight browser/operating system integration helped spell Netscape’s corporate doom; today it is one of the biggest gaping holes in Windows security, and a legion of hostile viruses swarms through it.

Stross writes, “Stuck with code from a bygone era when the need for protection against bad guys was little considered, Microsoft cannot do much. It does not offer a new stand-alone version of Internet Explorer. Instead, the loyal customer must download and install the newest version of Service Pack 2. That, in turn, requires Windows XP. Those who have an earlier version of Windows are out of luck if they wish to stick with Internet Explorer.”

But it’s not quite that simple. Microsoft’s reluctance to invest in browser development has stemmed only partly from the kind of inertia that comes from having won a war in a previous generation (“The browser? We own that space, we don’t have to keep improving it”). Even more deeply, Microsoft has been reluctant to make the browser better — more reliable, more secure, more flexible as an interface for more kinds of applications — because its leaders understood very well what that would mean: The better the browser is, the less dependent people are on the operating system’s features — as today’s users of well-designed Web applications like Gmail, Flickr and Basecamp demonstrate every day. This is not where Microsoft wants to see the computing world go, so why, once it gained a stranglehold on the browser market, would it help the process along?

In other words, what happened once Microsoft left the courtroom was precisely and exactly what the government’s antitrust lawyers said would happen: Microsoft’s goal in integrating the browser was not to serve the public and the users, but to shut down further innovation and development. Netscape argued that Microsoft wanted to control browsers because it wanted to make sure they did not emerge as a platform for applications that would undermine Windows’ importance. Netscape, the record now shows, was right.

We lost three or four years of Internet time (from the collapse of the bubble to this year’s Renaissance of Web applications) thanks to Microsoft’s stonewalling and the Bush administration’s unwillingness to represent the public interest in this matter. The next time a worm comes crawling through your Windows, curse the Justice Department’s settlement — and go download Firefox.

Filed Under: Business, Technology

The great Social Security swindle

November 29, 2004 by Scott Rosenberg

“You’re thinking of this place all wrong. As if I had the money back in a safe. The money’s not here. Your money’s in Joe’s house . . .(to one of the men) . . . right next to yours. And in the Kennedy house, and Mrs. Macklin’s house, and a hundred others. Why, you’re lending them the money to build, and then, they’re going to pay it back to you as best they can.”

Christmas season is “It’s a Wonderful Life” season, and anyone who has seen that movie — which ought to be pretty much everyone by now — will remember Jimmy Stewart’s plain-spoken explanation of banking, delivered to angry customers who have begun a run on the bank where he works.

Today it’s the Bush administration that’s started a run on the institution of Social Security. And so far no one in Washington has had the gumption or the forthrightness to get up, like Jimmy Stewart’s George Bailey, and tell the American people what’s really going on.

The Democrats have long been accused of overstating the case in defense of the Social Security system and “scaring seniors” by warning them that the evil Republicans are going to cut their benefits. Seniors may not, in fact, be in too much trouble — but people in their mid-’40s like me, and anyone younger, have every reason to fear.

What am I so worked up about? This piece in yesterday’s New York Times, headlined “Bush’s Social Security Plan Is Said to Require Vast Borrowing.” Richard W. Stevenson’s article is a highly problematic example of pseudo-objective “on the one hand, on the other hand” journalism — but even through the haze of official mendacity, the message is clear.

For months — years, if you go back to the 2000 election cycle — serious economists have been saying that there is no way to pay for President Bush’s scheme to privatize part of the Social Security system without running up huge deficits. At this point in Bush history, of course, the huge deficits have arrived even without “reforming” Social Security. So the Bush line now appears to be: Hey, “vast borrowing” hasn’t hurt us yet; what’s a few huge deficits more?

As the economist Herbert Stein famously said, “If something cannot go on forever, it will stop.”

Let’s recap some of the history here: The Social Security time-bomb — a side-effect of the Baby Boom demographic bulge passing through the employment lifecycle — was evident a generation ago, certainly by the waning years of the Reagan administration. Bipartisan efforts — including the first President Bush’s acceptance of a tax increase, despite his famous “Read my lips” promise — set the nation’s finances on course again. By the late ’90s we began racking up significant budget surpluses.

These surpluses were supposed to be set aside to keep Social Security solvent for us and our children. That was the famous “lock box” that Al Gore was unfairly derided for talking about. This money wasn’t “ours,” as George Bush fatuously and insidiously told the nation in 2000, justifying his call for tax cuts. It was cash that had been raised to solve a long-term problem.

Bush and his team broke open the lock-box and handed the cash out, mostly to the wealthiest tier of Americans, and began running up deficits like there was no tomorrow. Now they want us to buy into a fraudulent scheme to hand chunks of the nation’s obligations to future retirees into 401k-like private investment accounts. But since the money today’s workers now pay in Social Security taxes actually pays today’s retirees, any cash diverted to such investment accounts will have to be made up somehow.

Bush’s answer? Charge it!

In theory, the economists who like this privatization scheme see it as a way to boost the nation’s total savings, which is a good thing for the economy and should increase long-term growth, ultimately helping put the federal budget back on track. But, er, if the feds are borrowing the money for the citizens to save, then there’s no real increase in total savings, and no long-term benefit — as Stevenson’s article lays out. All we get are bigger and bigger deficits as far as the eye can see, with the looming possibility that, sooner or later, our lenders will grow tired of the game, and we’ll face a catastrophic drop in the dollar, a skyrocketing inflation rate, and the prospect, at worst, of a Weimar-like fiscal collapse.

Meanwhile, what are we taking this huge risk for? For the sake of letting individual investors take a modest portion of their retirement money and put it into mutual funds? Of course, we’ve recently had a national refresher course in how the mutual fund industry works; even without crooked kickbacks and such, the service fees eat up a significant chunk of the ostensible advantage you get from investing long-term in stocks over more conservative choices. And those financial advisers who love to tout the long-term advantage of stock investments are rarely willing to come clean on the risk to retirees: Growing older is not a choice, and if you’re unlucky enough to need to retire during a market downswing, you will not find much consolation in knowing that your portfolio would have averaged out a winner if you’d only had another decade or two.

In the long term, stocks may be better; but as a famous economist once said, in the long term, we’re all dead, too. The long term is always iffy. That’s why the best retirement safety nets are built out of safer materials than stock-market investments — and why Social Security should be kept out of the hands of the brokers.

Consider this other piece from yesterday’s Times, in which Mary Williams Walsh explains a little-known paradox of the pension world: It seems that, despite the woes so many pension funds now face, a handful of them have managed to prosper by choosing conservative, safe long-term investments. Meanwhile, the pension funds that are in trouble are those that chose riskier stock-market portfolios. Imagine that! This, of course, is precisely the course that Bush wants to put Social Security on. In a better world, Walsh’s piece would have been put on the Times front page right next to Stevenson’s, as a cautionary counterpoint to the president’s folly.

Everyone in Washington knows we need to fix Social Security. But the Bush approach, while it could win support in the short term in a Republican-dominated Congress, is a long-term disaster. The worst scenario here is one that no one in the administration would ever admit to, but if you listen in on the loony right fringes (who are closer than ever now to the levers of power) you’ll hear it: The idea is that if we undermine Social Security enough today, when the fiscal train-wreck hits tomorrow the government won’t have any choice but to scrap the retirement system entirely — fulfilling, finally, the dreams of its original die-hard Republican opponents, who saw FDR’s pledge to America’s working families as an evil efflorescence of socialism.

The Bush economists are ready to begin the dismantling. Wall Street is teeming with brokers slavering to get the commissions on this vast new influx of accounts. And, just when we can no longer count on Social Security to cushion our retirements, the borrowing the Bush plan demands will spark inflation or undermine the dollar or both, devaluing whatever savings we may have been counting on to augment those Social Security checks.

Maybe seniors — and the rest of us — should be scared.

Filed Under: Business, Politics

Doctorow at WIPO Geneva

November 18, 2004 by Scott Rosenberg

Cory Doctorow’s reports for the Electronic Frontier Foundation from the UN’s World Intellectual Property Organization (WIPO) meeting in Geneva are fascinating for what they illuminate at this bizarre crossroads of global bureaucracy and globalized corporatocracy. But most peculiar of all is his tale of how “all of the handouts set out by the ‘public interest’ groups (e.g., us, civil society coalition, IP Justice, Union for the Public Domain) were repeatedly stolen and pitched into the trashcans in the bathrooms.”

Here’s an excerpt of the full saga:

  Let me try to convey to you the depth of the weirdness that arose when all the public-interest groups’ papers were stolen and trashed at WIPO. No one gets into the WIPO building without being accredited and checked over, so this was almost certainly someone who was working on the treaty — in other words, a political opponent (none of the documents promoting the Broadcast Treaty were touched).

As the Indian delegation put it, WIPO is an organization based on information. For someone who believes in an information-protection instrument like the Broadcast Treaty to sabotage the negotiation by hiding information from the delegates is bizarre. The people who run the table were shocked silly — this has apparently never happened before at WIPO.

Filed Under: Business, Politics, Technology

Nobodies business

October 11, 2004 by Scott Rosenberg

I want to pick up a few threads I’ve been collecting and meaning to post about but haven’t had time for till now.

Let’s start with Matthew Klam’s New York Times Magazine cover story on bloggers from a couple of weeks ago. As a group portrait of a handful of high-profile political bloggers it was, I thought, a good read, and reasonably accurate, based on my own impressions of some of the people covered. But this passage jumped out at me and screamed for comment:

“In a recent national survey, the Pew Internet and American Life Project found that more than two million Americans have their own blog. Most of them, nobody reads. The blogs that succeed … are written in a strong, distinctive, original voice.”

This passage crystallized the fundamental and profound divide between most professional journalists and most bloggers. “Most of them, nobody reads.” Now, even the world’s most neglected, forlorn and unpopular blog has at least one reader — the author. So Klam’s first message to these bloggers is, “You are a nobody.” But in fact most of the millions of not-terribly-well-known blogs on the planet do have a handful of readers: friends, relatives, colleagues, the person who staggered in the door from a Google search and stuck around.

“Everyone’s famous for 15 people.” Not a new concept (here’s a reference from 1998), but still a valuable one. And one that continues to elude most journalists, who can’t lay aside their industry’s yardstick of success long enough to understand what’s happening on the Web today.

For Klam, as for so many of us media pros, “the blogs that succeed” is synonymous with “the blogs that reach a wide audience.” But publishing a blog is a nearly cost-free effort compared with all previous personal-publishing opportunities, and that frees us all to choose different criteria for success: Maybe self-expression is enough. Or opening a conversation with a couple of new friends. Or recording a significant event in one’s life for others to find.

Many of these blogs do not meet the definition of “journalism,” but who is Klam, and who are we, to say that they are not “successes”? Who are we to discount the human significance of untold numbers of personal stories and thoughts and ideas communicated to handfuls of readers — to dismiss this vast dialogue as the chatter of “nobodies”?

(David Weinberger has a similar reaction here.)

Of course there are blogs and bloggers who judge their enterprises according to the traffic yardstick. Steven Levy’s recent Newsweek column even suggested that some bloggers are beginning to become what is known indelicately in the Web industry as “traffic whores”: “The low road is a well-trodden path to big readership.” As some bloggers try to turn their pastime into a business or a livelihood, this is inevitable.

Unlike Levy, though, I’m less worried about the occasional “ankle-biting” blogger who grows hoarse-voiced in hope of page-views — and more impressed by the unflagging explosion of memorable new blogging voices and contributions to the burgeoning pool of human knowledge online.

This is the dark matter of the Web universe, the stuff J.D. Lasica is writing about in his book. Collectively, it outweighs all the “bright” matter of the more commercial Web sites with their vast traffic. This much was known as early as the mid-’90s, when we began to see that, though the top 20 Web sites may have dominated the traffic claimed by the top 100 Web sites, the top 100 Web sites still commanded only a fraction of the Web’s total traffic. This was a new world.

What’s happening today is that, thanks to Google and RSS and other technologies still aborning, that world is beginning to get organized, and as it becomes better organized it can’t help becoming more economically significant.

Here’s where I’d bring in Wired editor Chris Anderson’s now justly celebrated “Long Tail” piece. Anderson takes a look at consumer behavior patterns on Amazon, Netflix, Rhapsody, and other “big catalog” services online. These services restore back catalogs and “mid lists”; they restore a nearly infinite number of oldies into circulation. Individually, these works have minuscule demand; collectively, they’re huge:

“Not only is every one of Rhapsody’s top 100,000 tracks streamed at least once each month, the same is true for its top 200,000, top 300,000, and top 400,000. As fast as Rhapsody adds tracks to its library, those songs find an audience, even if it’s just a few people a month, somewhere in the country. This is the Long Tail.”

People don’t get this yet, Anderson writes: “We assume…that only hits deserve to exist” — just as we assume that if you don’t have a big circulation, “nobody” reads you.

Anderson’s piece focuses chiefly on the entertainment industry, but the principle is a broader one. If you want to keep climbing the ladder from blogs to the entertainment industry all the way up to the global economy, the next piece to read is James Surowiecki’s little essay on “the bottom of the pyramid,” which talks about the vast economic opportunity in creating products for the planet’s teeming billions of poor customers. (“Though developing nations don’t have much money on a per-capita basis, together they control enormous sums.”)

There’s an old saying in the land of the Broadway theater, where once I tarried, that you can’t make a living there, but you can make a killing. Perhaps the Internet’s fate is to transmute the worlds of publishing and entertainment and even global trade from the hit-or-miss nightmare of a Broadway-like lottery into something more hopeful — a world where it’s a lot harder to make a killing but a lot easier to make a living. Is there anyone, outside of a few boardrooms, who’d find that a loss?

Filed Under: Blogging, Business, Technology

Bubble 2.0?

October 6, 2004 by Scott Rosenberg

John Battelle is moderating a panel of financial guys. (Yes, they’re all guys.) It’s titled “So is this a bubble yet?” Starting with the Google IPO.

William Janeway of Warburg Pincus: Google was making lots of money, they had to go public to create liquidity for stakeholders, and because they’d reached the point where legally they had enough owners that they had to start reporting anyway. That’s different from the situation in which VCs are basically arbitraging companies, trying to sell them off to the greater fool. As happened during the bubble. I doubt that anyone in this room will be active professionally when the next true bubble comes along.

Safa Rashtchy of Piper Jaffray: Today, only 25% of the use of the internet is for consumer content, 75% is as a utility — for communication, essentially.

Janeway: We funded enormous productive waste. Trial and error. How many startups were funded in order at the stsatrt of Web 2.0 we could have Amazon, Ebay, Google, Yahoo? There will be a lot of smart people productizing smart ideas that will be acquired.

Janeway: One of the things we like about this environment is the number of scarred veterans who survived the bubble and are actually building businesses today. Theyu’re expecting it’ll be a 5-7 year time horizon. If we do that, build a real business, generate positive cash flow, we’ll be rewarded. That’s what the Valley requires.

Lanny Baker of Smith Barney: The overall market cap of internet sector is smaller than at the peak of the bubble. Companies are generating more cash. The jokesters running the scam companies have probably been weeded out. It’s a safer pool to swim in.

Filed Under: Business, Events, Technology

« Previous Page
Next Page »