Wordyard

Hand-forged posts since 2002

Archives

About

Greatest hits

Twitter’s link-sharing limits

October 27, 2008 by Scott Rosenberg

One of the main things that I do on Twitter these days, and that the people I follow do, is share links. Sharing links is one of the primal activities on the Web. It was one of the first things people did once they started building Web pages; it was one of the two driving forces behind the rise of blogging (the other was unedited self-expression).

Twitter was built for people to share “status messages” — the answer to the “What are you doing?” question — but most of the people I follow don’t use it for that very much. They use it to comment on news events and to share links they like. Because of this disjunction between original design and “street use,” I find that Twitter gets only one thing about sharing links right — and pretty much everything else wrong.

What it gets right is immediacy. Twitter is fantastic when there’s a breaking story and you want to see what links people are handing around. It’s a much speedier way to tune in to what’s happening (Senator Stevens — guilty!) than RSS feeds or reloading a news site’s front page.

But Twitter privileges “now”-ness over everything else. You can’t tag your links. You can annotate them only if you can say what you wish in under 140 characters (actually, under 140 minus the length of the URL). You can’t even see what the actual URL is, most of the time, since people use URL-shorteners to save space. There is really no other way to say this: For a service that is so widely used to share links, Twitter really sucks at it.

Delicious has long offered the best combination of features for simple link saving and sharing (it’s got space for annotations and a spiffy new interface). You can use Delicious to “follow” (subscribe to) specific tags, but not, as far as I can tell, to follow specific users. (If I’m behind on Delicious’s feature set, enlighten me!) You can use Delicious-generated RSS feeds for that, but we’re getting pretty far afield — nothing remotely approaching Twitter’s simplicity.

So here’s an opportunity for Twitter, or for someone else, if the Twitter team is too busy: Offer a service very similar to Twitter but optimized for link-sharing. (FriendFeed is cool but it’s trying to do so many other things at the same time that I don’t think it suits what I’m talking about.) Make it easier to share links real-time; expose the actual URL; give us some rudimentary tools for organizing the links; and watch something cool grow.

Of course, Twitter has the critical mass of usage right now, and that’s not going away. But surely there’s room for improvement.

Filed Under: Blogging, Technology

In conversation with Leo and friends

October 13, 2008 by Scott Rosenberg

I had a good time yesterday afternoon chatting with Leo Laporte, Harry McCracken (formerly of PC World and now a free-agent blogger at Technologizer), and Tom Merritt of CNET’s Buzz Out Loud on Laporte’s “This Week in Technology” podcast. We talked about Apple’s forthcoming notebook announcements, Sarah Palin’s email accounts, whether Google should be feared, whether the NSA’s eavesdropping should be feared, whether Google’s Android phone should be cheered, whether charging for SMS text messages at both ends will kill off the technology, and a lot more.

I am a mere amateur when it comes to Apple geekery and cellphone connoisseurship, and what I know about SMS text messaging could be communicated via SMS text messaging, so the invitation to hobnob with the experts was a gracious one — thanks, Leo! And we did talk a little about my book and the story of blogging.

You can listen to the whole thing here.

Filed Under: Personal, Technology

Review of Randall Stross’s Google book

October 9, 2008 by Scott Rosenberg

I will poke my head up ever so briefly from my labors to note that I have a book review up at Salon today of Randall Stross’s new “Planet Google.”

Here’s a couple passages:

“Planet Google” further reinforces the picture we now have of Google as the Mr. Spock of Internet companies: intellectually supreme, agile and engaged with the world, but prone to respond to the unpredictable behavior of its customers by cocking an eyebrow and exclaiming, “Highly irrational!”

Is there a Bones McCoy anywhere in the company who can provide a humanist counterweight to all that calculation? Maybe — but you’re not likely to learn who it is from Stross’ research. “Planet Google” is solid and informative, and Stross, refreshingly, avoids the frothier sort of Google hype sometimes heard from the tech-punditry choir. But the book is hardly the insider’s-eye view of Google that it has been painted to be.

Also:

If Google is going to falter over the coming decade, it is likely to be the result of avidly pursuing its “organize the world’s information” goal even as the evidence mounts that its Spock-like principles and engineering-first culture may not get the company to its destination. Stross’ account provides several case studies — including accounts of the oddly neglected Orkut social networking site and the ill-fated Google Answers service — in which innovative Google ventures foundered because of the company’s clumsiness at managing human interaction.

Filed Under: Business, Personal, Salon, Technology

Sarah Lacy’s Once You’re Lucky: Money doesn’t change everything

August 5, 2008 by Scott Rosenberg

I’ve just finished Sarah Lacy’s book Once You’re Lucky, Twice You’re Good: The Rebirth of Silicon Valley and the Rise of Web 2.0, and I’m feeling a little…green. Lacy’s portrait of this decade’s Web industry is so relentlessly shaped by the yardstick of cash — how much money this entrepreneur made, how many millions that startup is valued at — that by the end of the book, you can’t help having absorbed a little of that world view.

As I put down the volume, I found myself thinking, gee, why didn’t I start a company in my dorm room and pocket tens of millions before I turned 30? Then I slapped myself in the face a couple of times and reminded myself that the last time I lived in a dorm room, the Web didn’t even exist — and that when I set out to become a writer the idea wasn’t, how can I make millions, but rather, is it possible to support myself doing what I love? (I was lucky enough to have the world answer “yes!”)

To be fair, Lacy’s a business reporter; she’s written a business book; business is all about money. She paints a colorful and absorbing portrait of the world of Silicon Valley’s latest wave of smart kids to strike it rich. On the other hand, I can’t accept that her account offers an accurate portrait of “the rise of Web 2.0.” Because, in a way, I feel like I was there, too, at least in the earlier phases, talking with many of the same people and companies that Lacy writes about, showing up at many of the same conferences, witnessing the same phenomena. And it just looked, and felt, different to me: at the start, it was much less about retaining control of one’s company and much more about giving control to one’s users.

First, the good stuff about Once You’re Lucky: It’s full of amusing anecdotes, some of them illuminating, and it offers some valuable insights into the motivation of many of today’s young Web entrepreneurs and the complexity of their relationships with their financiers. It gives a great tour of how the startup and venture capital games have changed over the past decade, as the cost of launching a company has dwindled, reducing the need for big upfront investments that dilute founders’ stakes, even as the prospect of everybody-gets-rich IPOs has grown rarer.

I fault the book in a few areas. In tracing the emergence of the Web 2.0 era’s emphasis on social networking and user contributions, Once You’re Lucky is neglectful of the long history of these phenomena that predates the Web 2.0 era. From Amazon book reviews to the Mining Company (later About.com) to the AOL “guides” and on and on, the so-called “Web 1.0” era was actually full of content created by “the crowd.” Its most overinflated and notoriously flaky IPO, in fact, that of TheGlobe.com, was entirely a “community play” (though in a way that betrayed the best possibilities of online community). The Web of the day just wasn’t as efficient as the later generation of companies at organizing the material contributed by users, and there weren’t nearly as many contributors, and Google hadn’t come along yet to help the rest of the Web find the contributions (and to help the companies profit from them).

My biggest beef with Lacy’s book is that its choice of which companies to focus on seems capricious. Maybe it was just based on who she got access to. Plainly, Lacy got lots of great material from one of her central figures, Paypal cofounder Max Levchin, and she paints a thorough profile of the driven entrepreneur. But, his company, Slide, just isn’t all that interesting or innovative. After reading several chapters about it I still can’t tell you exactly what the company’s driving idea is. It does slideshows on MySpace! It’s big on widgets! It out-Facebooks Facebook with apps like Super Poke! But, you know, if you were stuck in the proverbial elevator with Levchin, could he actually tell you what Slide is all about?

There are other stories in the book whose inclusion makes more immediate sense. Few today would argue against Facebook’s significance, and it’s worth the time Lacy spends on it (though one might look for a little more skepticism). Ning may or may not prove important, but Marc Andreessen’s story is valuable in itself. What’s most interesting about Digg is its model for group editing (which, again, is based on “Web 1.0” roots via Slashdot), not its so-far-unfulfilled quest to sell itself.

Lacy might have delivered a more comprehensive portrait of Web 2.0 by offering more than cursory mentions of the companies that, in my book, really created the template for that phenomenon: Flickr, Delicious, the short-lived Oddpost (which got absorbed into Yahoo Mail). These small startups, growing like mushrooms out of the mulch of dead dotcom treetrunks, pioneered virtually all of the tools and technologies we now think of as “Web 2.0”: easy sharing of media creations; tagging of content to create user-generated “folksonomies”; Ajax techniques for inside-the-browser applications; and so on.

It seems that even though these services and companies were at the heart of the invention of Web 2.0, they don’t figure prominently in Lacy’s narrative because, by the financial yardstick, they were relatively small potatoes (all three were acquired relatively early by Yahoo for amounts rumored to be in the low tens of millions). Levchin is a lot richer than the founders and creators of these companies, but in my view, their work was far more significant.

As someone in the middle of writing a book on a related topic that is inevitably going to face similar criticism (how could you write about this blogger and not that one?), I know that Lacy couldn’t possibly cover every significant company. It’s just not clear what criteria she used to make her choices beyond the will-o’-the-wisp that is market valuation (especially wispy when your company is not actually traded on the market).

So this is where I say: the importance of a company does not lie in how rich it makes its founders, but rather in how widely its ideas spread. The business reporter who is too easily mesmerized by the number of zeroes in a company’s valuation is like the political reporter who is only interested in the horse race.

By themselves, numbers are dull. To me, the fluctuations of a company’s market value, like the ebb and flow of a politician’s polling numbers, is only of interest as part of a larger picture: How is that company, or politician, influencing our world?

[The book’s site is here, and here’s Lacy’s blog. Katie Hafner’s critical review is here. The SF Chronicle review by Marcus Banks is here.]

Filed Under: Books, Business, Net Culture, Technology

Nick Carr’s new knock on the Web: does it change how we read?

June 11, 2008 by Scott Rosenberg

The funny thing about Nick Carr’s Atlantic cover piece, “Is Google Making Us Stupid,” is that the piece itself has the truncated quality that it blames the Internet for imposing on our culture. When my copy of the magazine (yes, I actually subscribe on paper) arrived I saw the headline and looked forward to a really thorough, in-depth look at this question. Carr’s entirely capable of that; I disagree with much of his perspective in “The Big Switch,” but it’s one of the more cogent and sustained critiques of the Web 2.0 future, and anything but lightweight. So I figured the Atlantic had paid Carr to do what the Atlantic, and only a tiny handful of outlets, can still do: spend many thousands of words digging into the heart of an important issue.

Ah, well. You can still find such pieces in the Atlantic (like this one about rising crime rates in mid-sized American cities), but Carr’s isn’t one of them. At 4000 words, it’s barely longer than the kind of thing Salon does every day. It’s a provocative read scattered with tasty quotes and anecdotes; it asks a useful question but does little to answer it. Carr starts off describing a sense of alienation from old-fashioned reading that he shares with several other people he quotes:

I feel as if I’m always dragging my wayward brain back to the text. The deep reading that used to come naturally has become a struggle.

Like Carr, I’ve found myself reading fewer books over the past decade. I can’t tell whether it’s because I’m spending more time on the Web (certainly possible). In my case, if my attention span has shortened at all, I think it’s far more likely that, for instance, raising children has cut into both my available time and my reserve of repose (both actual physical sleep time and emotional reserve of patience). But when I do get the chance to sit back with a good book — like two I’ve recently finished, Faking It (with related blog by authors Yuval Taylor and Hugh Barker) and Clay Shirky’s Here Comes Everybody (also with author blog) — I don’t feel any less absorbed than when I was a teenager plowing my way through a shelf of Tolstoy and Dostoyevsky.

I don’t want to discourage you from reading Carr’s article and pondering the issues it raises. Does Google represent the digital apotheosis of Taylorism (the industrial-age science of labor measurement)? Does the Web crowd out the opportunity for leisurely contemplation or “slow, concentrated thought”? Those of us who use the Web constantly are probably experiencing changes in how we read and think; what are those changes?

These aren’t stupid questions. But they deserve deeper contemplation than Carr has provided. His piece is less like a thoroughly researched magazine piece than, say, the prospectus for a writing project. Perhaps the Atlantic has simply published Carr’s next book proposal. If so, I’d look forward to reading the resulting book — in a relaxed, contemplative way, of course.

Further discussion from Matthew Ingram, Matt Asay and Blaise Alleyne.

UPDATE: Jon Udell finds Carr’s critique “spot on.”

Filed Under: Culture, Technology

Page-views — in 2008?

June 4, 2008 by Scott Rosenberg

Apologies for the light posting, which will continue for a bit. Combination of head-down-in-book-work and family commitments. Got a long post from the D conference brewing, but haven’t been able to pull it together yet.

In the meantime, interesting piece in today’s Journal about the failure (so far) of much-touted Washington Post “hyperlocal” experiment, LoudounExtra.com. The guy in charge, Rob Curley, admits he spent too much time talking up the project with news executives and not enough actually getting to know the people the site was supposed to be serving. (Points, at least, for honesty.) A classic community-building mistake that I’m sure he won’t make again.

But what caught my eye was this bit tucked in a background graph about Curley:

Perhaps his biggest success was the Lawrence (Kan.) Journal-World’s KUSports.com, a site dedicated to University of Kansas sports that grew during Mr. Curley’s three-year reign from 500,000 monthly page views to a one-time peak of about 13 million monthly page views.

Page views, though superior to the old “hit” metric, were never an ideal measure of real value in online publishing (I wrote about this in Salon in 1999). In the era of Ajax-style web applications, where the browser might stay on one page while you work on email or something else for a half hour, page-views are meaningless. Once upon a time, sites broke up long articles into pages to squeeze out a few more ad impressions; today, pages are less and less the unit of web content, which now comes at us in widgets and RSS and a hundred other generated-and-remixed formats.

It was so quaint to see a big page-view number touted as the sign of a site’s success in 2008 — like a dotcom bubble flashback…

Filed Under: Media, Technology

Gates and Ballmer at D: Lament for lost youth

May 27, 2008 by Scott Rosenberg

I’m keeping my head down in my book writing, mostly, this year, but I allowed myself one trip to one industry event, so here I am at Walt Mossberg’s and Kara Swisher’s D conference again. New owner (who’ll be here tomorrow); same friendly proprietors.

Things kicked off tonight with a double interview with Bill Gates and Steve Ballmer. After last year’s psychodramatically rich confrontation between Gates and the other Steve in his life, this event was decidedly more tepid. Gates has had one foot out the door of his company for a long time, of course, but as he prepares to depart fully from active duty next month, he might have figured on taking something of a victory lap here.

No such luck. Mossberg, inconveniently, kept bringing up the Vista fiasco. Gates wryly commented, “We have a culture that’s very much about, ‘We need to do it better,’ and Vista’s given us a lot of opportunity for that.”

Ballmer predicted a release of “Windows 7” — the successor to Vista — by late 2009. (Danger, Will Robinson! Remember the Longhorn slippages! Haven’t they learned?) There was a suggestion that we might get a look at the new Windows 7 interface here; but what was actually on display was some neat tricks involving multitouch interfaces for applications –a la the iPhone’s pinch-and-tap approach to using more than one point of contact on a touch screen to manipulate stuff. (The demo included an onscreen piano keyboard, but nobody actually tried to play a chord, which I’d have thought would be the obvious way to show off multitouch.) All this was neat enough, but not much to go on — and unless Windows 7 fixes a lot of Vista’s problems there will be a dwindling base of users to experience its neat touches.

Ballmer declared, unconvincingly, that he’s not stewing over the collapse of his attempt to acquire Yahoo: “I’m not frustrated at all. They’re great guys, they built a great company. We couldn’t agree on a price.” As he spoke, a blown-up Wall-Street-Journal woodcut portrait of Jerry Yang stared down at him from the wall. (Yang will be here tomorrow.)

Both Gates and Ballmer remained almost pathologically unable to utter the syllables “Google.” Ballmer attempted to explain how he sees Microsoft responding to the Google challenge: “You need scale, and business innovation, and technological innovation. You need breakthrough innovation and incremental innovation. You need it in search and in advertising. You need to bring it all together. And you need it at all levels of the stack.”

Whenever I hear a CEO say, “We need to do it all!” I translate: “We really don’t know what the hell to do here.”

Gates and Ballmer seemed most comfortable, and genuine, in reminiscing about their youth, as Harvard friends and then as partners in building Microsoft from the ground up. Are their best days behind them? They would never admit it, but no matter how brave a face they put on, or how rosily they paint Microsoft’s prospects, I think that on some level even they sense it.

AllThingsD’s John Paczkowski did the live-blogging thing here. No doubt there will be video up soon too.

Filed Under: Business, Events, Technology

Yahoo/Microsoft collapse, the morning after

May 4, 2008 by Scott Rosenberg

I thought the Microsoft/Yahoo merger would be a disaster for both companies, but the news of Microsoft’s withdrawal of its offer should not be greeted with cheers in any quarters yet. Most reports of the story have noted the possibility that this is just a feint on Microsoft’s part. I could be wrong, obviously, but I can’t believe Ballmer and company are abandoning the field. It’s just not their DNA. They fight to win. They hate to lose. They throw chairs at walls when they’re frustrated. The antitrust ordeal made them more cautious in public, but I can’t believe they’ve become pussycats in private. If they’re really giving up, it means that Microsoft has become an utterly different company from what it used to be, and I just don’t see that.

Sure, after the last few months they may feel (accurately) that an acquisition would result in mass exodus from Yahoo because of corporate-culture incompatibility –i.e., it appears that everyone at Yahoo hates Microsoft’s guts, and everyone at Microsoft despises their counterparts at Yahoo (I’m talking about the corporate leadership — engineers tend to be more catholic in their perspective, at least sometimes). But what Microsoft wants from Yahoo is market share, not talent, so I don’t think this really matters to them.

Instead, they’re saying to Yahoo, “OK, you don’t like our price? Let’s see how you like what the market does to your share price, what the shareholder suits do to your legal budget, and what the withdrawal of our offer does to the other negotiations you’ve got going.” It’s a smart hardball move, at least in the short term. We’ll see how it plays out over the next couple of weeks.

Other interesting first takes:

Paul Kedrosky — “Yahoo (and maybe Microsoft too) reminds me of that crack suicide squad in Monty Python’s Life of Brian.”

Kara Swisher — “Kind of like Oscar Madison and Felix Unger, but not funny in any way at all.”

Mike Arrington — “Google was the big winner in a Microsoft/Yahoo acquisition attempt, no matter what the outcome. But among the possible outcomes, a broken Yahoo and a frustrated Microsoft almost certainly result in increased market share for Google.”

Filed Under: Business, Technology

Why the Web-only life is not worth examining

April 9, 2008 by Scott Rosenberg

Today’s Journal features a Portals column by Vauhini Vara that represents yet another attempt to gauge how far Web apps have come by attempting to “live on the Web,” forsaking all desktop-based software. (Others — like James Fallows in 2006 in Technology Review, whose effort I wrote about back then — have done this before.)

The trouble with this approach is that it’s a total straw man. No one would ever do this except to provide column fodder. The shifts in our software habits are incremental; we don’t “change state” 100 percent, we just drift in one direction. And the drift today is overwhelmingly towards the Web.

Of course Vara finds the trouble spots exactly where you’d expect: If you’re tied in to a corporate email system, giving up Outlook for a Web interface is still painful. Spreadsheets and PDFs are harder to work with. Web-based writing tools are pretty good but so far they haven’t provided a good replacement for Word’s clumsy but essential “track changes” feature.

OK. In the meantime, those of us who aren’t locked in to Outlook long ago went with Gmail or some other Web-based email system. We keep and share our calendars on the Web, and increasingly we use Web-based tools to coordinate small work groups. No one is holding a gun to our heads, so we happily mix Web apps and desktop apps. Why not?

If you’re starting a small business today, are you going to invest in Outlook or are you just going to piggyback on some Web service? When the business begins to grow, are you going to pay the big Outlook tax or stick with what’s working? As developers devise new useful tools for communication and coordination, are they introduced on the desktop or on the Web — or in both places?

These are the trends that matter. “All or nothing” is beside the point.

Filed Under: Media, Software, Technology

Those who cannot remember the past are condemned to write Facebook apps

April 8, 2008 by Scott Rosenberg

My friend and former colleague Chad Dickerson has a great post about Facebook developers reliving the perennial platform-developer’s nightmare: if you build something really wonderful, sooner or later the platform owner incorporates what you invented into the core software.

This line should be savored:

As the old Santayana quote goes, “those who do not learn from history are doomed to repeat it,” but in Silicon Valley, those who rely on their command of history too much often find themselves getting crushed by a 23-year-old who skipped history class in favor of a CS degree.

The platform developer’s dilemma goes back a long way: among other things, to the early days of Dave Winer’s web writing (he’d experienced the phenomenon when he saw his own Macintosh scripting environment eclipsed by Apple’s less versatile in-house effort). But it goes even farther back than that — back before Windows. In the 80s, DOS dominated the world, but you couldn’t really run DOS without a zillion helper utilities. Over time and successive DOS releases many of these helper utilities were incorporated into the OS. Much of the time this was a Good Thing for users, and many of the utilities were freeware anyway, but if you’d tried to build a for-profit business around some essential extension to DOS, you were on shaky ground — and Microsoft was the beast causing the tremors.

Chad locates the difference in today’s software world in the speed of development:

Velocity changes everything. As the developers dance faster in this new environment, so too does the platform elephant. The faster the elephant dances, the more likely “the little people” underneath (as Ariana calls platform developers in the News.com story) could get unwittingly trampled in the process.

Very true. But in the end, I think, developers understandably flock toward any platform on which large numbers of users have pitched their tents — true for DOS decades ago, Facebook today, and who knows what tomorrow.

PS I’m reasonably sure the canonical version of the Santayana quotation is:

Those who cannot remember the past are condemned to repeat it.

But the Web is full of variations. And those who cannot remember their quotes are condemned to wander the Web’s copycat quote pages!

Filed Under: Software, Technology

« Previous Page
Next Page »