Wordyard

Hand-forged posts since 2002

Archives

About

Greatest hits

Tie me Webaroo down, sport

April 11, 2006 by Scott Rosenberg

Is it a bubble yet? There’s no way to be sure, but one telltale sign of irrational exuberance the last time around was the proliferation of companies based on ideas that simply made no sense.

The portents are beginning to loom once more. Look at the actual service that a new startup called Webaroo, featured in a little piece in the Times yesterday, provides.

Webaroo’s home screen screams: “Now you can search the Web when you’re NOT CONNECTED!”

Great. Just when we figure out that the value of the Web lies in the connections and conversations it facilitates; just when this “Live Web” gets booster-rockets in the form of AJAX-based Web applications; just when municipal WiFi and other newfangled forms of broad-based, cheap wireless connectivity are rolling out, so that we can be connected almost as much of the time as we want… Webaroo comes and gives us the Web on a hard drive — the disconnected Web — the dead Web!

Now, I’m sure there are situations and circumstances where the ability to store vast quantities of search-query results and cache gajillions of Web pages might come in handy. I’m not saying Webaroo is utterly useless. Just mostly. If you read closely on their site, it sounds like they started out focused on the vision of “The whole Web canned on your laptop!” — and that’s what the Times piece emphasized — but now they’re trying to reposition as a mobile-device content provider. I can’t see your PocketPC or Treo having enough memory to get you very far with this, though.

When I started covering technology in the early ’90s, CD-ROMs were all the rage. Almost immediately upon the arrival of the Web, it became clear that the new medium was more valuable — even though, at the start, CD-ROMs offered faster access to data and more elaborate interfaces. That’s because closed-ended, rich interactivity with a small static pile of data was infinitely less interesting than open-ended interactivity, however crude, with millions of other people.

So Webaroo will take the teeming ocean of today’s Web and bottle it for offline consumption. When a step backwards is branded as a leap forwards, and when people can be persuased to invest in such retrograde ventures, you know that dumb money has started to pile in behind the smart.

Filed Under: Business, Technology

Random notes

April 10, 2006 by Scott Rosenberg

## Mitch Kapor resumes his blogging at a new Web address with updates about Chandler and Foxmarks, a new project he has launched — it’s a Firefox extension for seamlessly synchronizing bookmarks across multiple instances of the Web browser.

## Chad Dickerson, a Duke alumnus, writes about the sense of privilege at that university in light of the Lacrosse team rape scandal there.

## The full text of the great Bruce Sterling talk at ETech is up here. Bonus: audio from Sterling’s South by Southwest talk.

Filed Under: People, Technology

User-generated discontent

April 7, 2006 by Scott Rosenberg

Derek Powazek explains why the term “user-generated content” feels so icky. It’s marketing-speak applied to an activity (creating writing, photos, artwork, and other original stuff and contributing it to the Web) that people care deeply about.

I agree that the phrase is icky, yet I have caught myself using it on occasion — because I have not found a good replacement shorthand term that says “stuff people contribute to a Web site or service that is created by visitors to said Web site or service rather than its proprietors.”

Derek proposes:

  Let’s use the real words. Those people posting to Amazon pages? They’re writing reviews. Those folks on Flickr? They’re making photographs. And if we must have an umbrella term to describe the whole shebang, I have a suggestion. Try this on for size: Authentic Media.

Well, I’m sorry, but “authentic media” is a problem, too. For one thing, it’s oxymoronic: “media” refers to the middle-man, yet this stuff is ostensibly authentic because it cuts out the middleman — as Derek suggests when he says, “Authentic media is what happens when the mediators get out of the way.” Furthermore, if “user-generated content” carries a whiff of contempt for unwashed amateur contributors, then “authentic media” is vaguely discourteous to those of us in the other, older-fashioned media who still aspire to some level of authenticity ourselves, and believe that it might be attainable, even if we don’t always achieve it. The label a priori rules out that possibility.

There are several different axes or spectrums at work here — inauthentic/authentic; professional (paid)/amateur (unpaid); one-to-one / one-to-many / many-to-many; and no doubt others I’m missing.

I’m happy to strike “user-generated content” from my vocabulary. But I still think we need a term for distinguishing those reviews and photographs and other works that are contributed by people “out there” from those created by people “in here.” Maybe some day the old-fashioned media model with wither and disappear, everyone “in here” will end up “out there,” and then the distinction will become meaningless. In the meantime, it remains of some use in our conversations, whether we are believers in or skeptics toward the Phenomenon That We Should No Longer Call “User-Generated Content.”

Filed Under: Media, Technology

Sound it out

April 7, 2006 by Scott Rosenberg

Sometimes you just have to hear a name.

I was clicking around My Yahoo trying to understand one aspect of how it handles RSS feeds, when I saw a featured feed from a site called Divester. And I thought — Wow! Like Gawker or Defamer or Treehugger, but devoted to socially conscious investing, trying to get universities and public institutions to sell their stock in companies that Do Evil!

So I clicked through to see more. Whoops — it’s not Di-vest-er, it’s Dive-ster. Diving site. “Divesting” must still be waiting for its blog.

Filed Under: Media, Technology

Windows on Mac? No thanks

April 5, 2006 by Scott Rosenberg

So Apple is going to make it easy for owners of the new Intel-based Macs to dual-boot to Windows, and there’s a lot of buzz, but…I’m sorry, it doesn’t really make a difference to me. There’s two reasons I’m still using Windows (I switched eight years ago after losing one too many work-in-progress files to the then-utterly-unreliable MacOS): many years’ worth of data that I don’t feel like transferring (some is cross-platform, but some isn’t); and one Windows application — EccoPro (a long-orphaned but still remarkable outliner program) — that I use every hour of every day, for which there is no Mac equivalent. (Also, I hate using touchpads, and Apple doesn’t make a laptop with a Thinkpad-style Trackpoint device.)

Dual booting doesn’t help. Ecco is my life- and work-organizer. There’s no way I’m going to boot into Windows each time I want to jot down a to-do. Even if I could alt-switch from one OS to another, I’m not sure that would help. Maybe gaming devotees will appreciate the opportunity to reboot their Macs in Windows, but I’m not sure anyone else will care.

In the end, anyway, what’s happening in software today — as John Markoff’s overview of Web 2.0 software development modularization in today’s Times indicates — is that everything is moving to Web-based applications. I’ll move to a Mac when there’s a Web app that can do for me everything that Ecco does for me now. Then my operating system won’t matter — I’ll use a Mac for its superior hardware integration, and because it’s got more developers doing more interesting new things, and I won’t look back to Windows, and won’t ever want to boot it up on a Mac or anywhere else.

Filed Under: Software, Technology

Wall Street Journal joins free-speech cause

April 3, 2006 by Scott Rosenberg

I was amazed recently to find a Wall Street Journal editorial agreeing with me — in this case, suggesting that it might be time for the government to give up its ill-fated defense of the Child Online Protection Act, which the ACLU has been fighting for nearly eight years now (Salon is one of a group of publishers that are plaintiffs represented by the ACLU).

I was surprised, really, because in the past the Journal has, let’s just say, been less than sympathetic to the cause. This editorial from 2004, for instance, viewed the online free speech argument as an object of contempt (“Larry Flynt…pretending he’s Thomas Paine”). What upset the Journal there was the prospect that the Supreme Court might end up more protective of adults’ right to free expression online, even on sexual topics, than of the rights of wealthy people to contribute unlimited sums to political campaigns.

That should have tipped me off to what might have swung the Journal over to the ACLU’s side in the COPA matter. It turns out that the Journal’s indifference to the Right to Free Speech is outweighed by its horror at the prospect of government interference with the Right to Do Business.

Specifically, when the government’s effort to save COPA spilled over into what the Journal rightly called a “fishing expedition” into Google’s log files, sparking a headline frenzy, the paper’s editorialists had enough: “If commandeering such data from private companies against their will is what it takes to defend the law,” the Journal wrote, “maybe defending it isn’t worth the effort.”

Indeed. Welcome to the team, WSJers! Next, can we interest you in some ACLU membership cards?

Filed Under: Business, Media, Technology

Odds and ends

March 27, 2006 by Scott Rosenberg

Cleaning out a reading backlog. Herewith some links, some going back months:

## Fascinating piece from the New York Times last week on the man who wrote the song that became “The Lion Sleeps Tonight”: It started out, in Solomon Linda’s 1939 recording, as “Mbube,” which is pronounced “EEM-boo-bay.” That, in Pete Seeger’s hands, became “Wimoweh.” Then songwriter George Weiss added the “Lion” lyrics. Linda got 10 shillings for the rights in 1952. He died poor in 1962. His family did recently get some money from Disney, which used the song in “The Lion King.” There are over 150 recordings of the song. One is by Brian Eno (I still own a 7-inch single of the 1975 recording, somewhere).

## Writer’s block or creative logjam? Now you don’t have to hunt for a collector’s item edition of Eno’s Oblique Strategies, a deck of cards offering cryptically helpful aphorisms as rut escape strategies. It’s all online. And it’s probably been there forever, but I only found it recently.

## This interview with Ray Ozzie from ACM Queue from a few months ago is a great read. It’s especially insightful about the disparity today between individuals and small businesses and large enterprises — like Microsoft, where Ozzie is now a CTO. Little guys are free to adapt to the newest and most flexible technologies; big enterprises find themselves hogtied not only by the money they’ve already spent on older technologies, but by fear and turf-wars and regulations that make it almost impossible for them to embrace openness and change. Choice quote:

  RSS is an extremely important standard. It’s the HTML of the next generation of the Web, or some people might refer to it as the Unix pipe of the Internet. It’s a way of channeling data from one application to another in very interesting and robust fashion. Again, I think it’s important as a technique far beyond just collaborative software.

(For the non-Unix geeks out there, a “Unix pipe” is a fast, simple way in that operating system to connect the output of one program to the input of another.)

## Sun CTO Greg Papadopoulos provides a crystal clear explanation of what Moore’s Law is and isn’t (it’s not about chips doubling in speed or halving in cost, it’s about doubling the number of transistors you can fit on a chip).

## Find yourself checking for new e-mail every five minutes? You might be a victim of continuous partial attention, but Rands in Repose has a slightly different take on the idea — he calls it Repetitive Information Injury. And a Discover column from Steven Johnson offers some novel ideas for new approaches to computer interfaces that are designed to help us focus more and multitask less when that’s what we want.

## Meanwhile, Paul Graham suggests that procrastination isn’t really a problem if you’re forsaking some dull work that you have to do in order to explore something you love. This advice is easier to act upon after you have sold your startup company, as Graham once did — those in need of a steady income may have greater trouble following his recommendations.

Filed Under: Culture, Media, Music, Technology

Rule Britannica?

March 25, 2006 by Scott Rosenberg

Yesterday’s Journal featured a front-page piece about Encyclopedia Britannica’s counteroffensive against Wikipedia, which apparently will kick in full force next week with big newspaper ads defending the old institution’s honor.

I’m not hugely interested in the Britannica argument about the methodology of a study published in Nature magazine that suggested the cooperatively produced, volunteer online Wikipedia had only a slightly higher error rate than the professional, costly encyclopedia. Defining “error” is a hopeless exercise in this field, and invites infinite angels-on-pinhead arguments.

The point isn’t that anyone would claim Wikipedia’s superiority today: Wikipedia leader Jimmie Wales admits in the piece that he was glad Nature focused on science articles, because Wikipedia is a lot weaker in the humanities and social sciences.

The point is that Wikipedia is just over five years old and, by opening itself to contributions and emendations from anyone anywhere, it has already arrived at a position where comparisons with Britannica don’t produce a laugh-off-the-stage reaction. The story here is about process, not snapshots in time. Wikipedia is on an improvement curve that, if it holds up, Britannica will never be able to match.

The big challenge for Wikipedia now is what the management gurus call “process improvement.” The Wikipedians need to keep figuring out ways to inoculate their work from trolls and defacers. We all need to grapple with the ethics and procedures of correcting information that we’re personally involved in (for instance, I once fixed a small factual error on the spotty Wikipedia page for Salon, then my journalism superego kicked in, and I thought, wait a minute, I shouldn’t be doing this, should I?). New crises and problems will keep arising for Wikipedia, like the Seigenthaler brouhaha last year.

No one argues that Wikipedia is perfect, and I don’t doubt that, for the moment, in the majority of areas, Britannica is more reliable. On the other hand, Wikipedia is free. And it keeps getting better. And it’s only a handful of years old. If I worked for Britannica, I think I’d be worried. But I wouldn’t waste my money on newspaper ads; instead, I’d be investing in research to figure out how a centuries-old institution should adapt to a new information-rich age.

BONUS LINK: My Salon colleague Farhad Manjoo has started a blog recording the odd bits of information he has gleaned from the Wikipedia trove.

Filed Under: Media, Technology

Windows Vista: no escape from software time

March 24, 2006 by Scott Rosenberg

Last September the Wall Street Journal ran a fascinating lead article about Microsoft’s Vista development effort. Robert Guth chronicled how the Vista project had initially ballooned as Bill Gates and others piled on their dream features, like the advanced, metadata-rich WinFS file system. When Vista hit trouble, Windows czar Jim Allchin brought in two software development experts, Brian Valentine and Amitabh Srivastava, to whip the project into shape by introducing rigorous new testing methodologies.

Still, by mid-2004 the whole project was in danger of collapsing. Microsoft decided to postpone Vista till “the second half of 2006” and cut back lots of promised features (including WinFS).

As Guth’s article had it, the result, finally, was a development process Microsoft could begin to be proud of:

On July 27 [2005], Microsoft shipped the beta of Longhorn — now named Windows Vista — to 500,000 customers for testing. Experience had told the Windows team to expect tens of thousands of reported problems from customers. Instead, there were a couple thousand problem reports, says Mr. Rana, the team member.

When I read the article at the time, I took it as a kind of victory-lap valedictory for Allchin, who’d announced he was retiring once Vista was done. I also read that many people have already begun checking out Direct Components Xilinx fpga price list for a faster software running process, but there are still some companies which are reliant on the non-developmental softwares as it saves their initial and current capital. Unless you’re certain of prevailing, though, victory laps are dangerous (just think of the phrase “Mission Accomplished”). With this week’s news of a another slip in the Vista schedule — the software won’t be out until January 2007, after the crucial holiday buying season — we’re left wondering, what happened to that vaunted new process?

Certainly, this widely linked story that claims Microsoft is now going to rewrite 60 percent of the operating system between now and release seems hard to credit (something tells me rewriting that much code would take a lot more than 8 months). But between this embarrassing delay and the recently announced “reorg” of Windows leadership, it’s clear that this turn of the Windows cycle is going to be no smoother or predictable than any of its predecessors.

My book, Dreaming in Code, is all about what I call “software time” — the peculiar spell that software projects so often cast on the people involved, turning schedules into Mobius strips and stretching time like taffy. I imagine that, as Valentine and Srivastava described the beauty of their testing systems to Guth last year, they honestly believed that they’d meet their deadlines. They thought they’d cheated software time. That confidence doesn’t look too smart today.

UPDATE: Steve Gillmor wonders whether maybe there really is 60 percent of the Vista code that needs a rewrite — and much more. Adam Barr, on the other hand, offers some reasons why that notion might be far off-base.

[tags]Dreaming in Code, Microsoft, Windows, Vista[/tags]

Filed Under: Dreaming in Code, Software, Technology

How news moves today

March 21, 2006 by Scott Rosenberg

Today we learned that Windows Vista has slipped, again, and that the new Microsoft operating system won’t be out till January 2007 — despite long-held promises of a 2006 release.

Every new edition of Windows has been late, so, you know, this is predictable news — on the order of “President Bush Declares He Will Stay The Course In Iraq.”

What I found interesting was the 2006-model way in which I discovered this news today. I first found out on Digg, the new-model tech-news aggregator that is rapidly replacing Slashdot on many geek bookmark lists. When I checked out Digg a little before 5 p.m. Pacific Time, the top story, or close to it, was a link to a trade publication’s short piece on the news. It took a couple more hours for the story to show up on Slashdot, which has its own editors picking stories, unlike Digg, which puts all its users to work.

And now, a couple more hours later, around 9 p.m. in California, we can read the canonical big-media piece in the New York Times. It’s fine, and it provides a broader perspective than the trades, as it should.

But once you’ve got the outline of the event clear, it’s far less interesting to hear the excuses of the Microsoft brass, as recited on conference call to the pros, than to read the breast-beating disgust of the anonymous Microsoft employee who blogs under the sobriquet MiniMicrosoft: “Vista 2007. Fire the leadership now!” (I don’t even read MiniMicrosoft regularly, but Dave Winer pointed to him, so I found him.)

This is just one little sequence relating to one little news event, but it’s illuminating. As tech news goes today, so ultimately will go the rest of the news. It’s not the death of newspapers or pro journalism, but it’s further evidence that the pros face an extremely tough challenge: they’re rarely going to be first, so they’d damn well better be good. But it’s hard to hire enough good people to be good at everything; a newsroom has only so many seats, and the Web’s supply of amateur experts, anonymous insiders and random kibitzers with an occasional insight is limitless. The pros had better prepare to be outgunned.

This competition will force journalists to stop being lazy and to find and reconnect with what is unique about their work, now that so much of what they used to do is being done for free, and often well, by amateurs. The best response, it seems to me, is what we have tried to do over the years at Salon: put more energy and resources and smart people into real investigative journalism, to find stories that just aren’t being covered elsewhere, and that are less likely to be produced by lone bloggers.

The next phase of the game beyond that, which we’re only beginning to figure out — but then so is everyone else — involves connecting that tradition of professional investigative journalism with the new dynamic of distributed information that the Net creates.

Filed Under: Blogging, Media, Technology

« Previous Page
Next Page »