Wordyard

Hand-forged posts since 2002

Archives

About

Greatest hits

Chevron’s big pile

May 9, 2007 by Scott Rosenberg

The current favored information-overload coping mechanism is exemplified by Gmail: Don’t bother sorting or deleting. Storage space is cheap. Life is too short to take out the info-trash. Just let everything accumulate in one big pile and use tags and search tools to get what you need.

The “one big pile” method has the overwhelming appeal of liberating us from the role of digital janitor. (The principle lies at the heart of David Weinberger’s new book Everything is Miscellaneous — more on that soon, since I’m interviewing Weinberger for Salon.) But our opportunities to employ it remain limited. Gmail lets us treat our email as one big pile. Delicious lets us treat our bookmarks that way; Flickr, our photos.

But the biggest piles of all can be found on our hard disks. And they remain nearly impossible to treat in “big pile” mode. Google Desktop gives us an inkling, but its uses are limited. WinFS was supposed to transform the Windows file system into a Web 2.0-compliant, metadata-rich delight, but it’s vaporware. ITunes relieves us of managing our music files, but that’s just one corner of the personal-data universe.

And if it’s this bad for each of us as individuals, it’s way worse for big companies. Yesterday the Wall Street Journal featured a story by Pui-Wing Tam, titled “Cutting Files Down to Size,” about Chevron’s data-overload problem. The company’s store of office data is growing 60 percent a year; it’s got 1,250 terabytes today.

The article paints an alarmingly rich picture of the company’s problem, but is not nearly so convincing about the solution. Chevron is trying to cut back on document overload by deploying Microsoft’s SharePoint, so that instead of multiplying email attachments, all the people who use a particular document can work off a single copy. That’s just fine, but it can’t begin to be enough. With stuff that’s tagged as lower-priority, Chevron will begin deleting after 90 days. Its new plan “will require a team of 250 staffers and nearly two years.”

The Chevron exec in the article concludes by noting that “Half the battle will be changing people’s behavior.” Good luck. Asking people to do the clerical work of organizing their computer files is a losing battle. Better to try to deploy tools that help them do their work more easily — and maybe get the files organized as a side benefit along the way.
[tags]chevron, information overload, information management, data management, wall street journal[/tags]

Filed Under: Business, Software, Technology

Liberals yawn as Journal burns?

May 7, 2007 by Scott Rosenberg

Greg Sargent wonders why the liberal blogosphere isn’t squawking about Rupert Murdoch’s bid for the Journal. Obviously most of the liberal blogosphere hears “Journal” and thinks of the Whitewater-crazed loonies who spent most of a decade spinning Vincent Foster conspiracy theories. For anyone inclined to view the world through a partisan lens, the Journal’s editorial page has long overshadowed its higher-quality news coverage. Murdoch buying that gang? It’s “Shelob acquiring Barad-Dur, Inc.” Let them eat one another and spit out the bones!

But there’s also a sense in which the defense of the Journal’s “quality” newsroom is a rearguard action on behalf of a dying tradition. And many people who identify themselves as bloggers, whether on the left or the right, and whether they value the Journal’s great features (as I do) or not, may feel about that tradition the way they feel about the 19th-century novel or the Hollywood comedies of the ’30s. These things are grand, but, like it or not, their time has passed. A newsroom like the Journal’s will not and cannot exist a generation from now unless someone starts figuring out how to pay for it.

John Heilemann’s take in New York is a little contrarian and well worth reading:

Did anybody at Dow Jones ever contemplate purchasing MySpace? Did Arthur Sulzberger or Don Graham? I don’t know, but I’d wager they didn’t even know what MySpace was. The obvious retort is, Why should they have? What does social networking have to do with journalism? And, no doubt, a precise answer is hard to conjure. But if you don’t believe that the intermingling of these spheres will be central to how future generations consume their news, you’ve apparently been sleeping—and clearly don’t have kids.

Not that Murdoch or his people have the future figured out. But they’re groping toward it with purpose and energy—which is more than you can say for Dow Jones. God knows Murdoch’s politics aren’t my brand of vodka. But you have to admire the way he’s been an unrelenting force for change and modernization in the media racket, the way he’s shaped and adapted to epic transformations of platforms and technologies. The problem with America’s newspaper-family dynasties is that, to a greater or lesser degree, they still believe they’re in the same business they were in 30 years ago. Murdoch doesn’t—and he knows, too, that newspapers can’t be any kind of public trust if the public sees them as yesterday’s news.

For those who think that the Bancroft family’s pride-of-ownership will save the Journal from Rupert’s clutches, I say, think again. Murdoch has offered a huge premium on the Dow Jones stock price. The Bancrofts’ control is apparently only a little over 50 percent. All Murdoch needs is one or two heirs or heiresses to say, “Wait a minute, this is good money, what are we thinking?” and the prize is his. I don’t think there are too many other people out there with the resources to pay such an inflated price or the desire to sink that much cash into what market analysts politely call a “sunset industry.”

Every year the Journal’s publisher seizes the op-ed page for a letter to readers, and every year this missive touts the publication’s “faith in the wisdom of markets.” What’s happening here is simple: the market is having its way with the Journal. The result may not be ideal for those of us who love 5000-word features, but it is surely a kind of ironic justice.
[tags]Wall Street Journal, Rupert Murdoch, journalism, dow jones[/tags]

Filed Under: Business, Media

Microhoo… Yacrosoft?

May 4, 2007 by Scott Rosenberg

This time the noises about a Microsoft acquisition of Yahoo sound more serious. We’re also in one of the financial markets’ combination-mad moments — these merger frenzies often arrive at a market peak.

Remember January 2000? We woke up one morning shortly after the millennium to discover that Time Warner was buying AOL. I wrote one of the few dissenting columns about this deal, arguing that both companies were acting out of fear, not vision. I got dragged onto CNN that afternoon — I think they had a hard time finding someone to trash the deal — and the hosts treated my skepticism with disdain. Who was this punk from an upstart Web site to be questioning the actions of titans like Gerald Levin and Steve Case?

We know how that one played out. Acquisitions at this scale virtually never lead to useful combinations, strategic synergies, or anything else of use. They are financial engineering. What’s happening with this one is pretty simple: Microsoft and Yahoo have both found themselves at dead ends, but they both have formidable assets, and their leaderships are acting out of desperation. Microsoft can’t build a successful search engine, Yahoo can’t gain traction against Google, and each may think the other can solve its problems. In the event of a deal we will probably hear, as we did with Time Warner/AOL, that it’s a merger, not an acquisition, but don’t be fooled: Microsoft has the extra billions here.

Prediction: If Microsoft acquires Yahoo, the companies’ stock will initially prosper and the media will cheer on a new round of the War on Google. But seven years from now Yahoo will be as much of a shell as AOL is today. The talent will flee, the user base will stagnate, and Yahoo’s ability to innovate will wither under the weight of Microsoft bureaucracy and the pressure to serve Microsoft’s software interests.
[tags]microsoft, yahoo, mergers[/tags]

Filed Under: Business, Media, Technology

Murdoch’s Journal: Markets rule, indeed

May 1, 2007 by Scott Rosenberg

News that Rupert Murdoch has made a credible bid to acquire Dow Jones, which publishes the Wall Street Journal, has evoked two reactions: In newsrooms, among pro journalists and among devotees of investigative journalism, there is much rending of garments. The Journal is one of the world’s best news organizations. Its front-page features are often models of in-depth reporting. Dow Jones has maintained a reasonably good record of separating its news operation from its editorial page’s lunacy. Would a Murdoch-owned Journal let its editorial barbarians cross the great wall?

Outside of the journalistic fraternity, the prospect of a union between Murdoch and the Journal’s cartoon-conservative editorial page instead has many left-leaning readers either shrugging with indifference or indulging in a bit of schadenfreude. As one wag put it over on Andrew Leonard’s How the World Works blog, “Oy, it’s as if Shelob desired to acquire Barad-dur Industries, Inc.”

I count myself in both these groups. I would hate to see the Journal’s reasonably independent and often irreplaceable news coverage deteriorate; it is a central part of my daily information diet. But if the Journal’s grand newsroom tradition falls victim to a corporate acquisition, I can’t help feeling, also, that the fate is fitting. The Journal — its news pages as well as its editorial pages — is the daily bible of global capitalism, encompassing all of that term’s positives and negatives. It is a chronicle of the power of markets to reshape institutions. How could it expect to be exempt itself?

As for the rest of us, whether we embrace markets wholeheartedly or think they benefit from some fair rules and healthy counterweights, the prospect of a Murdoch-owned Journal — like the ongoing struggle for the New York Times’ corporate soul — is another reminder that, in the business world, good journalism has no uniquely protected status. It will flourish or perish as we find creative ways to support it. The old models are eroding. That’s not going to stop. The question is, how quickly can we find new ways to make sure that, whatever happens to the Wall Street Journal itself, someone somewhere is still able to provide Wall Street Journal-style coverage?
[tags]journalism, wall street journal, rupert murdoch[/tags]

Filed Under: Business, Media

Journalists’ “see no evil” stats

April 24, 2007 by Scott Rosenberg

Dave Winer writes:

A J-school prof at Cal told me that most reporters have absolutely no idea which of their stories people read or don’t read. They’re flying blind. I bet TV news people are too.

But wait, it’s even worse than it appears. Not only do most reporters have no idea which stories are read, many if not most don’t want to know.

The traditional view in journalism is that such knowledge is corrupting. If you know what’s popular and what isn’t, you will be driven by such knowledge to degrade your product. So the proverbial “Chinese wall” that’s supposed to segregate editorial decision-making from business influence has generally kept readership data out of the newsroom.

At a crude level, journalists fear that, the more granular the information about readership and popularity, the faster the suits will crank up celebrity gossip and defund serious coverage. The falllacy here is that, sorry, the suits already know everything they need to know about the relative popularity of different kinds of content — it’s just the editorial people who are (often) in the dark.

And then there is a more sophisticated level: the idea that writers and editors themselves, unpressured by crude strongarming by the business side but simply motivated by their own human need for attention, will find their judgment subtly but inexorably shaped by detailed usage stats.

The second concern is, I think, at least partly real, but I don’t lose sleep over it. From day one at Salon, when we were a half-dozen people in sublet space who could barely access our servers, we circulated traffic data to our editors; it simply blew our minds that we could. Over the years we took some heat for the practice, but I still think it makes sense. Ignorance is never a very good state for a journalist. Why choose blindness? Knowing where readers click doesn’t have to dictate your decisions — unless your decisions are poorly reasoned to begin with. In the soup out of which good coverage bubbles, traffic data should be one ingredient of many.

The real defense against what used to be called “page-view pandering” is strong, smart editors and writers with their own moral compasses. If you have them, then they deserve access to as much information as exists. If you don’t have them, then you’ve got bigger problems, and restricting access to your traffic stats won’t save you.
[tags]journalism, ethics[/tags]

Filed Under: Business, Media, Salon

Schmidt on scaling Google

April 17, 2007 by Scott Rosenberg

The first time I heard Eric Schmidt speak was in June 1995. I’d flown to Honolulu to cover the annual INET conference for the newspaper I then worked for. The Internet Society’s conclave was a sort of victory lap for the wizards and graybeards who’d designed the open network decades before and were finally witnessing its come-from-behind triumph over the proprietary online services. It was plain, at that point in time, that the Internet was going to be the foundation of future digital communications.

But it wasn’t necessarily clear how big it was going to get. In fact, at that event Schmidt predicted that the Internet would grow to 187 million hosts within 5 years. If I understand this chart at Netcraft properly, we actually reached that number only recently. (Netcraft tracks web hosts, so maybe I’m comparing apples and oranges).

I thought of this today at the Web 2.0 Expo, where Eric Schmidt, now Google’s CEO, talked on stage with John Battelle. (Dan Farber has a good summary.) He discussed Google’s new lightweight Web-based presentation app (the PowerPoint entry in Google’s app suite), the recent deal to acquire DoubleClick, and of Microsoft’s hilarious antitrust gripe about it, and of Google’s commitment to letting its users pack up their data and take it elsewhere (a commitment that remains theoretical — not a simple thing to deliver, but if anyone has the brainpower resources to make it happen, Google does).

But what struck me was a more philosophical point near the end. Battelle asked Schmidt what he thinks about when he first wakes up in the morning (I suppose this is a variant of the old “what keeps you up at night”). After joshing about doing his e-mail, Schmidt launched into a discourse on what he worries about these days: “scaling.”

It surprised me to hear this, since Google has been so successful at keeping up with the demands on its infrastructure — successful at building it smartly, and at funding it, too. Schmidt was also, of course, talking about “scaling” the company itself.

“When the Internet really took off in the mid 90s, a lot of people talked about the scale, of how big it would be,” Schmidt said. It was obvious at the time there’d be a handful of defining Net companies, and each would need a “scaling strategy.”

Mostly, though, he was remarking on “how early we are in the scaling of the Internet” itself: “We’re just at the beginning of getting all the information that has been kept in small networks and groups onto these platforms.”

Tim O’Reilly made a similar point at the conference kick-off: In the era of Web-based computing, he said, we’re still at the VisiCalc stage.

Google famously defines its mission as “to organize the world’s information and make it universally accessible and useful.” But the work of getting the universe of individual and small-group knowledge onto the Net is something Google can only aid. Ultimately, this work belongs to the millions of bloggers and photographers and YouTubers and users of services yet to be imagined who provide the grist for Google’s algorithmic mills.

I find it bracing and helpful to recall all this at a show like the Web 2.0 Expo — which, while rewarding in many ways, gives off a lot of mid-to-late dotcom-bubble fumes. Froth will come and go. The vast project of building, and scaling, a global information network to absorb everything we can throw into it — that remains essential. And for all the impressive dimensions of Google, and the oodles of Wikipedia pages, and the zillions of blogs, we’ve only just begun to post.

[tags]google, eric schmidt, internet growth, web 2.0, web 2.0 expo[/tags]

Filed Under: Blogging, Business, Technology

COPA plaintiffs win, yet again

March 22, 2007 by Scott Rosenberg

Alberto Gonzales has bigger problems these days, but his Justice Department just lost the latest round in a longstanding Internet censorship conflict.

The Child Online Protection Act went on trial again in recent months, and today, again, a federal court has struck down the law — which would require commercial online publishers like Salon to make sure that their readers are over 18 or face criminal prosecution for publishing material that might be “harmful to minors.” Publishers are supposed to be able to protect themselves from prosecution by requiring site visitors to register with their credit cards, thus ostensibly demonstrating their adult status.

The law is supposedly only aimed at commercial pornographers, but the law is absurdly vague. Somehow, publishers are supposed to trust the Justice Department to make the right call and understand who is a “bad” publisher and who isn’t. Placing such trust was problematic when the law was passed, under the Clinton administration; in the era of Bush justice, doing so would be utterly foolish.

Here’s the decision, which concludes that:

COPA facially violates the First and Fifth Amendment rights of the plaintiffs because: (1) COPA is not narrowly tailored to the compelling interest of Congress; (2) defendant has failed to meet his burden of showing that COPA is the least restrictive and most effective alternative in achieving the compelling interest;
and (3) COPA is impermissibly vague and overbroad.

I am proud that Salon has been a plaintiff in this suit since 1998, when the ACLU first launched it. (Here’s my account of the 1994 oral arguments before the Supreme Court in an earlier phase of the COPA fight.) I have no idea whether, defeated at every turn, the Justice Department will drag this proceeding into another decade by appealing it. In the meantime, we can take another deep breath and be glad for the victory.

Here’s the AP story. And here’s a post by Salon editor Joan Walsh, who testified in this most recent round of the case. And here’s the ACLU’s page. And here’s CNET’s story.
[tags]copa, aclu, child online protection act, salon, internet censorship[/tags]

Filed Under: Business, Media, Politics, Salon

Viacom vs. YouTube: Misreading history

March 14, 2007 by Scott Rosenberg

I’m reading the otherwise perfectly reasonable New York Times piece on the Viacom/Youtube lawsuit and I encounter this bizarre misrepresentation of recent history:

“In the early 1990s music companies let Web companies build business models on the back of their copyright,” said Michael Nathanson, an analyst at Sanford C. Bernstein & Company. “I think the video industry is being more aggressive for the right reasons, to protect the future value of those assets.”

It’s hard to imagine how one could find more ways to be wrong on this topic.

First, there were no “Web companies” in the early 1990s; the first Web companies emerged in 1994-5 — and aside from some unusual efforts, like Michael Goldberg’s Addicted to Noise zine, there was not a lot of music happening on the Web. The MP3 revolution didn’t begin to roll until late 1997 or early 1998 (here is Andrew Leonard’s early report on the MP3 scene, which I edited).

More important, Mr. Nathanson has the history here precisely inverted. What happened in the Napster era was that music companies refused to allow Web companies to build business models on the back of their copyright. They decided that MP3s were all about piracy and they sued Napster out of existence. They refused to do deals with companies that wanted to distribute their music online, and in fact they failed to offer their music online in any way palatable to consumers until Steve Jobs whacked them on the side of the head — and even then they saddled his whole iTunes enterprise with a cumbersome “digital rights management” scheme that even he is now disowning.

The Viacom suit against YouTube does not represent a break with the way the music industry dealt with its rocky transition to the digital age; it is an instance of history repeating itself. The RIAA strategy of “sue your customers” may have succeeded in driving file-sharing underground, but it didn’t do anything to protect the profits of the music industry, which have been in a tailspin ever since. If the Viacom suit is an indication that the owners of TV shows and movies are going to pursue a similar strategy of I’d-rather-sue-than-deal, they may find themselves in a similar downward spiral.

Google has a pretty good case based on the 1996 Telecommunications Act safe harbor provision. If Viacom fails to win against its corporate opponent, will it start suing all the Jon Stewart fans (and, possibly, the show’s own staff) who are uploading clips to YouTube?

If the TV and film industries look carefully at the music industry’s story, they will see that their danger lies not in being too soft on copyright infringers but rather in missing the tidal wave of a platform shift.
[tags]youtube, google, viacom, napster, drm[/tags]

Filed Under: Business, Culture, Technology

Software glitch leads to Dow conundrum

February 27, 2007 by Scott Rosenberg

I was sitting in a long news meeting this morning, laptop in front of me, checking every now and then to see how bad a drubbing the stock market was taking. One minute around noon, West Coast time, I saw that the Dow was down around 250; a few minutes later, somehow, it was down 500. I thought, “Whoa, was there another terrorist attack? Did Alan Greenspan say something? What happened?”

It turns out that what happened was some as yet undefined software problem. As this AP story describes it, the New York Stock Exchange’s systems were falling steadily farther behind all day — in other words, the actual drop in the market was already worse than it was being reported when we thought the Dow was down 250. When the market’s managers realized what was going on, they flipped a backup into place, and suddenly, the backlog cleared — leading to that huge plunge at 3 pm Eastern time.

What’s interesting to me if you look at that chart is, once the drop became known to the market — once the backup system was in place and accurately reporting the deeper plummet — the market actually bounced back to where it thought it had been, even though that wasn’t really where it was. I’m not enough of a stock geek to fully understand this, but it’s fascinating, on some level of paradoxical reasoning.

Whoever said markets were perfect information systems?

UPDATE: Based on Wednesday AM coverage it sounds like the problem was specifically with Dow Jones’ systems, not the general stock exchange systems.
[tags]stock market, dow, software, bugs[/tags]

Filed Under: Business, Software, Technology

Lessons from MySpace: Success is a bug

January 17, 2007 by Scott Rosenberg

Apropos of my previous post on YouTube and MySpace, today I read this fascinating case-study from Baseline magazine about the saga of MySpace’s understandably overtaxed systems.

MySpace’s exploding popularity has basically forced its infrastructure through a continuous cycle of upgrades, refactorings and revampings. Its managers have never had the luxury of sitting back and calmly planning upgrades; they’ve had to perform their engine surgeries on a careening vehicle.

This is what Web 2.0 is like from the back end, and it ain’t pretty. Outside of the real masters of this stuff — the Googles and Yahoos that know how to deploy, manage and maintain vast online services — it’s a big mess. This is another little-understood dynamic of the Web 2.0 startup world: There are financial reasons a successful small service might want to be acquired, but there are even more pressing operational reasons. And the more success a service finds, the more likely it’s going to risk systems flameout.

It’s not at all clear from the Baseline piece that MySpace has yet achieved a level of stability that a more mature company might desire. MySpace, of course, was acquired not by a technology company but by a media outfit, so — unlike other popular companies that were acquired by Yahoo or Google — they’re still somewhat on their own.

The Baseline piece offers two other fascinating tidbits. In the first, a normal phenomenon for a successful site — massive surges of traffic — was interpreted as a bug by the Microsoft server platform MySpace uses:

Last summer, MySpace’s Windows 2003 servers shut down unexpectedly on multiple occasions. The culprit turned out to be a built-in feature of the operating system designed to prevent distributed denial of service attacks—a hacker tactic in which a Web site is subjected to so many connection requests from so many client computers that it crashes. MySpace is subject to those attacks just like many other top Web sites, but it defends against them at the network level rather than relying on this feature of Windows—which in this case was being triggered by hordes of legitimate connections from MySpace users.

“We were scratching our heads for about a month trying to figure out why our Windows 2003 servers kept shutting themselves off,” Benedetto says. Finally, with help from Microsoft, his team figured out how to tell the server to “ignore distributed denial of service; this is friendly fire.”

Second, it seems that MySpace didn’t actually originally intend to allow the level of customization that has made it so popular; its engineers just never got around to filtering out the user-customized formatting.

That feature was really “kind of a mistake,” says Duc Chau, one of the social networking site’s original developers. In other words, he neglected to write a routine that would strip Web coding tags from user postings– standard feature on most Web sites that allow user contributions.

The Web site’s managers belatedly debated whether to continue allowing users to post code “because it was making the page load slow, making some pages look ugly, and exposing security holes,” recalls Jason Feffer, former MySpace vice president of operations. “Ultimately we said, users come first, and this is what they want. We decided to allow the users to do what they wanted to do, and we would deal with the headaches.”

Here we have the state of Web development today: Your site’s massive success gets treated as a bug by your server; and the feature your users love best is something your programmers forgot to block.
[tags]baseline, myspace, web 2.0, software development[/tags]

Filed Under: Business, Dreaming in Code, Software

« Previous Page
Next Page »