Wordyard

Hand-forged posts since 2002

Archives

About

Greatest hits

Viacom vs. YouTube: Misreading history

March 14, 2007 by Scott Rosenberg

I’m reading the otherwise perfectly reasonable New York Times piece on the Viacom/Youtube lawsuit and I encounter this bizarre misrepresentation of recent history:

“In the early 1990s music companies let Web companies build business models on the back of their copyright,” said Michael Nathanson, an analyst at Sanford C. Bernstein & Company. “I think the video industry is being more aggressive for the right reasons, to protect the future value of those assets.”

It’s hard to imagine how one could find more ways to be wrong on this topic.

First, there were no “Web companies” in the early 1990s; the first Web companies emerged in 1994-5 — and aside from some unusual efforts, like Michael Goldberg’s Addicted to Noise zine, there was not a lot of music happening on the Web. The MP3 revolution didn’t begin to roll until late 1997 or early 1998 (here is Andrew Leonard’s early report on the MP3 scene, which I edited).

More important, Mr. Nathanson has the history here precisely inverted. What happened in the Napster era was that music companies refused to allow Web companies to build business models on the back of their copyright. They decided that MP3s were all about piracy and they sued Napster out of existence. They refused to do deals with companies that wanted to distribute their music online, and in fact they failed to offer their music online in any way palatable to consumers until Steve Jobs whacked them on the side of the head — and even then they saddled his whole iTunes enterprise with a cumbersome “digital rights management” scheme that even he is now disowning.

The Viacom suit against YouTube does not represent a break with the way the music industry dealt with its rocky transition to the digital age; it is an instance of history repeating itself. The RIAA strategy of “sue your customers” may have succeeded in driving file-sharing underground, but it didn’t do anything to protect the profits of the music industry, which have been in a tailspin ever since. If the Viacom suit is an indication that the owners of TV shows and movies are going to pursue a similar strategy of I’d-rather-sue-than-deal, they may find themselves in a similar downward spiral.

Google has a pretty good case based on the 1996 Telecommunications Act safe harbor provision. If Viacom fails to win against its corporate opponent, will it start suing all the Jon Stewart fans (and, possibly, the show’s own staff) who are uploading clips to YouTube?

If the TV and film industries look carefully at the music industry’s story, they will see that their danger lies not in being too soft on copyright infringers but rather in missing the tidal wave of a platform shift.
[tags]youtube, google, viacom, napster, drm[/tags]

Filed Under: Business, Culture, Technology

Denise Caruso’s “Intervention”: What we don’t know can hurt us

March 11, 2007 by Scott Rosenberg

Biotech is not a field I’ve immersed myself in, and I have been — like, perhaps, many of you — content to place a simple boundary on my worries about its impact, on the assumption that smart and dedicated people were already deeply engaged in assessing and managing the risks we are taking in that area.

Then I read Denise Caruso’s eye-opening new book, Intervention, and realized that such complacency is a very bad bet. Intervention is a passionately argued, carefully documented critique of our society’s narrow approach to defining, and dismissing, the potential risks of biotech products.

I worked with Caruso many years ago at the San Francisco Examiner, and since then have followed her career as a technology pundit and more recently a nonprofit think-tank founder with admiration, mostly from afar. When I heard that she’d self-published her book after a publishing-house deal fell through, I set up an interview with her. It’s now live on Salon. Here’s a brief excerpt:

You spent years writing about the technology industry. How did this book come about?

It was sheerly out of reaction to meeting [molecular biologist] Roger Brent. He laughs when I say this, and I say it with all the love in my heart, but he’s one of the most macho scientists I’ve ever met in my life. His lineage — in academics, that means who your Ph.D. advisor was — is a guy named Mark Ptashne, whose Ph.D. advisor was James Watson. When I met Roger, his attitude was: What’s a nice girl like you doing being afraid of eating genetically modified food? Don’t you know that you could eat 10 kilos of Bt potatoes [Bacillus thuringiensis is used to modify crops transgenically for insect resistance], and nothing would happen to you?

I didn’t know that much about biology. But when he said that, I said, “I don’t think you actually know that to be true. I don’t know how you could know that to be true.” And we went back and forth on it, and he finally conceded — which I was really surprised about. He said, “So how do we protect the public, but not stop science from progressing at the same time?”

Filed Under: Books, Science, Technology

Leaking entities

March 8, 2007 by Scott Rosenberg

Today’s software is built up in layers, like sedimentary rock that has been accumulating over many generations. In Dreaming in Code I wrote about how sometimes the lower layers poke up through the surface, like angled strata of rock, disturbing the placid interface surface. (This was merely a metaphoric restatement of Joel Spolsky’s Law of Leaky Abstractions.)

Examples of this are all over the place. This morning, for instance, I went to have a look at the new beta of My Yahoo (here’s the TechCrunch report). And here’s what I saw at the top of the page:

Notice the center button. This is, of course, an HTML code or “entity” representing the “non-breaking space”, and it is rearing its ugly little head onto the shiny new AJAX-y fresh My Yahoo screen.

Presumably some designer or developer entered that data long ago, maybe long before anyone ever thought it would end up labeling a button in this environment. Or maybe it was coded consciously that way with the expectation that the HTML data, including the non-breaking space code, would be transformed by the My Yahoo application in such a way that each layer would understand that it was looking at an HTML entity and handle it properly. However it happened, the bug exposes a layer of the software you were never supposed to see.

It’s a tiny bug, to be sure, on the first day of a public beta. It will probably be gone soon. But such “entity” codes have made their way often enough over the years onto Salon’s home page — so I find it a little reassuring that these things happen even to the experienced and well-staffed team at Yahoo.

Filed Under: Dreaming in Code, Software, Technology

Software glitch leads to Dow conundrum

February 27, 2007 by Scott Rosenberg

I was sitting in a long news meeting this morning, laptop in front of me, checking every now and then to see how bad a drubbing the stock market was taking. One minute around noon, West Coast time, I saw that the Dow was down around 250; a few minutes later, somehow, it was down 500. I thought, “Whoa, was there another terrorist attack? Did Alan Greenspan say something? What happened?”

It turns out that what happened was some as yet undefined software problem. As this AP story describes it, the New York Stock Exchange’s systems were falling steadily farther behind all day — in other words, the actual drop in the market was already worse than it was being reported when we thought the Dow was down 250. When the market’s managers realized what was going on, they flipped a backup into place, and suddenly, the backlog cleared — leading to that huge plunge at 3 pm Eastern time.

What’s interesting to me if you look at that chart is, once the drop became known to the market — once the backup system was in place and accurately reporting the deeper plummet — the market actually bounced back to where it thought it had been, even though that wasn’t really where it was. I’m not enough of a stock geek to fully understand this, but it’s fascinating, on some level of paradoxical reasoning.

Whoever said markets were perfect information systems?

UPDATE: Based on Wednesday AM coverage it sounds like the problem was specifically with Dow Jones’ systems, not the general stock exchange systems.
[tags]stock market, dow, software, bugs[/tags]

Filed Under: Business, Software, Technology

Stealth fighter trips over dateline

February 26, 2007 by Scott Rosenberg

Back when my job as Salon managing editor involved overseeing our daily production, I noticed that, every spring and fall, almost without fail, our publishing system would experience a glitch of some kind on the weekend that the clocks got moved forward or back — nothing serious, mind you, but enough to throw a wrench in the works of our site updates. It wasn’t a single bug, but some sequence of related bugs, so we’d fix one and then six months later something else would happen. Eventually we got in the habit of just making sure that one of the developers kept a close eye on things when that weekend rolled around. It was prudent.

I thought of that as I read these accounts that are filtering out about the F-22 Raptors that, the speculation is, lost their bearings when they crossed the International Date Line. (Further speculation is that this was somehow connected to a software patch/upgrade related to the coming change in the date of Daylight Savings Time onset, but that’s harder to source.) The planes, en route to Japan, limped back to Hawaii instead

The F-22 costs $125 million or so and its operating system is written in 1.5 million lines of Ada code. It appears that, for all its “stealth” prowess and advanced weaponry, its soft underbelly may lie in the realm of the abstract.

It seems that this is one of the unexpected consequences of living in a world operated by software: new danger zones lie where human abstractions — borders, measurements, languages — change or conflict or fail to behave as expected. Clocks and calendars and maps are no longer just assists for human understanding; they are symbols at the heart of systems upon whose performance lives depend. I suppose this started with the first railway schedule, but with the dateline-addled F-22 it has entered a whole new realm of disconcert.
[tags]f-22, bugs, software[/tags]

Filed Under: Software, Technology

Teraflop software?

February 21, 2007 by Scott Rosenberg

Of the many “laws” I encountered in the course of writing Dreaming in Code, I think Wirth’s law (by the software pioneer Niklaus Wirth) is my favorite: Software gets slower faster than hardware gets faster.

Here is a contemporary instance. All right, it’s not exactly parallel; but it’s an example of the very common situation we encounter as hardware improves exponentially while software improves on only a linear basis.

This is from John Markoff’s recent piece about Intel’s demo of a prototype of a new chip-making technique that packs 80 processor cores on a single chip (the “Teraflop Chip”):

The shift toward systems with hundreds or even thousands of computing cores is both an opportunity and a potential crisis, computer scientists said, because no one has proved how to program such chips for many applications.

“If we can figure out how to program thousands of cores on a chip, the future looks rosy,” said David A. Patterson, a University of California, Berkeley computer scientist who is a co-author of one of the standard textbooks on microprocessor design. “If we can’t figure it out, then things look dark.”

Mr. Patterson is one of a group of Berkeley computer scientists who recently issued a challenge to the chip industry, demanding that companies like Intel begin designing processors with thousands of cores per chip.

In a white paper published last December, the scientists said that without a software breakthrough to take advantage of hundreds of cores, the industry, which is now pursuing a more incremental approach of increasing the number of cores on a computer chip, is likely to hit a wall of diminishing returns — where adding more cores does not offer a significant increase in performance.

I wrote about this “multicore competency” issue a couple of years ago. Looks like it’s not going away.

UPDATE: Corrected to fix a (happily) mistaken suggestion that Wirth had passed away.

Filed Under: Dreaming in Code, Software, Technology

Vista’s successor, Longhorn deja vu?

February 10, 2007 by Scott Rosenberg

Word, in the form of this Infoworld report, is beginning to trickle out from Redmond about the next Microsoft operating-system cycle. In two-and-a-half years or so, Microsoft expects to unveil Vista’s successor (code-named, apparently, “Vienna” — or maybe something else).

The report provides eerie echoes of the early days of Longhorn, as Vista was originally known. At the start of the Longhorn process in 2002 the promise was a similar 2 1/2 year delivery (see for instance this now amusing report from the WINHEC hardware conference in 2002, headlined “Longhorn slips to Late 2004”). In summer of 2003 Bill Gates and other Microsoft spokespeople began telling us about all the cool stuff Longhorn would provide. At the same time, Microsoft buckled down for what would turn out to be a year-long, all-hands-on-deck effort to tighten the security holes in Windows XP; this process resulted in XP’s “Service Pack 2” release in 2004. In 2004 Microsoft realized that the original Longhorn vision was hopelessly out of reach, and it now admits that it “rebooted” the entire development process at this point.

So that’s when Microsoft exec Ben Fathi starts the clock in looking at how long it really takes Microsoft to prepare a new edition of Windows:

Vista shipped about two-and-a-half years after XP SP 2, and Vista’s follow-up is expected to take about the same amount of time, according to Fathi. “You can think roughly two, two-and-a-half years is a reasonable time frame that our partners can depend on and can work with,” he said. “That’s a good timeframe for refresh.”

All well and good. Only when Fathi starts talking about what new stuff Vienna will have to offer users, it all sounds remarkably like what Microsoft had to say in Longhorn’s early days:

So what will be the coolest new feature in Vienna? According to Fathi, that’s still being worked out. “We’re going to look at a fundamental piece of enabling technology. Maybe its hypervisors, I don’t know what it is,” he said. “Maybe it’s a new user interface paradigm for consumers.”

Over on Engadget they’re interpreting this talk to mean stuff like “full virtualization and a radical new user interface” and “a break in compatibility with older applications.”

But to me it all sounds like: “We’re going to do big, big things, but we don’t know exactly what they are yet.” And that is precisely what Longhorn’s leaders were saying as they marched their troops down the roads that would swallow the project’s first 2-3 years.

Lesson learned?
[tags]microsoft, software, windows, vista, vienna[/tags]

Filed Under: Software, Technology

Good reads

January 12, 2007 by Scott Rosenberg

I’ve got a little link backlog. Let’s do something about it!

  • Earlier this week Jay Rosen wrote a remarkable essay about the recent kerfluffle in the right-wing blogosphere over charges that AP reporters in Iraq had made up a source. The excitable warbloggers, understandably dejected that they’ve lost the battle both on the ground and in the American public, grew excited at the thought of MSM blood. But it turned out the entire charge was bogus — the source was real.

    Rosen parses the motives and suggests that the warblog crowd would have done their cause a favor by being more critical of the Bush administration’s reality-evasion from the start:

    For Bush supporters who soldier on, the choices resemble what the go-getters from Enron faced: confront the bad accounting that’s gone on for years or adopt even more desperate measures to conceal losses and keep your hand alive. But if the AP had fabricated a source and relied on that source 60 times, maybe the tables could be turned again and the reckoning put off….

    If you really wanted Bush to succeed in Iraq, and you noticed that he could never be wrong or accept that bad news bearers could be right, this was a warning sign that the warbloggers themselves, as friends of the president’s project, should have taken the lead in discussing. Why didn’t they?

    The children of Agnew have been fully on his side, soldiers in his struggle, happy warriors with Bush because they believe in their red state bones the press is biased against them. Like him they also disbelieve the bad news on principle, and then find someone more loyal to look into it.

  • Michelle Goldberg’s recent Salon interview with Chris Hedges on fundamentalism in America and his new book, American Fascists, is also a great read: One passionate reporter who’s immersed in a fascinating subject interviewing another, equally obsessed.
  • Finally — this one’s a month old, but I’m just catching up — Clive Thompson’s New York Times magazine piece on open source spying. Can wikis and blogs really help the intelligence establishment do a better job assessing terrorist threats? It seems outlandish, but it grows on you the more you think about it (and read Thompson’s explanations).

    This passage rung my Dreaming in Code bell:

    The blog seemed like an awfully modest thing to me. But Meyerrose insists that the future of spying will be revolutionized as much by these small-bore projects as by billion-dollar high-tech systems. Indeed, he says that overly ambitious projects often result in expensive disasters, the way the F.B.I.’s $170 million attempt to overhaul its case-handling software died in 2005 after the software became so complex that the F.B.I. despaired of ever fixing the bugs and shelved it. In contrast, the blog software took only a day or two to get running. “We need to think big, start small and scale fast,” Meyerrose said.

    One of the big problems the agencies have, even with their closed networks, is persuading intelligence officers to share information. On the one hand, their desire to protect sources is understandable; on the other, the information doesn’t do the U.S. any good unless it gets circulated to people who can assess its significance.

    Is this the sort of information that is safe to share widely in an online network? Many in the intelligence agencies suspect not. Indeed, they often refuse to input sensitive intel into their own private, secure databases; they do not trust even their own colleagues, inside their own agencies, to keep their secrets safe. When the F.B.I. unveiled an automated case-support system in 1995, agents were supposed to begin entering all information from their continuing cases into it, so that other F.B.I. agents could benefit from the collected pool of tips. But many agents didn’t. They worried that a hard-won source might be accidentally exposed by an F.B.I. agent halfway across the country. Worse, what would happen if a hacker or criminal found access to the system?

[tags]journalism, fundamentalism, intelligence, open source spying[/tags]

Filed Under: Media, Politics, Technology

iPhone: the interface’s the thing

January 10, 2007 by Scott Rosenberg

The Wall Street Journal asks whether people will buy Apple’s slick new iPhone for $5-600. Of course they will — if it’s as good and as easy to use as it looked in Steve Jobs’s presentation. (Here’s some coverage: David Pogue’s test-drive; John Markoff’s story; Lev Grossman in Time; Farhad Manjoo in Salon.)

The original iPod came in at a similar price point and pundits asked similar questions. The value of Apple’s innovation pretty much obliterated the price sensitivity of the market, and by the time the early-adopters all had their iPods and Apple started going after a wider market, it was able to bring the price down some (and add more value by continuing to improve the product).

No, the question about the iPhone isn’t, “Will people pay for it?” It’s simply, “Can it really be as easy as Jobs made it look?”

Mobile-device interfaces are such a total disaster today that many of us simply never learn to use more than a fraction of their features — and even when we learn them, we tend to forget them immediately. Phones have become so disposable anyway, why waste your time learning all their dumb menus? Blackberries and Treos are considerably better, but they’re still full of compromises, and they typically do a lot less than the iPhone — which in effect is a tiny Macintosh optimized for phone and music functions.

If the iPhone interface is as intuitive as Jobs promised, then people will line up to get it regardless of its hefty price. It will have succeeded, to paraphrase Alan Kay’s famous utterance about the original Mac, in being the first cellphone interface good enough to be worth criticizing.
[tags]apple, steve jobs, iphone[/tags]

Filed Under: Business, Technology

An interview and a profile

January 8, 2007 by Scott Rosenberg

A few weeks ago I had the pleasure of being interviewed by Ed Cone about Dreaming in Code. I first met Cone years ago when he was organizing the panel I spoke on at the first BloggerCon. I’ve always enjoyed his work; like me, he’s someone who is equally interested in politics and technology, and blogs about both of them.

The Cone interview is now up at CIO Insight. It was fun to talk about the issues in the book for a relatively expert readership, where I could skip over some of the basics and jump right to the harder questions. Cone did a great job of drawing me out and then trimming the verbal excess from my responses.

CIO Insight: Are we just being impatient with a branch of knowledge that is still fairly new? Or is there something inherent to software development that makes it so weird and vexing?

Rosenberg: You get one perspective that says, hey, we now have a computer on every desk that does things that were unimaginable 20 years ago, and they’re all connected in this network that gives us instant answers and instant connections. These are miraculous things. And then you find other people who say, you know what? We’re still writing code basically by picking out characters one at a time, we still have programs that are laid low when a single bug creeps in, we still have projects that take ten times longer than they should, we need to rethink everything from the ground up.

I don’t have an answer between them. My personal temperament is more towards the optimistic. In the end, what you’ve got is this industry that’s been conditioned by Moore’s Law, and by its own fantastic financial success, to assume that the curve is always an upward curve, that everything gets better at an exponential pace. That’s the experience of the technology industry. You have that smacking up against the reality of human experience, of creativity, of people working in teams. We have these basic human factors, psychology, the limits of the conceptual capacity of the human brain—things that do not move at an exponential pace. They simply don’t. They tend to move linearly, if they are improving at all. People in the technology industry are loath to accept that.

This theme is also at the heart of another piece that occupied me for a considerable part of the fall — a profile of Charles Simonyi that is on the cover of the new issue of Technology Review. I covered Simonyi and his Intentional Software project just a little bit in Dreaming in Code, and I’m grateful to Jason Pontin at TR for giving me the chance to look at him, and it, more fully.

The first part of the profile, “Anything You Can Do, I Can Do Meta,” is up at the TR Web site now; the second part is slated to go up tomorrow. Since the piece was written as one integral whole, you might want to wait till you can read it all at once — I’ll post the link. It was fun to be writing for print again, and Technology Review is looking very spiffy these days, so this is one that you just might be better off reading on paper.
[tags]charles simonyi, ed cone, technology review[/tags]

Filed Under: Dreaming in Code, Media, Personal, Software, Technology

« Previous Page
Next Page »