Wordyard

Hand-forged posts since 2002

Archives

About

Greatest hits

Internet garbage dump? What Weizenbaum really said

March 26, 2008 by Scott Rosenberg

Joseph Weizenbaum — creator of the Eliza chatterbot and author of “Computer Power and Human Reason” — passed away recently. Running through all the obits was a quote that seemed to summarize this computing pioneer’s critical perspective on technology:

The Internet is like one of those garbage dumps outside of Bombay. There are people, most unfortunately, crawling all over it, and maybe they find a bit of aluminum, or perhaps something they can sell. But mainly it’s garbage.

This line appeared in outlets from the Wall Street Journal to Valleywag. Caught my eye, too.

The original quotation was in a New York Times article from 1999. But it’s not the whole story.

Weizenbaum wrote a letter to the Times after the article appeared:

I did say that, but I went on to say, “There are gold mines and pearls in there that a person trained to design good questions can find.”

Amazing what a little context can do!

Interestingly, although Weizenbaum’s critique of computing centers on the limitations of algorithmic problem-solving, it was Google’s pattern-matching prowess that unearthed this connection for me. I’d never have found it on my own.

Weizenbaum, whose family fled Nazi Germany in the ’30s, was right to urge us not to discount the value of the human compass in navigating our lives, and not to abdicate our judgment to machines. But I think he might have been a little too ready to dismiss the ability of machines to help us find informational gold and contextual pearls.

Filed Under: Media, People, Technology

Disk — raw or cooked

March 25, 2008 by Scott Rosenberg

I returned from my travels, sat down at the desk yesterday morning, fired up my email program, and — ffftt!!! — encountered one of those awful Windows system messages (something to the effect that the system had been unable to write back to the disk and it was very sorry but some data had been lost) that can mean only one thing: hardware failure. I took a breath, ran the disk utility, and got another message informing me that there was insufficient disk space to fix the bad clusters, which made no sense, since there was plenty of empty space on the drive — unless, like, massive gigabytes’-worth of clusters had gone south.

The bad news was, this was the big 500 GB drive I use to store basically everything. (Another drive is for the system and applications.) The good news is, I’ve become religious about backups, I have copies of all my important data, including online backups; but it’s scattered about.

I can recall many hours spent hostage to CHKDSK back in the late ’80s, fixing my own or friends’ wayward drives, watching patiently as the little grid filled out with marks for good and bad clusters. This was when the disks held 10 or 20 megabytes. Then in the mid 90s I wrestled with bad Mac disks on the cruddy machines Salon was trying to make do with. But since then, either the quality’s gotten a lot better or I’ve had a good run of luck. This is the first disk disaster I’ve had in a decade.

I took another breath and began poking at the drive with various tools and utilities. Sometimes the system could see it, sometimes it couldn’t. Then, at the lowest ebb of my fight, my system told me that there was zero Kb of data in zero files on the drive, and that the drive, in fact, could not be examined at all because “the type of the file system is RAW.”

Raw? As in, not cooked? By now I was doing all my research on the old laptop. Google told me that this error message indicates that a drive is close to unformatted — in, as it were, the state of nature. Only I knew there was tons of data on the disk, and it hadn’t been erased, and it didn’t seem like the disk had massively failed (a la “head crash”).

A couple of hours digging through the tech forums led me to a fine utility called GetDataBack. Several hours later, this program had painstakingly reassembled the file-system tables of my drive and enabled me to ferry my precious information back to safety in (roughly) one piece rather than having to reassemble it from patchwork backups.

This morning, I purchased and installed a new drive. Spent the rest of the day restoring files. Now I’m back. Phew. Only two days lost.

My motherboard fried during the middle of my labors on Dreaming in Code. So maybe I get one technical disaster per book project, and now the one for my new book is behind me. (I also notice that the previous disaster also occurred after a week away. I wonder whether it’s the temperature change of sitting unpowered after weeks of constant use that’s stressing the gear.)

If you’ve read this far, you don’t need me to remind you of the moral of this tale: you must have a backup plan.

Filed Under: Personal, Technology

Code mining

February 26, 2008 by Scott Rosenberg

I wrote Dreaming in Code because I believed that, as Bjarne Stroustrup says, “our civilization is built on software.” I noticed that creating software remains stubbornly difficult in certain ways, and, despite its centrality to our civilization, our understanding of that difficulty remains deficient. But I also wanted to create a journalistic record of the day-to-day experience of the software developer at the start of the 21st century — to tell a story about the act of programming itself.

I’m grateful that a good number of the book’s readers who’ve posted their thoughts feel that I achieved that goal. Others don’t think I did, and some days I agree with the criticism. Writing about the act of programming itself is as difficult as writing about any act of writing: the subject is an essentially interior process between the mind and the page (or screen), and it’s highly resistant to illumination.

Consider the difference when the topic of an essay is a rough physical act — like, say, digging coal out of the ground. I read a lot of George Orwell early in my career but I’d forgotten this passage, which Brad DeLong hoisted into the light of blog last year:

Our civilization, pace Chesterton, is founded on coal, more completely than one realizes until one stops to think about it. The machines that keep us alive, and the machines that make machines, are all directly or indirectly dependent upon coal. In the metabolism of the Western world the coal-miner is second in importance only to the man who ploughs the soil. He is a sort of caryatid upon whose shoulders nearly everything that is not grimy is supported. For this reason the actual process by which coal is extracted is well worth watching, if you get the chance and are willing to take the trouble.

When you go down a coal-mine it is important to try and get to the coal face when the ‘fillers’ are at work. This is not easy, because when the mine is working visitors are a nuisance and are not encouraged, but if you go at any other time, it is possible to come away with a totally wrong impression. On a Sunday, for instance, a mine seems almost peaceful. The time to go there is when the machines are roaring and the air is black with coal dust, and when you can actually see what the miners have to do. At those times the place is like hell, or at any rate like my own mental picture of hell. Most of the things one imagines in hell are if there — heat, noise, confusion, darkness, foul air, and, above all, unbearably cramped space. Everything except the fire, for there is no fire down there except the feeble beams of Davy lamps and electric torches which scarcely penetrate the clouds of coal dust.

It should go without saying that a wide gulf separates the strenuous and perilous experience of the “grimy” miners that Orwell depicted and the abstract, cerebral work of programming. The parallels are less obvious — but they jump out at me, too.

Both activities are essential to industry and highly profitable to those at the top of the economic pyramid they support. Both require the exploitation of long hours put in by young workers. In each, the treasures society values are struggled for in dim places and retrieved into the daylight after obscure labors to which their beneficiaries are oblivious.

Writing about a software project couldn’t have been more physically different from descending into a mine. But there were times, during my three years of research, when I felt like I was in an underground labyrinth, hunting for nuggets of insight in the dark.

It was, in any case, striking to find the “Our civilization is founded on…” construction that kicks off Dreaming in Code in Orwell’s penetrating lead. I’ve been trying to trace the Chesterton passage Orwell refers to as his antecedent, but so far no luck. Anyone have a clue?

Filed Under: Dreaming in Code, Software, Technology

Word processing, then and now

February 25, 2008 by Scott Rosenberg

Clive Thompson writes about how software can shape our creative work:

Our tools, of course, affect our literary output. And all this made me wonder how other writing tools affect what’s written. I use Movable Type to write my blog, and I’m constantly annoyed by how small the text-entry boxes are. Whenever I write an entry, the text quickly flows down several box-lengths, which can make it hard to keep track of my argument. The problem, of course, is that the tool was designed with the idea that people would be writing extremely short, pithy entries … whereas my entries tend to drag on and on and on. It reminds me of the writing on one of those old, proprietary-hardware word-processors from the 80s, which were outfitted with screens that only let you see seven lines at a time.

WordPress lets you set the posting box to any size you want. But for longer posts, I compose in a text editor. It’s just handier. I have no doubt, though, that browser-based editing will eventually evolve to the point where I don’t need to do that.

Thompson also references Virginia Heffernan’s recent Times piece on word processors, which recommends the Zen-like blank-screen approach of the Mac-based WriteRoom. (Of course, the dominant DOS-based word processor, WordPerfect, offered what was very close to a blank screen; in a pre-Windows world, you didn’t have a browser or e-mail always competing for screen real estate.)

For those of us who learned Basic on a Zenith Z19 and started word processing on a Kaypro (anyone?), the retro green-and-black now takes the breath away. It’s not just the vintage features available on WriteRoom, it’s also that the whole experience is a throwback to a time before user-friendly interfaces came to protect us from technology’s dark places. In those days, the mystery of the human mind and the mystery of computation seemed both to illuminate and to deepen each other.

All of which brings back involuntary, wincing memories of one of my earliest word-processing experiences, at the Boston Phoenix. In the early ’80s the Phoenix had some ancient minicomputer sitting in a back room, feeding the newsroom’s small and much-fought-over handful of dumb terminals. When I say dumb, I mean really dumb. In a limitation that is inconceivable today, these terminals had so little memory that they could only handle a few hundred words at a time. Most Phoenix reviews were way longer than that, yet many of us composed directly on this system (who could afford one’s own PC on what that alternative paper paid its writers?).

To compose a lengthy piece you had to write a chunk (a “buffer”), then save it — sending it on a leisurely journey back to main memory — to make room on the terminal for the next installment of your opus. Unfortunately, these terminals also had a habit of crashing. Too often you’d press that “send” key only to see the screen freeze, and you’d know then that you’d just lost all your work stretching back to the last time you’d saved your work. Only sometimes pressing “save” would itself trigger the dreaded freeze — a tragic Catch-22 indeed.

As a result — in a tableau that somehow seemed to epitomize all the pain of human composition in a technological age — you might occasionally spy some desperate writer hunched over notebook and pen in front of a frozen screen, painstakingly copying the slim remnant of his verbiage that was still visible, rescuing some fragment of inspiration before the inevitable reboot wiped the words clean.

Filed Under: Media, Personal, Technology

Microsoft plus Yahoo? The sum is less than the parts

February 1, 2008 by Scott Rosenberg

I don’t know whether Microsoft will win its unsolicited takeover offer for Yahoo (AP story here, on Yahoo) — the legal and financial road for such hostile bids is always unpredictable. But I do know this: it is a path to failure for both companies.

The business press is going to go into paroxysms over this move: it combines the frisson of a big tech-industry acquisition story with the raw testosterone of a hostile-takeover battle saga. In newsrooms everywhere this morning, you can practically hear the salivation. Don’t get distracted. These big takeovers — AOL/Time Warner was the biggest — are always about failure in the present and fear of the future. And they nearly always end badly.

To understand what the takeover would mean for Yahoo, just look at the fate of the previous company to end up in this circumstance. When Netscape, then a dominant portal site and purveyor of a declining but still widely used Web browser, got bought by AOL a decade ago, we heard all the usual pieties about the strength of the brand and the value of its franchise. But AOL’s acquisition of Netscape meant its doom: the remaining talent headed for the exits, and its assets were quickly cannibalized. AOL itself entered another disastrous merger a couple of years later, and today it is a shadow of its former importance — while Netscape isn’t even a phantom.

Similarly, if Microsoft wins Yahoo, you will see most of Yahoo’s smart people depart, and its customers gradually parceled out to attempt to bolster Microsoft’s ever-faltering efforts to build an online business. Much of the talk in the business press surrounding this deal will be about Yahoo’s ad business, and it’s true that Microsoft will find it useful, but it’s hard to see what new power a combined Microsoft and Yahoo business will have to challenge Google that the two companies didn’t have as separate entities.

For Microsoft, this move is a final admission of the utter failure of the company’s effort to build an online business for itself over the past decade — in services, advertising or content. Winning Yahoo would surely bolster Microsoft in this area in the short term. But in the long term, these efforts at lashing together two failures in hopes of sparking a success have never prospered. For Google, the target of Redmond’s chess move, there is really no danger here. Google today needs to worry about the drag on its stock from the broader market troubles, and the drain on its brainpower by the lure of new startups. Microhoo is hardly a threat.

UPDATE: Kevin Kelleher has an amusing take over at GigaOm:

[Ballmer] finally called Yahoo on the Oz-like illusion it’s been fostering for a couple of years: “You had a year. You lost. All your base belong to us.”…

Yahoo has been admirably laissez-faire with Flickr and del.icio.us. Will they be preserved or folded into to services we’ve all eschewed? How will Yahoo mail accounts be reconciled with Hotmail accounts? Will those of us who use Yahoo Finance and all its features adapt to MSN Finance? What is MSN Finance?

A 62 percent premium, hmmm –- we Yahoo users have a new choice: Learn to love life under Ballmer, or migrate to Google.

Not hard to guess where that choice will fall…
[tags]microsoft, yahoo, takeovers[/tags]

Filed Under: Business, Technology

Some Gibson, then a break

January 24, 2008 by Scott Rosenberg

We’re leaving tomorrow on a brief mid-winter getaway, so I may be absent from these precincts for a handful of days. Before I go, two passages worth savoring from Andrew Leonard’s recent interview with William Gibson in Rolling Stone:

How does it break down for you? Are you optimistic or pessimistic about the future?

I find myself less pessimistic than I sometimes imagine I should be. When I started to write science fiction, the intelligent and informed position on humanity’s future was that it wasn’t going to have one at all. We’ve forgotten that a whole lot of smart people used to wake up every day thinking that that day could well be the day the world ended. So when I started writing what people saw as this grisly dystopian, punky science fiction, I actually felt that I was being wildly optimistic: “Hey, look — you do have a future. It’s kind of harsh, but here it is.” I wasn’t going the post-apocalyptic route, which, as a regular civilian walking around the world, was pretty much what I expected to happen myself.

Also:

The very first time I picked up a Sony Walkman, I knew it was a killer thing, that the world was changing right then and there. A year later, no one could imagine what it was like when you couldn’t move around surrounded by a cloud of stereophonic music of your own choosing. That was huge! That was as big as the Internet!

Filed Under: Culture, Food for Thought, Personal, Technology

My review of Carr’s “Big Switch”

January 23, 2008 by Scott Rosenberg

I return to the pages of Salon tonight with a full review of Nick Carr’s new book, “The Big Switch”:

“The Big Switch” falls neatly into two halves. The first, which I can enthusiastically recommend, draws an elegant and illuminating parallel between the late-19th-century electrification of America and today’s computing world. In the less persuasive latter section, Carr surveys the Internet’s transformations of our world, and questions whether we should welcome them. His questions are good ones; indeed, any treatment of this subject that failed to explore them couldn’t be taken seriously. But in his eagerness to discredit “techno-utopian dreamers” and expound a theory of the Internet as a technology of control, Carr fast-forwards to dour conclusions that his slender argument can’t possibly support.

I had a variety of quarrels with Carr’s book (here’s the official site), but it’s most certainly an important contribution to today’s debate about the Web’s cultural sway. I remain more of an optimist than the author, but he presents the darker view with more heft, more care and more credibility than many others attempting to make this case (like Andrew Keen and Lee Siegel).

One of the points I didn’t cover in my Salon piece was the great comparison Carr makes between the “millwork” of Victorian-era factories and the complex custom software products today’s developers build for contemporary information factories. Millwork meant elaborate, Rube-Goldberg-like devices, unique to each location, designed to transfer the power from some source like a water-wheel to the factory’s machinery. Once electricity came along things got simpler, but each factory still ran its own plant — until the electrical grid rendered that whole approach obsolete.

In one of the best parts of his book, Carr argues, pretty definitively, that today’s custom software work is destined to disappear, as the old millwork did, once the Web-based software-as-service grid really takes off. I think Carr may discount a little too readily the difficulty of building effective and reliable Web-based services; even after you outsource your infrastructure and “mash up” your tools and so on, this stuff doesn’t happen by itself — somebody’s got to write the code to put it all together, and somebody’s got to fix it when it stops working. But Carr is plainly right that much of what we’ve taken for granted as the stuff of corporate information management is about to go up in smoke.

In an amusing coincidence, I was listening today to the first of Mitch Kapor’s lectures about “disruptive innovation” — the one in which he talks about the early days of the PC and his role as one of its most spectacularly successful software entrepreneurs. Kapor tells a hilarious tale (if you get the audio file, it starts around the 59:00 mark) of being summoned in 1983 to visit the office of Ken Olsen, founder of the Digital Equipment Corporation. Kapor’s company, Lotus, and the new IBM PC its products run on, are beginning to worry the minicomputer industry establishment. So Olsen sends a helicopter out to whisk the young upstart to Digital’s HQ. Olsen proceeds to deliver a mad rant to Kapor. What is he so miffed about? The flimsy construction of the IBM PC’s case!

It’s a curious instance of fiddling in the face of an inferno. But the detail that stuck with me was Kapor’s mention that Digital’s headquarters, in Maynard, Massachusetts, occupied a grand old mill building.
[tags]nicholas carr, big switch, mitch kapor, technology history[/tags]

Filed Under: Business, Culture, Technology

When Nintendo cartridge meets spin cycle

January 14, 2008 by Scott Rosenberg

I am accustomed to, and accommodated to, the fragility of our electronic gadgets. At best, they are built to have a fighting chance of surviving a few knocks. I have used Thinkpads until their plastic cases began to disintegrate, and I have an unusually durable cellphone — an antediluvian model with a black-and-white screen. But in general, our PDAs, Ipods, cameras and all other manner of digital gewgaw are prone to failure given the slightest abuse. And we accept this as the nature of contemporary stuff: cheap to make, quick to fail, cheap to replace — and your replacement will be faster, cooler, more capacious.

So when my son Jack reported, with a downcast face, that he had failed to remove three Nintendo DS cartridges from the pocket of a pair of pants that had just passed through the washing machine, I figured, oops — there goes $100 worth of ROM chips. I knew Nintendo does a great job of protecting its hardware from the depredations of its puerile customer base; how many times had I seen Game Boys survive impacts that would have totaled any laptop? Yet I had no hope for the laundered cartridges.

“Maybe they still work!” my son proposed, with the look of a gambler willing to bet on a long shot, knowing full well he faced brutal odds. I just pursed my lips and thought, “Dream on.”

I fished the pants out of the washer and located the cartridges — turned out to be two, not three. They seemed remarkably dry, yet I had no hope of their survival. This micro-finery of silicon and contacts, marinated in Tide and then roughed up by wash, rinse and spin? No way, Mario and Luigi.

I handed the cartridges to Jack and left the room, torn between urges to console my son and to chastise him.

A moment later, I heard: “YESSS! It works!” Sonic Rush had survived. So, we learned a moment later, had Pokemon.

Somehow, Nintendo had managed to manufacture a game cartridge that could take a licking from an eight-year-old boy — and his family’s household appliances — and keep on clicking.

To such engineering prowess, one can only bow.
[tags]nintendo ds[/tags]

Filed Under: Personal, Technology

Audio compression: sound and lack of vision

December 31, 2007 by Scott Rosenberg

I wrote earlier this year about the controversy over the level of compression in contemporary recordings — how it flattens out sound, fatigues the ears and makes music all sound the same. In Rolling Stone Rob Levine has now produced the definitive piece on the subject. It’s worth a read.

The most depressing part is the discussion of the remastering of old recordings to fit this new norm (apparently the new Led Zeppelin collection is a case of that).

My gold standard for rock recordings are the records (my older brother’s) that I first heard through my father’s KLH, lying on the living room floor, in the late ’60s: the White Album and “Abbey Road,” “Tommy,” the Kinks’ “Arthur.” Normally I’d be delighted to hear of new remasterings of such albums — but now I’ll think twice before buying them. Make the Arctic Monkeys sound monotonous if that’s what they want — but don’t ransack music history!

At the end of Levine’s piece, this passage struck an ironic note:

Bendeth and other producers worry that young listeners have grown so used to dynamically compressed music and the thin sound of MP3s that the battle has already been lost. “CDs sound better, but no one’s buying them,” he says. “The age of the audiophile is over.”

What’s funny is that the people who consider themselves real audiophiles — who read The Absolute Sound and invest in tube amplifiers — sneer at CDs as limited and thin (they rely on sampling, unlike analog recordings). Of course, these are typically classical listeners; for popular music, even CD-quality is now endangered.
[tags]compression, audio, sound quality, music, recording[/tags]

Filed Under: Culture, Technology

Fool for a CTO

December 18, 2007 by Scott Rosenberg

This past summer I paid a happy visit to the Motley Fool — in downtown Alexandria, just across the river from D.C. proper — to meet the technical team there and give a talk on Dreaming in Code.

I was pretty impressed with the people I met and the lively atmosphere at the company — unpretentious but serious about the important stuff. Anyway, the Fool is now looking for a new CTO. I know from experience that that’s a tough position to fill, but maybe one of you reading this is interested — or knows someone who’d be. More info here.

Filed Under: Business, People, Technology

« Previous Page
Next Page »