Wordyard

Hand-forged posts since 2002

Archives

About

Greatest hits

Howard Rheingold — call for questions

May 1, 2007 by Scott Rosenberg

I’ve been doing some advising to Jay Rosen’s NewAssignment.Net “citizen journalism” lab and its Assignment Zero project — an experiment in harnessing the work of a distributed group of volunteers to explore the complex questions surrounding harnessing the work of a distributed group of volunteers.

Recursive? You bet. But interesting enough for me to want to participate in — which I’m doing by taking on one modest assignment for the project.

Next week I’ll be interviewing Howard Rheingold as my contribution to Assignment Zero. I interviewed Howard way back in January of 1994, about his then-recent book The Virtual Community. In those days people were using the phrase “information superhighway” without (too much) irony. The Virtual Community described a looming decision point in the development of the online world. From my piece:

In particular, what’s up to us is whether the network turns out to be an open public space, like a town square or a civic forum, or a commercial enclosure, like a mall. To analogize, and doubtless oversimplify, the question is whether the network emerges as something like a souped-up telephone that we can all communicate with (known as the “many-to-many” model) or something like a jazzed-up cable TV (“one-to-many”) that provides us with more choices but not more power.

And Rheingold emphasizes that it’s up to us right now — during a brief window of opportunity, as the government bargains with the telephone companies, cable TV networks and other corporations to lay down new rules for the new roads.

We know how that turned out — then: the Internet trounced its “walled garden” rivals and became the global standard for electronic communication. Is that conflict a closed issue, or will we keep facing it in new forms? I’ll be following up with Howard about this and more.

NewAssignment.Net aims to channel “many-to-many” energies in its own way, so if you have topics you think we should explore, questions you want me to pose to Howard, or information you think is relevant to our talk, please post over at Assignment Zero (or right here, if you like!).
[tags]newassignment.net, howard rheingold, assignment zero, crowdsourcing[/tags]

Filed Under: Blogging, Media, Technology

Getting Moore’s Law right

April 30, 2007 by Scott Rosenberg

Everyone knows that Moore’s Law says chips will double in speed and/or halve in price every 18 months — right? But as with so many things that everyone knows, this is at best a wild oversimplification, and really just wrong. As Gordon Moore originally expressed his famous idea about the exponential progress of the semiconductor industry, the notion was that our ability to cram transistors into the same on-chip real estate would double at a regular interval (over time he tinkered with the length of that interval, from 18 to 24 months).

I was delighted recently to come across this piece in ExtremeTech — in which two analysts from Gartner discuss whether Moore’s Law has been, on balance, a blessing or a curse for the computer industry — because, among other things, the article gives us all one more reminder about the actual meaning of the concept.

This isn’t splitting hairs. The distinction is important because, if all Moore really said was that you were going to be able to make denser and denser chips, that left the industry with an existential choice of whether to use that capability to drive prices down or performance up. Moore’s principle did not map a straight upward path for his industry; instead, it laid out a crossroads, or rather a sequence of one crossroads after another.

The ExtremeTech piece is worth reading for its discussion of these choices of how to expend the Moore’s Law “bounty” (to borrow the fine word that Charles Simonyi applies).

Claunch added that: “if we were to make a PC run at the same speed as the original 8086 PCs, they’d probably cost about 10 cents to make. But nobody could afford to be in the business of selling PCs at 10 cents each. So instead, we have to use a different strategy. That different strategy is this: Pump twice as much stuff into the box, and if you do that, you can at least hold your price flat.”

But mostly I’m just grateful for the article’s refusal to oversimplify. Over the years I have had any number of arguments with editors so eager to apply Occam’s Razor and explain Moore’s Law “so anyone can understand it” that they’re willing to be inaccurate. (I worked hard to get it right in Dreaming in Code, which was largely about why software has had such a hard time keeping up with hardware — and thus why, even though we have chips unfathomably faster than those we used two decades ago, we rarely get our work done much faster.)

If there were a Moore’s Law for journalism, sometimes, in my more cynical moments, I think it would go like this: Every couple of years, our collective ability to maintain subtle distinctions and fine gradations of meaning collapses by half.
[tags]moore’s law[/tags]

Filed Under: Technology

Everyone needs help with the new system

April 25, 2007 by Scott Rosenberg

Recently you may have found yourself watching this amusing video known variously as “Medieval Help Desk” and “Introducing the Book,” in which a befuddled monk seeks technical support assistance figuring out how to use the newfangled text-delivery platform called the book. (“I ‘turn the page’?”)

First we laugh at the missteps and worries of the monastic protagonist, who fears he’ll “lose text” if he turns the page; then we realize the joke cuts both ways, and that the monk’s trials are no different from our own struggles with unfamiliar new interfaces. Sooner or later, we’re all newbies in relation to something, and our confusion will be laughed at by those in the future (perhaps ourselves) for whom the novelty we once scratched our heads over has become second nature.

I thought about that video as I read Jon Udell’s recent post titled Online Incunabula. I’d always thought “incunabulum” meant anything that was in embryonic form, but Udell explained that the word has a more specific meaning: it applies to books printed before 1501, in the earliest days of printing, when the conventions of book publishing hadn’t yet coalesced into a set of common practices. Udell’s post refers to a podcast interview with Geoffrey Bilder, an executive with CrossRef, which develops a system for making scholarly citations work online. Udell excerpts this passage by Bilder:

People were clearly uncomfortable moving from manuscripts to printed books. They’d print these books, and then they’d decorate them by hand. They’d add red capitals to the beginnings of paragraphs, and illuminate the margins, because they didn’t entirely trust this printed thing. It somehow felt of less quality, less formal, less official, less authoritative. And here we are, trying to make our online stuff more like printed stuff. This is the incunabula of the digital age that we’re creating at the moment. And it’s going to change.

So much of the apparatus that we take for granted when we look at a book — the table of contents, page numbers, running heads, footnotes — that wasn’t common currency. It got developed. Page numbers didn’t make much sense if there was only one edition of something. This kind of stuff got developed and adopted over a fairly long period of time.

If you treat Vannevar Bush as Gutenberg, we haven’t even gotten to Martin Luther yet, we haven’t even gotten to 1525. In fact, whereas people stopped trying to decorate manuscripts by 1501, we’re still trying to replicate print online. So in some ways they were way ahead of us in building new mechanisms for communicating, and new apparatus for the stuff they were dealing with.

I love this quote’s reminder of how early the online game’s innings remain. One of the things I’ve always valued about blogs is that their features — reverse chronology, permalinked posts, time-stamps, comments and so forth — represent the first bundle of conventions for the online medium that is truly native to it. The format evolved to meet the unique needs of a publishing environment in which anything can be changed at any time and yet everything ought to have a permanent address. (This is a point that both Rebecca Blood and I have been making for a long time now.)

It helps to think that what we’ve been doing here on the Web for several years is slowly, by trial error, inventing the online equivalents to “the apparatus that we take for granted when we look at a book.” And we’ve only just begun.
[tags]blogging, jon udell[/tags]

Filed Under: Blogging, Technology

Perfect iPod moments

April 24, 2007 by Scott Rosenberg

Steven Levy’s book about the iPod, The Perfect Thing, describes a transcendent moment the author experiences: In a funk one day in post-9/11 New York, with his iPod in shuffle mode, Levy hears the glorious opening chimes of the Byrds’ version of “My Back Pages,” and he has a Perfect Moment.

I don’t know about you, but I’ve always loved that song, and would rather not wait for shuffle mode to surface it from my thousands of other songs. I continue to hand-pick my music, relying on shuffle only occasionally for novelty or distraction.

Still, iPod-fueled transcendence remains available even to us control freaks. This morning, for instance, I relieved a BART commute’s tedium by listening to the splendid live recording a fan made of a memorable Mountain Goats show I attended last month. (It’s posted here at the Internet Archive.) The set begins pensively with “Wild Sage’s” ruminations, makes its way to the equally melancholy “Get Lonely,” and then bursts into “Quito” — a defiant anthem of aspiring redemption and half-glimpsed rebirth. The song reached its visionary climax at the precise instant my train emerged from the tunnel into the morning Bay Area sun. Perfection! A film-editing wizard couldn’t have better synced sound and vision. I beamed; it made my morning.

It’s been a quarter century since the Walkman’s advent introduced us to the notion of provisioning our daily wanderings with a soundtrack of our choice. The iPod kicks this dynamic into a higher gear. (Levy ponders this and much else in his book; I covered his talk in Berkeley here.)

I’d argue that those of us who are not as shuffle-happy as Levy can feel a bit of extra pride: By virtue of our active personal DJ-ing, we become, instead of passive observers of serendipitous moments, more like coauthors of our own pleasurable juxtapositions. But either way, we’re having fun, and that’s what really matters.
[tags]ipod, steven levy[/tags]

Filed Under: Culture, Music, Personal, Technology

Apple and Brooks’s Law

April 22, 2007 by Scott Rosenberg

Apple recently announced that it had to delay the release of the next version of Mac OS X, Leopard, by a few months — too many developers had to be tossed into the effort to get the new iPhone out the door by its June release. Commenting on the delay, Paul Kedrosky wrote, “Guess what? People apparently just rediscovered that writing software is hard.”

In researching Dreaming in Code, I spent years compiling examples of people making that rediscovery. I’m less obsessive about it these days, but stories like this one still cause a little alarm to ring in my brain. They tend to come in clumps: Recently, there was the Blackberry blackout, caused by a buggy software upgrade; or the Mars Global Surveyor, given up for lost in January, which, the LA Times recently reported, was doomed by a cascading failure started by a single bad command.

Kedrosky suggests the possibility of a Brooks’s Law-style problem on Apple’s hands, if the company has tried to speed up a late iPhone software schedule by redeploying legions of OS X developers onto the project. If that’s the case, then we’d likely see even further slippage from the iPhone project, which would then cause further delays for Leopard.

This is the sort of thing that always seemed to happen at Apple in the early and mid-’90s, and has rarely happened in Steve Jobs Era II. I write “rarely,” not “never,” because I recall this saga of “a Mythical Man Month disaster” on the Aperture team. If the tale is accurate, Apple threw 130 developers at a till-then-20-person team, with predictable painful results. We’ll maintain a Brooks’ Law Watch on Apple as the news continues to unfold.

UPDATE: Welcome, Daring Fireball and Reddit readers! And to respond to one consistent criticism: Sure, iPhone isn’t late yet, but Apple is explicitly saying it needed to add more developers to the project to meet its original deadline. If that all works out dandy, then the Brooks’s Law alarm will turn out to have been unwarranted. Most likely, given Apple’s discipline, the company will ship iPhone, with its software, when it says it will. What we won’t and can’t know is whether, and if so how much, the shipping product has been scaled back. And sure, of course this is all conjecture. Conjecture is what we have, given Apple’s locked-down secrecy.
[tags]apple, leopard, os x, software delays, brooks’s law[/tags]

Filed Under: Dreaming in Code, Software, Technology

Michael Wesch’s “Machine” video

April 20, 2007 by Scott Rosenberg

Before the opening talk at the Web 2.0 Expo earlier this week, the conference organizers played Michael Wesch’s video-ode to the participatory Web, “The Machine is Us/ing Us”. Given the insider-y nature of the crowd, I have to assume that most of the attendees had already seen it — it had rocketed to blogospheric celebrity in early February. But I didn’t realize the guy who made the video, a professor of cultural anthropology from Kansas State University, was at the conference.

On Tuesday afternoon I literally stumbled upon his talk in the hallway (on a tip from my neighbor Tim Bishop); it was a part of the free, informal “Web2Open” parallel conference. Across the hall, a hubbub made it hard to hear Wesch — the Justin.tv people had set up camp there and needed to be asked to pipe down.

Wesch turns out to be a rare combination of ingenuous Web enthusiast and smart cultural critic. In my experience, the cultural critics are usually pickled in cynicism and the Web enthusiasts are often blinded to their technology’s drawbacks. Maybe the discipline of cultural anthropology has helped Wesch maintain some balance; or maybe his sheer distance from Silicon Valley-mania — whether in the flatlands of Kansas or the mountains of Papua New Guinea — has helped him find a fresh perspective.

The came-out-of-nowhere saga of Wesch’s video neatly serves to mirror its message about the generated-from-the-bottom-up nature of the Web. Wesch originally made the video, he explained, because he was writing a paper about Web 2.0 for anthropologists, trying to explain how new Web tools can transform the academic conversation. He created it “on the fly” using low-end tools. Its grammar, with its write-then-delete-and-rewrite rhythms, emerged as he made goofs and fixed them: “The mistakes were real, at first. Then I thought they were cool, and started to plan them.” The music was a track by a musician from the Ivory Coast that he found via Creative Commons. (Once the video became a hit, Wesch says, he got a moving e-mail from the musician, who said that he’d been about to give up his dreams of a life in music, but was now finding new opportunities thanks to the attention the video was sending his way.)

The video’s viral success took Wesch by surprise. He’d forwarded it to some colleagues in the IT department to make sure that he hadn’t erred in his definition of XML. They sent it around. It took a week to go ballistic.

At one point someone in the small audience asked Wesch a question about his field research in Papua New Guinea. He paused for a second, asking, “There’s about a two-hour lecture there, I’m not sure I can compress that into a five-minute answer — should I try?” I couldn’t help myself; I blurted, “Hey, you did the entire history of the Web in four minutes — go ahead!”
[tags]web 2.0, web 2.0 expo, michael wesch, the machine is us/ing us, viral video[/tags]

Filed Under: Blogging, Events, Technology

Schmidt on scaling Google

April 17, 2007 by Scott Rosenberg

The first time I heard Eric Schmidt speak was in June 1995. I’d flown to Honolulu to cover the annual INET conference for the newspaper I then worked for. The Internet Society’s conclave was a sort of victory lap for the wizards and graybeards who’d designed the open network decades before and were finally witnessing its come-from-behind triumph over the proprietary online services. It was plain, at that point in time, that the Internet was going to be the foundation of future digital communications.

But it wasn’t necessarily clear how big it was going to get. In fact, at that event Schmidt predicted that the Internet would grow to 187 million hosts within 5 years. If I understand this chart at Netcraft properly, we actually reached that number only recently. (Netcraft tracks web hosts, so maybe I’m comparing apples and oranges).

I thought of this today at the Web 2.0 Expo, where Eric Schmidt, now Google’s CEO, talked on stage with John Battelle. (Dan Farber has a good summary.) He discussed Google’s new lightweight Web-based presentation app (the PowerPoint entry in Google’s app suite), the recent deal to acquire DoubleClick, and of Microsoft’s hilarious antitrust gripe about it, and of Google’s commitment to letting its users pack up their data and take it elsewhere (a commitment that remains theoretical — not a simple thing to deliver, but if anyone has the brainpower resources to make it happen, Google does).

But what struck me was a more philosophical point near the end. Battelle asked Schmidt what he thinks about when he first wakes up in the morning (I suppose this is a variant of the old “what keeps you up at night”). After joshing about doing his e-mail, Schmidt launched into a discourse on what he worries about these days: “scaling.”

It surprised me to hear this, since Google has been so successful at keeping up with the demands on its infrastructure — successful at building it smartly, and at funding it, too. Schmidt was also, of course, talking about “scaling” the company itself.

“When the Internet really took off in the mid 90s, a lot of people talked about the scale, of how big it would be,” Schmidt said. It was obvious at the time there’d be a handful of defining Net companies, and each would need a “scaling strategy.”

Mostly, though, he was remarking on “how early we are in the scaling of the Internet” itself: “We’re just at the beginning of getting all the information that has been kept in small networks and groups onto these platforms.”

Tim O’Reilly made a similar point at the conference kick-off: In the era of Web-based computing, he said, we’re still at the VisiCalc stage.

Google famously defines its mission as “to organize the world’s information and make it universally accessible and useful.” But the work of getting the universe of individual and small-group knowledge onto the Net is something Google can only aid. Ultimately, this work belongs to the millions of bloggers and photographers and YouTubers and users of services yet to be imagined who provide the grist for Google’s algorithmic mills.

I find it bracing and helpful to recall all this at a show like the Web 2.0 Expo — which, while rewarding in many ways, gives off a lot of mid-to-late dotcom-bubble fumes. Froth will come and go. The vast project of building, and scaling, a global information network to absorb everything we can throw into it — that remains essential. And for all the impressive dimensions of Google, and the oodles of Wikipedia pages, and the zillions of blogs, we’ve only just begun to post.

[tags]google, eric schmidt, internet growth, web 2.0, web 2.0 expo[/tags]

Filed Under: Blogging, Business, Technology

Toward a Mac migration

April 15, 2007 by Scott Rosenberg

There are still three barriers standing between me and moving onto a Mac. Two are rapidly disappearing. (I was a Mac guy for years and shifted to a PC in the mid-’90s during Apple’s slump years, when the unreliability of the Mac OS and Mac hardware had me losing more data than I could stand.)

One is the availability of a true lightweight Apple laptop. Rumor has it that’s coming; it’s time for a Mac laptop that is slim, elegant and three pounds heavy, like the IBM/Lenovo X-class laptops I’ve been using forever. I’m sure Apple knows this and I can’t imagine waiting too much longer for such a device.

Second is the availability of a Quicken for the Mac that’s as good as Quicken for the PC. It seems plain that Intuit is never going to make this happen.

Third is that, for the moment at least, I’m still running my life and work with Ecco Pro, and it’s an old Windows app. There are modern Mac apps that do some of what Ecco does better than it does, but I’ve found none that does everything that Ecco does as well as it does, and it pains me to think of abandoning it.

In the age of Intel-based Macs it’s now quite easy to run Windows in parallel to your OS X. But Apple’s Boot Camp requires a reboot each time you want to go to your Windows app, and that’s a royal pain; Parallels doesn’t. But both approaches require that you spend $300 on another copy of Windows, and that’s an extraordinary amount to pay.

Last night I downloaded and tried out Crossover Mac, an application (based on the WINE project) that lets you run individual Windows apps from inside OS X (on an Intel-based Mac) without needing to install a second OS. The good news is that Crossover Mac worked apparently hitchlessly on Quicken 2005, which is one of a bevy of apps that Crossover officially supports. (I haven’t really pounded on it, and maybe heavy usage will uncover problems, but I’m impressed so far.)

So what I’m now wrestling with is: how to get Ecco Pro running under Crossover? The app is not officially supported (no surprise there!) and my “let’s give it a try anyway” install failed. Ecco is a solid Win32 application but it dates back to the mid-’90s so there might simply be too many archaic calls or idiosyncracies. I’d probably give up hope — but there are screenshots on the Crossover site of Ecco running successfully under Crossover/Linux. So I think there ought to be some hope here. I’m posting this largely as a beacon: Ecco Pro users! Crossover users! Can anything be done here?

I’m also pondering trying the Parallels route by using a Windows license from an older, diisused version of XP or Windows 2000; either of those runs Ecco perfectly. If I experiment with Parallels using this approach I’ll report on it.
[tags]ecco pro, crossover, parallels, windows on mac[/tags]

Filed Under: Personal, Technology

John Backus, RIP, and up next in Code Reads

March 20, 2007 by Scott Rosenberg

I was all set to dive into “No Silver Bullet” for the next Code Reads, but given last night’s news of the passing away of John Backus, father of FORTRAN, I thought I would do a quick revision of the plan.

The next Code Reads will focus on Backus’s 1977 Turing lecture, “Can Programming Be Liberated from the von Neumann Style?” It’s full of equations and math notation that, superficially at least, look daunting to this reader — but I will give it a try, and perhaps the collective expertise of all of you will help bolster me in those areas where I falter!

Filed Under: Code Reads, Software, Technology

Robots are hard, too

March 18, 2007 by Scott Rosenberg

Friday’s Wall Street Journal included a book review of Almost Human: Making Robots Think,a new book by Lee Gutkind that’s a portrait of the work at Carnegie Mellon’s Robotics Institute.

That work, it seems, has its frustrations, and — as the reviewer, George Anders, tells it — the difficulties sound eerily like those recounted in Dreaming in Code’s description of the things that make software hard:

Mr. Gutkind’s second big insight involves Carnegie-Mellon’s approach to project management. It’s awful. Goals aren’t defined. Interim deadlines aren’t met. Crucial subsystems turn out to be incompatible. People rely on all-nighters to get everything finished. Such bad habits invite catastrophic blunders by exhausted people whose last-minute “fixes” snarl everything else.

In the most maddening breakdown of all, the scientists devising research projects seldom communicate well with the engineers trying to build them. Even the word “target” becomes a sore spot. To scientists, it means their working hypothesis. To engineers, it means the robot’s physical destination. Unaware of this gap, supposed colleagues get mired in confusing conversations.

Gutkind’s book is now on my “must read” list. One final irony to me, coming out of Dreaming in Code, is that Carnegie Mellon is not only home to Gutkind’s roboticists; it also harbors the Software Engineering Institute, which is ground zero for the CMM, CMMI, TSP and other acronymic attempts to add a framework of engineering rigor around the maddeningly difficult enterprise of producing new software. I might be jumping the gun (not having read Gutkind’s book yet), but it sounds like those roboticists and the SEI people should have lunch some time.

Filed Under: Dreaming in Code, Science, Technology

« Previous Page
Next Page »