Wordyard

Hand-forged posts since 2002

Archives

About

Greatest hits

Communities of interest

March 14, 2006 by Scott Rosenberg

Other stuff at ETech that was interesting:

Brian Dear of EVDB and Eventful, a site for posting and finding event information, introduced Eventful Demand, which allows people to band together and ask for “speakers” — musical performers, authors, anyone who might have a fan base or interested crowd — to come make an appearance, put on a show, give a talk in their area. Dear hopes that creating a common space for this sort of demand-side networking will reduce the reliance on middlemen and allow artists and other “speakers” to connect directly with their audiences. For instance, a band that had a good number of people asking for an appearance in their town could then take that info to a club as evidence of ticket-sales draw — or, more ambitiously, the “demanders” could organize the event themselves. You’ve heard of the “wisdom of crowds”; this is more like the “wishlist of crowds.” At the moment, the hottest “Demand” on Eventful is for “The impeachment of George W. Bush – Washington metro area.” Other than that, an awful lot of people seem to want Wil Wheaton to come to their towns.

Derek Powazek provided an update of the principles he expounded five years ago in his book Design for Community. “Web 1.0”-style communities, were, he said “company towns.” As examples, he included Salon’s Table Talk, which I think is reasonable; his own Fray.com similarly qualifies. In the “Web 2.0” world, he says, we’re more like individual homesteaders, and that gives us potentially much more power and control. He’s right, but I think he may, just a little bit, underplay the downside: once you own the house, you’re stuck dealing with the insurance and taxes, the leaks, the grafitti and the natural disasters. Still, given the choice, most people — at least most Americans — seem to prefer the homeowner model. Derek’s slides are here.

Other interesting talks at Etech about community, much-blogged elsewhere, included Clay Shirky’s chronicle of “patterns” in online moderation, “Shut Up! No, You Shut Up!” (summary here). Shirky has set up a wiki to record these patterns, modeled on Ward Cunningham’s original Pattern Language wiki for software developers.

Meanwhile, Danah Boyd offered a sociological perspective on recent models of successful communities — Craigslist, Flickr and Myspace. My decade at Salon certainly made this passage ring true:

  Passionate designers are hard to come by. The people in charge of Craigslist, Flickr and MySpace breathe their sites. They don’t go home at night and forget about the site. They are online at 4AM because something went wrong. They are talking to users at midnight just because. You cannot force this kind of passion – it’s not just a job, it’s a belief system.

Unfortunately, it is not clear that even the most passionate people can keep doing it forever. This type of true embeddedness is utterly exhausting. It plays a heavy toll on the lives of the designers. Even in smaller communities, creators grow tired.

Filed Under: Events, Technology

Attention traders

March 14, 2006 by Scott Rosenberg

I’ve got some random loose end posts from my time at last week’s ETech conference that I really should post before they get any older. Here’s one…

Seth Goldstein of Root.net introduced his company’s Vaults product, which aims to give individual consumers a place to bank their “attention data.” Today you can open a “vault” for free and stash your Amazon purchase history and your general clickstream data (derived from a browser plug-in); tomorrow, presumably, much more. Goldstein talked about “PPAs” (“promises to pay attention”) and “attention bonds” and drew a comparison with the way the mortgage industry’s adoption of mortgage bonds helped make housing more affordable.

Well, everyone needs a place to live; what problem is Root aiming to solve? The idea seems to be: Companies are already collecting and claiming large amounts of information about our financial lives and online behavior. That’s data that we ought, by rights, to control — and if it’s going to be exploited commercially, we should get our slice.

Fair enough. But the Root Vaults idea applies a Wall Street mentality to the “attention economy” concept, and when Goldstein unveiled the Vault home screen before the ETech crowd, it resembled nothing so much as a sort of Bloomberg screen for the mind. There’s something potentially dismal about this — are we going to convert every last remnant and scrap of our earthly existence into the margin-eking terms of financial markets?

On the one hand, I can imagine Root Vaults as offering a nifty way for us all to do what Howard Rheingold long ago advocated — pay attention to where we’re paying attention. On the other hand, I’m wary of letting the bond-trading worldview colonize my choices of entertainment and edification. I’m not looking to become the CEO of my own mind, fiddling with spreadsheet optimizations of my own personal satisfaction.

I mentioned this reaction to Goldstein, and he readily admitted that clickstream data has its limits: “You gotta start somewhere. Is it an accurate representation of a person? No. You don’t want to reduce people to data on a Bloomberg dashboard. But this is a natural resource that people are already producing.”

Filed Under: Events, Media, Technology

Ozzie at the clipboard, Stone at attention

March 7, 2006 by Scott Rosenberg

Tuesday here at Etech began with Ray Ozzie, once of Groove and now of Microsoft, demoing the prototype for an absurdly simple yet marvelously useful little innovation: the ability to cut and paste events, using the Windows clipboard, such that they move from application to application (and Web app to Web app) with their structure and metadata intact. It’s a little thing, in one sense — but just the sort of little thing that stands in the way of the Web-based information realm being fully useful. That Microsoft is helping lead this change rather than fighting it to the last byte is remarkable. That Ozzie did his demo using Firefox was simply gracious. (He writes in detail about the project on his blog.)

Jeff Han showed his research into “multi-touch interaction” — giant touch-screens that respond to complex commands delivered via more than one point of touch. The interface hardly seemed as intuitive as Han promised (two fingers zooms in — or is it out?), and some of the demo resembled the manipulation of a virtual lava lamp. But when Han turned his interface into a giant light-table and showed how perfectly it was suited for the organization of large numbers of photos — and videos! — the value of the innovation became immediately apparent.

The ostensible theme of the conference this year is “The Attention Economy,” but most speakers barely addressed it. One notable exception was Linda Stone, the former Microsoft and Apple exec who coined the phrase “continuous partial attention” back in 1998 and unpacked the term for us a bit here. (There are good notes from Nat Torkington on a similar talk she gave at Supernova last year.) She distinguished multitasking — where you’re switching between discrete goal-oriented processes — from the more diffuse and corrosive continuous partial attention, in which we are constantly “scanning for opportunities, optimizing for the best opportunity,” paying half a mind to what’s in front of us and keeping our peripheral vision peeled in hope of spotting something better. Stone says we’re driven by CPA out of a “desire to be a live node on the network,” to stay connected and to feel validated that we fit into a social web.

Stone placed CPA in a social-history timeline that falls into 20-year spans: a period from 1965 to 1985 in which we placed highest value on self-expression, creativity and personal productivity; then a period from 1985 to 2005 in which the network became paramount and we valued communication the most. I found this explanation so generalized as to be almost useless — “We played Battleship in the ’70s, we played Diplomacy in the ’90s,” she declared, but wait a minute, I played Diplomacy in the ’70s, and so did all my friends!

Nonetheless, Stone is onto something important here. Her description of our “overwhelmed, overstimulated and underfulfilled” technological existence wasn’t exactly what the technology-besotted ETech crowd wanted to hear, but they needed to hear it. Still, as I looked around at a sea of heads buried in laptops, sucking down the wi-fi, fingers darting to catch the latest email or Technorati result, I wondered how many had given Stone the attention she deserved.

Filed Under: Events, Technology

Sterling language

March 6, 2006 by Scott Rosenberg

I loved the two Bloggercons I participated in, and I share the enthusiasm expressed by Dave Winer and the BarCamp people and the MashupCamp people for the whole “unconference” idea — the notion that great gatherings can happen when you put great people together in rooms without programming lots of speeches and panels and product demos.

Still, I’m not ready to give up on the occasional old-fashioned lecture, under the right circumstances, and there are some people in whose presence I will gladly say, “I am an audience member — you talk, I’ll shut up.” Bruce Sterling is one of them. He spoke tonight here at Etech.

I haven’t heard Sterling in several years, and I’d forgotten his peculiar cadence — a kind of incantatory precision that you first mistake for superciliousness and then realize, no, wait, those pauses and touches of drawl aren’t affectation, he’s just savoring those words, he loves them, he doesn’t want to say goodbye to them quite yet.

Sterling’s ostensible subject was “The Internet of Things,” and he talked a bit about the stuff he’s been talking about for some time now: spimes, physical objects trackable in space and time, material things that are — like items on today’s Web — linkable, rankable, sortable and searchable. It’s a fascinating topic, even the second or third time around; but the heart of tonight’s talk was a series of observations on language and technology.

“Computer,” Sterling argued, was simply an awful name for these machines that arrived in the middle of last century. “Computer” led us straight to “artificial intelligence,” down the dead-end street that had us thinking the machines could become smart — that they were “thinking machines.” We should have picked a word more like what the French chose, “Ordinateur,” suggesting that the devices, uh, ordinate things. They are card shuffling tools. They do what we see the Google-ized Web doing so well today — link, rank, sort and search. “I think we could have done better words,” Sterling said — and if we had, we might have gotten Google 20 years sooner.

He went on to parse some Web 2.0-speak, first decoding Tim O’Reilly’s definition of the phrase, then dissecting scholar Alan Liu‘s critique of the phenomenon, at every turn reminding this crowd of “alpha geeks” that the labels they pick for their innovations really do make a difference.

“You don’t want to freeze your language too early,” Sterling advised — that stops creativity in its tracks. Hype, he suggested, is underrated: “Hype is a system-call on your attention.” Buying into it blindly is a disaster, of course, but “if you soberly track its development, hype is revealing…. In politics, the opposite of hype is the truth, but in technology, the opposite of hype is argot, jargon” — language that has no traction in the real world. And “if no one is dismissing you as hype, you are not being loud enough.”

Sterling cited a recent interview with Adam Greenfield, the author of a new book called Everyware that’s also about a version of “the Internet of Things.” In the interview, Greenfield said he coined the term “Everyware” to describe his take on the concept others have labeled “ubiquitous computing” because “I wanted people relatively new to these ideas to be able to have a rough container for them, so they could be discussed without anyone getting bogged down in internecine definitional struggles.”

But wait, Sterling cried — “getting bogged down in internecine definitional struggles” is exactly where we should be when we’re inventing new things. This is “the wetlands of language”, where we “use words to figure out what things mean.” The struggles count; they help us understand and shape what we’re doing. Choosing a label for a technology, he argued, “really matters — it’s like christening a baby.”

There was much more. If the good folks at ITConversations post the audio, or if Sterling posts a text, I’ll link so you can experience the whole thing — including the full shtick about Alan Turing’s head in a box, which I’m afraid I failed to take good notes on, since I was too busy laughing.

It would take a good video, though, to capture the funniest moment of the evening: Sterling was displaying examples of “receding tech” (“things that do not blog or link”) — a rusty engine block half-buried in desert sand, a mountain of discarded tires — when the projection screen flashed a warning window: YOU ARE NOW RUNNING ON RESERVE POWER. Then the laptop went to sleep. He was wrapping up, anyway.

Filed Under: Events, People, Technology

Etech 2006

March 6, 2006 by Scott Rosenberg

I’m here in San Diego at the O’Reilly Emerging Technology Conference, an event that I had the pleasure of attending in 2003 and 2004 but which I skipped last year while I was trying to get some traction on my book.

In my absence the conference moved from the relatively cozy confines of the Westin Horton Plaza to the vastness of the Manchester Grand Hyatt — two tall towers on the edge of the harbor. This feels like a place not for things that are emerging but rather for things that have conquered.

Filed Under: Events

Red alert — content generator overload

March 2, 2006 by Scott Rosenberg

Lee Gomes of the Wall Street Journal had a funny piece yesterday about the “content mills” that are, uh, repurposing — read, pirating (or, in the case of Wikipedia and the like, reusing what’s free) — other people’s writing, in order to create pages that can be festooned with Google text ads and turned into cash.

There are different shades of gray on this spectrum. Some companies are building honest businesses by paying all comers small sums for articles that they know, in advance, will generate a certain level of Google-word money. Mesothelioma, anyone? (This rare form of asbestos-caused cancer has long been one of the best-paying Google words, because lawyers who represent asbestos victims are willing to pay big for leads.) Other shyster-entrepreneurs are simply paying writers to massage other people’s words just enough to pretend that the work is original, then reposting it. Gomes hooked up with someone from the latter group, and his account of conscientiously trying to deliver actual original copy to a patron who couldn’t care less makes a diverting farce.

Gomes concludes that the real villain here is Google itself: He blames the search engine for inspiring a flood of bogus content.

  In fact, search engines are more like a TV camera crew let loose in the middle of a crowd of rowdy fans after a game. Seeing the camera, everyone acts boorishly and jostles to get in front. The act of observing something changes it. Which is what search engines are causing to happen to much of the world’s “information.” Legitimate information, like articles from the WHO, risks being crowded out by junky, spammy imitations.

But Google the search engine is not the culprit; Google the ad machine is. The shysters wouldn’t be cranking out the HTML if it weren’t for AdSense, the Google text ads that publishers can plaster over their pseudocontent. Though AdWords — the keyword-based text ads that appear on Google’s own search results — are subject to a limited amount of gaming and manipulation (that Google is always trying to defeat or limit), the level of crap surrounding AdSense is far greater.

So blame Google — it deserves some. But keep the focus clear: A terrific search engine alone doesn’t make people publish acres of garbage.But put a few dollars in play and some “content providers” will do the most embarrassing things.

Filed Under: Blogging, Media, Technology

Yahoo: No shows

March 2, 2006 by Scott Rosenberg

Yahoo, having boldly proclaimed its intention to produce TV-style “Web shows” a la MSN circa 1996, thinks again. (Maybe they won’t be buying that movie studio after all.) That didn’t take long; six weeks ago Yahoo content guy Lloyd Braun was touting his shows to the Wall Street Journal.

Smart move. I guess the Webheads in the Valley gave the show-biz people in Santa Monica a crash course in how the Internet, you know, works.

Filed Under: Media, Technology

Theirs not to reason why…

March 1, 2006 by Scott Rosenberg

85 percent of U.S. troops in Iraq polled by Zogby “said the U.S. mission is mainly ‘to retaliate for Saddam’s role in the 9-11 attacks.'”

This is the saddest thing I have read in a long time. (Thanks to Tim at War Room for pointing it out.)

Filed Under: Politics

Corruption’s two faces

March 1, 2006 by Scott Rosenberg

When the Enron and Worldcom scandals unfolded in the early years of this decade, it became clear that we were looking at two different species of corruption: let’s call them old-school and New Wave. Old-school corruption is blunt and obvious; you’d know it for what it is if you bumped into it in a dark alley, which is probably where you’d find it. Large sums of cash are moved unceremoniously from place to place; ledgers are altered; bribes land in open palms.

Worldcom, clearly, was old-school — out-and-out, prima facie fraud. Enron was something equally insidious but entirely different in form: a kind of corruption that consisted largely of deliberate and elaborate bending of arcane rules, game-playing in largely incomprehensible gray areas of accounting rules and laws, and fabrication of sham institutions to give these activities bureaucratic shelter — all orchestrated with a ruthless goal in view, but all pursued under rationales that at least appeared plausible to the casual spectator.

As today’s political corruption scandals roll out in depressing sequence, it’s clear that they, too, divide along these old-school/New Wave lines. Here, the outline of the Duke Cunningham Congressional bribery scandal — which would make wonderful opera bouffe if it weren’t our money and security on the line — is pure old-school. Check out, for instance, this report from TPM’s Daily Muck with the latest from Cunningham’s prosecutors: The congressman had deals with a couple of defense contractors who were kicking back part of their 800-percent profit as bribes to him, and when the Pentagon was slow in paying them, he’d “browbeat” Defense officials to move their butts, and demand that they be fired if they failed to comply.

But Tom Delay, he’s plainly a New Wave sort of crook — the Andy Fastow of the Republican Party. He played in the nether reaches of election law and congressional process the way Enron’s execs and accountants danced beyond the margins of finance law and corporate governance rules. As dramas of naked political power-flexing, patronage-wielding and election-influencing, they are riveting; we could admire Delay’s sheer creative chutzpah if we weren’t still suffering from its consequences.

With old-school crooks, exposure is a straightforward matter of accumulating enough evidence to obtain a confession. New Wavers are tougher to nail, because they’ll always be able to argue that their aggressive interpretation of the letter of various rules and laws wasn’t technically illegal. So what if their actions involved phantom companies, slush-fund transfers, or unprecedented mid-decade redistricting? Do the laws and rules explicitly say this stuff is illegal? Can you prove it? All of it? What if they were just being creative and entrepreneurial? If you prosecute them, aren’t you just telling businesspeople and politicians to stop dreaming of new ways to do things?

When you hear that argument, pinch yourself if you start to succumb, and remember: it’s just an apologia for the same old corruption in a clever new guise.

Filed Under: Business, Politics

A likely story

February 27, 2006 by Scott Rosenberg

As a young journalist fresh out of school, I decided to pursue criticism rather than reporting. I knew I’d chafe under the stricture of American journalism that forbade the expression of opinion. (Also, I knew I was temperamentally unsuited for shoving notebook or microphone under the noses of people who were recently bereaved, newly indicted or otherwise thrust into the spotlight.) If I became a reporter, I could wait 20 years and maybe, just maybe, if I worked hard and was good and got lucky, I’d earn a column, and, finally, be able to write what I thought, instead of having to seek out “experts” to say what I thought I already knew. Or I could become a critic — and start writing, immediately, not just about what I was observing but also about what I was thinking about it, which seemed more honest.

This is how it looked to me a quarter century ago, anyway. Since then I’ve ended up doing somewhat more reporting in addition to columns and criticism, and my respect for the insanely difficult work of reporting well has risen.

But also, the strictures have evolved. In the disintegration of journalistic norms taking place all around us today, it is increasingly common to find out-and-out opinions sitting right in the middle of ostensible “news” articles. I’m not talking about the kind of complaint you hear all the time from partisans of every stripe who dismiss facts that are inconvenient to their side by relabeling them as opinions. I’m talking about real opinions — statements that do not stand on their own as reported fact but hang in the void, unsupported by anything other than assertion.

These musings were occasioned by a piece in Sunday’s New York Times Business section by Louis Uchitelle. “Two Tiers, Slipping Into One” describes the decline of the bargaining power of Rust Belt unions. A union leader tells Uchitelle what he says to young workers who are getting a worse deal than their elders: “I assure them that five years down the road, when the present contract expires, we in the union are going to improve their lot in life.”

Uchitelle begins his next paragraph: “That does not seem likely.”

I am not questioning the accuracy of Uchitelle’s forecast, nor do I begrudge him his sarcasm. I am instead marveling at the forthright expression of an opinion. The union guy says one thing, and the reporter says, “Nope, sorry, not gonna happen.”

Uchitelle is a veteran labor and economics reporter. I have no reason to think he’s wrong. I’m just wondering — can we extend this practice a bit? If we’re saying it’s OK for reporters to point out that something a union leader says “does not seem likely,” maybe it would be OK for them to point out the same thing in other places and at other times, for other speakers?

Consider:

“We’re working with Congress to hold the line on spending,” Mr. Bush said Monday. “And we do have a plan to cut the deficit in half.”

That does not seem likely.

The insurgency in Iraq is “in the last throes,” Vice President Dick Cheney says.

That does not seem likely.

Secretary of Defense Donald Rumsfeld: “”There is no wiggle room in the president’s mind or my mind about torture.”

That does not seem likely.

Is this practice of writing truth to power going to spread across the pages of America’s dailies? That does not seem likely. But we can dream.

Filed Under: Media, Politics

« Previous Page
Next Page »