Wordyard

Hand-forged posts since 2002

Scott Rosenberg

  • About
  • Greatest hits

Archives

The Web Parenthesis: Is the “open Web” closing?

October 12, 2010 by Scott Rosenberg 24 Comments

Heard of the “Gutenberg parenthesis”? This is the intriguing proposition that the era of mass consumption of text ushered in by the printing press four centuries ago was a mere interlude between the previous era of predominantly oral culture and a new digital-oral era on whose threshold we may now sit.

That’s a fascinating debate in itself. For the moment I just want to borrow the “parenthesis” concept — the idea that an innovative development we are accustomed to viewing as a step up some progressive ladder may instead be simply a temporary break in some dominant norm.

What if the “open Web” were just this sort of parenthesis? What if the advent of a (near) universal publishing platform open to (nearly) all were not itself a transformative break with the past, but instead a brief transitional interlude between more closed informational regimes?

That’s the question I weighed last weekend at Open Web Foo Camp. I’d never been to one of O’Reilly’s Foo Camp events — informal “unconferences” at the publisher’s Sebastopol offices — but last weekend had the pleasure of hanging out with an extraordinary gang of smart people there. Here’s what I came away with.

For starters, of course everyone has a different take on the meaning of “openness.” Tantek Celik’s post lays out some of the principles embraced by ardent technologists in this field:

  • open formats for freely publishing what you write, photograph, video and otherwise create, author, or code (e.g. HTML, CSS, Javascript, JPEG, PNG, Ogg, WebM etc.).
  • domain name registrars and web hosting services that, like phone companies, don’t judge your content.
  • cheap internet access that doesn’t discriminate based on domains

But for many users, these principles are distant, complex, and hard to fathom. They might think of the iPhone as a substantially “open” device because hey, you can extend its functionality by buying new apps — that’s a lot more open than your Plain Old Cellphone, right? In the ’80s Microsoft’s DOS-Windows platform was labeled “open” because, unlike Apple’s products, anyone could manufacture hardware for it.

“Open,” then, isn’t a category; it’s a spectrum. The spectrum runs from effectively locked-down platforms and services (think: broadcast TV) to those that are substantially unencumbered by technical or legal constraint. There is probably no such thing as a totally open system. But it’s fairly easy to figure out whether one system is more or less open than another.

The trend-line of today’s successful digital platforms is moving noticeably towards the closed end of this spectrum. We see this at work at many different levels of the layered stack of services that give us the networks we enjoy today — for instance:

  • the App Store — iPhone apps, unlike Web sites and services, must pass through Apple’s approval process before being available to users.
  • Facebook / Twitter — These phenomenally successful social networks, though permeable in several important ways, exist as centralized operations run by private companies, which set the rules for what developers and users can do on them.
  • Comcast — the cable company that provides much of the U.S.’s Internet service just merged with NBC and faces all sorts of temptations to manipulate its delivery of the open Web to favor its own content and services.
  • Google — the big company most vocal about “open Web” principles has arguably compromised its commitment to net neutrality, and Open Web Foo attendees raised questions about new wrinkles in Google Search that may subtly favor large services like Yelp or Google-owned YouTube over independent sites.

The picture is hardly all-or-nothing, and openness regularly has its innings — for instance, with developments like Facebook’s new download-your-data feature. But once you load everything on the scales, it’s hard not to conclude that today we’re seeing the strongest challenge to the open Web ideal since the Web itself began taking off in 1994-5.

Then the Web seemed to represent a fundamental break from the media and technology regimes that preceded it — a mutant offspring of the academy and fringe culture that had inexplicably gone mass market and eclipsed the closed online services of its day. Now we must ask, was this openness an anomaly — a parenthesis?

My heart tells me “no,” but my brain says the answer will be yes — unless we get busy. Openness is resilient and powerful in itself, but it can’t survive without friends, without people who understand it explaining it to the public and lobbying for it inside companies and in front of regulators and governments.

For me, one of the heartening aspects of the Foo weekend was seeing a whole generation of young developers and entrepreneurs who grew up with a relatively open Web as a fact of life begin to grapple with this question themselves. And one of the questions hanging over the event, which Anil Dash framed, was how these people can hang on to their ideals once they move inside the biggest companies, as many of them have.

What’s at stake here is not just a lofty abstraction. It’s whether the next generation of innovators on the Web — in technology, in services, or in news and publishing, where my passion lies — will be free to raise their next mutant offspring. As Steven Johnson reminds us in his new book, when you close anything — your company, your service, your mind — you pay an “innovation tax.” You make it harder for ideas to bump together productively and become fertile.

Each of the institutions taking a hop toward the closed end of the openness spectrum today has inherited advantages from the relatively open online environment of the past 15 years. Let’s hope their successors over the next 15 can have the same head start.

Filed Under: Business, Events, Media, Net Culture, Technology

In the context of web context: How to check out any Web page

September 14, 2010 by Scott Rosenberg 19 Comments

One of the great fears about the Web as it becomes our primary source of news is the notion that it rips stories from their moorings and delivers them to us context-free. We’re adrift! In a flood of soundbites! Borne upon a river of bits! Or something like that.

I’ve never understood this argument. As I tried to suggest in my Defense of Links posts, the convention of the link, properly used, provides more valuable context than most printed texts have ever been able to offer.

But links aren’t the only bearers of digital context. Every piece of information you receive online emits a welter of useful signals that can help you appraise it.

The techniques described here first filled my quiver in the ’90s, when I worked as Salon’s technology editor. We’d receive story tips and ideas, some of them pretty far out, and we’d scratch our heads and think, “Can this be for real?” I began applying an informal set of tests and checks to try to prevent us from being manipulated, pranked, or turned into a conduit for bad information. This was our way of trying to take the “discipline of verification” at the heart of the journalism we’d always practiced and apply it to the new medium. We knew we’d never be perfect. But there were scammers, hoaxsters and nuts out there, and we were damn sure not going to be pushovers for them.

Though some of the details have changed in the intervening years, the basic principles for evaluating an unknown source remain relevant, I think.

  • What’s the top-level domain? Is the page in question on a spammy top-level domain like “.info”? That’s not always a bad sign, but it raises your alert level a bit.
  • Look the domain name up with whois. Is the registration info available or hidden? Again, lots of domain owners hide their info for privacy reasons. But sometimes the absence of a public contact at the domain level is a sign that people would rather you not look into what they’re doing.
  • How old or new is the registration? If the site just suddenly appeared out of nowhere that can be another indication of mischief afoot.
  • Look up the site in the Internet Archive. Did it used to be something else? How has it changed over the years? Did it once reveal information that it now hides?
  • Look at the source code. Is there anything unusual or suspicious that you can see when you “view source”? (If you’re not up to this, technically, ask a friend who is.)
  • Check out the ads. Do they seem to be the main purpose of the site? Do they relate to the content or not?
  • Does the site tell you who runs it — in an about page, or a footer, or anywhere else? Is someone taking responsibility for what’s being published? If so, obviously you can begin this whole investigation again with that person or company’s name, if you need to dig deeper.
  • Is there a feedback option? Email address, contact form, public comments — any kind of feedback loop suggests there’s someone responsible at home.
  • What shape are the comments in? If they’re full of spam it may mean that nobody’s home. If people are posting critical comments and no one ever replies, that could also mean that the site owner has gone AWOL. (He might also be shy or uninterested in tangling with people.)
  • Is the content original and unique? Grab a chunk of text (a sentence or so), put it in quotes, and plug it into Google to see whether there are multiple versions of the text you’re reading. If so, which appears to be the original? Keep in mind that the original author might or might not be responsible for these multiple versions.
  • Does the article make reference to many specific sources or just a few? And are the references linked? More is usually a good sign, unless they appear to be assembled by script rather than by a human hand.
  • Links in are as important a clue as links out. If your hunt for links in turns up a ton of references from dubious sites, your article may be part of a Google-gaming effort. If you see lots of inbound links from sites that seem reputable to you, that’s a better sign.
  • Google the URL. Google the domain. Google the company name. Poke around if you have any doubts or questions. Then, of course, remember that every single question we’ve been applying here can be asked about every page Google points you to, as well.

Once you’ve done some or all of this work, it may be time to actually try to contact the author or site owner with your questions. If there’s no way to do so, that’s another bad sign. If there is, but they don’t answer, it might be a problem — or they might just be really swamped!

Software developers use the term “code smell” to describe the signals they catch from a chunk of program code that something might be off. What I’m trying to describe here is a rough equivalent for online journalism: Call it “Web smell.”

No one of these tests, typically, is conclusive in itself. But together they constitute a kind of sniff test for the quality of any given piece of Web-borne information.

There are probably many more tests that I’m not remembering — or that I never knew in the first place. If you know of some, do post them in the comments.

BONUS LINK: Craig Kanalley’s “How to verify a tweet” assembles a similar set of tests for tweets.

FOLLOWUP: Craig Silverman’s “How To Lose Your Gut” (at Columbia Journalism Review) has some more tips.

Filed Under: Media, Technology

Carr’s “The Shallows”: An Internet victim in search of lost depth

September 8, 2010 by Scott Rosenberg 7 Comments

One day, immersing myself in my reading was simple as breathing. The next, it wasn’t. Once I had happily let books consume my days, with my head propped up against my pillow in bed or my body sprawled on the floor with the volume open in front of me. Now I felt restless after just a few pages, and my mind and body both refused to stay in one place. Instead of just reading, I would pause and ask, “Why am I reading this and not that? How will I ever read everything I want to or need to?”

I was 18. It would be years before I’d hear of the Internet.

Nicholas Carr had, it seems, a similar experience, quite a bit more recently. He describes it at the start of his book The Shallows: What the Internet is Doing to Our Brains:

I used to find it easy to immerse myself in a book or a lengthy article. My mind would get caught up in the twists of the narrative or the turns of the argument, and I’d spend hours strolling through long stretches of prose. That’s rarely the case anymore. Now my concentration starts to drift after a page or two. I get fidgety, lose the thread, begin looking for something else to do. I feel like I’m always dragging my wayward brain back to the text. The deep reading that used to come naturally has become a struggle.

When I experienced this loss of focus, I simply blamed my new condition on my newly acquired adulthood. Carr, apparently, was lucky enough to retain his deep-reading endurance undisturbed from childhood well into his grownup years. By the time it began to slip away from him, we were all deep into the Web era. Carr decided that, whatever was going on, the Web was to blame. It wasn’t something that simply happened; it was something that the Internet was “doing to” his brain.

The Shallows has been received as a timely investigation of the danger that information overload, multitasking and the Web all pose to our culture and our individual psyches. There are serious and legitimate issues in this realm that we ignore at our peril. (Linda Stone is one important thinker in this area whose work I recommend.)

So I cannot fault Carr for asking what the Internet is doing to us. But that is only half of the picture. He fails to balance that question with its vital complement: What are we doing to, and with, the Internet? This imbalance leads him both to wildly overstate the power of the Internet to alter us, and to confuse traits that are inherent to the medium with those that are incidental.

Carr writes as a technological determinist. In asking what the Internet is “doing to” us he casts us as victims, not actors, and once that casting is in place, there’s only one way the drama can unfold. The necessary corrective to this perspective can be found in the opening chapter of Claude Fischer’s great history of the telephone, America Calling. Fischer admonishes us not to talk about technology’s “impacts” and “effects,” because such language “implies that human actions are impelled by external forces when they are really the outcomes of actors making purposeful choices under constraints.” (Emphasis mine.)
[Read more…]

Filed Under: Books, Culture, Media, Technology

Dr. Laura, Associated Content and the Googledammerung

August 20, 2010 by Scott Rosenberg 6 Comments

I was on vacation for much of the last couple of weeks, so I missed a lot — including the self-immolation of Dr. Laura Schlessinger. Apparently Schlessinger was the last public figure in the U.S. who does not understand the simple rules of courtesy around racial/religious/ethnic slurs. (As an outsider you don’t get a free pass to use them — no matter how many times you hear them uttered by their targets.) She browbeat a caller with a self-righteous barrage of the “N-word” — and wrote her talk-show-host epitaph.

I shed no tears for Dr. Laura — why do we give so much air time to browbeaters, anyway? — and I don’t care much about this story. But after reading a post over at TPM about Sarah Palin’s hilariously syntax-challenged tweets defending Schlessinger, I wanted to learn just a bit more about what had happened. So of course I turned to Google.

Now, it may have been my choice of search term, or it may have been that the event is already more than a week old, but I was amazed to see, at the top of the Google News results, a story from Associated Content. AC, of course, is the “content farm” recently acquired by Yahoo; it pays writers a pittance to crank out brief items that are — as I’ve written — crafted not to beguile human readers but to charm Google’s algorithm.

AC’s appearance in the Google lead position surprised me. I’d always assumed that, inundated by content-farm-grown dross, Google would figure out how to keep the quality stuff at the top of its index. And this wasn’t Google’s general search index recommending AC, but the more rarefied Google News — which prides itself on maintaining a fairly narrow set of sources, qualified by some level of editorial scrutiny.

Gee, maybe Associated Content is getting better, I thought. Maybe it’s producing some decent stuff. Then I clicked through and began reading:

The Dr. Laura n-word backlash made her quit her radio show. It seems the Dr. Laura n-word controversy has made her pay the price, as the consequences of herbrought down her long-running program. But even if it ended her show, it may not end her career. Despite being labeled as a racist, and despite allegedly being tired of radio, the embattled doctor still seems set to fight on after she leaves. In fact, the Dr. Laura n-word scandal has made her more defiant than ever, despite quitting.

I have cut-and-pasted this quote to preserve all its multi-layered infelicities. The piece goes on in this vein, cobbled together with no care beyond an effortful — and, I guess, successful — determination to catch Google’s eye by repeating the phrase “Dr. Laura n-word” as many times as possible.

The tech press endlessly diverts itself with commentary about Google’s standing vis-a-vis Facebook, Google’s stock price, Google’s legal predicament vis-a-vis Oracle, and so forth — standard corporate who’s-up-who’s-down stuff. But this is different; this is consequential for all of us.

I was a fairly early endorser of Google back in 1998, when the company was a wee babe of a startup. Larry Page impatiently explained to me how PageRank worked, and I sang its deserved praises in my Salon column. For over a decade Google built its glittering empire on this simple reliability: It would always return the best links. You could count on it. You could even click on “I’m feeling lucky.”

I still feel lucky to be able to use Google a zillion times a day, and no, Bing is not much use as an alternative (Microsoft’s search engine kindly recommends two Associated Content stories in the first three results!). But when Google tells me that this drivel is the most relevant result, I can’t help thinking, the game’s up. The Wagner tubas are tuning up for Googledammerung: It’s the twilight of the bots.

As for Associated Content, it argues — as does its competition, like the IPO-bound Demand Media — that its articles are edited and its writers are paid and therefore its pages should be viewed as more professional than your average run-of-the-mill blogger-in-pajamas. I think they’ve got it backwards. I’ll take Pajama Boy or Girl any day. Whatever their limitations, they are usually writing out of some passion. They say something because it matters to them — not because some formula told them that in order to top the index heap, they must jab hot search phrases into their prose until it becomes a bloody pulp.

Let me quote longtime digital-culture observer Mark Dery, from his scorcher of a farewell to the late True/Slant:

The mark of a real writer is that she cares deeply about literary joinery, about keeping the lines of her prose plumb. That’s what makes writers writers: to them, prose isn’t just some Platonic vessel for serving up content; they care about words.

The best bloggers know a thing or two about this “literary joinery.” And even bad bloggers “care about words.” But the writer of Associated Content’s Dr. Laura post is bypassing such unprofitable concerns. He chooses his words to please neither himself nor his readers. They’re strictly for Google’s algorithm. The algorithm is supposed to be able to see through this sort of manipulation, to spit out the worthless gruel so it can serve its human users something more savory. But it looks like the algorithm has lost its sense of taste.

[I should state for the record that in the course of my business work for Salon.com I had occasion to meet with folks from Associated Content. They were upright and sharp and understood things about the Web that we didn’t, then. They’ve built a successful business out of “content” seasoned to suit the Googlebot’s appetite. It’s just not what we think of when we think of “writing.” And if this piece is any indication, there isn’t an editor in sight.]

BONUS LINK: If you want to understand more fully the process by which “news” publishers watch Google for trending topics and then crank out crud to catch Google’s eye, you cannot do better than this post by Danny Sullivan of SearchEngineLand. Sullivan calls it “The Google Sewage Factory”:

The pollution within Google News is ridiculous. This is Google, where we’re supposed to have the gold standard of search quality. Instead, we get “news” sites that have been admitted — after meeting specific editorial criteria — just jumping on the Google Trends bandwagon…

Filed Under: Business, Media, Technology

Could Google’s neutrality backstab be a fake?

August 5, 2010 by Scott Rosenberg 8 Comments

News that Google and Verizon are negotiating a deal to “jump the Internet line,” as the New York Times put it in a great headline, shocked people who’ve been following the Net neutrality story and upset many of Google’s true believers. Google has long been one of Net neutrality’s most reliable big-company backers.

Net neutrality — the principle that information traveling across the Internet should be treated equally by the backbone carriers that keep the packets flowing — made sense for Google’s search-and-ad business: Keep the Internet a level playing field so it keeps growing and stays open to the Googlebot. It also helped keep people from snickering too loudly at the company’s “don’t be evil” mantra.

So why would Google turn around now, at a time when the FCC is weighing exactly how to shape the future of Net neutrality regulation, and signal a course-change toward, um, evil?

Here are the obvious explanations: Google wants to speed YouTube bits to your screen. Google is in bed with Verizon thanks to Android. Google figures neutrality is never going to remain in place so get a jump on the competition.

None of these quite persuades me. But what if — here is where I pause to tell you this is total speculation on my part — it’s a fake-out? What if Google — or some portion of Google — is still basically behind the Net neutrality principle but realizes that very few people understand the issue or realize what’s at stake? Presumably Google and Verizon, which sells a ton of Android phones, talk all the time. Presumably they talk about Net neutrality-related stuff too.

Maybe someone inside Google who still believes in Net neutrality strategically leaked the fact that they’re negotiating this stuff — knowing the headlines and ruckus would follow. Knowing that this might be a perfect way to dramatize Net neutrality questions and mobilize support for strong Net neutrality rules from the public and for the FCC.

This scenario assumes a level of Machiavellian gameplaying skill on Google’s part that the company has not hitherto displayed. And if the whole story is a feint, it might well not be a strategic move on Google’s part but rather a sign of dissent inside Google, with one faction pushing the Verizon deal and another hoping to blow it up.

Still, worth pondering!

UPDATE: A tweet from Google’s Public Policy: “@NYTimes is wrong. We’ve not had any convos with VZN about paying for carriage of our traffic. We remain committed to an open internet.” [hat tip to Dan Lyke in comments]

Filed Under: Business, Politics, Technology

Careful with that ad headline!

August 5, 2010 by Scott Rosenberg Leave a Comment

Here is the front page of a flyer that recently dropped out of my newspaper. It is an ad for a certain very large PC maker whose name rhymes with that fiery place where bad people spend eternity.

We glance at ads very quickly, or we catch them from the corner of our eyes. And when this one passed through my eye and into my brainpan, what I saw was:

DOCK TO DOORSTOP IN ABOUT 48 HOURS.

Which really just doesn’t seem like enough time to enjoy your new laptop…

Filed Under: Business, Technology

“Blogging is like auto-save for our entire culture”

July 29, 2010 by Scott Rosenberg Leave a Comment

A couple months ago I gave a talk at WordCamp San Francisco, attempting to put WordPress in historical perspective. Those who know the subject know that WordPress’s adoption of the relatively strict GPL free-software licensing is central to its story. (This is the background to the recent dustup between WordPress founder Matt Mullenweg and the creator of the popular Thesis theme over the licensing of that theme.) Ironically, my talk was directly opposite one being given by free-software godfather Richard Stallman, the “Father of the GPL.” It was great so many people still chose to listen to me!

This is a variation on the talks I’ve been giving about Say Everything, with some additional material on WordPress, and some thoughts about the value of blogging to our collective history: “Blogging is like auto-save for our entire culture.”

[This video lives over here at WordPress.tv. Thanks to everyone at WordCamp for having me!]

Filed Under: Blogging, Events, Say Everything, Technology

“Failsafe” is an oxymoron: BP’s Gulf spill and the St. Francis Dam

May 20, 2010 by Scott Rosenberg 2 Comments

I listened to this interview yesterday with BP director Robert Dudley on the News Hour:

ROBERT DUDLEY: …The blowout preventers are something that are used on oil and gas wells all over the world, every well. They just are designed not to fail with multiple failsafe systems. That has failed. So, we have a crisis.

…JEFFREY BROWN: Excuse me, but the — the technology — the unexpected happened. And so the question that you keep hearing over and over again is, why wasn’t there a plan for a worst-case scenario, which appears to have happened?

ROBERT DUDLEY: Blowout preventers are designed not to fail. They have connections with the rig that can close them. When there’s a disconnection with the rig, they close, and they’re also designed to be able to manually go down with robots and intervene and close them. Those three steps, for whatever reason, failed in this case. It’s unprecedented. We need to understand why and how that happened.

The failsafe failed. It always does. “Designed not to fail” can never mean “certain not to fail.” There is no such thing as “failsafe” — just different degrees of risk management, different choices about how much money to spend to reduce the likelihood of disaster, which can never entirely be eliminated.

Two different social attitudes conspire to lead us to disasters like the Gulf spill. On the one hand, there is the understandable but naive demand on the part of the public and its proxies in the media for certainty: How can we be sure that this never happens again? Sorry, we can’t. If we want to drill for oil we should assume that there will be spills. If we don’t like spills, we should figure out other ways to supply our energy.

On the other side, there is what I’d call the arrogance of the engineering mindset: the willingness to push limits — to drill deeper, to dam higher — with a certain reckless confidence that our imperfect minds and hands can handle whatever failures they cause.

Put these two together and you have, rather than any sort of “failsafe,” a dynamic of guaranteed failure. The public demands the impossibility of “failsafe” systems; the engineers claim to provide them; and everything is great until the inevitable failure. Each new failure inspires the engineers to redouble their efforts to achieve the elusive failsafe solution, which lulls the public into thinking that there will never be another disaster, until there is.

I wrote about these issues as they relate to software in Dreaming in Code. But at some point the need to understand this cycle demands a more artistic response.

May I suggest you give a listen to Frank Black’s “St. Francis Dam Disaster,” a great modern folksong about a colossal engineering failure of a different era.

Filed Under: Music, Science, Technology

For the media biz, iPad 2010 = CDROM 1994

March 26, 2010 by Scott Rosenberg 44 Comments

I’m having flashbacks these days, and they’re not from drugs, they’re from the rising chorus of media-industry froth about how Apple’s forthcoming iPad is going to save the business of selling content.

Let me be clear: I love what I’ve seen of the iPad and I’ll probably end up with one. It’s a likely game-changer for the device market, a rethinking of the lightweight mobile platform that makes sense in many ways. I think it will be a big hit. In the realm of hardware design, interface design and hardware -software integration, Apple remains unmatched today. (The company’s single-point-of-failure approach to content and application distribution is another story — and this problem that will only grow more acute the more successful the iPad becomes.)

But these flashbacks I’m getting as I read about the media business’s iPad excitement — man, they’re intense. Stories like this and this, about the magazine industry’s excitement over the iPad, or videos like these Wired iPad demos, take me back to the early ’90s — when media companies saw their future on a shiny aluminum disc.

If you weren’t following the tech news back then, let me offer you a quick recap. CD-ROMS were going to serve as the media industry’s digital lifeboat. A whole “multimedia industry” emerged around them, complete with high-end niche publishers and mass-market plays. In this world, “interactivity” meant the ability to click on hyperlinks and hybridize your information intake with text, images, sound and video. Yow!

There were, it’s true, a few problems. People weren’t actually that keen on buying CD-ROMs in any quantity. Partly this was because they didn’t work that well. But mostly it was because neither users nor producers ever had a solid handle on what the form was for. They plowed everything from encyclopedias to games to magazines onto the little discs, in a desperate effort to figure it out. They consoled themselves by reminding the world that every new medium goes through an infancy during which nobody really knows what they’re doing and everyone just reproduces the shape and style of existing media forms on the new platform.

You can hear exactly the same excuses in these iPad observations by Time editor Richard Stengel. Stengel says we’re still in the point-the-movie-camera-at-the-proscenium stage. We’re waiting for the new form’s Orson Welles. But we’re charging forward anyway! This future is too bright to be missed.

But it turned out the digital future didn’t need CD-ROM’s Orson Welles. It needed something else, something no disc could offer: an easy way for everyone to contribute their own voices. The moment the Web browser showed up on people’s desktops, somewhing weird happened: people just stopped talking about CD-ROMs. An entire next-big-thing industry vanished with little trace. Today we recall the CD-ROM publishing era as at best a fascinating dead-end, a sandbox in which some talented people began to wrestle with digital change before moving on to the Internet.

It’s easy to see this today, but at the time it was very hard to accept. (My first personal Web project, in January 1995, was an online magazine to, er, review CD-ROMs.)

The Web triumphed over CD-ROM for a slew of reasons, not least its openness. But the central lesson of this most central media transition of our era, one whose implications we’re still digesting, is this: People like to interact with one another more than they like to engage with static information. Every step in the Web’s evolution demonstrates that connecting people with other people trumps giving them flashy, showy interfaces to flat data.

It’s no mystery why so many publishing companies are revved up about the iPad: they’re hoping the new gizmo will turn back the clock on their business model, allowing them to make consumers pay while delivering their eyeballs directly to advertisers via costly, eye-catching displays. Here’s consultant Ken Doctor, speaking on Marketplace yesterday:

DOCTOR: Essentially, it’s a do-over. With a new platform and a new way of thinking about it. Can you charge advertisers in a different way and can you say to readers, we’re going to need you to pay for it?

Many of the industry executives who are hyping iPad publishing are in the camp that views the decision publishers made in the early days of the Web not to charge for their publications as an original sin. The iPad, they imagine, will restore prelapsarian profit margins.

Good luck with that! The reason it’s tough to charge for content today is that there’s just too much of it. People are having a blast talking with each other online. And as long as the iPad has a good Web browser, it’s hard to imagine how gated content and costly content apps will beat that.

You ask, “What about the example of iPhone apps? Don’t they prove people will pay for convenience on a mobile device?” Maybe. To me they prove that the iPhone’s screen is still too small to really enjoy a standard browser experience. So users pay to avoid the navigation tax that browser use on the iPhone incurs. This is the chief value of the iPad: it brings the ease and power of the iPhone OS’s touch interface to a full-size Web-browser window.

I can’t wait to play around with this. But I don’t see myself rushing to pay for repurposed paper magazines and newspapers sprinkled with a few audio-visual doodads. That didn’t fly with CD-ROMs and it won’t fly on the iPad.

Apple’s new device may well prove an interesting market for a new generation of full-length creative works — books, movies, music, mashups of all of the above — works that people are likely to want to consume more than once. But for anything with a shelf-life half-life — news and information and commentary — the iPad is unlikely to serve as a savior. For anyone who thinks otherwise, can I interest you in a carton of unopened CD-ROM magazines?

Filed Under: Blogging, Business, Media, Say Everything, Technology

SEO mills: That’s not fast food, it’s bot fodder

December 14, 2009 by Scott Rosenberg 11 Comments

Yesterday TechCrunch’s Mike Arrington denounced the rise of SEO-mill-driven content — the sort of business Associated Content and Demand Media are in, and AOL is going into — as “the rise of fast food content.”

This gave me a good laugh, since, of course, most journalists have long (and mostly wrongly) viewed Arrington’s own output, and that of all blog-driven enterprises, as “fast food journalism.” Arrington, rightly, I think, sees himself more as a “mom-and-pop” operation producting “hand-crafted content,” and he’s bemoaning “the rise of cheap, disposable content on a mass scale, force fed to us by the portals and search engines.”

Trouble is, Arrington’s metaphor is off. The articles produced by the SEO-driven content mills aren’t like fast food at all. Fast food works because it tastes good, even if it’s bad for us: it satisfies our junk cravings for sugar and salt and fat. We eat it, and we want more. The online-content equivalent to junk food might be a gossip blog, or photos of Oscar Night dresses, or whatever other material you read compulsively, knowing that you’re not really expanding your mind.

The stuff that Demand Media and Associated Content produce isn’t “junk-food content” because it’s not designed for human appetites at all: it’s targeted at the Googlebot. It’s content created about certain topics that are known to produce a Google-ad payoff; the articles are then doctored up to maximize exposure in the search engine. individually they don’t make much money, but all they have to do is make a little more per page than they cost. Multiply that by some number with many zeros on the end and you’ve got a business.

These businesses aren’t preying on our addictive behaviors; they’re exploiting differentials and weaknesses in Google’s advertising-and-search ecosystem. As Farhad Manjoo pointed out recently in Slate, the actual articles produced by these enterprises tend to be of appallingly poor quality. McDonald’s food may not be good for you, but it’s consistent and, plainly, appealing to multitudes. But few sane readers would willingly choose to consume an SEO mill’s take on a topic over something that was written for human consumption.

That’s why I think Arrington’s off-base. The SEO arbitrageurs may make money manipulating the search-engine bots, but they can’t “force feed” their output to real people. Doc Searls’ idealism on this point is more persuasive than Arrington’s lament.

Filed Under: Business, Technology

« Previous Page
Next Page »