Wordyard

Hand-forged posts since 2002

Archives

About

Greatest hits

Fifteen years of epochal pronouncements

May 7, 2009 by Scott Rosenberg

In 1994 Louis Rossetto cranked up HotWired and believed he was ushering in the professionalization of the Web. It was time to rout all the anarchists and the hackers and the amateurs who thought the Internet was all about self-expression (and them). “The era of public-access Internet has come to an end,” he declared. He didn’t mean that the public would no longer be able to access the Internet, of course; he was drawing an analogy to public-access TV. Just as that once-promising avenue for citizen media had been eclipsed by the pros of the cable world, so, he reasoned, the Web would similarly leave the amateurism of its youth behind.

Nick Carr believes this is still going to happen, but many of us today understand that the opportunity the Web affords all of us to add to it lies at the heart of the medium’s identity. It’s not some minor feature of the medium’s youth that will be sloughed off as maturity arrives. It’s not some incidental efflorescence of excess creativity that will vanish once the laws of supply and demand kick in. It is what makes the Web tick. You can try to ignore that, and use the Web as a mere replacement for paper and trucks, but why bother? You will lose your readers and your future.

I thought of all this as I read reports today of Rupert Murdoch’s pronouncement that “The current days of the internet will soon be over.” Phrased that way, the prediction makes it sound like the end times are near. But the only apocalypse in sight, I’m afraid, is that of the old-line news industry, if it insists on pursuing dead-end subscription models for general-interest Web products.

There is money to be made on the Web for the providers of information, but it will never be made by locking away generic news and opinion articles and charging subscription fees to access them. Cutting your content off from the rest of the Web in this fashion robs it of its Webbiness. It’s like a movie producer in the 1930s saying, “Hey, let’s make talkies!”, but then turning off the sound in the theaters.

Murdoch, and any other publisher who shuts the gates, may well boost his bottom line in the short term. But in the medium term and beyond he is simply guaranteeing the slow decline and ultimate irrelevance of his publication. This is painful for journalists and media execs to hear, but they need to hear it — just as, back in 1994, Rossetto needed to hear that no, actually, “public access” was exactly what the Web was all about.

Filed Under: Blogging, Business, Media

Yesterday, AOL/TimeWarner; today, Twitter and…

May 6, 2009 by Scott Rosenberg

There’s a ridiculous amount of chatter in the tech blogosphere about who’s going to buy Twitter. And if the right offer comes along with enough zeros behind it, I don’t doubt that Twitter will sooner or later sell itself. But I doubt its founders are going to do it any time soon. Industry veterans understand that the day you sell your company is the day that innovation ends and “value extraction” begins.

Evan Williams knows that since he lived it. When Google acquired Blogger it secured the service’s future and insured its growth to the household name it became. (One of the many tales told in Say Everything…) But you didn’t exactly see Blogger pushing the boundaries or adding exciting new wrinkles. The innovation was done.

Google, being Google, didn’t rush to extract value. But that’s what we’re seeing now with MySpace and News Corporation. Having invested in the social network because of its market share and buzz but with little idea how to make money with it, Rupert Murdoch is now impatient to ramp up the revenue. The competition over at Facebook — still independent and still run by founders — is more focused right now on adding features and figuring out what their service is all about than in raking in the dollars. If they sell now, they know they’re likely giving up further explorations of what Facebook is (explorations that today are underwritten, to be sure, by investors who hope someday to cash out).

Meanwhile, the granddaddy of this sort of deal — the great AOL/Time Warner merger of 2000 — is receiving its final interment this week with the announcement that Time intends to fling the old albatross off its neck in a spinoff. When it was first announced, that combination was hailed as “the deal of the millennium,” but none of the people involved really had a clue about the future — not the AOL executives who shrewdly sold off their business at the peak of its market value, and certainly not the Time Warner execs who very quickly realized the two companies had absolutely no business combining forces.

AOL was never a hugely innovative company, but it was good at getting people online quickly and easily in the early days of the Web. Maybe it had a future doing the same thing in the broadband era. But from the moment AOL sold itself to Time, it ceased being a force of any consequence on the Net and began a long, slow downward slide from which it has never recovered, and from which I doubt it ever can — even with ex-Googler Tim Armstrong at the helm.

Reading about the spinoff this week reminded me of one of my most amusing experiences during the dotcom bubble. In January 2000 I was a new dad with three-month-old twins at home; elated but sleepless, I was running on caffeine and adrenaline. When I woke up to news of the AOL deal I rubbed my eyes and banged out a very quick column raising some questions about it.

Later that day I got a call from some producers at CNN asking if I would go on the air to talk about the deal. I thought, yeah, sure, as long as I can keep my eyes open… What I realized once the anchorperson started asking me questions was that I’d been cast as the deal’s Dr. Doom. In retrospect I think I was perhaps the only pundit they could get in front of their cameras who wasn’t convinced that the deal was going to reshape the Web world.

I saved video from the show. Here it is:

“What’s the problem?” indeed! I can’t claim any astute prescience; I couldn’t foresee just how quickly the boom would go bust and the deal would turn sour, and I worried more about big companies trying to strangle the Web than, in retrospect, I needed to. But I knew a fear-driven deal when I saw one and was in no mood to cheer what looked like the blind mating dance of clueless media barons.

It’s good to remember that today as the chorus on the sidelines starts chanting for new matches. They rarely work — and even when they do, they usually mean that the fun is over.

Filed Under: Business, Media, Personal, Say Everything

MySpace and Geocities — separated at birth

April 23, 2009 by Scott Rosenberg

Once upon a time, there was a Web company that was based not in Silicon Valley but in Santa Monica. It grew at a breathtaking rate. All of its content was created by its users, and though the pages those users created tended to look jumbled and messy, there was an enthusiasm embedded in all that busy-ness, and a fannish passion for pop-cultural pursuits. The company built up such a sheer momentum of traffic that a much bigger company was persuaded to acquire it for a massive sum of money at the height of a speculative Internet frenzy.

This story sounds like that of MySpace, the once-hot social-networking site for bands and their fans that Rupert Murdoch purchased in 2005. Once “the most popular Website in America,” as the title of a recent book had it, MySpace has been left in the dust by Facebook and Twitter in terms of innovation and growth. MySpace is in the news this week because Murdoch and his henchmen have just shown the door to the site’s founding duo, Chris DeWolfe and Tom Anderson, and replaced them with a former Facebook exec. It’s a recession out there, and Murdoch, who somehow believes that MySpace can be his entree to digital power, is eager to turn it around and demonstrate that it can become the online cash cow he has always dreamed of. Good luck there; I think that, even though Murdoch got MySpace for what many considered a bargain price (of around $500 million), it will prove an albatross around his corporate neck.

In fact, though, MySpace isn’t the company I was thinking of in that first paragraph. I was telling the story of Geocities — the MySpace of 1997-1999. Geocities was the most successful of the “build your own website” companies of the mid-90s (there were others, like Angelfire). Before there were blogs, there were Geocities pages, which were sort of like blogs except without the software to manage your content. Geocities pages were easy to build and really difficult to maintain. As a result, Geocities was populated fast — and nearly as quickly became a vast wasteland of abandoned digital real estate. It must have looked good on paper to the bizdev people at Yahoo in 1999, though, because they paid an astonishing $2.87 billion (in bubble-inflated Yahoo stock) for the ramshackle enterprise.

A decade later, Yahoo’s current management — facing tough times and after many rounds of layoffs — has decided to shut Geocities down. I don’t think there are too many people who will cry for this relic of a bygone era.

What I’m thinking is, there’s every reason to think MySpace will follow a similar trajectory, no matter how many executives huff and puff to try to reinflate its sagging appeal. If that’s the case, look for News Corp. to turn off its lights sometime in 2015 — about a decade after Murdoch’s ill-advised acquisition.

BONUS LINK: Harry McCracken surveys the top 15 Web properties of 1999 and asks, where are they now?

Filed Under: Business, Media, Net Culture

Mark Penn’s fuzzy pro-blogging stats

April 21, 2009 by Scott Rosenberg

I did a lot of digging around in the numbers around blogging for my book, so I’m on alert when I read a piece like Mark Penn’s look at pro blogging in the Wall Street Journal, which is getting lots of attention this morning. A little skepticism is definitely in order.

Here’s the nub of hard numbers in Penn’s piece:

The best studies we can find say we are a nation of over 20 million bloggers, with 1.7 million profiting from the work, and 452,000 of those using blogging as their primary source of income. That’s almost 2 million Americans getting paid by the word, the post, or the click — whether on their site or someone else’s.

Where do these numbers come from?

“20 million bloggers” links to a 2008 report from Emarketer that costs $695 if you actually want to know how they got their numbers (I confess I haven’t made the investment).

“1.7 million profiting” links to a promotional page for BlogWorld Expo that cites no source at all for its data.

“452,000 of those using blogging as their primary source of income” is drawn from a Mediabistro rewrite of numbers from Technorati’s State of the Blogosphere reports. Technorati’s are the longest-running and most valuable, and consistent, series of blogging studies over time, but like any study’s numbers, they can be easily misrepresented: here, Penn relies on them for the datum that bloggers who reach 100,000 uniques a month can earn $75K a year. But if you read the source, you find this:

The average income was $75,000 for those who had 100,000 or more unique visitors per month (some of whom had more than one million visitors each month). The median annual income for this group is significantly lower — $22,000.

In other words, the $75K average is skewed by a handful of outlier successes, but the great majority of bloggers who get 100,000 uniques/month earn more like $22,000. Here, the median is far more relevant than the average. Penn, of all people, knows this.

Later on, Penn’s piece cites other sources, including a Pew study and this iLibrarian post which references a 2008 study by an outfit called BIGResearch. The BIGResearch study particularly flummoxed me as I was researching my book, and in email correspondence with a company representative I got to the root of the oddness of their numbers: Their study defined “blogger” as, basically, anyone who writes or reads a blog. That’s one way to muddy the waters!

The methodology of Penn’s piece seems to be: gather as many numbers as you can and don’t worry about the fact that they are from many different sources at different times using different methodologies and even differing definitions of what it means to “be a blogger” — just toss them all together and start drawing conclusions. Those conclusions, in turn, seem to be based on a misapprehension that bloggers are by definition opinion writers. Many are, to be sure; but many others — particularly in the “pro blog” world Penn focuses on — concentrate on becoming expert sources in a particular area, or informational services, or link reviews.

My suggestion to Penn (who — full disclosure — I briefly worked for, decades ago, during my college years, when he was starting his company): You should commission a real study of blogging, using real sampling techniques, and share the results with the world. No one has done this yet that I’m aware of. You know how to do it! And we’d get a lot better information than this crazy-quilt pastiche of mix-‘n’-match stats.

UPDATE: Penn has posted an addition to his column that goes into more detail about the numbers. “I was surprised at how few studies there are on this,” he writes, “and I believe there definitely should be more. So perhaps in the future I will do some original research, but for this piece we took the best we could find and referenced every number so people would know where they came from.”

Filed Under: Blogging, Business, Media

Should Google pay a tax to media corporations?

April 20, 2009 by Scott Rosenberg

Returning from a mostly-offline spring break vacation, I find that the future-of-news debate has been going round in circles. In the most interesting turn of the wheel, Nick Carr weighed in with an elaboration of his argument that Google is a vampirical middleman, sucking the lifeblood from the media industry. His take on this trope is more sophisticated than the usual “Google took our ads, make them pay!” line from the newsroom diehards, and worth a look.

Carr quotes the point I made recently — that participation in Google’s search engine is voluntary, and any news outlet that wishes to opt out can do so easily — but suggests that this is an oversimplification:

When a middleman controls a market, the supplier has no real choice but to work with the middleman — even if the middleman makes it impossible for the supplier to make money. Given the choice, most people will choose to die of a slow wasting disease rather than to have their head blown off with a bazooka. But that doesn’t mean that dying of a slow wasting disease is pleasant.

The problem with Carr’s middleman theory is that it, too, is an oversimplification. It presupposes that the problem news organizations have with Google is that it “gets between them and their readers.” This assumes that the readers were already visiting the media company websites, and Google is interposing itself. But anyone who’s ever looked at a media website traffic report knows that most often Google traffic actually represents something precious for media businesses — new blood, first-time visitors, what the direct-marketing business calls “qualified leads.” In other industries, the media companies would be paying Google for that traffic, but Google gives it away for free.

In fact, media companies are not end-parties to transactions that Google is interfering with: they are middlemen, too, and in more than one kind of transaction. They sit in the middle between readers and the information readers seek; they also sit in the middle between advertisers and the customers those advertisers seek to reach.

Carr, standing in the shoes of the aggrieved media executive, sees Google as stepping in between the media outfit and its readers, grabbing a cut of the revenue. But put on the reader’s shoes and things look different: Google isn’t introducing an additional middleman layer but simply substituting its own, newfangled method of connecting readers with information and advertisers with readers. And if that version happens to suit the medium of the web and win the allegiance of readers, what right do media executives have to our sympathy, or to a “fair share” of Google’s revenue? They had decades to become Google themselves if they chose to.

Carr paints Google as a conventional middleman — an extractor of existing value. But Google, with its efficient, targeted text-link advertising, has actually added value to pages that previously could not be valued at all. Sure, media companies wish they’d done that themselves — I wish I’d done it, too. Now that Google has done so, they have a right to their chagrin; but they don’t have a right to a cut.

So yes, Google is a middleman of sorts, but not in the way your car dealer is a middleman. It doesn’t buy cheap goods from a supplier to mark up for a consumer. Its role in the economic system of the Web has been fundamentally additive: it has (at least in terms of its primary product, the search engine) contributed new value rather than skimming existing value.

This is when the Google Tax crowd cries, “But Google News is stealing our headlines!” Let’s put aside the fair-use argument for a minute and also defer the “incoming links have their own value” point. Even if the “Google News is theft” people were right, they are fighting over (relative)crumbs. News people who focus their ire on Google often choose to eye the company’s vast profits, mostly earned from its enormous search traffic, and then — in a rhetorical dodge that is either ignorant or disingenuous — pretend that most of those profits are earned from Google News.

In truth, Google News is an interesting but relatively small experiment to assemble a news page via algorithm rather than editor. As Google CEO Eric Schmidt seemed to admit to Maureen Dowd last week, that experiment has to date been a failure:

When I ask if human editorial judgment still matters, he tries to reassure me: “We learned in working with newspapers that this balance between the newspaper writers and their editors is more subtle than we thought. It’s not reproducible by computers very easily.”

The relevant point about Google News is that it represents a tiny sliver of Google’s business — it’s a pimple, at best a big pimple, on the balance sheet. I’m sorry to break this news to all the editors and publishers who are clamoring for a share of Google News’s revenue, but they should know that money is not going to save their businesses. And if what they’re really demanding is that Google give them a share of its total search-based revenue for the right to use headlines and snippets of news articles in Google News, then they’re batty. And they have no leverage, because Google can rightly say, “You can walk any time you wish.”

Many newspaper people seem to be under the impression that if Google, and Craigslist, and (fill in your favorite Web shibboleth here) had never been invented, then everything would be OK, and they would be free to transplant their old business model into the new medium. This is delusional. If Larry Page and Sergey Brin hadn’t invented a search engine that really works, and wedded it to a targeted advertising system, somebody else would have. If Craig Newmark hadn’t built a community of free classified advertising, somebody else would have. These functions are made possible by the nature of the Web, and they were both visible and inevitable by 1997 or so. It is the Web itself that unbundles the media industry’s products and undermines its old business model, not the actions of the handful of innovators who saw the Web’s potential and built on it.

Carr takes a long cynical view, arguing that everything will calm down and the media business will recover once the news industry’s present overproduction crisis ebbs and scarcity returns to the information marketplace. This is, for instance, what my old boss Steve Brill is trying to do with his latest venture. I wish him luck, but I think it will be a total failure. It is only the latest genie-stuffing exercise in a world where the bottle itself is busted.

Scarcity will never return to the information marketplace, at least not in its old familiar broadcast-era form. It is too cheap to distribute news today. Producing certain kinds of news remains a costly undertaking, and we’re still figuring out new models to support it, in a rocky transition that is rightfully causing a lot of nailbiting. But those new models are unlikely to resemble the ones that worked in an era when distribution could be controlled by the producers themselves — when the media executive could control both production and distribution and dictate terms to both readers and advertisers. And whatever new models emerge, they are unlikely to provide last century’s monopoly profits.

The Web of Google, Craigslist and you and me is certainly a less hospitable place for the New York Times and CBS and Rupert Murdoch. But in the long run it will be a more interesting, more diverse and healthier environment for the rest of us. In some ways it already is.

See also Mathew Ingram’s response to Carr, in which he offers a parallel argument that Google’s middleman “power” doesn’t reduce the power of content producers but instead amplifies it.

Filed Under: Business, Media

The OPEC plan for newspapers

April 9, 2009 by Scott Rosenberg

It’s turned into the silly season here in future-of-journalism land, what with the AP’s muddled new campaign to try to stop websites from linking to its content and the latest wave of cockamamie plans to save newspapers by (take your pick) putting them on the government dole, seizing some of Google’s profits to pay their bills, or organizing a sort of journalistic OPEC to begin jacking up the price of news online.

There are a few important facts that always seem to get lost in the broadsides that present these save-our-papers plans. One of these regards Google, which is widely seen among old-school journalists as the evil force that ate the newspaper industry’s profits by stealing its headlines without paying for them. The truth is that any newspaper website — indeed, any website at all — can stop Google from linking to it by adding a simple line of code to their “robots.txt” file that tells the Googlebot to go away. If you don’t understand what that means, it doesn’t matter; all you need to know is that participation in Google is voluntary.

Participation is also pretty much universal, because of the benefits. When users are seeking what you have, it’s good to be found. Newspaper sites, like most sites, don’t generally go the “robots.txt” exclusion route because they want Google to send people their way. But no one, Google or otherwise, is forcing any news organization to allow Google to link in.

The Google traffic is generally welcomed because it’s usually newcomers — site visitors who aren’t already part of the regular audience but who might become regulars if they like what they see. Over at the Wall Street Journal — the one major newspaper that has built a significant business out of charging for its articles — this influx of Google-directed eyeballs is apparently so valuable that the newspaper will actually slice a hole in its pay wall for Google-referred visitors to walk right in.

None of these realities seems to weigh in the scales for the new wave of “stop giving away the news” visionaries. Today’s entrant, newspaper consultant John Morton, writing in the American Journalism Review, is no different from his predecessors. Morton wants to see all American newspaper websites decide to shut their gates to non-paying visitors on July 4. Just organize this cartel and watch the profits return.

In reality, such a move would be suicidal: it would decimate these sites’ traffic while only marginally increasing their revenue. It would also hasten the evolutionary development of alternative, Web-only news organizations and business models that will be entirely disconnected from the old world of paper.

What all such plans fail to understand is that no website can succeed unless it is participating in the core activities of the Web — linking and sharing. These activities are not diverting bells and whistles; they are the heart of the medium. When you cut yourself off from the rest of the Web you’re not just giving up some minor side-benefit; you’re abandoning the fundamental distribution model of the medium — like publishing a newspaper but leaving it on the truck.

This is dead-end thinking. If you don’t believe that, ask the Wall Street Journal’s editors why they let you in for free when you click on a Google link.

Filed Under: Media

When MP3 was young

April 2, 2009 by Scott Rosenberg

In early 2000 I got a call from a producer at Fresh Air, asking if I’d like to contribute some technology commentary. Fresh Air is, to my mind, one of the very best shows on radio, so yes, I was excited. For my tryout, I wrote a brief piece about this newfangled thing called MP3 that was just beginning to gain popularity. We’d been covering the MP3 scene at Salon since 1998, but it was still a novelty to much of the American public. I went down to KQED and recorded it. As far as I knew everyone liked it. But it never aired. I had four-month-old twins at home and a newsroom to manage at work. I forgot all about it.

In a recent file-system cleanup I came across the text of the piece and reread it, and thought it stood up pretty well. The picture it presents — of a future for music in which its enjoyment is divorced from the physical delivery system — has now largely come to pass. But at the time I was writing, the iPod was 18 months or so in the future; the iTunes store even farther out; the “summer of Napster” still lay ahead; and the record labels’ war on their own customers was still in the reconaissance phase.

Here it is — a little time capsule from a bygone era, looking forward at the world we live in today:

The phonograph I had as a kid played records at four different speeds. 33 was for LPs, 45 was for singles. There were two other speeds, 16 and 78, but I had no idea what they were for — they made singers on regular LPs sound like they’d sunk to the ocean floor or swallowed helium. Later I learned that the 78 speed was for heavy old disks, mostly from the ’20s, ’30s and ’40s; I’m still not clear what 16 was all about.

These old-fashioned playing speeds represented what, in today’s era of rapid obsolescence, we’d call “legacy platforms” — outmoded technologies that are no longer in wide use. The phonograph itself became a “legacy platform” in the 1980s with the advent of the compact disk. Now it’s the CD’s turn, as the distribution of music begins to move onto the Internet.
[Read more…]

Filed Under: Media, Music, Personal, Technology

What’s in a middle initial?

March 20, 2009 by Scott Rosenberg

One of the first things I learned as a rookie reporter was to ask everyone I interviewed how to spell their names and what their middle initials were. Who cared about the middle initial? Mostly, nobody. But obtaining it, the reasoning went, was a sign — to both the interviewee and, later on, your readers — that you cared about the details and could be trusted to get them right.

I still care about details and aim to spell names right. Mostly, I don’t bother with middle initials any more. Still, I take note when I see a Web writer who does. So I perked up while I was reading a breezy but lengthy piece titled “Die, Newspaper, Die” by
Mark Morford, a columnist at SFGate, the Web site of the foundering SF Chronicle. In his piece, Morford attempts to sum up the latest round in the Web’s discussion of post-newspaper journalism. He comes down on all sides at once, but with a definite leaning towards the value of the old pros, the sort of reporters who still bother to ask for middle initials:

In the howling absence of all the essential, unglamorous work newspapers now do — the fact-checking, interviewing, researching, all by experienced pros who know how to sift the human maelstrom better than anyone, and all hitched to 100+ years of hard-fought newsbrand credibility — what’s the new yardstick for integrity?

Alas, if including middle initials, and getting them right, is one of those yardsticks, Morford comes up short. For some reason, in referring to Steven Johnson — the widely known writer, founder of the pioneering Feed magazine and more recently Outside.in, and author of a currently much-discussed post on the new journalism ecosystem — Morford calls him “Steven P. Johnson.”

Now, getting a middle initial wrong could happen to anyone. But in Steven’s case, the man’s URL — at stevenberlinjohnson.com — includes his middle name. Morford even links to it.

A tiny thing, no doubt. But in a column whose title is “Notes and Errata,” it really made me wonder how much of that “100+ years of hard-fought newsbrand credibility” is left to salvage.

Filed Under: Blogging, Media

This morning I am in New York

March 18, 2009 by Scott Rosenberg

nyposthed

Filed Under: Business, Media

Berkeley J-School’s Chronicle panel: The horse-and-buggy set’s lament

March 17, 2009 by Scott Rosenberg

[Warning — long post ahead! This happens when one has a transcontinental flight during which to blog.]

A panel at the UC Berkeley School of Journalism that I attended yesterday evening was titled “The SF Chronicle in Transition.” “Transition,” here, is plainly a euphemism; the title ought to have been “The Chronicle In Extremis,” and the mood was that of a wake.

There is plenty of cause for communal handwringing in the face of the wrenching cutbacks and shutdowns that are plaguing newspapers across the U.S. and that most recently have threatened the survival of our major Bay Area daily, which has reportedly been losing its owner, the Hearst Corporation, $50 million a year, and looks likely to cut its staff by half if owners and unions reach an agreement. If not, Hearst has threatened to shut the paper down, leaving this city without a major daily newspaper. (It’s hard to believe that Hearst would simply write off its huge investments in the Chron, however; the threat sounds more like a negotiating tactic than a serious option.)

The panel offered a by now familiar litany, a mixture of wrongheaded cliches with legitimate fears. Heard, for instance, was the old canard that giving up newspapers for the Web means we won’t ever stumble on things we didn’t know we were interested in. (In fact, hugely popular sites like Boing Boing or Kottke.org have professionalized the generation of serendipity, and our Twitter friends feed us as varied a diet of links as we choose to feast on.) Here was the routine complaint about rudeness and “uninformed shouting” in comments forums. (A brief shouting match between one member of the crowd at the Berkeley event and the editor and publisher of the Berkeley Daily Planet — from what I could hear, about whether a writer had been censored — was as rude and off-topic as anything I’ve seen in a newspaper comments section.)

Beyond the usual Web-bashing lay some realistic worries about how we’ll get our local news and who will perform the public-interest watchdog role if newspapers vanish. “We’re in for a real dangerous period where there’s no one watching the store,” Lowell Bergman, the veteran investigative reporter, predicted.

[Read more…]

Filed Under: Blogging, Media

« Previous Page
Next Page »