Wordyard

Hand-forged posts since 2002

Scott Rosenberg

  • About
  • Greatest hits

Archives

Lies our bubbles taught us

April 30, 2014 by Scott Rosenberg Leave a Comment

5826294465_fe40b48ae8_o

Of course it’s a bubble! Let’s not waste any time on that one.

The fact of a bubble doesn’t mean that all the stuff people in tech are building today is worthless, nor will it all vanish when (not “if”) the bubble deflates.

It does mean that a certain set of reality distortions infects our conversations and our coverage of the industry. If you are old enough to have lived through one or more previous turns of this wheel, you will recognize them. If they’re not familiar to you, herewith, a primer: three lies you will most commonly hear during a tech bubble.

(1) Valuation is reality

When one company buys another and pays cash, it makes sense to say that the purchased company is “worth” what the buyer paid. But that’s not how most tech-industry transactions work. Mostly, we’re dealing in all stock trades, maybe with a little cash sweetener.

Stock prices are volatile, marginal and retrospective: they represent what the most recent buyer and seller agreed to pay. They offer no guarantee that the next buyer and seller will pay the same thing. The myth of valuation is the idea that you can take this momentary-snapshot stock price and apply it to the entire company — as if the whole market will freeze and let you do so.

Even more important: Stock is speculative — literally. When I buy your company with stock in my company, I’m not handing you money I earned; I’m giving you a piece of the future of my company (the assumption that someday there will be a flow of profits). There’s nothing wrong with that, but it’s not remotely the same as giving you cash you can walk away with.

When one company buys another with stock, the entire transaction is performed with hopes and dreams. This aspect of the market is like capitalism’s version of quantum uncertainty: No one actually knows what the acquired company is “worth” until the sell-or-hold choices that people will make play out over time. Some people might get rich; some might get screwed.

Too often, our headlines and stories and discussions of deals closed and fortunes made ignore all this. Maintaining the blur is in the interests of the deal-makers and the fortune-winners. Which is why it persists, and spreads every time the markets go bananas. Then young journalists who have never seen a tech bubble before sally forth, wide-eyed, to gape at the astronomical valuations of itty-bitty startups, without an asterisk in sight.

This distorted understanding of valuation takes slightly different forms depending on the status of the companies involved.

With a publicly traded company, it’s easy to determine a valuation: Just multiply price per share times the total number of outstanding shares, right? But the more shares anyone tries to sell at any one time, the less likely it is that he will be able to get the same price. The more shares you shovel into the market, most of the time, the further the price will drop. (Similarly, if you decide you want to buy a company by buying a majority of its stock, you’ll probably drive the price up.) So stock valuations are elusive moving targets.

For private companies, though, it’s even worse. Basically, the founders/owners and potential investors sit down and agree on any price they like. They look for rationales anywhere and everywhere, from comparable companies and transactions to history to media coverage and rumor to “the back of this envelope looks too empty, let’s make up some numbers to fill it.” If other investors are already in the game, they’re typically involved too; they’re happy if their original investment is now worth more, unhappy if their stake (the percentage ownership of the company their investment originally bought them) is in any way diluted.

There are all sorts of creative ways of cutting this pie. The main thing to know is, it’s entirely up to owners and investors how to approach placing a value on the company. All that private investment transactions (like Series A rounds) tell you is what these people think the company is worth — or what they want you to think it’s worth.

Also: it’s ridiculous to take a small transaction — like, “I just decided to buy 2 percent of this company for $20 million because I think they’re gonna be huge” — and extrapolate to the full valuation of the company: “He just bought 2 percent of us for $20 million, so we’re now a $1 billion company, yay!”

If you can keep persuading lots of people to keep paying $20 million for those 2 percent stakes, and you sell the whole company, then you’re a $1 billion company. Until then, you’re just a company that was able to persuade a person that it might be worth $1 billion someday. During a bubble, many such people are born every minute, but their money tends to disappear quickly, and they vanish from the landscape at the first sign of bust.

(Here’s a classic “X is worth Y” fib from the last bubble of the mid-2000s. Note that Digg was ultimately acquired, years later, for a small fraction of the price BusinessWeek floated in 2006.)

(2) “We will always put our users first”

Many startup founders are passionate about their dedication to their users, in what is really Silicon Valley’s modern twist on the age-old adage that “the customer always comes first.” One of the side-effects of a bubble is that small companies can defer the messy business of extracting cash profits from customers and devote their energy to pampering users. Hooray! That makes life good for everyone for a while.

The trouble arises when founders forget that there is a sharp and usually irreconcilable conflict between “putting the user first” and delivering profits to investors. The best companies find creative ways to delay this reckoning, but no one escapes it. It’s most pronounced in advertising-based industries, where the user isn’t the real paying customer. But even tech outfits that have paying users face painful dilemmas: Software vendors still put cranky licensing schemes on their products, or create awkward tie-ins to try to push new products and services, or block interoperability with competitors even when it would make users’ lives easier. Even hardware makers will pad their bottom lines by putting nasty expiration codes on ink cartridges or charging ridiculous prices for tiny adapters.

But the ultimate bubble delusion is the founder’s pipe-dream that she can sell her company yet still retain control and keep “putting the user first” forever. In the public-company version of this phenomenon, the founder tells the world that nothing will change after the IPO, the company’s mission is just as lofty as ever, its dedication to the user just as fanatical. This certainty lasts as long as the stock price holds up. Legally and practically, however, the company now exists to “deliver value to shareholders,” not to deliver value to users. Any time those goals conflict — and they will — the people in the room arguing the shareholders’ cause will always hold the trump card.

In the private company version, the founder of some just-acquired startup proudly tells the world that, even though he has just sold his company off at a gazillion- dollar valuation, nothing will change, users will benefit, move right along. Listen, for instance, to WhatsApp founder Jan Koum, an advocate of privacy and online-advertising skeptic, as he “sets the record straight” after Facebook acquired his company:

If partnering with Facebook meant that we had to change our values, we wouldn’t have done it. Instead, we are forming a partnership that would allow us to continue operating independently and autonomously. Our fundamental values and beliefs will not change. Our principles will not change. Everything that has made WhatsApp the leader in personal messaging will still be in place. Speculation to the contrary isn’t just baseless and unfounded, it’s irresponsible. It has the effect of scaring people into thinking we’re suddenly collecting all kinds of new data. That’s just not true, and it’s important to us that you know that.

Koum sounds like a fine guy, and I imagine he totally believes these words. But he’s deluding himself and his users by making promises in perpetuity that he can no longer keep. It’s not his company any more. He can say “partnership” as often as he likes; he has still sold his company. Facebook today may honor Koum’s privacy pledges, but who can say what Facebook tomorrow will decide?

This isn’t conspiracy thinking; it’s capitalism 101. Yet it’s remarkable how deeply a bubble-intoxicated industry can fool itself into believing it has transcended such inconvenient facts.

(3) This time, it’s different

If you’re 25 today then you were a college freshman when the global economy collapsed in 2007-8. If you went into tech, you’ve never lived through a full-on bust. Maybe it’s understandable for you to look at today’s welter of IPOs and acquisitions and stock millionaires and think that it’s just the natural order of things.

If you’re much older than that, though, no excuses! You know, as you should, that bubbles aren’t forever. Markets that go up go down, too. (Eventually, they go back up again.) Volatile as the tech economy is, it is also predictably cyclical.

Most recently, our bubbles have coincided with periods — like the late ’90s and the present — when the Federal Reserve kept interest rates low, flooding the markets with cash looking for a return. (Today, also, growing inequality has fattened the “play money” pockets of the very investors who are most likely to take risky bets.)

These bubbles end when there’s a sudden outbreak of sanity and sobriety, or when geopolitical trouble casts a pall on market exuberance. (Both of these happened in quick succession in 2000-2001 to end the original dotcom bubble.) Or a bubble can pop when the gears of the financial system itself jam up, as happened in 2007-8 to squelch the incipient Web 2.0 bubble. My guess is that today’s bubble will pop the moment interest rates begin to head north again — a reckoning that keeps failing to materialize, but must someday arrive.

It might be this year or next, it might be in three years or five, but sooner or later, this bubble will end too. Never mind what you read about the real differences between this bubble and that one in the ’90s. Big tech companies have revenue today! There are billions of users! Investors are smarter! All true. But none of these factors will stop today’s bubble from someday popping. Just watch.

Filed Under: Business, Technology

Demonetization

November 24, 2012 by Scott Rosenberg Leave a Comment

Buried near the end of John Markoff’s front-page feature in the Times today about “deep learning”, neural-net-inspired software, this tidbit, which I think requires no further elaboration, but is worth noting, and noting again:

One of the most striking aspects of the research led by Dr. [Geoffrey] Hinton is that it has taken place largely without the patent restrictions and bitter infighting over intellectual property that characterize high-technology fields.

“We decided early on not to make money out of this, but just to sort of spread it to infect everybody,” he said. “These companies are terribly pleased with this.”

Said companies will (a) build a new industry on these openly shared ideas; (b) make fortunes; and then (c) dedicate themselves to locking those ideas up and extracting maximum profit from them.

That’s inevitable and nothing new. Let’s be glad, though, for the occasional Geoffrey Hintons and Tim Berners-Lees, who periodically rebalance the equation between open and closed systems and keep our cycle of technology evolution moving forward.

Filed Under: Business, Technology, Uncategorized

WSJ Social: When news apps want to steal your face

September 24, 2011 by Scott Rosenberg 19 Comments

I read about WSJ Social, the newspaper’s experiment at providing a socially driven version of itself entirely inside Facebook, and thought, hey, I should check it out. So I Googled “WSJ Social” and clicked on http://social.wsj.com. Since my browser was already logged in to Facebook, I was immediately confronted with a Facebook permissions screen. I captured it above for posterity.

Here is the problem: All I want to do is see what WSJ is up to. I might or might not actually want to use the product. But before I can proceed, here is what I’m asked to approve:

(1) “Access my basic information — Includes name, profile picture, gender, networks, user ID, list of friends, and any other information I’ve made public.” Well, this stuff is public already, right? I think I can live with this.

(2) “Send me email — WSJ.com may email me directly…” Hmm. I’m not eager to add to my load of commercial email and there’s no indication of the volume involved. But I’m not hugely protective of my email address — you know, there it is in the image above — so I guess this isn’t a dealbreaker.

(3) “Post to Facebook as me — WSJ.com may post status messages, notes, photos, and videos on my behalf.”

Excuse me? You want to do what?

Forget it, NewsCorp. Ain’t happening.

Now, I fully understand that the app may be up to nothing terribly nasty — some or most of what it wants to do may be routine back-end stuff. But it doesn’t provide me with any confidence-building information. Tell me, WSJ Social: How often are you going to post under my account? And what kinds of messages are you going to send? How will I know you’re not going to spam my friends? How do I know the WSJ’s rabid editorial-page id won’t start posting paeans to Sarah Palin under my name?

Facebook permissions screens may have become as widely ignored as Terms of Service checkboxes and SSL certificate warnings. But the notion of the Journal (or anyone else) insisting on its right to “Post to Facebook as me” before it will even let me examine its news product is simply ridiculous.

UPDATE: On Twitter, WSJ’s Alan Murray responds: “Not going to happen. Standard permissions in order to allow WSJ Social to share stories you ‘like’ with your friends.”

Filed Under: Business, Media

Steve Jobs, auteurs, and team-building

September 7, 2011 by Scott Rosenberg 6 Comments


If you look at my life, I’ve never gotten it right the first time. It always takes me twice.
  — Steve Jobs, in a 1992 Washington Post interview

I first wrote about Steve Jobs as a digital auteur in January 1999, in a profile for Salon that tried, in the near-term aftermath of Jobs’ return from exile to Apple, to sum up his career thus far:

The most useful way to understand what Jobs does best is to think of him as a personal-computer auteur. In the language of film criticism, an auteur is the person — usually a director — who wields the authority and imagination to place a personal stamp on the collective product that we call a movie. The computer industry used to be full of auteurs — entrepreneurs who put their names on a whole generation of mostly forgotten machines like the Morrow, the Osborne, the Kaypro. But today’s PCs are largely a colorless, look-alike bunch; it’s no coincidence that their ancestors were known as “clones” — knockoffs of IBM’s original PC. In such a market, Steve Jobs may well be the last of the personal-computer auteurs. He’s the only person left in the industry with the clout, the chutzpah and the recklessness to build a computer that has unique personality and quirks.

The Jobs-as-auteur meme has reemerged recently in the aftermath of his retirement as Apple CEO. John Gruber gave a smart talk at MacWorld a while back, introducing the auteur theory as a way of thinking about industrial design, and then Randall Stross contrasted Apple’s auteurial approach with Google’s data-driven philosophy for the New York Times.

(Here is where I must acknowledge that the version of the auteur theory presented in all these analyses, including mine, omits a lot. The theory originally emerged as a way for the artists of the French New Wave, led by Francois Truffaut, to square their enthusiasm for American pop-culture icons like Alfred Hitchcock with their devotion to cinema as an expressive form of art. In other words, it was how French intellectuals justified their love for stuff they were supposed to be rejecting as mass-market crap. So the parallels to the world of Apple are limited. We’re really talking about “the auteur theory as commonly understood and oversimplified.” But I digress.)

Auteurial design can lead you to take creative risks and make stunning breakthroughs. It can also lead to self-indulgent train wrecks that squander reputations and cash. Jobs has certainly had his share of both these extremes. They both follow from the same trait: the auteur’s certainty that he’s right and willingness (as Gruber notes) to act on that certainty.

Hubris or inspiration? Either way, this kind of auteur disdains market research. “It isn’t the consumers’ job to know what they want,” Jobs likes to say. Hah hah. Right. Only that, the democratic heart of our culture tells us with every beat, is precisely the consumer’s job. To embrace Jobs’ quip as a serious insight is to say that markets themselves don’t and can’t work — that democracy is impossible and capitalism one colossal fraud. (And while that’s an intriguing argument in its own right, I don’t think it’s what Jobs meant.)

I have to assume what Jobs really means here is that, while most of us know what we want when we’re operating on known territory, there are corners that we can’t always see around — particularly in a tumultuous industry like computing. Jobs has cultivated that round-the-corner periscopic vantage for his entire career. He’s really good at it. And so sometimes he knows what we want before we do.

I find nothing but delight in this. I take considerable pleasure in the Apple products I use. Still, it must be said: “I know best” is a lousy way to run a business (or a family, or a government). It broadcasts arrogance and courts disaster. It plugs into the same cult-of-the-lone-hero-artist mindset that Apple’s ad campaigns have celebrated. It reeks of Randian ressentiment and adolescent contempt for the little people.

Jobs’ approach, in Jobs’ hands, overcame this creepiness by sheer dint of taste and smarts. There isn’t anyone else in Apple’s industry or any other who is remotely likely to be able to pull it off. If what Jobs’ successors and competitors take away from all this is that “we know best” can be an acceptable business strategy, they will be in big trouble.

But there’s a different and more useful lesson to draw from the Jobs saga.

The salient fact about the arc of Jobs’ career is that his second bite at Apple was far more satisfying than his first. Jobs’ is a story that resoundingly contradicts Fitzgerald’s dictum about the absence of second acts in American life. In a notoriously youth-oriented industry, he founded a company as a kid, got kicked out, and returned in his 40s to lead it to previously unimaginable success. So the really interesting question about Jobs is not “How does he do it?” but rather, “How did he do it differently the second time around?”

By most accounts, Jobs is no less “brutal and unforgiving” a manager today than he was as a young man. His does not seem to be a story of age mellowing youth. But somehow, Jobs II has succeeded in a way Jobs I never did at building Apple into a stable institution.

I’m not privy to Apple-insider scuttlebutt and all I really have are some hunches as to why this might be. My best guess is that Jobs figured out how to share responsibility and authority effectively with an inner circle of key managers. Adam Lashinsky’s recent study of Apple’s management described a group of “top 100” employees whom Jobs invites to an annual think-a-thon retreat. Jobs famously retained “final cut” authority on every single product. But he seems to have made enough room for his key lieutenants that they feel, and behave, like a team. Somehow, on some level, they must feel that Apple’s success is not only Jobs’ but theirs, too.

Can this team extend Jobs’ winning streak with jaw-droppingly exciting new products long after Jobs himself is no longer calling the shots? And can an executive team that always seemed like a model of harmony avoid the power struggles that often follow a strong leader’s departure? For now, Jobs’ role as Apple chairman is going to delay these reckonings. But we’re going to find out, sooner or later. (And I hope Jobs’ health allows it to be way later!)

If Apple post-Jobs can perform on the same level as Apple-led-by-Jobs, then we will have to revise the Steve Jobs story yet again. Because it will no longer make sense to argue over whether his greatest achievement was the Apple II or the original Mac or Pixar or the iPod or the iPhone or the iPad. It will be clear that his most important legacy is not a product but an institution: Apple itself.

Filed Under: Business, Technology

Why the Daily, Murdoch’s “tablet newspaper,” will be DOA

November 21, 2010 by Scott Rosenberg 72 Comments

When I first heard the phrase “iPad newspaper” — shorthand for Rupert Murdoch’s not-so-secret-any-more new project — I puzzled over its oxymoronic implications. Forget about the, you know, iPad/paper contradiction and think about the business. Murdoch is reportedly spending $30 million on this thing. Could that possibly pay off with a product that’s tethered to a single, new platform? Puzzled, I tweeted, “Will they stop me from reading it on my desktop?”

Apparently, the answer is yes. The Guardian writes that this new publication will feature “a tabloid sensibility with a broadsheet intelligence” (funny, that’s pretty much how David Talbot described Salon when we started it!) and tells us:

According to reports, there will be no “print edition” or “web edition”; the central innovation, developed with assistance from Apple engineers, will be to dispatch the publication automatically to an iPad or any of the growing number of similar devices. With no printing or distribution costs, the US-focused Daily will cost 99 cents (62p) a week.

Now, these “reports” (and the Guardian) may be unreliable here; we won’t know for sure till Murdoch unveils his product. But taking these rumors at face value, it sounds like Murdoch intends to deliver his latest news baby into a tablet-only world. A Monday column by David Carr confirms the report and adds some detail: The publication is to be called The Daily, and it will, apparently, be just that: “It will be produced into the evening, and then a button will be pushed and it will be ‘printed’ for the next morning. There will be updates — the number of which is still under discussion — but not at the velocity or with the urgency of a news Web site.” Wonderful! Slower news — and at a higher price.

First, let’s give Murdoch credit for what’s intelligent about this plan. It’s smart to ditch the original “legacy” of paper and the more recent legacy of website publishing — to build something fresh for a new platform rather than do the old shovelware dance. And it’s smart to jump in relatively early, to snag users when the tabula is still rasa.

For Murdoch, I have to imagine there is also something personal about this project. I’m sure he is furious that he has so far failed to extend his record of success and dominance, unbroken in other media, into the digital world. The iPad must look to him like his latest, best, and perhaps last chance to do so, after the humiliating embarrassment of his MySpace investment and the apparent trainwreck of the Times UK paywall.

But how likely is it that any significant number of people will pay $50 a year (or a bit less, assuming a subscription discount) for what is likely to be an above-average but hardly essential or irreplaceable periodical? It’s not as if iPad users have no existing sources of online news, innovative delivery mechanisms for information, or a shortage of stuff to read. iPad users love their browsers; the device is great for reading the free Web.

Murdoch will need more than half a million people to pay that fee to cover a $30M budget (less if he can sell ads), so maybe the thing will work. I’ll bet against it, though, assuming it’s as the Guardian and Carr describe it. I’ll base my bet on the same logic that I’ve long articulated about why paywalls are a bad idea (the problem is not with the “pay” but with the “wall”).

Why do people love getting their news online? It’s timely, it’s convenient, it’s fast — all that matters. Murdoch’s tablet could match that (though it sounds like it may drop the ball on “timely” and “fast”). But even more important than that, online news is connected: it’s news that you can respond to, link to, share with friends. It is part of a back-and-forth that you are also a part of.

Murdoch’s tablet thingie will be something else — a throwback to the isolation of pre-Web publications. Like a paywalled website, this tablet “paper” will discourage us from talking about its contents because we can’t link to it. In other words, like a paywalled site, it expects us to pay for something that is actually less useful and valuable than the free competition.

It’s possible, of course, that the creators of Murdoch’s tablet publication will try to turn it into a true interactive project — where interactive doesn’t mean “buttons you can click on” but rather “people you can interact with.” If they’re smart, they’ll try to build a community within their walls. But that’s a very difficult goal to achieve even if you embrace it wholeheartedly. At big media companies like News Corp., this idea is more often an afterthought than a priority.

Much more likely, the Murdoch project will make the same mistake so many big-media-backed digital ventures have made before. It will assume that its content is so unique, its personality so compelling, its information so rich that readers will regard it as essential. Yet even if it is a really good digital periodical — and it might be! — it is hard to imagine what News Corp. can do to make it that essential, in a world awash in news and information.

(Carr reports that “Initially, there will be a mirror site on the Web to market some of its wares outside the high-walled kingdom of apps.” I’ll bet that over time this mirror site will either grow to be the “real” Daily, as editors realize the free numbers dwarf the pay numbers, or they will pull up the drawbridge completely to try to force a few more customers to pay. It’s Slate 1998 all over again! Will we get Daily umbrellas?)

Now, I know a lot of my friends in journalism are rooting for Murdoch here because they see the pay-for-your-apps iPad model as a deus ex machina that will intervene to save the threatened business model of the old-school newsroom. (Carr’s column weighs the pros and cons here well.)

If you’ve read this far you know I think that’s unlikely. I also think it’s undesirable. On this, I stand with Tim Berners-Lee — who did the primary work in creating the Web two decades ago.

I followed the coverage of Murdoch’s venture around the same time I read Berners-Lee’s great essay on the 20th anniversary of the open Web., I’ll let him have the last word:

The tendency for magazines, for example, to produce smartphone “apps” rather than Web apps is disturbing, because that material is off the Web. You can’t bookmark it or e-mail a link to a page within it. You can’t tweet it. It is better to build a Web app that will also run on smartphone browsers, and the techniques for doing so are getting better all the time.

Some people may think that closed worlds are just fine. The worlds are easy to use and may seem to give those people what they want. But as we saw in the 1990s with the America Online dial-up information system that gave you a restricted subset of the Web, these closed, “walled gardens,” no matter how pleasing, can never compete in diversity, richness and innovation with the mad, throbbing Web market outside their gates. If a walled garden has too tight a hold on a market, however, it can delay that outside growth.

UPDATE: Sam Diaz at ZDNet shares my skepticism. I originally avoided writing here about the angle turning up suggesting that Steve Jobs was personally involved in the NewsCorp Daily project and had loaned Murdoch an engineering team; it appeared to be super-thinly sourced. Diaz agrees. We’ll all know soon enough.

In the meantime, let me take gentle issue with the concern both Diaz and Carr raise about the size of the Daily staff. Diaz asks: “Can a team of 100 reporters covering everything from Hollywood to Washington really dig in deep enough to produce the type of content worthy of that paid subscription?” Short answer: If 100 can’t, then 500 couldn’t, either. Carr: “How do you put out an original national newspaper every day with a staff of only 100?” Short answer: You don’t try to cover everything, but you cover what you do cover so originally and engagingly that people can’t resist.

Come on, people: 100 journalists is a huge newsroom as long as you’re not trying to be a “paper” — er, “tablet” — of record. If anything, it’s too big. The key, of course, lies in who those 100 people are, and how you deploy them. The problems with the Daily don’t lie in how much Murdoch is spending or how many bodies he’s hiring, but rather with some of the central premises of the project.

Filed Under: Business, Media

“Your map’s wrong”: Zuckerberg lights out for the territories

November 17, 2010 by Scott Rosenberg 4 Comments

It’s hard to think of a more meaningful recent exchange in the tech-industry world than the moment onstage at Web 2.0 last night when Facebook’s Mark Zuckerberg turned to conference organizers John Battelle and Tim O’Reilly and told them, “Your map’s wrong.” (I was sorry not to be there in person! I went to the first several Web 2.0 conferences but have recently tried to reduce conference attendance in an effort to Get Things Done instead.)

Zuckerberg was referring to a big map on the wall behind him that charted the conference’s theme of “points of control.” Battelle and O’Reilly had aimed to provide a graphic display of all the different entities that shape and limit our experience online today. It’s a useful exercise in many ways. But Zuckerberg argued that it was wrong-headed in describing an essentially closed system.

Here’s the full exchange, which you can watch below:

ZUCKERBERG: “I like this map that you have up here, but my first instinct was, your map’s wrong.”

BATTELLE: “Of course it’s wrong, it’s version one.”

ZUCKERBERG: “I think that the biggest part of the map has got to be the uncharted territory. Right? One of the best things about the technology industry is that it’s not zero sum. This thing makes it seem like it’s zero sum. Right? In order to take territory you have to be taking territory from someone else. But I think one of the best things is, we’re building real value in the world, not just taking value from other companies.”

Now, of course it’s in Zuckerberg’s interest to make this argument. And it would be disingenuous to maintain that Facebook isn’t engaged in some real direct competition with the other big Net-industry players today. As Tim Wu’s new book reminds us, the cycle of communications-technology innovation runs in a regular pattern in which innovators become monopolists and monopolists exact their tolls. Facebook, like its predecessors, is likely to proceed accordingly.

Nonetheless, I think Zuckerberg’s larger point is profoundly right. He found a way to remind us of something that was true when I started creating websites 15 years ago and that’s still true today: It’s still early in this game, and the game itself continues to grow. The portion of the online realm that we’ve already invented is still a mere fraction of the total job of creation that we’ll collectively perform. There is more world to come than world already made.

I find that I regularly need to remind myself of this every time I’m thinking of starting something new. When I started the Salon Blogs program in 2002 I worried that we were late arrivals to that game. Blogs had been around forever — I’d been reading them for five years! We shouldn’t forget that at the time of Google’s founding in 1998, search was considered old hat, a “solved problem.” Similarly, Facebook itself could have seemed a johnny-come-lately five years ago, coming as it did on the heels of Friendster, Orkut and MySpace.

The Net is still young and what we do with it and on it remains an early work in progress. The “uncharted territory” still beckons those who enjoy exploring. And it may be that one secret of Zuckerberg’s and Facebook’s success is that they aren’t losing sight of this truth as they plunge into the technology industry’s crazy scrum.

Here’s TechCrunch on Zuckerberg’s interview. And here’s the full video, linked to start at the 52:30 mark where the map discussion occurred:

Filed Under: Blogging, Business, Technology

Is Daily Beast really losing $10 million a year?

November 15, 2010 by Scott Rosenberg 3 Comments

I watched the mini-circus of media coverage that accompanied Friday’s announcement of the Newsweek/Daily Beast merger, and joined in the name mashup fun (I favor the Daily Week). Like a lot of people, I also scratched my head: Lashing two money-losing operations together doesn’t seem all that smart.

But one question kept nagging me. Everyone was pegging Daily Beast’s annual losses at $10 million. I spent a good number of years helping manage the budget of a Web news operation that resembles the Daily Beast’s in many ways (though, even at the height of the dotcom bubble, we never had a Tina Brown-sized salary to pay). And $10 million is an awful lot of money to lose on a digital-only outfit with essentially no distribution costs and a parent company (IAC) to handle the back end.

I mean, it’s possible to imagine burning through that kind of money on a company with 70 employees. It helps if (a) you have no revenue and (b) you pay people lavishly, throw tons of parties, give everyone an outsize travel budget, and spend a fortune on marketing — all of which Brown is entirely capable of.

Still, a loss of $10 million? I wanted to know where that number came from.

I spent some time over the weekend looking around, and as far as I can tell, this figure has only one source, in a Wall Street Journal story that ran last month:

The Daily Beast is expected to lose about $10 million this year, said a person in the know; executives say it’s on a pace to be profitable in two years.

So all we have behind this much-bruited-about number is a single anonymous “person in the know,” who might be the Daily Beast’s accountant or might be Tina Brown’s hairdresser. (Meanwhile, those “profitable in two years” predictions are not worth the breath required to utter them; every money-losing media company has a plan to be profitable in two years.)

Yet the Beast’s $10 million loss has now graduated from this thinly sourced ballpark figure to become received wisdom in the business press. Saturday it turned up in a New York Times piece:

It is an epiphany Mr. Diller most likely came to after seeing no other alternative for eventually turning a profit from The Daily Beast, which is losing on the order of $10 million a year.

I’d be curious to know where the Times piece got its hedged “on the order of” figure. Is it just repeating the Journal’s number? If it was independently sourced, why no mention of that? If the story used some back-of-the-napkin reckoning to arrive at the figure, shouldn’t we see it?

There’s no question that reporting on the finances of an outfit like the Daily Beast isn’t easy; execs will never voluntarily share unflattering numbers. (If you look at the SEC filings of Daily Beast parent IAC, you don’t get much help; Tina Brown’s fiefdom is not broken out on a separate line.) Despite reporting directly from within the belly of the Beast, media-reporting star Howie Kurtz, who recently left the Washington Post to join Brown’s outfit, offered no help in his own lengthy piece on the merger.

Still, we all could do better when casually throwing numbers like this around unless they are better documented. I’m sure the Daily Beast is losing a nice chunk of change, but it’s not at all clear to me that anyone outside the place has a clue whether that number is $5 million, $10 million or more.

Now that Brown’s losses are going to be mixed up with Newsweek’s, of course, figuring out who’s responsible for any ultimate profit or loss will become much more difficult, even from inside the organization. Brown will get to keep busting budgets for a spell and will no longer bear sole responsibility for the bottom line.

This sort of responsibility dilution was one of the reasons AOL’s savvy management sold out to Time Warner a decade ago — a merger that looked dubious to me from the get-go. I get the same whiff of fear and train-wreck from the Daily Week — only, this time around, the stakes are a lot lower.

Filed Under: Business, Media

What if the future of media is no “dominant players” at all?

October 28, 2010 by Scott Rosenberg 2 Comments

The New Yorker’s John Cassidy recently concluded a skeptical review of the finances of Gawker Media (which I caught up with late) — a piece somewhat ludicrously headlined “Is Nick Denton Really the New Rupert Murdoch?” — by asking the following question:

Can Gawker Media (and other blogging outfits such as the Huffington Post) translate their rapid audience growth into big streams of revenue and profits, thereby becoming dominant players in the news-media business? Or will the established players, which now have sizeable online arms as well as other sources of income (and costs), ultimately come out on top? Therein lies the future.

“The future” has been lying “therein” over and over for the last 15 years, yet it never seems to turn out that way. This kind of thinking drives me nuts — it’s always a zero-sum battle for dominance. (Can the scrappy little new guys grow so powerful that they’ll replace the big old guys? Or will the lumbering big old guys survive and “ultimately come out on top”?) And it always misses the point.

There are many other imaginable scenarios. Here’s the one I think is most likely.

Denton’s Gawker, Huffington Post, and similar-scale ventures won’t “become dominant players.” But those that husband their resources and play their cards smartly will survive, continuing to grow and to figure out the contours of the new media we are all building. They’ll be active, important players, without “dominating” the way the winners of previous era’s media wars did.

Meanwhile, “the established players” will fall into two groups. Many will collapse under the weight of their legacy costs and dwindling revenues, as so many are already starting to. Others will survive by figuring out, in time, how to cut costs while expanding their online reach.

The survivors in the second group will find that they can be profitable and do good work, but they will hardly have “come out on top.” In fact, as companies, they will come out looking much more like Gawker Media and Huffington Post than today’s Time Inc. or New York Times Company.

(The other factor here is that new “dominant players” may enter from other quarters — just look at the investments AOL and Yahoo are making in content. But I think they’ll find dominance elusive, too.)

In other words, this is a future with no small group of “dominant players,” but maybe a much broader spectrum of modestly successful players. This is because, in a world awash in content, the media business is never going to be as profitable as it was in a world of scarce content. It will be sustainable, but it won’t support the sort of monopoly profits that made it so attractive for seekers after dominance, you can check this website to find more info

It is also a world where there are no more Rupert Murdochs, which would come as a relief.

This outcome is almost entirely inconceivable to New York media insiders and to the reporters whose job, like Cassidy’s, is to cover their world.

The rest of us should cross our fingers and hope that…therein lies the future.

Filed Under: Business, Media

The Web Parenthesis: Is the “open Web” closing?

October 12, 2010 by Scott Rosenberg 24 Comments

Heard of the “Gutenberg parenthesis”? This is the intriguing proposition that the era of mass consumption of text ushered in by the printing press four centuries ago was a mere interlude between the previous era of predominantly oral culture and a new digital-oral era on whose threshold we may now sit.

That’s a fascinating debate in itself. For the moment I just want to borrow the “parenthesis” concept — the idea that an innovative development we are accustomed to viewing as a step up some progressive ladder may instead be simply a temporary break in some dominant norm.

What if the “open Web” were just this sort of parenthesis? What if the advent of a (near) universal publishing platform open to (nearly) all were not itself a transformative break with the past, but instead a brief transitional interlude between more closed informational regimes?

That’s the question I weighed last weekend at Open Web Foo Camp. I’d never been to one of O’Reilly’s Foo Camp events — informal “unconferences” at the publisher’s Sebastopol offices — but last weekend had the pleasure of hanging out with an extraordinary gang of smart people there. Here’s what I came away with.

For starters, of course everyone has a different take on the meaning of “openness.” Tantek Celik’s post lays out some of the principles embraced by ardent technologists in this field:

  • open formats for freely publishing what you write, photograph, video and otherwise create, author, or code (e.g. HTML, CSS, Javascript, JPEG, PNG, Ogg, WebM etc.).
  • domain name registrars and web hosting services that, like phone companies, don’t judge your content.
  • cheap internet access that doesn’t discriminate based on domains

But for many users, these principles are distant, complex, and hard to fathom. They might think of the iPhone as a substantially “open” device because hey, you can extend its functionality by buying new apps — that’s a lot more open than your Plain Old Cellphone, right? In the ’80s Microsoft’s DOS-Windows platform was labeled “open” because, unlike Apple’s products, anyone could manufacture hardware for it.

“Open,” then, isn’t a category; it’s a spectrum. The spectrum runs from effectively locked-down platforms and services (think: broadcast TV) to those that are substantially unencumbered by technical or legal constraint. There is probably no such thing as a totally open system. But it’s fairly easy to figure out whether one system is more or less open than another.

The trend-line of today’s successful digital platforms is moving noticeably towards the closed end of this spectrum. We see this at work at many different levels of the layered stack of services that give us the networks we enjoy today — for instance:

  • the App Store — iPhone apps, unlike Web sites and services, must pass through Apple’s approval process before being available to users.
  • Facebook / Twitter — These phenomenally successful social networks, though permeable in several important ways, exist as centralized operations run by private companies, which set the rules for what developers and users can do on them.
  • Comcast — the cable company that provides much of the U.S.’s Internet service just merged with NBC and faces all sorts of temptations to manipulate its delivery of the open Web to favor its own content and services.
  • Google — the big company most vocal about “open Web” principles has arguably compromised its commitment to net neutrality, and Open Web Foo attendees raised questions about new wrinkles in Google Search that may subtly favor large services like Yelp or Google-owned YouTube over independent sites.

The picture is hardly all-or-nothing, and openness regularly has its innings — for instance, with developments like Facebook’s new download-your-data feature. But once you load everything on the scales, it’s hard not to conclude that today we’re seeing the strongest challenge to the open Web ideal since the Web itself began taking off in 1994-5.

Then the Web seemed to represent a fundamental break from the media and technology regimes that preceded it — a mutant offspring of the academy and fringe culture that had inexplicably gone mass market and eclipsed the closed online services of its day. Now we must ask, was this openness an anomaly — a parenthesis?

My heart tells me “no,” but my brain says the answer will be yes — unless we get busy. Openness is resilient and powerful in itself, but it can’t survive without friends, without people who understand it explaining it to the public and lobbying for it inside companies and in front of regulators and governments.

For me, one of the heartening aspects of the Foo weekend was seeing a whole generation of young developers and entrepreneurs who grew up with a relatively open Web as a fact of life begin to grapple with this question themselves. And one of the questions hanging over the event, which Anil Dash framed, was how these people can hang on to their ideals once they move inside the biggest companies, as many of them have.

What’s at stake here is not just a lofty abstraction. It’s whether the next generation of innovators on the Web — in technology, in services, or in news and publishing, where my passion lies — will be free to raise their next mutant offspring. As Steven Johnson reminds us in his new book, when you close anything — your company, your service, your mind — you pay an “innovation tax.” You make it harder for ideas to bump together productively and become fertile.

Each of the institutions taking a hop toward the closed end of the openness spectrum today has inherited advantages from the relatively open online environment of the past 15 years. Let’s hope their successors over the next 15 can have the same head start.

Filed Under: Business, Events, Media, Net Culture, Technology

Hey Zuck! Hollywood just hacked your profile

October 4, 2010 by Scott Rosenberg 7 Comments


You know those Facebook phishing hacks — the ones where someone gets control of your account and sends phony messages to your friends? “I’m stuck in London! Send money quick!”

I kept thinking of that phenomenon as I watched The Social Network this weekend. Because what filmmakers Aaron Sorkin and David Fincher have done to their protagonist, Facebook founder Mark Zuckerberg, is the moral equivalent of this sort of identity theft.

They have appropriated Zuckerberg’s life story and, under the banner of fidelity to “storytelling” rather than simple documentary accuracy, twisted it into something mirroring their own obsessions rather than the truth. They transform Mark Zuckerberg’s biography from the messy tale of a dorm-room startup’s phenomenal success into a dark vision of a lonely geek’s descent into treachery.

The Social Network takes the labyrinthine and unique origins of Facebook at Harvard and turns them into a routine finger-wagger about how the road to the top is paved with bodies. Sorkin apparently isn’t interested in what makes his programmer-entrepreneur antihero tick, so he drops in cliches about class resentment and nerd estrangement.

In order to make it big, Sorkin’s Zuckerberg has to betray his business-partner friend (Eduardo Saverin). Why is he hungry for success? Sorkin has him wounded by two primal rejections — one by a girlfriend, the other by Harvard’s fraternity-ish old-money “final clubs.” The programming whiz-kid doesn’t know how to navigate the real-world “social network” — get it? — so he plots his revenge.

Many thoughtful pieces have already discussed the movie, and I don’t want to rehash them. I agree more with my friend David Edelstein’s take on the film’s cold triviality than with the enthusiastic raves from other quarters. Go read Lawrence Lessig and Jeff Jarvis for definitive critiques of the film’s failure to take even the most cursory measure of the real-world phenomenon it’s ostensibly about. Here’s Lessig: “This is like a film about the atomic bomb which never even introduces the idea that an explosion produced through atomic fission is importantly different from an explosion produced by dynamite.” Over in Slate, Zuckerberg’s classmate Nathan Heller outlines how far off the mark Sorkin wanders in his portrait of the Harvard social milieu. (Obsessive, brainy Jewish kids had stopped caring about whether they were excluded from the almost comically uncool final clubs at Harvard long before my own time there, and that was quite a long time ago by now.)

It’s Hollywood that remains clubby and status-conscious, far more dependent on a closed social network to get its work done than any Web company today. The movie diagrams the familiar and routine dynamic of a startup business, where founders’ stakes get diluted as money pours in to grow the company, as some sort of moral crime. (That may explain why — as David Carr lays it out — startup-friendly youngsters watch the film and don’t see the problem with Zuckerberg’s behavior, while their elders tut-tut.) Again, this is a Hollywood native’s critique of Silicon Valley; movie finance works in a more static way.

It’s strange to say this, since I am not a fan of Facebook itself — I prefer a more open Web ecology — but The Social Network made me feel sorry for the real Zuckerberg, notwithstanding the billionaire thing. He’s still a young guy with most of his life ahead of him, yet a version of his own life story that has plainly been shaped by the recollections of people who sued him is now being imprinted on the public imagination.

At least Orson Welles had the courtesy to rename William Randolph Hearst as “Charles Foster Kane.” This isn’t a legal issue (John Schwartz details why in today’s Times). But, for a movie that sets itself up as a graduate course in business ethics, it is most certainly a giant lapse of fairness.

In New York, Mark Harris described the film as “a well-aimed spitball thrown at new media by old media,” but I think it’s more than that — it’s a big lunging swat of the old-media dinosaur tail. The Web, of which Facebook is the latest popular manifestation, has begun to move us from a world in which you must rely on reporters and screenwriters and broadcasters to tell your story to one where you get to present your story yourself. (And everybody else gets to tell their own stories, and yours too, but on a reasonably equal footing.) The Social Network says to Zuckerberg, and by proxy, the rest of us who are exploring the new-media landscape: “Foolish little Net people, you only think you’re in control. We will define you forever — and you will have no say!”

In other words, The Social Network embodies the workings of the waning old order it is so thoroughly invested in. It can’t be bothered with aiming to tell the truth about Zuckerberg — yet it uses his real name and goes out of its way to affect documentary trappings, down to the concluding “where are they now?” text crawl.

The movie’s demolition job on the reputation of a living human being is far more ruthless than any prank Zuckerberg ever plotted from his dorm room. For what purpose? When a moviemaker says he owes his allegiance to “storytelling,” usually what he means is, he’s trying to sell the most tickets. I guess that to get where they wanted to go, Sorkin and Fincher just had to step on a few necks themselves.

Filed Under: Business, Culture, Media

Next Page »