Once upon a time, "retweeting" was something you did by copying the full text of someone else's Tweet and adding "RT @username:" to it. If there was space, you could add your own comment in front.
Then Twitter incorporated retweeting into its software — you just pressed a button to share someone else's message with everyone who follows you. Retweets became a lot easier; and overnight, the old-school technique became known as a "manual" retweet. For a while, old-school users (like me) stuck with it, either out of habit, or as a badge of ur-Twitter cool, or because we actually relished the opportunity to add our two cents. Meanwhile, more modern social-media mavens began complaining that it was a bad practice — it messed up their analytics. Some indignantly told us that manual retweets were actually evil self-promotion, because they go out under your own name rather than that of the author of the original tweet.
Today, those of us who still use it occasionally are like drivers who still like a shift and a clutch. Don't we know the technology has moved on?
Now Twitter is tinkering again! The latest change is that the "favorite" — a tool most users rely on either to bookmark links they want to return to or to send a little head-nod of acknowledgment out to the tweet's creator — is being put to use in a new way. Twitter is experimenting with showing you tweets from users you do not follow if those tweets are favorited by lots of other users (presumably, a lot of other users who you follow).
Like most of Twitter's steady evolution since its debut in 2006, this change pushes the service away from early-adopter enthusiasts and toward a bigger crowd. Twitter already has hundreds of millions of accounts and probably tens of millions of actual real people using it. But that's not enough to support Wall Street's expectations; post-IPO Twitter has billions in its eyes.
Quartz's Dan Frommer thinks that's all OK: "Twitter needs to keep growing," he writes, and "if additions like these…could make Twitter useful to billions of potential users, it will be worth rewriting Twitter’s basic rules."
Over at The Next Web, Jon Russell is less enthusiastic, calling the changes "confusing and seemingly unnecessary." When will Twitter no longer be Twitter?, asks Robinson Meyer in the Atlantic.
Russell argues that, even though you could always look up someone's list of Twitter faves, "favoriting is inherently a private action" — you were saving something for yourself or winking at some specific other person, not broadcasting.
Guess what? Twitter doesn't care. It's doing the same thing that Web platform-builders have been doing since the early days of Web 2.0, when Flickr and Delicious set "public" as the default for bookmarks and photo posts. They did so because there was so much interesting stuff you could surface that way.
Facebook moved that whole dynamic in a different direction by figuring out that if you trained people to share the details of their lives, and then kept changing the rules in ways that made those details more and more public, you could mine a network of billions of people for billions of dollars.
Back in 2007 Jason Kottke observed that what many Web companies do is "take something that everyone does with their friends and make it public and permanent." A decade ago, this technique seemed to be a method of expanding creative horizons, broadening the possibilities of online sharing, and enabling exciting new data mashups. Then we began to see its darker side, as financial incentives drove services to get aggressive about flipping the switch from "private" to "public."
Twitter may find that its "favorites" experiment works well. Maybe it "increases engagement" or improves the experience for casual users. It is also reminding us, as Facebook's mood experiment did, who is in charge, and what their motivations are. They control the vertical; they control the horizontal.