This post was also published in VentureBeat.
Unlike other category-defining Internet companies, Twitter has struggled to meet both user growth metrics and Wall Street’s expectations. There are a lot of possible explanations for Twitter’s user growth problems, but they really boil down to one simple thing: As the content shared into streams grows exponentially, the streams have to get smarter in order to remain relevant to users.
Twitter presents cards in a straight reverse chronological stream that shows all content. The more people you follow and the more you use Twitter, the worse the Twitter experience becomes.
Facebook took a very different tack. Back in 2008, Mark Zuckerberg established Zuckerberg’s Law of Information Sharing, which predicted that the rate people share information like status updates and photos would double every year. In 2009, Facebook acquired Friendfeed for $50 million, integrating a team that was using content shared from external sites to learn what users liked and didn’t like. In 2011 and 2012, Facebook poached data science teams from across Silicon Valley to build an increasingly intelligent rules engine called EdgeRank that figured out what posts to show to which user in what order.
Users complained every time there was a change, but Facebook’s relentless focus paid off. Now in 2015, Facebook’s stream automatically notices how long it’s been since you’ve last looked, what types of content you’re interested in, what you like, what you click, and figures out who your close friends are to showcase their content. The Facebook ranking algorithm is constantly tweaked and optimized by an increasingly large machine intelligence team. On Facebook, the more people you follow, the better the experience gets as it increases signals to the stream algorithm. The Facebook stream has become so good that brand content is increasingly filtered out, so Facebook has just added a SeeFirst option that lets people opt-in to brand content.
Google also foresaw that the exponential deluge of information would overwhelm users and in 2011 began to work on Google Now to predict what people would be interested in as a stream of cards; it subsequently shut down other personalized Google attempts like iGoogle. Google Now launched in 2012 and after three years of iterations offers an extremely advanced interface that infers things you need to know, ranging from where you parked your car to fresh information about items you have searched. Google has been very proactive about placing Google Now before you search and answer cards above search results.
Conversely to Facebook and Google, Twitter has stuck with a straight temporal stream that shows all content no matter how irrelevant. Attempts to overlay features such as the Discover tab and a “while you were gone” view did not change how the main Twitter stream works: a torrent of information that quickly slides both interesting and silly posts into obscurity. Attempts to introduce threaded conversations created replicas of conversations on the same stream.
Twitter’s Dick Costolo recently lamented being too focused on short-term thinking to appease Wall Street. Exhibit #1 was the relentless effort to have users follow more people on Twitter. The early product team at Twitter discovered patterns that indicated if you followed at least 30 people, you were likely to remain engaged. So they redesigned the product to drive this behavior. Somehow this blossomed into a constant effort to get every user to follow more people. Yes, it was an easy engagement number to show Wall Street. However, the more people you follow on Twitter, the worse the experience.
With mobile usage surpassing desktop usage, the constraints of a mobile screen make stream optimization critical. The most relevant and actionable cards need to be on the top.
Although many commenters seem to think that switching to a Flipboard or Nuzzle style view would help, Facebook, Google, and others have proven that a stream interface with cards really performs. The real problem Twitter needs to solve is ranking whose posts are important to whom and how well the content is received. Twitter has been acquiring some machine learning teams, but is it too little too late? Perhaps not.
The first step is to remove the bad actors, as Twitter is a veritable bot farm. The second step is sorting the Twitter stream by relevance, with an option to switch between relevant and temporal posts, just like Facebook did years ago. After a bit of weening, nobody cares about the real time feed anymore. Once this switch is made, there is plenty of runway to iterate with users and test what works and doesn’t work. The third step is to aggregate similar posts together so that there is context and the stream doesn’t overflow with similar content.
Twitter has become the newswire of our generation, with everything from breaking news such a revolutions, interesting content, and celebrity crosstalk. Twitter just needs to be sorted into a modern stream, and the user growth and Wall Street accolades will follow.