About Me

My Photo
Peter Yared is the CTO/CIO of CBS Interactive, a top ten Internet destination, and was previously the founder and CEO of four enterprise infrastructure companies that were acquired by Sun, VMware, Webtrends and TigerLogic. Peter's software has powered brands from Fidelity to Home Depot to Lady Gaga. At Sun, Peter was the CTO of the Application Server Division and the CTO of the Liberty federated identity consortium. Peter is the inventor of several patents on core Internet infrastructure including federated single sign on and dynamic data requests. Peter began programming games and utilities at age 10, and started his career developing systems for government agencies. Peter regularly writes about technology trends and has written for CNET, the Wall Street Journal, BusinessWeek, AdWeek, VentureBeat and TechCrunch.

Many thanks to Bob Pulgino, Dave Prue, Steve Zocchi and Jean-Louis Gassée for mentoring me over the years.

Saturday, January 29, 2011

The iPadification of the Web


This post was also published in VentureBeat.


Design on the Web needs a reboot — and the iPad may provide the push publishers need to toggle the switch. But will smarter-looking online offerings save old media?

Creators of Web content have poured considerable effort into reinventing their websites as top-down, gorgeously designed experiences for Apple’s tablet and other mobile devices, in the hope that what they give away on the Web might turn into something their audience will pay for as an app.

That rethink is starting to reach into the desktop, ranging from the Huffington Post’s Glide App for Google’s Chrome Web Store to the Mac App Store version of Mashable, a popular tech blog.

You might ask why Mashable needs a Mac App Store version: Don’t users have Web browsers on their Macs? True. But the iPad has a browser, and a screen large enough to view websites comfortably. That hasn’t stopped iPad app developers.

The iPad has driven a new take on the content site — a streamlined, sexy version. One typically might see navigation on the left, related content to the right, and articles that float open in the middle — a simple, uncluttered format. Most importantly, as with the iPad’s App Store, the Mac App Store and Chrome Web Store offer a micropayment mechanism that lets people pay small amounts of money with a single click in order to subscribe to content.



Apps, furthermore, have a dedicated presence, putting themselves at the top of a user’s menu of online activities. A Web browser contains infinite possibilities — but users only install a finite number of apps, giving app publishers an advantageous position in mindshare.

There is no question that these new, “iPadified” sites look far better than their Web analogs. If anything, they look more like the mobile versions of websites. And since mobile sites are by definition focused and simplified, they are quite often better experiences. As venture capitalist Fred Wilson noted on his blog, sometimes companies should just use the mobile version of their site as their actual website.

Think about how Web design happens in the real world. Does anyone really care about your mission statement? In a groupthink-friendly marketing meeting, it gets tacked onto the homepage. And then a social-media expert recommends a Facebook plugin and sharing links for a dozen or so popular sites. Then a recommended-content widget to drive more pageviews. Sales wants more ad inventory. (Startups like BrightTag have sprung up purely to manage this mess.)

Against that tide of flashy flotsam comes the iPad. For the smaller screens of mobile devices, hard decisions have to be made, and the crap gets cut. Which raises the question: Why was it ever there in the first place?

Woe betide the publishers who don’t iPadify their content. Entrepreneurs — full disclosure, myself included — are more than ready to do the job for them. The first generation of news aggregators, like Digg and Google News, used algorithms to present headlines in bare-bones format. Now, a raft of “social newspapers” like Flipboard, Paper.li, and PostPost, a side project of mine, are plucking headlines shared by users’ friends on Facebook and Twitter and displaying them in elegant, iPad-friendly formats.

Facebook and Twitter themselves, for that matter, provide elegant, stripped-down interfaces for reading the news, or what users’ friends consider news, at any rate. As a result, an increasing amount of traffic to news sites now comes from social links.

Against this tide, iPadification appears like a sane response. The embrace of mobile design and interface metaphors provides publishers the hope that they can retrain users to consume content as an experience, starting with the homepage, rather than a series of links that they click from an endless variety of sources.

But can it, really? Ultimately, iPadification and socialization aren’t conflicting: Well-designed newsreaders driven by social links seem poised to offer the best of both worlds. This scenario has already been played out in the music industry. An article in a newspaper is no different that a track on an album, and users have clearly decided they like to play DJ with music and editor with content. No matter how nice the labels made CDs, whether box sets or exclusive special content, users just want to download a track, not an album. From the user perspective, why should an article be any different? Why take an editor’s mix when you or your friends can put together their own?

The transition from paper to digital is likely to cull the weak, and only a few large players such as the New York Times, the Economist and the Wall Street Journal, along with some of the top fashion-magazine brands, are going to make the cut. Taking ’90s-style CD-ROM “multimedia” and putting it into an app won’t save the vast middle tier of media. For the rest, either they’ll adapt to the economics of the open Web, as upstart new-media brands like Mashable and VentureBeat have, or they’ll fade away, apps and all.

Even the New York Times, perhaps the pinnacle of editorially curated content, is embracing iPadification and socialization after initially fighting it. News.me, a collaboration between the Times and New York City-based incubator Betaworks, the backer of link-sharing service Bit.ly, was supposed to launch late last year. It’s not out yet, but when we finally get a chance to see what they’ve been up to, we’ll get a glimpse of media’s iPadified, social future.

Friday, January 21, 2011

With Schmidt Out as CEO, Google can Stop Copying Microsoft


This post was also published in VentureBeat.


There’s much to praise in departing Google CEO Eric Schmidt’s tenure. But if the stagnation of recent years can be pinned on one fault, it’s this: Schmidt’s Microsoft obsession.

Sure, Microsoft gets lots of flack for attempting to knock off Google’s Web search with Bing. But the truth is that under Schmidt’s stewardship, Google has been obsessed with replicating Microsoft products. Windows, Internet Explorer, Office, Exchange, and .Net? Chrome OS, Chrome Browser, Google Apps, and Google App Engine.

Ironically, when Schmidt first joined Google, he told John Battelle that he was looking forward to not competing with Microsoft, after being battered by Microsoft during his stints as CTO of Sun and CEO of Novell.

Schmidt leaves Google a healthy company. It has completely dominated search as a money-making business, even though the search product definitely is in the midst of a much-needed retooling. But now that Schmidt is moving on — Ken Auletta in the New Yorker suggests that the post as executive chairman is just a temporary gig before he leaves for good — can Google drop the so-last-century obsession with faded tech idea like operating systems and email?



Schmidt’s Microsoft obession has some history, which I witnessed firsthand. Schmidt spent the formative years of his career at Sun Microsystems, a company whose very soul was combatting Microsoft. In 1998, a year after Schmidt left, I arrived at Sun when it acquired NetDynamics, one of the first application server companies, where I was CTO.

Back then Java was still new, e-commerce was still starting to scale, and there was no standard Java way to add data and logic to an HTML page. We went and met with the Java folks about it, and they told us they were going to create “Java Server Pages” or JSPs, a pretty hackneyed notation. “Why are you doing that instead of using XML?” I asked, referring to the Extensible Markup Language, a more fluid and up-to-date way of doing things that’s now a standard way of exchanging data. “Because that’s how Microsoft does it with Active Server Pages,” was the reply. I left the meeting thinking that these guys were insane. I soon learned that almost every division of Sun was trying to compete with Microsoft on Microsoft’s terms, from office productivity software to software servers to consumer operating systems.

After Sun, Schmidt was the CEO of Novell, and launched or acquired failed products such as SUSE Desktop Linux in an attempt to compete with Windows and the Mono project to replicate Microsoft .Net on Linux.

Schmidt claimed he’d learned from those experiences at Sun and Novell. But is it really that surprising that, once Google started gushing cash from its advertising business, that Schmidt used it to take on Microsoft yet again?

In the end, there is really no point of copying Microsoft on its own turf; as Schmidt has now found out three times, it is a lost cause. Google’s enterprise business — Google Apps and the like — have scored some wins, but are far from achieving Microsoft’s scale.

No wonder why: Imagine you are a typical business running Microsoft Outlook, Office, Exchange, SharePoint, .Net, and such. Google comes in and says, “We can rip some of this stuff out, it will be very painful to migrate your data, and you have to retrain all of your users.” And Microsoft comes in and says “we will migrate all of your servers to our cloud over the weekend, you guys can come back on Monday and everything will look exactly the same, except now it’s hosted and you can fire your IT department.” Which way do you think most businesses are going to go? It’s a nice business. It’s just not the future. Why bother competing for it?

Product companies have to be run by product people -– Steve Jobs at Apple, Larry Ellison at Oracle, Bill Gates during Microsoft’s heyday, Marc Benioff at Salesforce, and Mark Zuckerberg at Facebook. These guys know how to drive their companies into new products such as iPads, social advertising, and application platforms. To the extent that Schmidt, who shared power with cofounders Larry Page and Sergey Brin, was a product person, he was about last century’s products, not this one’s.

Here’s hoping Page, who came of age when Microsoft was already fading, is looking 10 years into the future, rather than 10 years into the past. Imagine if Google had put the same amount of energy into its Orkut social network (that launched weeks before Facebook in January 2004) that it put into Google Apps. Now that Google is back in the hands of its original product visionary, we will finally see what Google is really made of.

Wednesday, January 12, 2011

Google Already Knows its Search Sucks (And They’re Fixing It)


This post was also published in VentureBeat.


It’s a popular notion these days Google has lost its “mojo” due to failed products like Google Wave, Google Buzz, and Google TV. But Google’s core business — Web search — has come under fire recently for being the ultimate in failed tech products.

I can only ask: What took you so long? I first blogged about Google’s increasingly terrible search results in October 2007. If you search for any topic that is monetizable, such as “iPod Connectivity” or “Futon Filling,” you will see pages and pages of search results selling products, and very few that actually answer the query. In contrast, if you search for something that isn’t monetizable, say “bridge construction,” it is like going 10 years back into a search time machine.

Search has been increasingly gamed by link and content farms year by year, and users have been frogs slowly getting boiled in water without realizing it. (Bing has similarly bad results, a testament to Microsoft’s quest to copy everything Google.)

But here’s what these late-blooming critics miss: Yes, Google’s search results do indeed suck. But Google’s fixing it.

Yes, the much acclaimed PageRank algorithm that ranks search results based on the most number of inbound links has failed, since is easy for marketers to overwhelm the number of organic links showing that something is interesting with a bunch of astroturfed links. Case in point, the Google.com page that describes PageRank is #4 in the Google search results for the term PageRank, below two vendors that are selling search engine marketing.

Facebook, which can rank content based on the number of Likes from actual people, rather than the number of inbound links from various websites, can now provide more relevant hits, and in realtime since it does not have to crawl the web, a Like is registered immediately. No wonder Facebook scares Google.

But the reality of Google’s success was actually not PageRank, although it makes for a good foundation myth. The now-forgotten AltaVista, buried within Yahoo and due to be shut down, actually returned great results by employing the exact opposite of PageRank, and returned pages that were hubs and had links to related content.

Google’s secret was that it could scale infinitely on low-cost hardware and was able to keep up with the Internet’s exponential growth, while its competitors such as AltaVista were running on expensive, big machines running processors like the DEC Alpha. When the size of the Web doubled, Google could cheaply keep up on commodity PC hardware, and Altavista was left behind. Cheap and expandable computing, not ranking Web pages, is what Google does best. Combine that with an ever-expanding data set, based on people’s clicks, and you have a virtuous circle that keeps on spinning.

The folks at Google have not been asleep at the wheel. They are well aware that their search results were being increasingly gamed by search marketers, and that this was not a battle they were going to win. The answer has been to dump the famous blue links on which Google built its business.

Over the past couple of years, Google has progressively added vertical search results above its regular results. When you search for the weather, businesses, stock quotes, popular videos, music, addresses, airplane flight status, and more, the search results of what you are looking for is presented immediately. The vast majority of users are no longer paging through pages of Google results: They are instantly getting an answer to their question:



Google has the unique position of learning from billions and billions of queries what is relevant and what can be verticalized into immediate results. Google’s search value proposition has now transitioned to immediately answering your question, with the option of sifting through additional results. And that’s through a combination of computing power and accumulated data that competitors just can’t match.

For those of us that have watched this transition closely and attentively over the past few years, it has been an amazing feat that should be commended. So while I am the first to make fun of Google’s various product failures, Google search is no longer one of them.

Friday, January 07, 2011

Amazon’s Narcissistic App Store


This post was also published in VentureBeat.


Why did Amazon launch an Android app store this week? Chalk it up to a case of Apple envy.

Thanks to the iPod, iPhone, and iTunes, Apple is now making an estimated $5 billion per year in digital content distribution. Amazon.com, caught by surprise, has been struggling to keep up in the fast-growing digital content market. It missed the boat with music. (Discount Kid Rock MP3s, anyone?). Amazon’s online-video service is caught in a netherworld between iTunes and Netflix. Even at its new lower price point, the Kindle will soon be eclipsed by the iPad and Android tablets; already, Apple forced Amazon to lower its exorbitant book distribution fees.

Despite these multiple failures, Amazon still wants to make a go of it. Everybody else has an app store, from Apple to Google to Nokia and even HP. Why shouldn’t the world’s biggest online store sell apps, too?

Give Jeff Bezos this much credit: He sees that everyone is vertically integrating from content to distribution to device, and Amazon.com needs a piece of this action before another fast-growing digital content revenue stream flows away.

The problem is that Bezos doesn’t own a mobile platform, so he has to horn in on someone else’s. Google’s Android, as an open-source system, allows third-party app stores, so it’s a natural target.

But embracing Android, while certainly easier than trying to machete into Apple’s walled garden, has its good and bad sides. There is an old corollary about open source: the great thing about open source is that anyone can do anything with it, the bad thing about open source is that anyone can do anything with it. Generally open-source projects have a natural center and one main branch, and it takes quite a lot of mismanagement to cause people to fork a project.

We are starting to see some wireless carriers such as Verizon start to deploy their own stores. That’s understandable, as they actually sell phones and services, and already sell content to their customers — though, like past app-store attempts by carriers, it’s unlikely to be successful.

Google’s Android Market definitely has its problems. Like Facebook, it allows any developer to submit an app, and then users can flag dangerous or fraudulent apps. However, a Facebook app hanging your browser is very different than an Android app crashing your phone. Clearly there needs to be some better application filtering and sandboxing.

But does this posit the need for an entirely new Android app store from Amazon, which will manually approve apps like Apple does with the iPhone? What’s cool about Android is that there is no Big Brother approving apps. Amazon will also offer app recommendations — a vital need in the app world, and one that Amazon’s vast stores of customer data can help in. But this is a category that others such as GetJar and Chomp fill without having to deploy an app store.



The one clearly important innovation Amazon is offering is dynamic pricing, which will discount apps automatically to increase sales. But Google, which is a master at dynamically pricing ads, could easily copy this feature.

The problem here is that Google has finally gotten all of the carriers and handset manufactures in line and is shipping relatively consistent and stable Android phones. Android is finally a viable contender to the iPhone despite its early fragmentation.

Yet now, instead of software and hardware fragmentation, the Android platform has to contend with fragmented app stores. While the Amazon app store might be good for Amazon, it is definitely not good for the Android ecosystem.

The good news is that it is likely to fade into the ether, since it will be difficult to convince carriers and handset manufacturers to promote yet another app store — or explain to end users, who already suffer brand confusion about who makes Android phones, why they would need another app provider. But if Amazon’s innovations in pricing and recommendations spur Google to improve its Android Market, then Amazon’s store will have served a purpose. Just not the one Jeff Bezos may have had in mind.

Monday, January 03, 2011

Will 2011 be the Year Online Video Unravels the Internet?


This post was also published in VentureBeat.


The explosive growth of online video over the past couple of years has begun to unravel the way both businesses and consumers have used and paid for Internet access over the past decade. Although past Internet growth has been exponential, the deluge of video that is coming in the next decade has already forced a series of legal, regulatory, and business disputes that have set the stage for significant changes for the next decade of video on the Internet.

A lot of this has been talked about in the lofty intellectual framework of net neutrality, which advocates present as the principle that providers of Internet access should not discriminate between types or sources of traffic on their network. But what it really comes down to are a new set of business arrangements for who will pay to get the bits from point A to point B. A host of developments in the past year have set the stage for major battles over bandwidth in 2011.

In April, the US Court of Appeals in Washington, DC overturned a 2008 Federal Communications Commission ruling forbidding Internet service providers from throttling BitTorrent, a popular video sharing protocol. With that ban overturned, ISPs now had free reign to meter particular bandwidth-hungry protocols.

Comcast had already effectively worked around the FCC decision by capping consumer consumption to 250 gigabytes, a relatively large amount of bandwidth that most consumers never came close to reaching. However, with the advent of HD streaming from Hulu, Netflix, iTunes, Amazon.com, and others watching 100 hours of HD video, or just 3 hours a day, can exceed that cap, prompting many “over the top” users to transition to more expensive “business class” plans that do not cap bandwidth.

Right before the Christmas holiday, the FCC issued a description of an upcoming set of rules regarding net neutrality for service providers. Wired broadband providers such as Comcast and Time Warner could no longer “unreasonably discriminate” against competing content providers like Hulu, but they could theoretically still discriminate against bandwidth-hogging protocols like BitTorrent.

In a King Solomon-like decision, the FCC also decided that wireless providers like AT&T and Verizon Wireless could definitely discriminate against particular protocols and apparently even particular providers such as Hulu, although this aspect is not completely clear until the full rules are published early next year. Most wireless providers had already capped their “unlimited” data plans earlier this year in response to the increasing consumption of video on their networks.

This FCC decision to allow discrimination on wireless neworks set the net neutrality community into a holiday-season tizzy. Despite the uproar, it has to be argued that since bandwidth is indeed limited on wireless networks, allowing providers to discriminate was a reasonable decision.

At some level of bandwidth, even net neutrality advocates cave. I had a funny exchange with a prominent net neutrality advocate earlier this year after he had posted on Facebook that he was using GoGo, the inflight Internet-access provider, on a plane. “Aren’t you glad GoGo caps your neighbors from sucking up all the bandwidth with Hulu and YouTube?” I asked. “This is different,” he responded.

The FCC thinks that wireless networks, too, are different, with inherent limitations in bandwidth. So it ruled they can discriminate in order to maintain predicable service, although the may revisit this once fourth-generation, or 4G, wireless technologies like WiMax and LTE get larger penetration.



While most of the fuss was about the legal landscape, the biggest video salvo came on the business front in late November, when Level 3, an Internet backbone provider that had recently entered the content delivery network business through its acquisition of Savvis’s operation, publicly complained that Comcast was attempting to extort large fees in order for Level 3 to deliver Netflix’s video streams to Comcast customers.

Level 3 claimed that Comcast was doing this in a discriminatory manner in order to promote its own competing Fancast property and its soon to be acquired stake in Hulu via the Comcast acquisition of NBC Universal. Comcast in turn argued that it was simply charging for a standard “peering” arrangement in order to deliver a vastly increased amount of data from Level 3, that Level 3 had underbid to win Netflix’s business and was trying to have Comcast, in essence, subsidize its contract.

For many readers of VentureBeat who invest in or operate Internet businesses, the practice of paying to serve content on the Internet is well known. The more successful a company is at distributing its content, the more it has to pay in hosting bills. All the main hosters used by startups and large companies alike, ranging from MediaTemple to Amazon.com to Rackspace tier their prices based on the number of bytes sent out.

Why do content companies have to pay by the byte sent if the Internet is “free”? Because, in fact, the hosters in turn must pay to plug into various backbones, and they pay by the amount of traffic they are sending upstream, and deduct by the amount of traffic they are accepting back. The word “Internet” is short for “internetworking.” What’s presented as a single network is in fact a collection of interconnected networks, each of which pay to send data to each other. This system of “the more you send, the more you pay” has been in place since the start of the commercial Internet. It’s called “peering”: If two companies are sending each other the same amount of data at an interconnection point, they are by definition “peers”. If one is sending a lot more, it has to pay more.

The commercial Internet has long had a “fast lane,” called a content-delivery network, where companies like Akamai and Limelight Networks have servers and data centers located in the same facilities as ISPs or close by. They pay the ISPs for a high level of peering connections and copy content to those colocated facilities. So when you are reading a New York Times article, or watching a YouTube video, chances are it is being served from an Akamai server very close to you, and the content provider is not paying to send the content from their server through their backbone links. This is cheaper for a media distributor since the CDN only has to pay for the uplink into your ISP or to colocate servers at your ISP, and is faster as well since it is not hopping across a bunch of networks.

The most interesting aspect of the Level 3 and Comcast fracas is that it is an attempt to use regulatory and fair-trade claims to change how the commercial Internet has worked to date. It is unlikely that the FCC or Federal Trade Commission will force ISPs to deliver an unlimited amount of data for free. What if Netflix was plugging directly into Comcast’s network? Wouldn’t they be expected to pay a connectivity fee like they are currently paying their backbone and CDN providers? Peering arrangements are the fabric of how the commercial Internet has operated, and it is unlikely that regulators will attempt to change this.

Netflix already accounts for an estimated 20 percent of primetime Internet traffic, a figure that is constantly growing. That company can easily switch to Akamai or Limelight which already have the infrastructure in place with ISPs like Comcast to delivery video streams. However, it is likely that regulators will force ISPs like Comcast, which also provide their own competing video streams and services, to offer CDNs reasonable and nondiscriminatory pricing for peering connections.

The FCC mandate requiring wireline ISPs not to discriminate will be pushed to the limits by multichannel video providers like Comcast and AT&T U-Verse, since it is a fundamental aspect of their operation to discriminate. A multichannel wire coming into your home might provide digital TV, digital telephone, and Internet data. But it’s not three different connections: It’s just one, with Comcast, AT&T, Time Warner, Verizon and the rest operating their networks to deliver a certain amount of bandwidth to each service.

As it stands right now, these providers secure video content licenses from firms like CBS and HBO and deliver it with dedicated bandwidth to proprietary cable boxes. It really is no different than Netflix licensing the same content, paying for connectivity with a CDN, and streaming it to a Roku box. But although both signals are coming across the same cable, for now everyone is committed to keeping them separate from a regulatory perspective.

As more and more video viewing shifts from cable television to Internet video, multichannel providers like Comcast and AT&T are very likely to want to shift dedicated bandwidth from the digital television portion of their networks to the Internet portion, and allow viewing of their licensed video content on Web browsers and Roku boxes in addition to proprietary cable boxes. Such a shift will likely cause a huge battle with net neutrality advocates and streaming competitors such as Netflix. It is not clear how the FCC and FTC will rule on this issue, and is very dependent on the upcoming government climate.

One option would be to force providers to lease dedicated bandwidth to video streaming competitors at reasonable and nondiscriminatory pricing, much like how in the early days of broadband phone companies were forced to lease their lines to competitive DSL providers. This would enable legacy providers such as Comcast and new entrants like Netflix to play on a relatively even field.

But as the heated discussion around net neutrality shows, there’s no guaranteed bandwidth for reason in this discussion.