Google I/O 2010, Keynote 1 Impressions

The first Keynote of Google I/O 2010 had a few highlights:

  1. HTML5 my Lord - Kumbaya

  2. Native Client Underplayed

  3. The announcement of the Chrome Web Store

  4. Adobe focusing on HTML5 tooling

  5. The Open Sourcing and permissive use licensing of the VP8 codec

  6. The partnership of VMWare/SpringSource and Google to allow developers to create portable Cloud Apps

HTML5 my Lord - Kumbaya

Rightly so, Google is heavily pushing its weight behind HTML5 as a way to expose greater client-side functionality to web-developers whilst still remaining in a secure environment.

Development team after development team were rolled out onto stage to wax lyrical about how great HTML5 is and how it allows them to create richer web experiences.

Very true, the demos all highlighted functionality that was not possible in HTML4, yet, something was off. HTML5 allows the server to send the client JavaScript code to run locally and also allows for the code and resources to be locally cached so that an application can be ran offline (the server can create a client-side database cache). All this is great and a definate step forward.

BUT, HTML5 still sits on top of the DOM and JavaScript. That makes them slow. I assume that the PC used in the demos was capable, and even if not, you would have to go back 10 years to re-create the same sort of sluggishness in a desktop application that was demonstrated in these HTML5 applications.

For the majority of apps, it simply will not make a difference. If you are dealing with text and forms with the occasional diagram on your page, HTML5 will solve all your needs, as-is.

But, I think that it is a fundamental mistake to think that web-apps can compete with desktop apps with the current iteration of HTML5.

Desktop apps need real bindings to the environment on which they sit. They require access to the USB ports, the microphone, the webcam, the displayport. Desktop apps need to run at native speeds and not through a layer of JavaScript, that needs to be contantly runtime evaluated for optimisation due to the language being designed for maximum flexibility, not speed.

Desktop apps need to be able to control screen refresh and to be able to synchronize on vertical refresh. Desktop apps need to be able to run full-screen if required. They need to be power-efficient and to be able to perform their task with in the smallest amount of time.

Some HTML5 apps will not be subject to performance considerations nor the considerations of file-system or hardware access. For these apps, HTML5 represents a huge leap forward, but if the intent of Google is to move all apps onto the cloud as a combination of server and client-side scripting, then HTML5 needs to be updated to allow for native-execution-speed and better bindings to the host device capabilities.

Native Client Underplayed

As if to acknowledge that JavaScript will not meet the performance expectations of web-users accustomed to desktop applications that run at native speed, the keynote mentioned in passing the 'Native Client' project but downplayed it at this time as it undermined the main theme of HTML5 being the Alpha and the Omega.

The Native Client project allows for Web authors (via a browser plugin) to write code in C++ to run natively for some web applications. The source-code is compiled and made safe via the pre-processor (making sure that code cannot overwrite or jump to code outside its designated memory area). The code is then executed via the browser in a sandboxed environment. The code is said to run at 97% of native speed (likely due to the limited amounts of addressing modes accessible within the sandbox).

Whilst this is currently not part of the HTML5 specification, I hope that it is added sometime in the next year as native code execution is the missing link in HTML5. Of course, native code execution means nothing unless it is portable, so I would not support this development unless there were a virtual machine option to ensure portability whilst allowing for an order of magnitude greater performance than JavaScript. I understand that Google is investigating using the LLVM for this purpose.

The announcement of the Chrome Web Store

The Chrome Web Store was rolled out and several apps were highlighted. This was the biggest mis-step in the presentation for none of the apps seemed to equal a desktop app, and the best web 'app' demod was 'Plants versus Zombies' which made me sit up and take notice for a few seconds. It seemed to be running at a good speed and to have the sound working in the game (something I have not seen done with any degree of success in HTML5 so far). Then, it was revealed that this web 'app' is was actually running in Flash. So much for the theme of the keynote (HTML5 is great). We all know that Flash is supposed to be evil and bad, but it still provides functionality that HTML5 can only dream of for-now.

Someone came out and showed an offline demo of a web-based Photoshop-a-like application. The performance looked fine for the simple use-cases he detailed but I couldn't help but wonder how many users would pay $5 for essentially a web-page experience.

Adobe focusing on HTML5 tooling

Adobe's CTO, Kevin Lynch came on stage to 'focus on HTML5'. He showed off some really nice tools for developing HTML5 web pages and tooling based around the new CSS elements. It worked nicely, but it was a simple use-case. Its obvious that Flash remains a necessary evil until HTML5 pulls its act together and its nice to see Adobe transitioning to the open standard whilst still supporting the closed standard.

The Open Sourcing and permissive use licensing of the VP8 codec

As predicted for some time, Google took this opportunity to announce the open-sourcing of the VP8 codec. This is a very efficient codec in the same league as H264 without the patent issues (maybe).

FireFox has long been in a predicament with regards to how to implement the open HTML5 standard that allows websites to embed proprietory codec encoded videos within <video /> tags. The browser must be responsible for rendering the video but a truly open browser does not and cannot ever afford a license for a proprietory codec. FireFox publically flailed and supported Ogg/Vorbis as the best Open-Source alternative although it was obvious that H264 was a much better implementation. Google itself had a conundrum, how can it move its videos to HTML5 when different browsers support different codecs. Would it have to re-encode all the YouTube videos for every combination of browser supported codec?

It looks like in the end there will be 2 codecs for youtube, H264 and VP8. H264 has to stick around because current-gen mobile devices have hardware acceleration for H264 wheras VP8 has to be decoded in software (for now) leading to potentially sluggish performance and/or poor battery life. This should change in the next few years as hardware support for VP8 should be added to most SoC solutions.

Personally speaking, I hope my Atom 330 ION based net-top has enough CPU/GPU power to software decode VP8 720P streams (1080P is probably out of the question).

It also remains to be seen how the MPEG LA group respond to Google attacking their cash-cow. Its obvious that there are enough devices out in the wild and enough support of H264 (Blu-Ray, Mobile devices, etc.) that H264s continued survival is ensured. They may respond with litigation, or opening their own codec, or perhaps just mine the existing licencees whilst developing a follow-up 3D optimised codec. Who knows?

The partnership of VMWare/SpringSource and Google to allow developers to create portable Cloud Apps

This part was interesting to me. Google App Engine is cool, GWT is cool, but building on Google App Engine always felt like vendor lock-in to me. Glad to see that there will be competition in the cloud for hosting. Google could have tried to tie in app developers to using their hosting, but it seems that an App developed using App Engine will be deployable to a number of different targets.

The Roo demo was nice but I'd seen it before on youtube and I really hate remembering command line commands so was a little disappointed that the presentation never detailed the GUI tools.

I like the idea of GWT 2.1 supporting multiple profiles and it being able to detect devices and adjust the look and feel accordingly. I'll be using 2.1 myself in the near future I'm sure.


Android takes the lead

The smartphone OS war continues to rage on with Android making serious inroads into Apple's share of the pie. Statistics supplied by AdMob. In December 2009, iPhone OS enjoyed an 11% lead over Android OS. Just three months later, and now Android enjoys a 7% lead over iPhone OS.

These above statistics only factor in web traffic to mobile ads served by the AdMob network which is currently the largest provider of mobile adverts and monetization. These statistics are unlikely to represent relative sales figures as Android phone users may spend less time in apps than iPhone users, but the statistics demonstrate the trend toward Android and from iPhone OS.

We can expect the current trend to continue as the HTC Desire and HTC Legend hit the market soon.

iPhone demand may be temporarily dampened by the expected upcoming release of the new iPhone model likely to be announced in June as customers wait to see what the new phone offers or waiting for a price-drop on the current models. This yearly release cycle is sub-optimal as smartphone customers in the market for a new phone often desire the latest and greatest and there is a new shiny phone every month for the tech-lovers, each of which shinier and fresher than the Apple phone from 6 months ago.

Android OS looks set to gain from Apple's lethargy by the continued monthly onslaught of new Android handsets with new form factors.

Unless Apple chooses to license iPhone OS to other handset manufacturers or simply announce quarterly iPhone hardware releases with new form-factors, there seems to be little that Apple can do to halt its diminishing market share.

That said, the market itself is growing and given the current deltas and the upcoming Windows Mobile 7 release, iPhone OS is likely to settle at around 10-15% of the market. This is approximately the same place that macs hold in the PC market. Android devices benefit from scaling up and down to cover the all segments of the market (value to premium customers) whilst Apple is only targeting premium customers at the moment.

The report also details that both iPhone OS and Android have yet to seriously penetrate developing nations with Nokia and Symbian being overwhelmingly the most popular choices. Developing nations do not usually offer subsidized phones so handset cost is the key factor in purchasing decision. Additionally the network infrastructure has not been developed to handle the data requirement of modern smartphone OSs meaning that smartphone use in these countries is usually to access non-connected apps on the move as well as 2G SMSs and voice calls. When at home or at a wireless access points, apps can be downloaded and the web can be browsed.

Given this usage model, Android looks to be in a favorable position to conquer these markets as value models arrive with 1st gen hardware around the $70-100 level in about 18 months time. Given another 18 months, a $50 handset is perfectly feasible without subsidy.


Digital download markets (PC/Mac/Linux)


Over the past few years, there has been a steady rise of digital sales of applications and games at the expense of retail stores. Steam (the PC game download service) and iTunes (the digital music provider) both launched in 2003 and since that time there has been an explosion in both the number of stores and number of media items purchased on these stores.

I should state up front that although I am in favour of digital distribution under the right conditions, currently none of the PC (or console) digital download services operate under conditions that favour the consumer.

For the remainder of the article, I will use the word media. For this I mean specifically audio (such as a music track), video (such as a TV episode or movie at a given quality) or game (such as a collection of files that constitute a game on a given platform). I will also concentrate on the non-console market for now but may write a seperate article on the state of the console markets.

The Trust We Give

There are three areas of trust that are implicit when buying media from an online digital download market:

1 - Trust that the provider can provide a good functional service as promised (can provide the goods on demand).

2 - Trust that we now actually own the media (trust the the provider understands that it holds your media in proxy and that you are the true owner of the media).

3 - Trust (or faith) that the provider can forever until your death continue to provide the service (stay in business).

If you buy a piece of content from a digital market, by all rights, you should own a physical copy. A physical copy is the ultimate proof of ownership. You have it in your hand, and beyond a short physical defect guarantee period, your relationship with the retailer ends. A digital purchase is like buying a book then immediately loaning out the book to the shop you bought it from. You already paid for the book, the book is yours, but it is a conceptual ownership when you do not have the item at hand and in fact, never actually see the book. Trust is everything.

Publishers automatically get money from a market when you buy, at point of purchase, their transaction is complete however the transaction with the consumer is ongoing. We need assurances that our content will not be downgraded and that if the company disappears that we will still have access to those goods which we purchased. Currently no service offers any such assurances (as they cannot).

Too big to fail

Realistically, it *probably* won't happen with the bigger players such as Steam and iTunes as the market seems to be behind them right now, but how can we can never completely know that? They are too big to fail right?

As the number of purchases you make on a service increases so does your risk. Buying 5 games a year for 10 years, that's $3000 of investment you made in one service. Admittedly, the headline price of games quickly erodes and the cost of replacement may be 10 times less than your original purchase price but the price of other media such as music and video can sometimes hold its full value, especially if you are the type of consumer that waits for deals.

If someone broke in your home and took $3000 whilst simultaneously taking $3000 from millions of other consumers that would be the crime of the century, but we are creating a situation where this could happen.

The fear of buying from a smaller marketplace hurts smaller marketplaces disproportionally that may actually have better deals or services. If I could buy a game from Steam for $12 versus Direct2Drive for $10, I would probably go for Steam as it seems more stable and larger. This is not a decision based on good consumer behaviour. Both items would be identical. I just do not trust the second market and with no consumer protections, free market consumers end up rewarding the largest market, not the most competitive.

Direct2Drive has some good deals now now but are we guaranteed they will be there in the future? Ubisoft, EA and dozens of other companies offer digital downloads (often laces with DRM that assumes the existence of their servers which in turn depend of the existence of their company). If these companies cease to exist, then the content disappears. PC owners therefore flock to Steam. In doing so, all the eggs are in one basket and Steam becomes too big to fail.

Another area of worry in these digital markets is a complete lack of cross compatibility between the markets themselves. The markets do not simply become a means of distribution. They tie-in installation and synchronization functions into their own clients.

That is, if I want a game from Steam, I need Steam installed. If I want music or video from Apple, I need iTunes installed. I can't compare oranges with oranges across these markets as there is not a common interface to access all the markets concurrently allowing for small monopolies to emerge around the 'safe' players. What is needed is a open-market standard and open clients that can implement the standard across platforms. Of course, I don't mean to say that I would expect to be able to play an iPhone game or Android game on my PC, but I should be able to have a client on my PC where I can browse all content across all platforms and even make a purchase without resorting to proprietary clients.

Digital Media Rights Repository

When you buy a piece of digital media from a store, you are buying a license to use that media either on an indefinite basis or as a rental for a fixed amount of time. If you bought an indefinite license then that license should last until death of the licensee. Copyright remains with the copyright holder but the licensee should have rights.

This situation cannot continue indefinitely, somethings gotta give, and legislation often starts with a headline failure.

So, how can we ensure that the life of your download media extends beyond the life of the provider that you bought it from?

Well each piece of content needs a unique content id, each user needs a unique id and each license type needs a unique id and the licenses to the content that the users hold should be held in a central digital copyright repository. Each digital distribution service registers purchases with the central repository upon purchase and pays a nominal 'insurance' fee to the repository upon purchase (1 or 2 cents aggregated and paid weekly).

From the moment that you buy the content from your provider, it is insured against provider failure. That is, if Steam goes bankrupt, you can file a claim, and activate another provider for the same content free-of-charge. The replacement-market does not need to pay the publisher for the new user that is downloading their already purchased media and the administrative costs are covered by the nominal per-purchase fee so no-one is out of pocket and customers are free to seek the best deals with protection.