What I don't understand is all the people praising HTML5 because it uses less CPU than Flash yet Chrome ends up using 15-20% CPU just to render a simple animation unlike Flash.
I believe this example is a SWF running in Canvas (as opposed to vanilla HTML5).
Regardless, with HTML5 you have all the benefits of an OSS rendering engine where performance can be improved by individual implementors (eg, Apple improving WebKit on A4/ARM) or groups of developers with similar interests (eg, Google/Apple/other contributors to WebKit for x86). Where as with Adobe's Flash plugin, you're pretty much reduced to hoping or praying that Adobe will fix problems relevant to your interests.
In other words, the performance may suck now in some demos, but the situation is better than relying on Adobe... for some companies anyway.
Some people probably just assume that switching to HTML5 content always uses fewer CPU resources. There are plenty of informed reasons to prefer HTML content over Flash though. One is that putting performance into the hands of multiple companies will introduce competition. Just like with Javascript speeds, you can expect browser-makers to start racing in other areas of HTML5 performance.
I wouldn't be surprised to see HTML5 outperform equivalent Flash content in a year or so.
The amount of work required to do so is epic, and considering the rate of Flash's evolution it's like getting on a treadmill whose speed is controlled by Adobe.
Funny metaphor, and very accurate. However, HTML5 is like that same treadmill, only that people still have to build the treadmill itself. It's epic work times two.
That standard is for Flash as it existed several years ago AFAIK. It's not useful for creating a viable competitor to modern Flash except maybe as a jumping-off point to the actual hard work.