This is called GPU virtualization, and there have been experiments ("Sugar" [1]) of using it for WebGL. We looked into it, consider it promising but not urgent. It addresses some of the security concerns at the cost of performance (and implementation complexity), but the portability issues are unchanged.
[1] http://newport.eecs.uci.edu/~amowli/hpcfactory/publication/a...