GPU accelerated path rendering is one of the single most polar technologies. Things like Bezier curves are defined mathematically, so you can represent the way a tiger looks in a, albeit large, mathematical equation! But due to their usage of maths that computers aren't fast at performing, they are extremely difficult to rasterize in real-time.
NVidia's technology to do this has been around for years, but it hasn't 'taken off', I've yet to see anyone use it (maybe Adobe Illustrator uses it?). Khronos (the group who created OpenGL) even has a standard for accelerated vector graphics, called OpenVG, which has also been around for years as well yet only the Raspberry Pi and a few other embedded ARM devices seem to support it.
My conclusion? Path rendering is insanely useful, but it's very complex for both application developers ("this is slow and hard") and artists alike ("I hate these weird grabby-handles!"). That complexity comes at a cost, and time after time we've been shown that complexity is often overpowered by simplicity even when it's inferior (bitmap images).
I would like to see Quadratic (i.e. a bezier curve with just one control point) become popular due to their artistic simplicity as well as their easy relation to graphics hardware (a triangle with a simple GLSLfragment shader can render a quadratic bezier curve).
Perhaps quadratic bezier curves aren't as mathematically pure (cannot represent circles "perfectly") -- but they are artistically beautiful and, in my mind, just as useful as pixels are (in real-time applications). http://imgur.com/a/i2RtE
It's funny to see this pop up now. I'm writing a demo (one shader for two triangles) that does 2d distance field rendering, and I'm using quadratic bezier curves for a large part of it. Rather than an optimized form, I'm just iterating over the curve and then doing a point-line distance calculation for each step; it's not the most efficient way to do it, but it's plenty fast on modern GPUs.
I'm excited to see where accelerated path rendering goes in the future. In conjunction with distance field rendering, it's absolutely fantastic.
Cubics vs. Quadratics is not as big an issue as you thikg. Cubics flatten down to quads just fine. (side not: Flash was all quadratics). The hard parts is that vector graphics just doesn't map well to gpus. Way too much CPU work involved. Another really big problem with vectors is that they are relatively unbounded. With raster you scale pretty linear in the number of pixels. Vectors jump from very easy to very hard all the time. And in graphics you need to care about the worst case.
Cubics cannot translate perfectly into quadratics, no. You can estimate a set of quadratics to build what appears to be a cubic, but they're not identical at any arbitrary resolution.
Quadratic bezier curves, unlike cubic ones, do actually map really well onto GPUs (given their basis on triangles) quite unlike what you've said here.
Sorry for not being clear. Of course you are right, they are not the same. What I tried to say was that in the context of rendering approximating cubics by quadratics will not be the bottleneck.
100k of WHAT? :) It is easy for a 100k solid circles lines up in a grid.
Impossible for 100k mitered lines overlapping the full screen at 1% opacity.
In addition I would guess webgl would get around 10% performance vs native because it is missing a lot of useful API for that kind of stuff and has a significant overhead in javascript - rendering vectors needs a lot of CPU work and not having floats (only doubles) does hurt in use cases like bulk geometry.
If you can get 100k+ bezier curves to render in 30-60fps from a vbo, eg, loop/blinn, let's chat :) we're fine on 1mm+ simple primitives, this is more interesting :)
NVidia's technology to do this has been around for years, but it hasn't 'taken off', I've yet to see anyone use it (maybe Adobe Illustrator uses it?). Khronos (the group who created OpenGL) even has a standard for accelerated vector graphics, called OpenVG, which has also been around for years as well yet only the Raspberry Pi and a few other embedded ARM devices seem to support it.
My conclusion? Path rendering is insanely useful, but it's very complex for both application developers ("this is slow and hard") and artists alike ("I hate these weird grabby-handles!"). That complexity comes at a cost, and time after time we've been shown that complexity is often overpowered by simplicity even when it's inferior (bitmap images).
I would like to see Quadratic (i.e. a bezier curve with just one control point) become popular due to their artistic simplicity as well as their easy relation to graphics hardware (a triangle with a simple GLSLfragment shader can render a quadratic bezier curve).
Perhaps quadratic bezier curves aren't as mathematically pure (cannot represent circles "perfectly") -- but they are artistically beautiful and, in my mind, just as useful as pixels are (in real-time applications). http://imgur.com/a/i2RtE