We're.... misattributing some things here, I think.
Flash was a bundle of good things and bad things, as we all more or less agree. Good for creativity, bad for usability and accessibility and security, etc.
You could, obviously, do so many things in Flash that were impossible with web standards.
But, why was there such a gap between what you could do in Flash and what you could do with web standards? Why was Flash even necessary?
This was due in large part to how Microsoft acquired a stranglehold on the web right as it was beginning to really take off -- IE4, IE5, and IE6 were the defacto standards, and Microsoft used them to quash innovation on the web as they (successfully) tried to hold on to their desktop software business for another decade or so.
Perhaps we haven't lived through the worst possible timeline, but there was certainly a better outcome possible - where Microsoft didn't stifle innovation on the web for close to a decade.
(And now, of course, we're approaching another dark age as Google heads toward IE6-like market share....)
> Microsoft used them to quash innovation on the web
Microsoft didn't "quash" any innovation, the ActiveX framework allows more innovation than any other browser on the market,
Flash was invented as an ActiveX plugin, so does AJAX and it started the whole Web 2.0 era. IE had so much incredible feature that it makes modern looks like a dull corporate utility. We had VRML and vector graphics. We can playback videos with IMG tags. Heck we can even embed the mighty DirectX on Web. Chrome basically re-invented everything we already had with dHTML in the 90s.
Microsoft didn't took down ad blockers by crippling the API. In fact there's innovative abuse problem with IE, think all those toolbars!
Flash had a VM that truly was "write once, run on anything" and a great (software) pixel and vector renderer. Cross-browser compatibility with HTML standards was completely bonkers and HTML had none of the features Flash Games and Animations needed. Try to move or rotate a picture at 30 fps with old HTML.
I remember Flash being slow and buggy on Mac and Linux at the time, and updates lagged far behind the Windows version (on Linux in particular). And then I saw an article from a Macromedia/Adobe dev saying that it was heavily optimized x86 assembly code targeted towards Windows and very difficult to port to other platforms... given the IE6 monoculture at the time, it made sense, but I felt like a redheaded stepchild for daring to use a minority OS.
I don't think it was 64-bit clean, either, so now no modern device runs Flash "natively."
Flash was a bundle of good things and bad things, as we all more or less agree. Good for creativity, bad for usability and accessibility and security, etc.
You could, obviously, do so many things in Flash that were impossible with web standards.
But, why was there such a gap between what you could do in Flash and what you could do with web standards? Why was Flash even necessary?
This was due in large part to how Microsoft acquired a stranglehold on the web right as it was beginning to really take off -- IE4, IE5, and IE6 were the defacto standards, and Microsoft used them to quash innovation on the web as they (successfully) tried to hold on to their desktop software business for another decade or so.
Perhaps we haven't lived through the worst possible timeline, but there was certainly a better outcome possible - where Microsoft didn't stifle innovation on the web for close to a decade.
(And now, of course, we're approaching another dark age as Google heads toward IE6-like market share....)