> The optimal solution would be using a template engine to generate static documents.
This helps the creator, but not the consumer, right? That is, if I visit 100 of your static documents created with a template engine, then I'll still be downloading some identical content 100 times.
XSLT solved this problem. But it had poor tool support (DreamWeaver etc) and a bunch of anti-XML sentiment I assume as blowback from capital-E Enterprise stacks going insane with XML for everything.
XSLT did exactly what HTML includes could do and more. The user agent could cache stylesheets or if it wanted override a linked stylesheet (like with CSS) and transform the raw data any way it wanted.
I'll still be downloading some identical content 100 times.
That doesn't seem like a significant problem at all, on the consumer side.
What is this identical content across 100 different pages? Page header, footer, sidebar? The text content of those should be small relative to the unique page content, so who cares?
Usually most of the weight is images, scripts and CSS, and those don't need to be duplicated.
If the common text content is large for some reason, put the small dynamic part in an iframe, or swap it out with javascript.
If anyone has a genuine example of a site where redundant HTML content across multiple pages caused significant bloat, I'd be interested to hear about it.
I care! It is unnecessary complexity, and frankly ugly. If you can avoid repetition, then you should, even if the reason is not obvious.
To give you a concrete example, consider caching (or, equivalently, compiling) web pages. Maybe you have 100 articles, which share a common header and footer. If you make a change to the header, then all 100 articles have to be uncached/rebuilt. Why? Because somebody did not remove the duplication when they had the chance :-)