Models are trained with content scraped from the net, for the most part. The availability of content pertaining to those specs is almost nil, and of no SEO value. Ergo, models for the most part will only have a cursory knowledge of a spec that your browser will never be able to parse because that isn't the spec that won.
You can just also learn with the knowledge of 1996
Selfhtml exits it pretty easy to limit the scope of authoring language to a given HTML version and target browser. Your LLM should have no problem with german.
There were specs competing for adoption, but only tables (the old way) and CSS were actually adopted by browsers. So no point trying to use some other positioning technique.
In 1996, We had only css1. Ask it to use tables to do this, perhaps.