The initial motivation of LaTeX compile times being slow is very interesting to me.
I use LaTeX as a tool for layout of books to print for hobby bookbinding and my current project - a 3 megabyte, 500k word beast of a novel - only takes around 10 seconds to compile.
I cant imagine what the friend of the author here had going on in his paper such that his compile times took such a hit. Required use of specific LaTeX libraries dictated by the journals he was submitting to that were written inefficiently? specific LaTeX features or packages that end up hitting significantly slower codepaths?
Makes me wonder if its not LaTeX itself that is the speed issue but instead the very maturity of the ecosystem that it has as advantage over Typst. It could entirely be possible that once Typst has the wide berth of features and functionality available through its ecosystem, that it would become just as easy to fall into compile time tarpits.
My experience is the same as OP: Typst is significantly faster than Latex. My book has all of table of contents, parts and chapters, figures, code samples, tables, images, and bibliography. These are all going to require multiple passes to layout. E.g. you cannot insert page numbers until you have laid out the table of contents, as it comes before other content. However you cannot construct the table of contents before you have processed the rest of the document. A typical novel won't have most of these, and so I think it will be substantially easier to layout.
PhD theses often have many figures, and these are often not optimised. My thesis is around 30MB, and that's after I optimised a couple of the most egregious figures. I'm planning to make it better in the corrections phase, but file size is really not something most people will be concerned in the writing phase, and so compile times are likely to suffer.
For further reference, a single-pass compilation of my thesis currently takes 25 seconds, and multiple passes are of course needed if the layout/bibliography changes. I ended up setting up TeXstudio to always compile only once for the preview, and then run the full N complications for the final build. That plus liberal use of \includeonly made compile times not that much of an issue
A few parts are dramatically slower. Images for me, for instance, really slow down the compilation, especially if latex needs to read a high definition image into memory. Raw text like that from a novel is very fast.
it is not strange that these systems don’t do incremental compiles? Things are literally paged.. Why does newer systems such as typst do full compiles when a single page is edited? I know that all subsequent pages might be affected, but surely, one could workaround this by allowing sloppy but fast compile options for subsequent pages, that sacrifices correct layout for something decent?
I use LaTeX as a tool for layout of books to print for hobby bookbinding and my current project - a 3 megabyte, 500k word beast of a novel - only takes around 10 seconds to compile.
I cant imagine what the friend of the author here had going on in his paper such that his compile times took such a hit. Required use of specific LaTeX libraries dictated by the journals he was submitting to that were written inefficiently? specific LaTeX features or packages that end up hitting significantly slower codepaths?
Makes me wonder if its not LaTeX itself that is the speed issue but instead the very maturity of the ecosystem that it has as advantage over Typst. It could entirely be possible that once Typst has the wide berth of features and functionality available through its ecosystem, that it would become just as easy to fall into compile time tarpits.