Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think the purpose of this project isn't to change the LLVM build system, but rather to integrate LLVM better into Bazel. Bazel offers deterministic builds (which allows things like content-based caching and early stopping), but for that to work well, you often need to build the toolchain itself using Bazel as well—since the object files depend on the compiler too, not just the source files. Currently, the system is kind of broken: if you upgrade your compilers, Bazel doesn't catch the changes, and your build cache ends up potentially breaking as a result.

That said, I'm not sure why they don't just hash the compiler and library binaries instead of building them from scratch... but that's a different question.



Right, the Google monorepo is more monolithic than anyone can imagine. The compiler was part of the repo as well. I forgot how does the bootstrap work, but it definitely builds a compiler if needed (most often there is a cached version somewhere already, so no risk of small changes result into gigantic builds)


The bootstrap isn't too much of an issue, a binary release of the compiler is committed inside the repo itself.


How many dozens of gigabytes is it, excluding art artifacts?


No idea about excluding binary/artifacts but it was ~80TB [0]. A better comparison is available here: https://www.visualcapitalist.com/wp-content/uploads/2017/02/...

[0] - https://cacm.acm.org/magazines/2016/7/204032-why-google-stor....


Wild! Thanks for the info.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: