This piece of software would have to handle all the intricacies of the GitHub actions but also be updated to the latest changes...
We are moving back to a makefile based approach that is called by the GitHub workflows. We can handle different levels of parallelism: the make kind or the indexed by worker number when running in actions. That way we can test things locally, we can still have 32 workers on GitHub to run the full suite fast enough.
I also like that we are less bound to GitHub now because it has been notorious unreliable for us this past year and we may move more easily to something else.
Take a look at https://dagger.io/. Declarative pipelines using Node, Python, or Go. Parallelism built in, and caching built in - things are cached if they're unchanged.
You could implement something similar by splitting the make targets in the GitHub action before they get passed to make so each worker is assigned their own target, then have a make target that executes all the targets for local multithreaded builds via `make -j${NUM_CONCURRENT}`.
We are moving back to a makefile based approach that is called by the GitHub workflows. We can handle different levels of parallelism: the make kind or the indexed by worker number when running in actions. That way we can test things locally, we can still have 32 workers on GitHub to run the full suite fast enough.
I also like that we are less bound to GitHub now because it has been notorious unreliable for us this past year and we may move more easily to something else.