make as a task runner is not too bad, but there are better alternatives today like just (as others have commented).
make as a build system is ok until you hit the warts.
- make/Makefiles aren't standardized, which is why automake exists. So now you're not writing Makefiles, but templates and generating the actual makefile. This doesn't matter if you own the whole toolchain, but most people don't, so this is what some folks do to guarantee their Makefiles are portable.
- make cannot do any kind of dependency resolution, it assumes that whatever you need is right there. That leads to configure scripts, which like makefiles, are not standard, so you use autoconf/autoreconf to generate the configure script that runs before you can even run a target with make.
- make (and adjacent tools like automake/autoconf/autorefconf) use mtime to determine if inputs are out of date. You can get into situations where building anything is impossible because inputs are out of date and running autoconf/autoreconf/automake/configure leaves them permanently out of date. (fwiw, many build systems can get away with using mtime if they can do proper dependency tracking)
All in all the fundamental design flaw with make is that it's built with the unix philosophy in mind: do one thing well, which is "rebuild targets if their inputs are out of date." However this is an extremely limited tool and modern build systems have to do a lot of work on top to make it useful as more than a basic task runner.
I'm referring to package management. Modern build systems all have some way of doing package management directly or interfacing with package managers instead of just shelling out to them, which you would have to do with make.
It normally works in conjunction with GCC’s “-MMD -MP” arguments which provide .d files which then get included back into the Makefile with something like “-include $(OBJS:%.o=%.d)”.
It doesn’t directly interpret any source file though, if that’s what you mean.
Do you really expect an answer from a self-defined "ignorant"? Or is this a rhetoric question and you are hiding an answer inside it? If so I don't get it. Wouldn't it better to explain it in plain words?
> make cannot do any kind of dependency resolution, it assumes that whatever you need is right there. That leads to configure scripts, which like makefiles, are not standard
The ancient convention there is "make configure", which sets up whatever "make [build]" needs.
the only thing I really miss in make is the ability to resolve mtime as something other than mtime. So I resort to using touchfiles which are gross but still work better than a lot of other things (I'm looking at you, docker build caching).
make as a build system is ok until you hit the warts.
- make/Makefiles aren't standardized, which is why automake exists. So now you're not writing Makefiles, but templates and generating the actual makefile. This doesn't matter if you own the whole toolchain, but most people don't, so this is what some folks do to guarantee their Makefiles are portable.
- make cannot do any kind of dependency resolution, it assumes that whatever you need is right there. That leads to configure scripts, which like makefiles, are not standard, so you use autoconf/autoreconf to generate the configure script that runs before you can even run a target with make.
- make (and adjacent tools like automake/autoconf/autorefconf) use mtime to determine if inputs are out of date. You can get into situations where building anything is impossible because inputs are out of date and running autoconf/autoreconf/automake/configure leaves them permanently out of date. (fwiw, many build systems can get away with using mtime if they can do proper dependency tracking)
All in all the fundamental design flaw with make is that it's built with the unix philosophy in mind: do one thing well, which is "rebuild targets if their inputs are out of date." However this is an extremely limited tool and modern build systems have to do a lot of work on top to make it useful as more than a basic task runner.