I work as an Equipment Reliability Engineer at a nuclear power station. The only programming we are allowed to use is Microsoft Excel Macros, nothing else. There’s a reason why VBA is still alive.
No, I pay fastmail [1] to do it all for me. I am reading in the comments that a lot of people have success with self-hosting, and maybe I will look into it someday when I am not busy every hour of every day studying chemical engineering, but for now I am happy to pay £50/yr to have someone else deal with hosting, clean IP addresses, DNS/SPF/DKIM, or whatever else is necessary to make sure my emails land in people's inboxes.
make recompiles source code files by detecting the last modified date on a file, hence it only recompiles source files as necessary. So if you have 10 source files with 5000 lines of C in them, and you only change one of them, it will not recompile everything, it will only recompile that source file which has changed.
Which makes me agree with the parent above, I don't see how exactly Ccache is supposed to be used. Maybe for a distributed source directory with many developers working on it?
On ccache site, there is section "Why bother?", the very first line:
>If you ever run `make clean; make`, you can probably benefit from ccache. It is common for developers to do a clean build of a project for a whole host of reasons, and this throws away all the information from your previous compilations. By using ccache, recompilation goes much faster.
Putting aside my tongue-in-cheek comment, honestly this argument does not convince me very much.
What is the purpose of "make clean" other than to invalidate the whole cache so that it is cleanly recompiled? In such a situation I would want to invalidate the cache from ccache also completely.
I'm sure there are legitimate reasons for using ccache but it is not very obvious to me what it is:
"Only knows how to cache the compilation of a single file. Other types of compilations (multi-file compilation, linking, etc) will silently fall back to running the real compiler. "
Well yes, traditional use of makefiles has been exactly to cache the compilation of single compilation units and trigger the compile of changed units - ccache does not help with granularity here it seems.
Distributed development might be a good argument for this, but then what does it offer to faciliate that? It seems to suggest using NFS - which I could do with a Makefile as well. So is the advantage that it uses hashes instead of timestamps? Timestamps work quite well for me, but maybe that is a valid point.
Another argument could be that is stores the precompiled units somewhere else and therefore doesn't clutter the file system. But is that really a good argument? Build directories exist, so even if you'd like to keep compiling several variants in parallel you could do so with a few different build directories.
And yes, there are quite a lot of newer alternatives to Makefiles as well, so it would have to compete with those alternative build systems as well.
I basically never `make clean` but ccache is a boon for `git bisect`. In theory bisect takes log time; in practice, without ccache, it’s slower because handwave build time goes by something like the log of the number of commits you jump across.
If it’s my branch, I often have a build of every commit in my cache. Even if not, each jump back and forth makes a bunch of new cache entries, many of which will be reused in subsequent bisect steps.
It was my experience that building a native Android project with older NDKs benefitted hugely from introduction of ccache. Especially if you had multiple branches of the same code (release/nightly) that shared a significant code base.
That was pre-Android Studio times. IDK what is the situation now.
This is exactly my experience with my differential equation lessons in my maths classes at Year 1 and Year 2 undergraduate chemical engineering. The way we were told to just follow the instructions and not have any critical thinking at all about what we were doing made me so unmotivated that I sort of gave up learning differential equations. I was lucky this was during lockdown, so I was assessed by online tests and was able to get through it, but my god was the teaching so so unengaging.
When AMD or NVIDIA sell GPUs to scalpers, and those scalpers resell the GPUs and keep the profit, there's outrage about it. Nobody ever says to "build more GPUs" to fix the problem.
When houses and new developments get snatched up by investors and landlords as buy-to-lets (or even by Blackrock and pension funds), there is little outrage and calls for restricting who can buy them. The same argument is rehashed over and over and over again. Just build more houses!
I feel that there is a disconnect between these two schools of thought. I personally believe that, while indeed more houses should be built, we should also have a serious discussion about whether houses should be sold en-masse to very wealthy investors
Building more GPUs requires billions of dollars of investment and many years to build a new fab. Building more houses requires only several hundred thousand dollars to form a new construction company and train some workers. So it's not a good analogy. It is physically possible to just build more housing in high-demand markets.
Separately, GPU scalping is (was) only happening because of cryptocurrency booms which everyone knew were temporary. Not many people are naive enough to scream for AMD or NVIDIA to spend billions of dollars on new fabs because of a temporary trend. The housing supply issues, on the other hand, have been brewing for decades and are not temporary.
> Building more houses requires only several hundred thousand dollars to form a new construction company and train some workers.
Hahahah look at this guy!
Bet you never tried to get a permit for home construction? Shit can take YEARS in most states/countries. Even worse if you dare suggest to make apartment complexes. N I M B Y
One, chipmakers are indeed building new factories to increase production of chips, so your entire premise is incorrect. Two, the people criticizing the scalpers are wrong. NVIDIA and AMD were simply generous enough to not raise the price of their products even though they could have, demand was high enough to justify it. Scalping GPUs is fair game. Three, the number of houses being bought by investment firms is negligible. It's not having a significant effect on prices. It's the land use restrictions that make building housing difficult or impossible that are the problem. Particularly those that restrict multifamily housing.