Oooh I wonder if I could get that here - things like npm install(presume any package manager, certainly homebrew) or git commands take obscenely long times.
Commands that download random 3rd party code and dependencies from the internet are precisely the kind of thing that should be included in a virus scan, surely?
Especially since any npm package can execute any code at install time in npm's default configuration.
On Linux this is extra juicy since you cannot globally install any package without root unless you explicitly changed the directory's permissions and/or location.
By default, npm installs into the local project directory, as it should. Only the OS package manager should touch system directories. Before using the "--global" flag, think about what you're actually trying to do, and what the better way to do that would be. One conventional workaround is to install commonly used tools to "~/bin". Root is not required for that.
Of course, it's a good idea to keep the username used for development separate from the one used for browsing the web. It wouldn't be surprising for rogue npm packages to search for e.g. credit card details. I'm sure the browsers try to obfuscate that somehow, but how much can they really defend against code that is allowed to read the disk?
It wasn't incorrect, you misunderstood. npm just uses silly defaults on Linux while using saner defaults on Windows and mac os. I am a Linux user and having to configure npm to use another path for global packages that I use in literally all my projects is just bad UX.
Sure, npm has bad UX. (For an actual example, see the stubborn refusal to follow XDG spec.) This is not an example of that. Defaulting prefix to "/usr/local" is what every well-behaved Linux package does.
It seems odd to install so many packages at the "global" (but not owned by root!) level, that using the "--prefix=~" flag would be a hardship. I just checked; I have three. You can't fault npm for this.
I'd say excepting source directories is less about engineers believing those are safe directories and more about engineers wanting to except their entire machine and source directories being the compromise management went for.
If sysadmins have installed McAfee on your workstation, then presumably they want to use it. Installing it and then excluding code downloaded from the internet defeats the whole point. (The effectiveness/safety/whatever of antivirus is a completely separate issue.)
If you have antivirus software installed, you presumably want it to scan stuff that is downloaded from the internet.
On the other hand, if you don't believe/trust in the efficacy of antivirus software, then there's no point taking half measures and excluding some things from its scans, instead why use it at all?
> If you have antivirus software installed, you presumably want it to scan stuff that is downloaded from the internet.
I don't think you understood the original proposition. This is about corporate-controlled machines. Engineering teams didn't install this AV on their own machines. The point is they didn't install it; it's a company-mandated install. So no, there's no presumption that they want the AV to scan anything.
> if you don't believe/trust in the efficacy of antivirus software, then there's no point taking half measures and excluding some things from its scans
I 100% agree. Half measures / excluding some things IS pointless. But as I said in my above comment, that pointless half-measure may just have been the only compromise management would agree to.
The problem is usually that AVs hook file operations to scan files. Unfortunately, software development performs a LOT of file IO by package management and compilers, and in the case of compilers those files are internally formatted as files containing code (eg obj files, libs or executables), even if they are only temporary during the build.
Because of this, an AV product could work fine for every department of the company, but have an extreme negative performance impact on software devs. To give you an idea, it could mean the difference between a 5 minute and a 1 hour build. These issues are inherent to a generic AV product so often the fix is simply to add those folders to an exclusion list.
Does it provide security for those folders? Nope. But the alternative could make it impossible to get work done.