You’re basically locking yourself to a single development eco system, and a highly limited deployment eco system.
It’s not clear what the benefits of either are either. I get that the MacBook gets great performance for battery life but the majority of work is gonna be done in desktop settings, so simply using more/equally powerful x86 chips is only gonna cost you a few dollars a developer per year in electricity costs.
And all that despite the fact that your development is on Docker which doesn’t even have a working solution for the workflow you’re considering at the moment.
It‘s currently in consideration and by the time we’re ready to make a call on it, Docker will be too. They almost are in fact.
But consider that we may be optimising for different things. Most new developers I hire can be thrown a MacBook and they’ll know what to do, Linux on the other hand doesn’t have that guarantee especially towards the junior and front-end market segments of where I work. It’s a (real) broad strokes opinion, but I’m of the belief that macOS and by extension MacBooks offer us fewer overheads in terms of setup, maintenance, onboarding, tooling suitability for the median developer. So that leaves us using macOS.
This is the factor we’re optimising for more than deployment portability - we optimise for vendor lock-in in less than the developer experience for the median of our developers. For many of us on this forum we may be best with Linux on a bleeding edge distro, but for our preferences we deploy MacBooks for portability. Whether it helps things overall, this is in Manila where a net monthly salary is often less than the cost of a laptop, so we deploy one device that can be transported between home and work as required for those that don’t have a personal device.
With that, I see this as Apple locking us into that ecosystem rather than a choice we’re making on our side, so I’d rather lean into this and explore it further than doing nothing. If it comes out positive then we’ll be ready to make the switch before Apple forces us into it, and if not we’ll deploy something thinkpad-esque and keep our production instances x86.
"With that, I see this as Apple locking us into that ecosystem rather than a choice we’re making on our side, so I’d rather lean into this and explore it further than doing nothing. If it comes out positive then we’ll be ready to make the switch before Apple forces us into it, and if not we’ll deploy something thinkpad-esque and keep our production instances x86."
As a long time Apple user (personally and staff wise), please don't tie your business decisions with company that treats professional users badly, every time they can. Your median developer benefits from Linux knowledge in general, you can deploy stable distribution without fear of compatibility problems after minor software update.
Apple marketing and lure is great, I have fallen for their game for 20 years. But I cannot be comfortable with ideas, business and management practices that this generation of Apple deploys.
They destroyed entire indie businesses by arbitrary changes and/or enforcement of App Store policies, not to mention they're leading the war on general purpose computing as we know it by locking everything down.
I want to be able to tell my children I didn't participate in that.
If I compile the list of all anti professional moves that Apple has made in recent years I will get depressed and I don't like to be depressed:)) Here, watch this funny rant from proven Apple professional user, may be it will give you some insight. https://www.youtube.com/watch?v=MKJjLwMUPJI
On other hand most valuable company in the world uses slave labor and gives the consumer highest possible price, I cannot support this dynamic anymore. https://www.youtube.com/watch?v=zeEERdbfH0c
M1 performance is about much more than just battery life, it’s screaming fast is raw execution power as well. In single core it’s even competitive with Ryzen for goodness sake. That’s just mental.
I don’t see this as a significant lock in risk. It’s not like Apple are the only company selling ARM laptops and desktops, and it seems clear Google, Microsoft and Amazon among others are serious about ARM.
Sociological studies strongly suggest that masks do work.
For example, there was a recent study that compared COVID hospitalization rates of districts in Tennessee that had mask mandates vs those which didn’t and the areas with mask mandates had significantly lower disease spread.
It is but with 70% being non-reproducible we should be careful what we conclude. Humans are not robots. Lots and lots of factors in play.
I have worn mask since march and have no problem with doing so, I believe they help curve some, but scientific rigor not clickbaity politicized headlines on NYT is what is needed here.
Because it's not science. Spain has been wearing masks for months now, 99% of people comply and it gets only worse and worse. Both of our statements are anecdotes.
Wasn’t the original Skype that Microsoft paid billions for also based on BitTorrent technology?
Not BitTorrent exactly, but close enough that if the blockchain spinoff was bought for that much the blockchain folks would certainly consider (rightfully IMO) a victory in the blockchain column.
Right, until random builds start breaking in Visual Studio because some CSPROJ somewhere because visual studio is looking at the wrong DLL since there is no deterministic way to determine where VS will look for DLLs under many circumstances (maybe it’s looking at the GAC. Maybe it’s looking at the many various paths for Nuget. Maybe it’s looking at some of the default build paths) and picks up a random unpredictable version.
And since you can’t even look at the CSPROJ within VS without gymnastics debugging isn’t even easy to get started with.
After the first week of constant anxiety because it constantly felt like I had forgotten something, it was incredible, getting used to not having Google Maps at all times, and drawing up the courage to ask strangers for help with stuff like using their stores phone to call a friend who I was waiting for, for directions, etc. it was absolutely incredible.
I felt like a literal weight had been lifted off my shoulder and felt more independent and free than ever before. I explored a ton more, and had a great time overall. I was also less stressed out and did much better at work due to the significantly reduced distractions.
Of course, I was completely single at the time, which made all of this possible. I’m not sure it would be doable either when in a relationship or married.
Also, a lot of people are commenting about retreats and stuff, which also I have done and is great. However, living your normal life without being hyper connected and your face in a screen at all times (basically going back to the 90s) is a very different and refreshing experience.
After a smartphone died a while back I went to a flip phone. I used a variety of cobbled together solutions to get SMS messages with calendar reminders, todo items, etc. The only thing I missed was Google Maps and Email. I was fine with both of these. Email should not be "urgent" and Google Maps was barely a consideration in a city I grew up in. But then we decided to sell our house, and instant back and forth using email, web links, etc was important. Going to new locations was happening a lot as we viewed homes. So I got a new smartphone. I really enjoyed the time without one. One trick I adopted after that and still use from time to time is putting my phone in ultra power saving mode. All the functions are still there but it is far less tempting to pick up. Example: no reddit app, so you are stuck using the browser, so you have time to ask yourself if you really need to go on reddit right now.
The security stuff for NodeJS is really frustrating. If anything, NodeJS is more secure than something like the JVM or C++. If I include a 3rd party package in the JVM, I have absolutely no guarantee that it will work well, much like in Node. In fact, in Node, I can actually read the source code and see what the package is, running is doing. In nearly every other environment, you may simply have access to a binary, with maybe some interface info.
So why do people not throw the same kind of fit about nearly every other programming environment as they do for Node/NPM? And frankly why do those other environemnts not have the ridiculous security breaches we have seen in Node/NPM land?
The real problem with Node/NPM i suspect is a lack of a standard library. Simply having a standard library would have greatly reduced dependency and package hell. Further, a standard library would mean people would be more willing to write a little more code rather than include a new dependency.
This is 100% true especially stupid libraries that are someone's class project. And JavaScript developers are so used to dependency hell that one of my developer imported 3rd party package for date formatting.
This seems to be a similar situation to the Bootstrap fallacy where you use the same frameworks every time you make something in a language to make it quick and easy for developers to work on all of your company's projects by just learning one. Using the same familiar libraries is great for reducing the amount of time it'll take to train someone to work on and maintain a large portion of your company's tech.
I am confused, are you using MomentJS for fancy output like 3 days ago etc or for simple output like 5/31/2020? I can see how it is useful in former case but seems overkill in later case.
the parent said something about problems being solved if there was a standard library, and if perhaps there were a standard library people would be willing to write more code instead of just adding another dependency.
I believe these points
dependencies are carefully considered by users
dependencies try to be dependency-free themselves to assist with the previous point
dependencies solve important domain problems, they are not trivial one-line-functions
dependencies are typically developed and tested by a known team or company, which you trust, not just someone random
would be solved by the parent comment's proposed standard library.
What you’ve said has nothing to do with languages. Claiming X devs are better than Y devs is just your bias. Other languages can be more domain specific, have more frictions around using package management, and have less nr of developers.
Yeah, I feel like Deno will reduce dependency usage, and people will hurrah and say "look, using URIs as deps actually worked to make things easier!", when in reality the reason dependency hell freezes over is because Deno actually has an STL.
The problem with adding to JS’s stdlib is that it’s interpreted. If the next ECMAScript version provides new features in the stdlib, it’ll take a while for browsers to adopt it, and during that time, you’ll need polyfills.
Compare that to a new version of C++ where you just update your compiler, and the executable runs (almost) anywhere. No polyfills to run it.
I think the problem is less that it is interpreted and more that there are many different interpreters for different platforms, all of which are competing.
Python is also interpreted, but there isn't much problem changing the stdlib because CPython runs pretty much everywhere you need it to. Sure, there are other interpreters like PyPy that also need to implement changes to the lang, but it's not a show-stopper like it is with browsers.
> If I include a 3rd party package in the JVM, I have absolutely no guarantee that it will work well, much like in Node.
In the JVM you can use the security Manager [1] and limit file access and access to similarly sensitive areas. If you want you can fully guarantee that nothing is accessed randomly.
Of course that builds on the JVM not having a zero-day bug.
I remember 10 years ago (?) people were already complaining about how crazy installing some random packages via pip is instead of installing them via distro's package manager repo. The packages from distro's repo might be out of date but at least someone already vetted them they said. If someone with a time machine go back and tell them what we'll do with npm and docker today they'll probably quit programming on the spot.
I'm not too sure, but in Java, you depend on specific versions and the packages are signed. And most company have an internal repo they work off on, in case the public repo is having downtime or the package gets removed from it. Also deployments don't make use of dependencies, a single uberjar bundles it all up.
> Also deployments don't make use of dependencies, a single uberjar bundles it all up.
That's not actually completely right... there are problems with deploying a single jar. With Java 9 modules, you're actually throwing away module encapsulation if you deploy a uberjar. The current state-of-the-art is to deploy the whole app + the JVM in a jlink image, which requires no uber jar.
Sweden’s strategy failed on its own terms. There is absolutely no arguing that. Their first goal was to protect the vulnerable, and they failed miserably at that. Fundamentally because Tegnell resisted the fact that asymptomatic carriers existed.
One could argue, like some Iraq war proponents do, that the idea was ok for Sweden, but the failure was in execution. But one can not argue that it wasn’t a significant failure.
We're barely 3 months into this and your calling it a failure? could it be done better: yes, it is failure? no.
"no arguing", "can not argue" Yes, you very much can, and we'll see in a few years what the "correct" choice was.
The actual spread of the virus is a fraction of what Tegnell predicted. They were predicting close to 60% by the end of May. It was in the low teens with about a week to go in May in Stockholm.
Their primary goal was to protect the vulnerable ie the elderly. But Tegnell did not believe in the existence of asymptomatic carriers (well after other countries had locked down because of them) so care workers who were infected but not showing symptoms were in constant contact with the elderly since his guidance required them to avoid going in to work only if they showed symptoms.
Both of these were the primary stated goals. Neither were achieved.
the comment was referring to failure in execution, not necessarily failure in choosing the correct approach. it's hard to argue that sweden shouldn't have been more careful with its elderly and vulnerable, even if going for herd immunity.
It's fairly impossible to say "Sweden should hav ebeen more careful with its elderly" - The measures put in place by Sweden were designed with that in mind.
That's why elderly care facilities were the first, if not only ones, to face quarantine.
However, when you have years of poor management of elderly care, and elderly care companies that refuse to heed the recommendations, there's not much "Sweden" can do in the moment wihtout severe effort.
And the politicians should have put in that severe, very expensive effort. But that would have required them to be fast, which is not a very Swedish thing. Everything is extremely decentralised and in many cases privatized.
Either the state (highest level) should have forced an intervention, or all actors should have acted responsibly. But there is too much inertia. The elderly care is so poorly managed, it was not only a disaster waiting to happen, it actually was a slow burning dumpster fire even before Covid19.
How would they do that? I think it's safe to assume right away that the companies aren't going to act responsibly, because, well, they haven't for years. So that leaves the state, and what should they do?
Not like they can push a button and invent more caretakers, fix internal routines, etc, across an entire country.
I'd love it if they could, but in general, there is going to be inertia when trying to affect change in a system that has been degrading for years, with actors actively working against those changes (in this case, companies putting profit over welfare).
It's so very, very easy to say "the state should have done something, fast", but it gets very difficult when you're trying to specify which parts of the state should have done what, to which actors and on which level.
Right, and it had nothing to do with Tegnell’s resistance to the idea of asymptomatic carriers and hence his guidance that care workers were free to go into work as long as they didn’t show symptoms.
Edit: Also, your comment basically justifies the lockdowns in every other country. Tegnell’s whole argument was that the enlightened Swedes didn’t need a mandatory lockdown because they would simply do the right thing without the government telling them. And there is some truth to that in a society that is wealthy, well educated, and highly trusting of its government. Despite that, it’s citizenry failed to achieve what Tegnell said they would.
How in the world would you expect other countries’ citizens to voluntarily do what the government was recommending when Sweden couldn’t? His criticism of lockdowns in other countries was completely unwarranted based on his own reasoning for why Sweden didn’t need a lockdown (which also, as you point out, turned out to be wrong).
Your argument has been repeated a number of times through this thread and it's bogus. Sweden's policy is already a failure because its economy has been affected to much the same level as other similar countries while experiencing a much higher death rate. And other countries are now opening up again to each other, while Sweden has become a pariah.
The federal government stopped giving the police military weapons. They also had consent degrees with several departments that helped improve things.
These and others are all things the current administration has reversed and is almost certainly playing a huge role into why things have gotten so bad.
Fast and Furious was a DEA operation to allow gun dealers to sell weapons to straw purchasers in the U.S., knowing and hoping they'd go to Mexican cartels, allowing them to legally go after the cartels.
The Obama admin was responsible in "the buck stops here" sense, but this wasn't about admin policy.
It's literally in the first sentence of the article: between 2006 and 2011. The initiative ran from 2006-2011. Operation Wide Receiver was the first operation that ran 2006-2008. Operation Fast & Furious was a subset that ran 2009-2011.
This information is readily seen by anyone who actually clicks the link.
Regardless, the facts are not political no matter how much you may want to make them. The initiative began under the Bush admin and continued under Obama. Arguably, neither had much to do with it as it was mostly the work of the ATF and its leadership.
You’re basically locking yourself to a single development eco system, and a highly limited deployment eco system.
It’s not clear what the benefits of either are either. I get that the MacBook gets great performance for battery life but the majority of work is gonna be done in desktop settings, so simply using more/equally powerful x86 chips is only gonna cost you a few dollars a developer per year in electricity costs.
And all that despite the fact that your development is on Docker which doesn’t even have a working solution for the workflow you’re considering at the moment.