Not sure early versions of rust is the best example of refcounting overhead. There are a bunch of tricks you can use to decrease that, and it usually doesn't make sense to invest too much time into that type of thing while there is so much flux in the language.
Yeah I was thinking the same thing. "10 years ago the Rust compiler couldn't produce a binary without significant size coming from reference counts after spending minimal effort to try and optimize it" doesn't seem like an especially damning indictment of the overall strategy. Rust is a language which is sensitive to binary size, so they probably just saw a lower-effort, higher-reward way to get that size back and made the decision to abandon reference counts instead of sinking time into optimizing them.
It was probably right for that language at that time, but I don't see it as being a generalizable decision.
Swift and ObjC have plenty of optimizations for reference counting that go beyond "Elide unless there's an ownership change".
In case anyone is interested, V8 pre-ignition/TurboFan had different tiers [1]: full-codegen (dumb and fast) and crankshaft (optimizing). It's interesting to see how these things change over time.
You mean the rockets that are intercepted? Which civilians have been hit, compared to the tens of thousands in Gaza in a few weeks? Those interceptor rockets cost Israel (and the US) much more than it costs Hamas to manufacture.
That's economic warfare. It's like a DDoS attack. Do something cheap that costs your enemy a lot.
The goal is to make occupation too expensive to sustain.
Israel and Hezbollah have been in a low-grade state of war for over a decade. If Hezbollah hadn't diverted its energy to the Syrian Civil War, they might by now have been in a state of total war. Hezbollah has units dedicated to infiltrating and attacking targets in Israeli Galilee. My guess is, and I could be wrong, that it probably doesn't make sense to attribute a motive of "dragging the US in" to Israeli strikes on Hezbollah. In many ways, Israeli strikes on Hezbollah are far more ordinary than strikes in Gaza are.
The Houthi leadership have repeatedly said that they will stop the attacks if the Israeli assault & ethnic cleansing of Palestine ends.
The real naivete is presuming that the "lalala can't hear you, I will do whatever I want" playbook of US/Israeli foreign policy is sustainable in any way.
Lots of people say lots of things. When it comes to geopolitics, actions and capability are what matter. Until someone is willing to underwrite shipping insurance on the Houthis’ word, their leadership’s promises are worthless.
When a human driver must emergency break for a downed branch, it'd ok. When an AI does it, it's unexpected and needs to be hyperanalyzed. I swear, the trolley-car problem is absurd, it's poisoned all debate. 99% of crashes is people not being able to stop in time because people don't drive defensively and can't stop in time when they are called to do so.
There's a good reason for this. It's because the human can be interrogated into what was going through their mind whereas many ML models cannot. That means we can't ascertain if the ML accident is part of a latent issue that may rear its ugly head again (or in a slightly different manner) or just a one-off. That is the original point: a theory-of-mind is important to risk management. That means we will struggle to mitigate the risk if we don't "hyperanalyze" it.
You're missing the context. The AI didn't actually do anything unexpected, unless you expected it to try and drive through a downed branch. The AI behaved exactly as it should. The unexpected part was when the car behind the AI didn't see the branch and, therefore, didn't expect the AI car in front to stop. Unexpected doesn't mean wrong.
Cars can do unexpected things for good reasons, as the AI did in this case.
I'm taking in a larger context. I think just reading the three cited examples is an incorrect approach. For one, Waymo isn't sharing "all" their data, they've already been highlighted for bad practices in terms of only sharing the data from when their Waymo team decided was a bad decision. That's not necessarily objective, and can also lead to perverse incentives to obfuscate. So we don't have a great set of data to work with, because the data sharing requirements have not been well-defined or standardized. Secondly, if you look at reports of other accidents, you can see where AV developers have heinously poor practices as it relates to safety-critical software. Delaying actions as a mitigation for nuisance braking is really, really bad idea when you are delaying a potentially safety critical action. I'm not saying Waymo is bad in this regard, but we know other AV developers are and, when you combine that with the lack of confidence in the data and the previous questionable decisions around transparency, it should raise some questions.
Density is one of the main ways to get cost savings. But there are others too, and there's also a lot of hype around them. Chiplets for example. Or CXL for memory.
> but no customer wants to give Nvidia monopoly money forever either.
From a consumer perspective, I agree. From a datacenter, edge and industrial application perspective though, I think those crowds are content funding an effective monopoly. Hell, even after CUDA gets dethroned for AI, it wouldn't surprise me if the demand continued for supporting older CUDA codebases. AI is just one facet of HPC application.
We'll see where things go in the long-run, but unless someone resurrects OpenCL it feels unlikely that we'll be digging CUDA's grave anytime soon. In the world where GPGPU libraries are splintered and proprietary, the largest stack is king.
These stats can be misleading for other reasons as well.
The US and EU have a lot of existing infrastructure, construction of which required enormous amounts of CO2 emissions. Countries that are still building out their infrastructure should be expected to expend more CO2 per capita to catch up. (Modern equipment is more effecient, but still).
> Countries that are still building out their infrastructure should be expected to expend more CO2 per capita to catch up. (Modern equipment is more effecient, but still).
Well, the UAE is not exactly a "country that is still building", so your comment is very misleading. Nevertheless, we can also compare the "share of global cumulative CO₂ emissions" for UAE and USA. It's 0.3% and 24% respectively, and when prorated to the population, we get (0.3 / 10) / (24 / 330) = 0.4, so the UAE is cumulatively about 40% of the USA already, even though their development only started 50 years ago.
And again, if you consider that out of the 10 millions living in the UAE, about 9 million are migrant workers, the vast majority living in dorms, the equation is clear and the UAE current lifestyle is totally unsustainable.
And they ship/externwlize much of their costs to other places. Just a few examples with China - US and EU send their plastic there for “recycling” and probably the majority of goods we use are manufactured in China so they can take the “carbon hit”
China doesn't take the world's plastic anymore, that comment is now 4 years outdated. Since then, some of it was shipped to south-east Asian countries but they started cutting back soon after.
That being said, plastic is a problem in itself, not a "carbon" concern as it is being discussed in COP28 so it is off-topic.
Regarding the "externalize manufacture"/"carbon hit", again, this is a little outdated. It was very true from 2005-2010, but since then China has developed rapidly and is now importing almost as much carbon as it is exporting. See graphs below for "consumption-based CO2 emissions vs. territorial emissions". Except for the EU, it is a matter of ~10% difference.
Swift is probably a better baseline.