Hacker Newsnew | past | comments | ask | show | jobs | submit | jmyeet's commentslogin

Anything but exposing the abusers who Epstein and Maxwell trafficked to [1] and investigating (let alone prosecuting) child abuse [2][3].

Britain has many real problems. This isn't one of them.

[1]: https://www.pbs.org/newshour/world/the-epstein-files-rattle-...

[2]: https://www.aljazeera.com/news/2020/2/26/british-politicians...

[3]: https://www.theguardian.com/commentisfree/2022/apr/28/outrag...


The UK has been far better at this than the US thus far.

While we were having "No Kings" rallies over our elected Epstein co-conspirator, they arrested and demoted a royal family member over it. And a separate lord.


There was a second person arrested? I thought only Andy got got.


The US used to have a massive Strategic Helium Reserve [1]. Starting in the 1990s, Congress passed a law to sell down the reserve. This flooded the market with cheap Helium (yay, party balloons?) because the mandated pricing just didn't make any sense.

10-20 years ago there was a lot of talk about how this was foolish because it was depleting and squandering an unrenewable resource. But the thinking has shifted on that because it's an inevitable byproduct of natural gas production.

Now natural gas itself is limited but you can still get Helium from alpha decay of radioactive elements. Some elements are particularly strong alpha emitters (eg Polonium-210, Radium-223). They're basiclaly producing Helium constantly.

Helium is a known issue in various industries. The article notes (correctly) that MRI Helium use is decreasing because of the rise of so-called "Helium free" or "Helium light" MRI technology.

But there are short term supply issues. As noted, Qatar produces ~30% of the world's Helium currently. And that can (and has) been disrupted by recent events.

Lithography is a particularly important consumer of Helium for superconducting magnets. That demand is rising with probably no end in sight. Lithography itself is on the cutting edge of technology and engineering so seems harder to replace. I mean, EUV lithography is basically magic.

[1]: https://en.wikipedia.org/wiki/National_Helium_Reserve


Shutting down the National Helium Reserve seemed like a good idea at the time. It was originally established when airships were considered essential for national security, largely for maritime patrol. But blimps and dirigibles fell out of favor for most military missions and there wasn't much demand for other uses, so it was politically hard to justify wasting tax dollars to maintain a reserve.

Ironically exactly now - while we are at or close to peak natural gas extraction - would be the best time to fill up strategic helium reserves worldwide. If every natural gas well was required to capture and store helium for future use we could extend that runway by multiple generations.

But instead of our grandparents and great grandparents general idea of investing in the future of their societies, we’ve decided to stop doing that and add up all the debt possible to pass down to future generations.

It is quite depressing to think about.


The article briefly touches on insufficient recycling. Though it's not clear for which applications helium recycling is technically/economically feasible and for which it isn't.

Terrestial helium isn't produced by nuclear fusion. It's produced by nuclear decay. As you may know, you get alpha, beta and gamma radiation from decay. Gamma rays are just energetic photos. You typically need thick lead and/or concrete to shield you from them. Beta radiation is high energy electrons. A thin sheet of steel will shield you from those.

And lastly we have alpha radiation, which is just a Helium nucleus. A sheet of paper will generally block alpha radiation.

Some materials are really strong alpha emitters. A good example is Polonium-210 where almost all of its energy from decay is in the form of alpha radiation. This is why Po-210 is so lethal when ingested, which has been used for that purpose [1].

But this means if you produce a lump of Polonium-210, it's basically radiating Helium. The source of almost all of the Earth's Helium is from uranium and thorium decay.

[1]: https://en.wikipedia.org/wiki/Poisoning_of_Alexander_Litvine...


> Gamma rays are just energetic photos

They are indeed. The average planet busting Gamma Ray Burst is just a Vogon trying to "get the whole family in".


I would think that lighting a Vogon family picture would be about as advisable as recording a Vogon speech. That is to say not at all.

Put another way: the idea didn't raise $17M, the team did. That's usually the case but you can fully expect a pivot in their future.

Once open source spreads into an area, it tends to kill (commodify) commercial software in that space.

For example, with databases, MySQL and Postgres "won". Yes, there are commercial databases like SQL Server and Oracle but they largely exist through regulatory capture and inertia. It's highly unlike anyone will ever make a commercial general purpose database again. There are always niche cases.

Same with operating systems. Yes we have MacOS and Windows but what are the odds we get another commercial mass OS? I'd say almost zero.

It's the same for source control. Git "won". There are a handful of others (eg Mercurial). But gone are the days of, say, Visual Source Safe.

But when people talk about "what comes after Git" they really mean (IMHO) "what comes after Github", which is a completely different conversation. Because Github absolutely can be superseded by something better. Will it though? I don't know. It has an incredible amount of inertia.

As for AI and anything related to source control, I'd have a hard time betting against Anthropic. But remember the exit could be an HN post of "We're joining Anthropic!". Side note: I really hate this "we're joining X" framing. No, you took the bag. That's fine. But let's be honest.

For people with a proven track record, AI is a gold rush of acquisition more than creating a sustainable business, let alone an IPO. I think that's what this bet is.


GitHub has already been bettered - gitlab is much better, in my opinion.

Shifting liabilities from corporations to the public coffer is what companies do. You'll often hear this described as "privatizing profits and socializing losses". Let me introduce you to the Price-Anderson Act of 1957 [1]. It's been repeatedly extended, most recently with the ADVANCE Act [2]. This limits liability for the nuclear power industry in a whole range of ways:

- It removes jurisdiction from state courts to the federal court. In recent weeks, the part of "states' rights" is doing similar to stop states regulating prediction markets, as an aside [3];

- All actions are consolidated into a single claim;

- That claim has an inflation-adjusted absolute limit, which is somewhere around $500 million (I'm not sure of the exact 2026 figure);

- Any damages beyond that are partially sharead by the industry and an industry self-funded insurance program;

- The industry as a whole has a total liability limit, also inflation-adjusted. I believe this is around $10 billion.

For context, the clean up from Fukushima is likely to take a century and the cost may well exceed $1 trillion for a single incident [4]. So if this happened in the US, the government would be on the hook for almost all of it.

So I have two points here:

1. If you oppose any effort to shift liability from AI companies to the government (as I do) with legislation such as this, how do you feel about the nuclear industry doing the exact same thing? and

2. Minor point but I noticed in searching for the latest details, Gemini made factual errors, stating that "the Act is set to expire in 2025" when it was extended in 2024 until 2045. Always check AI's work, people.

[1]: https://en.wikipedia.org/wiki/Price%E2%80%93Anderson_Nuclear...

[2]: https://en.wikipedia.org/wiki/ADVANCE_Act

[3]: https://www.pbs.org/newshour/politics/federal-government-sue...

[4]: https://cleantechnica.com/2019/04/16/fukushimas-final-costs-...


This is what government should be doing. Figure out how to do something safely, make that a regulation, then shield companies from liability as long as they follow that regulation. In practice you won't extract trillions of dollars from most companies anyways, because they'll go bankrupt long before they manage to pay all that back.

I'm a big believer that humanity's future is in space in a Dyson Swarm. There are simply too many advantages. It's estimated that humanity currently uses ~10^11 Watts of power. About 10^16 Watts of solar energy hits the Earth but the Earth's cross-section is less than a billionth of the Sun's total energy output. A Dyson Swarm would give us access to ~10^25 Watts of power. With our current population that would give every person on Earth living space about equivalent to Africa and access to more energy than our entire civilization currently uses by orders of magnitude.

I bring this up to present an alternate view of the future that a lot of thought has gone into: the Matrioshka Brain. This is basically a Dyson Swarm but the entire thing operates as one giant computer. Some of the heat from inner layers is captured by outer layers for greater efficiency. That's the Matrioshka part.

How much computing power would this be?

It's hard to say but estimates range from 10^40 to 10^50 FLOPS (eg [1]). At 10^45 FLOPS that would give each person on Earth access to roughly 100 trillion zettaflops.

[1]: https://www.reddit.com/r/IsaacArthur/comments/1nzbhxj/matrio...


It makes me wonder about what it would take to actually create one.

You’d need self-replicating machines to build it, naturally. You’d need some ability for them to mine from asteroids and process the materials right there on the spot. And they’d need to be able to build both the processor “swarmlets” (probably some stamped-out solar/engine/CPU package) and more builders, so that the growth can be exponential. Oh, and the ability to turn solar energy into thrust somehow using only fuel you can get from the mined asteroids. Maybe a prerequisite is finding a solar system that has a huge and extremely uranium-rich asteroid belt.

You would need a CPU design that can be built using the kind of fidelity that a self-replicating machine in space under constant solar radiation can achieve. But if you can get the scale high enough, maybe you can just brute force it and make machines on the computational scale of a Pentium 3, but there’s 10^40 of them so who cares. Maybe there’s a novel way of designing a durable computing machine out of hydrocarbons we have yet to discover.

The machines would have to self replicate, and you’d need to store the instructions somewhere hardened. And that can be built out of materials commonly found in asteroids. Maybe hydrocarbons. Hell, may as well use RNA. These things need to be as good as humans at building stuff, so really this is just creating artificial “life” that self has DNA and is made of cells that build proteins needed to create the machine. Maybe they reproduce by spreading as little DNA seeds that can attach to an asteroid with the right chemistry, and we just spew them into the cosmos at a candidate star and hope the process gets kickstarted. Hell, we could make it spew its own DNA at the next stars over as soon as it’s done. We’d have a whole galaxy computing for us, all we’d need is the right DNA instructions, the right capsule for them, and a way to launch them.

Maybe another civilization has already done this…


I very much disagree, there's just too many engineering hurdles for us to surmount for this to be a reasonable solution. When you actually break down the physics, the scale works against you.

You can't have "one giant computer" when the speed of light is a 16-minute ping time from one side of the swarm to the other. Also cooling. Space is a vacuum, you can't just use convection. The inner layers would melt before they could radiate it away.

Even maintenance and power distribution, you're talking about trillions of nodes that need active course correction to avoid a chain-reaction of collisions.

There's so many reasons this is not feasible and more of a whimsical thought experiment. I've barely even touched on most of the issues.


Some people just choose to ignore reality..

Dyson Swarm sounds like the name of an aggressive cleaning machine.

It's 2071. James Dyson, now 124 and in better health than ever, thanks to the AI-fuelled nanorobotic revolution, has just lost control of the last of the company's Dyson Swarms. What started as a fleet of cleaning-nanites, a dirt and dust-eating squadron-for-hire, has gone rogue; all of Earth's organic matter is now on the menu. People still haven't forgiven him about Brexit.

I do find it kind of funny that the Dyson company sold a vacuum cleaner called the "Dyson ball".

If we absorb all of the sun's energy using a dyson swarm, the earth is going to get very cold and dark

I suspect uou've misread that document. It is a good document though. It's saying a large parts plant uses ~188,000 MWh, I think per year.

A modern AI data center uses 20-100MW+ of electricity. Those two things aren't the same. 20MW of continuous electricity use (which AI data centers do) translates to 175,000 MWh of electricity per year. That's about the same as a minimum and might be 5+ times more.

This document is only about energy usage so we have to guess what "large" means in terms of employment but 3000 to 7000 seems to the range. Compared to 20-30.

But AI data centers are worse because they actually produce what I call negative jobs. Their currently only value proposition is in laying off people and otherwise suppressing labor costs. All while the residents all pay more for their electricity with the money no longer have because they got laid off.


> A modern AI data center uses 20-100MW+ of electricity.

I understand the high end builds to have exceeded 100 kW per rack at this point, with the largest sites exceeding 1 GW (ie 10x your upper bound). So the smallest datacenters use as much as the largest auto plants, and the largest datacenters use 100x that.


So it's hard to get numbers here so I went looking for electricity usage figures for an automobile plant. This obviously depends on the size but the estimates I could find for a theoretical plant that produces 1000 vehicles a day are:

- 300-400GWh/year of electricity usage. It's significantly more for EVs, as an aside;

- Such a plant employes 2000 to 5000+ people.

Data centers also vary in size but I've seen estimates of 20-100MW being a typical range. 20MW run continuously is 175GWh/year.

So it seems like one large AI data center employs probably fewer than 50 people and uses as much electricity as a plant producing upwards of half a million cars per year. Those cars have a lot of utility, obviously, and employ a lot of people.

Let's be fair: AI data centers currently produce almost nothing of value and contribute almost nothing to the local or state economy. They're being built speculatively on the basis of a potential future value add that has yet to materialize.

My view is that the "value" AI data centers will add is for employers, by allowing them to fire people and suppress wages. That's the true use case. So, in other words, AI data centers represent negative jobs.

Five years from now we'll see studies and media reports on the relationship between how many jobs you can eliminate per MW of electricity. The added bonus is all the residents will be paying higher amounts for their electricity for that "privilege".


Back when Netflix was $8/month I just had it forever without thinking about it. It was at first a great way to catch up on TV shows. Netflix was after all originally a place for studios to monetize old content, particularly TV. Even at $10/month, it was fine. But it kept going up.

I think I finally cancelled it at $14-15 but I go back for 1-2 months a year to catch up on stuff I want. I basically cycle streaming services.

I've searched for data on how often people do this. I'm 99% sure it's a small minority but I bet it's growing. There is an inertia with subscriptions of every type. People are lazy to cancel things they don't use. It's the entire basis for the gym model.

So somebody is doing the math in the background of working out how much they can raise prices and lose people to subscription cycling vs lazily not cancelling and it still favors raising prices. I suspect at some point that'll change and, when it does, it'll be too late to do anything abou tit.

My suspicion is that this kind of analysis will be a textbook example of a company making short-term optimizations all the way into extinction.

The only research I've found is on comparing to move to cable to streaming and how many streaming services people have. I haven't found anything about streaming churn. If anyone knows of any, please let me know.


> I've searched for data on how often people do this

I've just started doing this late last year. I'm down to one active service at a time. I heard of it from someone else, so that's at least +2 to your tally


I'm reminded of Asimov'sThree Laws of Robotics [1]. It's a nice idea but it immediately comes up against Godel's incompleteness theorems [2]. Formal proofs have limits in software but what robots (or, now, LLMs) are doing is so general that I think there's no way to guarantee limits to what the LLM can do. In short, it's a security nightmare (like you say).

[1]: https://en.wikipedia.org/wiki/Three_Laws_of_Robotics

[2]: https://en.wikipedia.org/wiki/G%C3%B6del%27s_incompleteness_...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: