Hacker Newsnew | past | comments | ask | show | jobs | submit | rlpb's commentslogin

> Doesn't this mean that solar/wind are insanely lucrative?

This is how markets are supposed to work. It provides an economic incentive for production to increase, which is what we want.

Consider what happens if you develop a farming method to produce potatoes for a fraction of the usual cost, but you can only meet 10% of total demand at your local market. What price are you going to sell your potatoes for when you show up to the market? You (like any free market seller) want to maximise your return, so you'll be able to sell for a fraction under the previous market rate, undercutting everyone else. Your farming method would be extremely lucrative.


Sure, but those same free markets will happily see those expensive producers go out of business. In the electricity scenario, that would mean blackouts.

If you triple the price, you don't have a new gas plant appear out of thin air. And the result won't really be lower consumption either, because most people would have fixed rate contracts (not in the UK so don't know specifically, but this is very common elsewhere)


> Sure, but those same free markets will happily see those expensive producers go out of business.

No, because remember you are only able to meet 10% of market demand. The expensive producers will still get 90% of the business, and the market price for their product will remain basically the same. This is what we observe in the electricity markets today: the price to us is the cost of the most expensive product. The cheaper producers who cannot meet the full market demand still get to sell at the cost of the most expensive product.


> The cheaper producers who cannot meet the full market demand still get to sell at the cost of the most expensive product.

Which would mean it's super lucrative and your same laws of economics will tell you that that means they'll be building like crazy.

My while point was that as soon as you get to a day where no gas is needed you've lost the ability to react quickly because no supplier will just leave a gas plant around just for that


Yes, but here’s the thing: you don’t have a monopoly over your potato farming method. Lots of new farms are built, and the more that do, the more the average price of a potato drops. Your expected return starts to drop. Yours - and everyone else’s - profit margins get squeezed.

Investors begin to refuse to build new potato farms because a return on their investment gets worse whenever anyone decides to build a new farm.

But the people need potatoes and more potato farms! The government issues an incentive scheme to guarantee a minimum price for each potato sold. Potential farm owners bid against each other for the lowest price, but it means they can build a farm and expect to break even.


> Investors begin to refuse to build new potato farms because a return on their investment gets worse whenever anyone decides to build a new farm.

If they all refuse, then they're leaving money on the table. One investor could invest in 10% production only, and that would be very lucrative. It would be exactly my low cost to produce potato scenario.

In practice, they don't all refuse, or all invest. The market finds a balance. In time, producers switch to the new method, because anybody who doesn't leaves an opportunity for someone else to take their business and make more money.

This takes time, though. If we want things to go quicker, then we need to guarantee return on investment for longer, which is exactly what the government does by guaranteeing prices to renewable energy producers.


> This might be obvious, but all of those things have a single common denominator: Microsoft, over you, getting to decide what your computer is doing.

Sure, but Microsoft have to strike a balance, too. If they push too hard in this direction, they'll lose their users to Macs on one side (probably the majority) and Linux on the other (a minority in number, but perhaps significant in expertise and clout). Once an exodus begins, it's much harder to stop. So where we are in that balance, and the state of user mindshare migration, is still interesting to discuss.


You cannot git push something that is not committed. The solution is to commit often (and do it over ssh if you forget on a remote system). It doesn't need to a presentable commit. That can be cleaned up later. I use `git commit -amwip` all the time.

Sure, you might neglect to add a file to your commit, or commit at all, but that's a problem whether you're pushing to a central public git forge or not.


TCP has an "urgent data" feature that might have been used for this kind of thing, used for Ctrl-C in telnet, etc. It can be used to bypass any pending send buffer and received by the server ahead of any unread data.

Fun fact: Oracle implements cancellation this way.

The downside is that sometimes connections are proxied in ways that lose these unusual packets. Looking at you, Docker...


Just googling it now and TCP urgent data seems to be a mess.

Reading the original RFC 793 it's clear that the intention was never for this to be OOB data, but to inform the receiver that they should consume as much data as possible and minimally process it / buffer it locally until they have read up to the urgent data.

However, the way it was historically implemented as OOB data seems to be significantly more useful - you could send flow control messaging to be processed immediately even if you knew the receiving side had a lot data to consume before it'd see an inline message.

It seems nowadays the advice is just to not use urgent data at all.


Unfortunately the can be many buffers between you and the server which "urgent data" doesn't skip by design. (the were also lots of implementation problems)

It absolutely could have happened when the ecosystem norm is `curl https://third.party/installer|sudo sh`. That was the normal method for third parties to ship software before snaps came along.

We have Flatpaks to solve this problem too now, but AFAICT while Flatpaks do support sandboxing the UX for that is such that most Flatpak non-power-users aren't enforcing sandboxing on Flatpaks they install, so in practice the feature isn't present where it's most needed.


Here's an article that seems relevant to the topic:

"District Court Finds That Using Copyrighted Works to Train Large Language Models Is Fair Use"

https://www.finnegan.com/en/insights/ip-updates/district-cou...


Whether they are derivative works in the context of copyright law (which the GPL relies upon) has not yet been decided by the courts AFAICT. So your assertion may be your personal opinion but we don't know if the law agrees or not yet. From some quick searches it seems that the answer isn't a slam dunk one way or another and is still working its way through the courts.

What if a person puts in the work, but the work was worthless or can be trivially reproduced without effort?

See also: https://en.wikipedia.org/wiki/Sweat_of_the_brow


You mean like when I take a photo?


A photo is easy to take but hard to reproduce.


As is randomly splattering paint on a canvas, even with no artistic vision or skill.


There is well established case law on the contract that forms when you buy something from a store (say with cash). There is a contract, on implied terms . I think what we’re talking about here is entering into a contract (or not) on explicit terms dictated by one party where the other party has not explicitly considered them and barely given the opportunity to do so if at all. I don’t think anybody is denying the ability of contracts coming into existence on implied terms.


Pinning dependencies by hash is completely undermined by automation that then updates the pins without meaningful review, as is common.


I think that this is the issue then, not pulling dependencies from the internet directly.

> meaningful review No that I think about it, maybe for the first time in history it's actually feasible to review all the code in the repos using LLMs. Before LLMs were a thing, for any big project that would be way too much work to realistically do it.

Also, someone can provide code review of publicly available dependencies as a service, to avoid wasting tokens of reviewing same code again and again by each dev locally on their machine.

U wonder if anyone is already working on such service...


Most (all?) of the solutions offered are not providing a "code review service" but rather a "curated registry" one: download from us and we guarantee some things.

It's definitely more widely known/used for container images than individual software packages.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: