Hacker Newsnew | past | comments | ask | show | jobs | submit | Touche's commentslogin

The fact that you're familiar with the Coke controversy proves the point. Before they were rare and memorable. Now it happens ever other week. Bots are intentionally driving division.

Why would they need to delete storage, they could just not accept past the cap.


Storage billing is partly time-based.

EBS is billed by the second (with a one minute minimum, I think).

Once a customer hits their billing cap, either AWS has to give away that storage, have the bill continue to increase, or destroy user data.


I think most of the "horror stories" aren't related to cases like this. So we can at least agree most such stories could be easily avoided, before we looked at solutions to these more nuanced problems (one of which would be clearly communicating the mechanism of a limit and what would be the daily cost of maintaining the maxed storage - and for a free account the settings could be adjusted for these "costs" to be within free quota)


Everything on AWS can deny a request no matter what the API happens to be


Everyone who makes this argument always assumes that every website on the internet is a for-profit business when in reality the vast majority of websites are not trying to make any profit at all, they are not businesses. In those cases yes absolutely they want them to be brought down.


Or instead of an outage, simply have a bandwidth cap or request rate cap, same as in the good old days when we had a wire coming out of the back of the server with a fixed maximum bandwidth and predictable pricing.


There are plenty of options on the market with fixed bandwidth and predictable pricing. But for various reasons, these businesses prefer the highly scalable cloud services. They signed up for this


The difference is this actually happens, a lot, unlike your straw man. It happens enough that there's a website dedicated to it.


It's not "exactly" that.


What's the difference? I'm not trying to be a hater, just trying to see how it differs from the already included, or soon to be included features of these platforms that all push into the "agentic, working for you while you are away" space.


I use lazygit for that. But any diff tool you like will work.


They don't have arbitrary access over your file system. They ask permission for doing most everything. Even reading files, they can't do that outside of the current working directory without permission.


Comments like this just show how bad the average dev is at security. Ever heard of the principle of least privilege? It's crazy that anyone who has written at least one piece of software would think "nah, it's fine because the software is meant to ask before doing".


I'm pretty comfortable with the agent scaffolding just restricting directory access but I can see places it might not be enough...

If you were being really paranoid then I guess they could write a script in the local directory that then runs and accesses other parts of the filesystem.

I've not seen any evidence an agent would just do that randomly (though I suppose they are nondeterministic). In principle maybe a malicious or unlucky prompt found somewhere in the permitted directory could trigger it?


They might not be capable of ingenuity, but they can spot patterns humans can miss. And that accelerates AI research, where it might help invent the next AI that helps invent the next AI that finally can think outside the box.


Yes, general means you can present it a new problem that there is no data on, and it can become a expert o that problem.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: