Hacker Newsnew | past | comments | ask | show | jobs | submit | ldoughty's commentslogin

> The Claude Platform on AWS is a first of its kind offering for Anthropic, giving you all native Claude API features from day one. Anthropic operates the service and data is processed outside the AWS boundary.

So it's not... On AWS... ?

This statement sounds.... Backwards?

I get they have another option that is in AWS, but this continues the cryptic naming problem AWS already is overloaded with


I think the idea is that you can launder your team or product AI spend through your AWS account. This matters in Enterprise. It looks like the difference with Bedrock is that you access more "Claude platform" stuff than just the model.

More charitably, this lets an org heavy on AWS use their existing IAM / SSO / Finops processes to manage Claude stuff, this is genuinely helpful when otherwise you have to go thru several teams and build out whole new rails to adopt.


> I think the idea is that you can launder your team or product AI spend through your AWS account.

This is exactly it. For any reasonably sized org, setting up new contracts with new vendors involves a lot of procurement, lawyers, negotiations, etc.

If a team can just click a button in AWS, there’s no issue.

This is a product / solution that solves an organizational problem, not a technical one.

I wouldn’t even call it a hack as much as extremely common a strategy.


Sadly it’s going to be more nuanced.

The Bedrock models, at least, have additional click through EULAs for Anthropic models. You’re going to need to review and agree to those as well.

Claude is going to be marketplace spend and that’s usually capped towards your PPA at 25%.


> click through EULAs

Every year "don't agree to things on behalf of the company"

Every day "click here to agree that ..."


I've always wondered how this plays out in practice. I might certify that I have signing authority but I most certainly do not. What happens in the US (in Delaware?) when there's a dispute?

We had a customer try to back out of a contract by claiming the person signing didn't have authority. It didn't work because the person's manager (who has authority) was included in all of the communication.

Legally it didn't matter whether the signer had authority because the way the signer's company behaved during the signing process implied that the signer had authority.

E.g. If the CTO at a company tells a vendor to "send the contract over to my product manager" then the CTO created the impression with the counterparty that the product manager has authority, and the company will be hound to the contract based on that fact regardless of whether the product manager actually has authority or not.

I'm sure it's more nuanced than this, but my understanding is actual authority is less relevant than implied authority. E.g. if you have your board of directors take away the CEO's authority to sign a contract, it doesn't automatically invalidate everything the CEO signs, since a counterparty can reasonably assume that the CEO has authority just based on their job title.


Generally any W-2 has authority to enter into contracts, strictly from the vendor’s POV. As a vendor you don’t need to get your customer’s publicly listed officer or director to sign off on contracts. The W-2 can also be fired for entering their employer into the contract, but that's not (directly) the vendor's problem.

Once a vendor has entered into a contract, that could change - e.g. "any change orders must be approved by $EMPLOYEE_SET".

It's absolutely wild that every W-2 employee can expose their employer to essentially unlimited liability, but AFAIK, that's the truth.


Well, you see, I had my cat click "submit", so we don't have to pay the bill!

"Practice is policy"

No, "This is exactly not it." They are buying your data on a cheap.

As someone who is dealing with the procurement of both in a medium sized it, finops and infosec are exactly it.

Do you have experience selling to fortune 100 sized organizations?

“I don’t have the budget for this but we have AWS credits” is something teams beg for all the time.

When people beg to give you money, you accept it. Why? It’s not some conspiracy theory. You accept the money because it’s money.


100% correct... Have EPD or PPA? Reduced spend because of reasons? Well now you can make it up in claude tokens.

This is my day job. I couldn't get access to the Claude Platform even with a business goal justification because of the management overhead while having Anthropic model access with Bedrock.

Through AWS, assuming the underlying data governance is reasonable, this will be a much easier pill to swallow.


yes it sounds like a hack to get access to untracked spend in corporate accounts.

In my org, I have to file a form for reimbursement if I bought a pencil for $0.25 but in AWS? spend varies by +/- $5k per month and nobody even questions it. This will definitely make it trivially easy for me to build on Anthropic's services without even telling anybody vs the hoops I would have to jump to get it paid for another way.


Another selling point has been a guarantee of 1:1 api feature and design parity between Anthropic and this Claude platform. Helps if you have workloads you want to balance between providers.

Nah that's not what's happening here. This service is offered under AWS Marketplace. The only argument is actually probably a shared billing console, and that's where it ends. Won't matter for small companies, small fish, but the for the big pond this means new contracts to check, lawyers and so on. So not really a "revolution" happening. News for startups, yes, but not so much for the big corps or gov.

It is basically like invoicing through AWS Marketplace.

> I think the idea is that you can launder your team or product AI spend through your AWS account.

Can confirm that this is the one and only reason that we use Claude through AWS


Also you can spend your commitment contracts :p

Isn’t that capped at 25%?

Something like that, but if your boss went too high in your commitment contract, it's nice to have different options to put there...

Yeah its a "marketplace private offer"

As other people have pointed out, it makes contract signing much easier.

THe other side effect is that it bumps up your spend, possibly to the point where you are eligible for "private pricing" ie global discount.

So its a win-win for most people.


> Claude on Amazon Bedrock keeps AWS as the data processor and operates within the AWS boundary. This is a good fit for companies that have strict regional data residency requirements or need their data processed exclusively within AWS's infrastructure.

Seems like there are two different options.


Yeah i think this could backfire. At the moment they have such a clear messsage with Bedrock about data governance. You now have to ask a question and probalby get approval where previously there was no question and hence no barriers.

At this point I can only assume that AWS wants to have this naming issue. It’s an issue they have everywhere. Sagemaker is the worst offender. Only a solution architect can guide through such confusion…

As a long time Amazonian I can tell you it's simply because UX designers basically don't exist in Amazon (in case that wasn't obvious), and the ones that do exist are extremely bad at their job.

there’s a top level feature in aws for investors to give out credits of like $120k of AWS spend during funding rounds. there’s min commits of spend for cheaper prices (RI). funneling costs and invoicing though aws has real benefits. aws spend monitoring is literally a sub industry with billion dollar players

The credits you get from aws in their startup program are typically not spendable on marketplace. At least what we got through YC we could not spend there. Not sure how claude is integrating, maybe it’s different here

Yeah, as someone with strict export compliance concerns which forces us to use Bedrock because its exclusively us-based inference in our AWS account, this does nothing for me. Frankly, nothing Anthropic has shipped over the past 6 months besides the models themselves has been useful to our company, despite running into the same problems they're trying to solve with all of those features (managed, remote agents). There's not really a good solution, as AgentCore runtime sucks and is expensive. You basically have to build this yourself because nobody is solving for self-hosted managed infra for agents, and we don't really have the time to build this sort of system on top of building our actual product. It's very frustrating for them to put this out as a win, when it doesn't help the people who are using AWS Bedrock to begin with.

Synthetiq offers self-hosted (local or in your cloud)

The problem is that data centers use SO MUCH water... sure we humans let water evaporate, but this is a new source of water "waste" to the tune of nearing 2 billion gallons/year, just in Loudon County Virginia & connected water users [0].

When that water source is underground wells, this can take years (on the fast end) or decades (on the moderate end) to get back down. Look at California's water issue -- so many wells extracting water for farming has changed the land topography.

Also, when water 'comes back', it might come back in the ocean and not on land... reducing the available fresh water without desalination.

Data centers need the water to cool... but maybe there's room to find incentives for them to do so while making sure our water bills don't go up like our electric bills are because of the extra load they are putting on utilities.

[0]: https://www.theregister.com/2024/08/19/virginia_datacenter_w...


The owner of the private space generally has authority to deny this already, there's no need for an additional law.

In the US at least, any private homeowner/renter can deny entry to their property, barring legal warrants and exceptional circumstances. A business can have a policy, and is generally legally protected as long as the policy is 1) equally applied, and 2) does not violate ADA... A court would have to weigh in if glasses are allowed or not for ADA... but I suspect there's already a case where a movie theater banned such glasses and they would probably(?) win, since such individuals could be expected to have non-recording glasses.


A bunch of people pay to remove ads, and a bunch of people that are happy to give businesses their attention (view ads) I'm exchange for services... I.e. Gmail, YouTube, but don't feel they use enough / are annoyed enough to warrant $15-25/month.

Some brands are okay with impressions.. you can build trust in your product be advertising it for weeks/months and when the user does make a purchase that brand is on the mind.


But the data collected is property of the government and flock is not allowed to use that data for additional business gain (according to their statements)...

So they can't sell the fact that you're at Target at 8:00 p.m. on Thursday to anybody... Nor build profiles to sell to advertisers... And if that's the case that's very similar to cloud storage vendors.

If I access hacker news, and the record of my visit is stored in an AWS S3 bucket, I can't submit to AWS to delete my visitor record, even though the server, network cards, wires, and storage medium are AWS property, it was hacker news' website that generated that record and their responsibility to take my request to delete it.. AWS' stance would rightly be "talk to the website operator for CCPA requests"


The AWS analogy breaks down because AWS doesn't encourage customers to pool their S3 buckets into a nationwide searchable index.

Flock operates a federated network. If you drive past an unmarked camera, you have absolutely no way of knowing which specific HOA or town leased it so how are you realistically supposed to know who the "data controller" is to send your ccpa or deletion request to?


Start standing in front of the cameras looking sketchy long enough till police are sent out to ya, then ask the cop who called.


Someone once dropped some fireworks not too far from me at 3am a few years back. They were loud and, yeah, cops were called. A few minutes later about five cars drive past me about 30mph over the limit. Not sure how they didn't see me or try to see me. But I know they didn't catch the BRIGHT orange and lifted care.

Me being me, I submitted a FOIA request for the dashcam footage of the five cop cars and the dispatch logs.

Instead of pulling over the easily identifiable car, they pulled over some random guy. They were behind him the whole time but five cop cars pulled behind him thinking that he fired a gun a few minutes back.

He was let go without a citation, but the official reason, despite being paired with the dispatch for the firecracker, was a broken headlamp.


I may or may not know a business owner who got criminals off their business' street by saying he thinks he saw a gun any time criminals showed up to do things, everything from prostitution to selling drugs. Cops showed up immediately. They stopped coming by altogether, probably the safest street in quite a rough part of town.

It's crazy how cops just rush to very specific and nuanced crimes. Someone likely said they heard gun shots, and then they scrambled to find them.


> It's crazy how cops just rush to very specific and nuanced crimes. Someone likely said they heard gun shots, and then they scrambled to find them.

Is it crazy? Shouldn't the response be proportional?


Contrast to someone being shot dead, if the killer drives away, they might be there half an hour later.


The police should do a lot of things they fail to do.


Police prioritizing responses to violent crimes where lives may in danger seems reasonable to me.


I don’t care. I don’t care who owns the data. If I can’t easily get private information like my movements removed from a database like this, the legislation does not sufficiently protect me.

It should absolutely be Flock’s responsibility to remove my data and we should absolutely require it by law. Full stop.


The legal term is 'distinction without a difference'. Flock/others can't create a weaselly scenario to pretend it's something else. Otherwise people could bypass all kinds of laws/rules just by giving some weaselly description to everything.

This also falls under the 2026 rule 'everyone Is 12 Now'. Flock is literally acting like a 12 year old to get out of following the rules. My 12 year old tried to use this dumb parsing of things to avoid rules/consequences.


The problem with this is where do you draw the line? If I film you with my iPhone (e.g. you walk past in the background of my video), Apple should delete my video from my phone and iCloud account based only on your instructions?

Apple hold the data in iCloud, Apple (or a phone network) may be leasing me the phone. That sounds pretty similar to the Flock situation.

I guess the difference is that flock might be sharing the data from a customers camera with other customers. Then they are definitely controlling it.

I think the bigger problem with Flock is the fact that their cyber security is so laughably bad that non-customers can easily access the data.


Not pronouncing about what path is the most distopic, just for the fun of the exercise of what if we push in the direction:

Given the rule, I would expect (IANAL), Apple should not deal with data stored on phones they sold.

People are responsible for what they store on their device. When I take a photo in the street, if someone come to me asking to erase a photo with them or their kids as they were in the background, I'll tell I don't publish any photo online, which is generally what people are thinking of as a concern and that stop there, but if they insist I will remove it from my phone. Because I'm too lazy to actually live edit the photo and remove them from the picture, even if that is certainly doable with a simple prompt by now.

Now if Apple store automatically photo in some remote server they own, they are the ones who should be responsible to comply with making sure they won't store something illegally. Microsoft, Google, and Apple use PhotoDNA to detect known CSAM if I'm not mistaken. Though legally they only should remove once they get a notice about it. Same way, they could proactively blur visages of people not detected as the people that were whitelisted for the uploading account. And, by that logic, they should certainly remove the information regarding a person if they get a notice, just as well as they wouldn't keep CSAM data once notified, would they?

Anyway the underlying issue is not who store what, but what societies lose at letting mass surveillance infrastructures being deployed, no matter how the ownership/responsibility dilution game is played on top of it.


Are you using your phone photographs to track my movements? I don't care about the photographs part, I care about the "collecting data that can track my movements" part.

I don't mean my movements on the internet either. I understand that those things are easy to track. I mean in real life.

As far as responsibility for the data goes, you're right, it's not clear. Therefore, anyone who uses the data -- Flock or their customer -- should be required to delete it on my request.

That seems like a pretty clear delineation, no?


If apple collects all the data and track movement then yes, they should be liable.


A reasonably nuanced defense could likely claim that to be able to do what you want, would have much worse side effects on privacy.

For example, would you want to be able to tell Public Storage (or some other storage unit place) to remove any naked photos of you stored anywhere in their storage units?

For them to actually be able to do that would require they have nigh omniscience on everything stored by/for everyone in every one of their storage units. Even inside closed boxes.

Now, it's not the same thing of course - but hopefully you understand what I'm referring to?


Except that the analogy is that they already have, or can easily create, that list. If they couldn’t, their value proposition would be lame. “We know you’re looking for a specific license plate, here’s a million hours of footage from all over the city, have at looking through it all.”


Only for paying customers, which you aren't of course. If those customers paid public storage to inventory their stuff, then that inventory is their property. Surely it would be inappropriate to use their inventory data to find your naked photos. A violation of privacy even. (/s, kinda)

I was enumerating the likely defense, not that it's valid.


"Existing capability" removes the argument against onerous requirements, in a legal setting.


The law cares about lots of things we don't care about.


The law is there to serve society. If it is not effectively serving society, it should be changed.


This is also true according to their contracts (we were one of the first munis in the country to ostentatiously cancel our Flock contract, and the lead up to that was a bunch of progressive legal experts poring over that contract looking for holes.)


>a bunch of progressive legal experts poring over that contract looking for holes

all attorneys represent their clients; your attorney does not have to share your opinion of the law or public policy, they can still interpret what the law means to you.

if you are afraid your attorney might have a bias (they are human) you may get better advice from the "misaligned" POV: the flaws/holes in a privacy law found by a pro-business conservative attorney are more likely to find sympathy in the courts from both fellow conservatives and progressive judges.


As a practical matter, this may be good advice. But it also places a demand on someone with a legitimate concern that they go find an ideological "beard" to make themselves more palatable and sympathetic.

It's not hard to see how this enables an institution to gate itself from criticism.


Except that Flock very clearly benefits financially from having direct access to this data: owning (and in their own documentation, they very clearly do own it) a network of 80,000 surveillance devices across the country, and owning every single transit point for the data they collect, is what gets them to a $7.5 billion valuation from investors.

The fact of the matter is that Flock is playing two-step with the concept of "ownership" of data. They disclaim ownership as a way to leave local agencies holding the bag for liabilities, but they fight tenaciously to retain complete and unfettered access to that data.

(After organizing a community group that won Flock contract cancellations in multiple jurisdictions in Oregon, I went on to coauthor state legislation regulating ALPRs. I am very well familiar with all the dirty ball they play.)

Also, Flock's cameras collect more data than is provided to police agencies. Who owns that data, I wonder?


That makes them a data broker in my reading, and at least in California, Data Broker legislation should apply. CA Data Broker registry gives me access denied, but that could be because I am outside US.


I looked it up at https://cppa.ca.gov/data_broker_registry/ and didn't find Flock / Flock Safety in that list of the currently registered 566 data brokers.


Because Flock isn't a data broker. Flock's customers own their data, not Flock, and they use Flock's platform voluntarily to share data with other customers.


Flock charges to access the data which is voluntarily shared by other customers. I am struggling to note a difference in this practice from any other data brokerage service in existence.

Does Flock do some kind of P2P dance to avoid the data transiting their systems?


Legally how does it work if I upload a file to Google Docs and then share it with my contacts? Is Google then a data brokerage for my files?


They are not, because they are not operating a business that acquires and resells your data. You own your document, and Google isn't selling it to third parties. Flock doesn't own municipal data, and Flock is also not "selling it to third parties"; it's facilitating a sharing system that law enforcement agencies avidly desire.

Presumably the California data brokerage statutes were written specifically to prevent the kind of nerd-lawyering happening on this thread.


So… Flock uses their own platform and top to bottom tech stack to do everything technically? Your local PD doesn’t use random cameras (like Reolink), doesn’t run a custom software stack (like Frigate in a container on some random VM hosted with AWS), doesn’t store the data wherever (like Backblaze)? The customers just have to install the Flock cameras and “order” the subsequent data from Flock? But you say they’re not at all responsible or accountable for any it because despite doing everything at every step, they’re “just a broker”?


I was referring to the claim that "Flock's cameras collect more data than is provided to police agencies" — that suggests that there is data not "owned" by the customers, which implies it's Flock's data, thus it might make them liable under Data Broker legislation.


If Flock's customers, using Flock's infrastructure or tooling, can share data with each other, that would be bad.

I'm not saying that's what's happening, but that's what I thought was happening before reading this thread, and now I have to go and run through their policies.

Either way ALPRs and AI-facial scanners in public are a huge violation of privacy and I loathe them, but I hope it's correct that Flock customers cannot easily share information with one another.


> If Flock's customers, using Flock's infrastructure or tooling, can share data with each other, that would be bad.

Ex-employee of Flock here, that's ABSOLUTELY what's happening.

And what's more Flock lets them do so even when they know the agencies are legally not permitted to do so. They turn a blind eye, say it's not their problem to enforce ("oh, doing so in state X is illegal? Well, even if your agency is in state X, we didn't disable that feature"), then happily provide training to do enable those agencies to do so (and it's a nudge nudge wink wink part of the sales process.)



Sharing data between customers is a large part of the point of the product.


Equivocation. My stock broker doesn't own my stocks either, they merely hold my assets in a brokerage account.


I encourage you to present that analogy to an actual court and see how far it gets you. It's very easy to find the statutory definition of a "data broker" under California law.

This is what I mean by the fruitlessness of these kinds of legal discussions on HN. What do you want me to argue, that you're wrong to want the law to work that way?


Are you aware that not every lawyer with skin in the game shares your opinion of what a broker is?

https://www.courthousenews.com/california-drivers-accuse-flo...


Technically, most stocks are registered in the name of a securities holding company, with you named as beneficial owner. That makes it frictionless for you to buy and sell. You enjoy all the rights of ownership, unless the broker lends your shares out to someone else.

You _can_ get shares registered in your name.


And you would (rightfully) be angered if your stock broker sold your shares and pocketed the proceeds, because you own them.


> But the data collected is property of the government

I thought this was the get-out clause from the constitutional problems with Flock? That because Flock is a non-government organisation it isn't restricted by the constitution (i.e. the constitution only restricts what the government can do).

They can't have it both ways - if Flock are collecting the data then they are subject to the privacy laws. If it's the government collecting the data via Flock as just a service, then they are subject to constitutional restrictions.


This is worth validating independently, but to be clear:

Are you saying Flock itself does not have access to any of the data, and that the data they store on behalf of local governments is not fed into any central datalake? That every organization's data is completely, unalterably separate from everyone else's?

If so, that makes the panopticon slightly less powerful.


That’s a pretty compelling argument, but what if I went round to AWS’ house, peeked into their kitchen, and saw a crate of photos on their table with me in them?

I’d absolutely say:

“Hey, that’s me! Give me those right now!”

I’d also be pretty angry if they told me:

“Sorry we’re storing those for Corp Inc. Go ask them.”

To refute my own point though, this only sounds annoying because the data processor is being irritating by manually referring me to the data controller. In practice, it would be trivial for them to automatically forward communications between me and the controller.

That’s what feels is amiss with the top level article.


If I lease out a property to a tenant (apartment, retail, industrial use, whatever) and that tenant is committing an illegal activity on the property. Would the landlord be liable for knowing it? Or not?

"Sorry FBI, the tenant renting my warehouse out to manufacturing cocaine is not my responsibility. I won't do anything about it. You deal with them."

Nope, that's a failure of a duty to act and aiding and abetting a criminal activity if you hace constructive knowledge.


I assume they are building "meta-data" profiles of people based on the data they say they can't use directly. That seems like an easy work-around that satisfies the lip-service they've given to the issue.


I would argue that the request was invalid in the first place.

If I see a flash on a speed camera operated by a business on behalf of a police department, your argument states I should be able to use CCPA to force the business to delete my picture and the record of me speeding If I can get the request to them before the police can file with the court and request that data as evidence.

The data belongs to the government, and you can't get around that right by going to business that holds the data and asking them to delete it.


> If I see a flash on a speed camera operated by a business on behalf of a police department, your argument states I should be able to use CCPA to force the business to delete my picture and the record of me speeding If I can get the request to them before the police can file with the court and request that data as evidence.

Sounds reasonable to me. If the police want to put up a camera, then the police should put up a camera.

Offloading their legal responsibilities to a third party company is shitty.


"Hey private prison please delete all data you have about me. And by the way, I'm locked up here by accident. Please release me."


Honestly private prisons are a farce anyways, so yeah this seems valid to me. The government doesn't get to get out of its obligations to citizens by outsourcing to third parties, and third parties don't get to wield government-level authority without government-level accountability.


So police departments should have to develop and host all their administrative software also? I think we can all see why that would be a terrible idea. Police are like any other government agency or business in that they contract with the private sector for a variety of services that are not in their area of expertise.


> So police departments should have to develop and host all their administrative software also?

Yes. We're in an high technology and information age. Police should be well-versed and capable of understanding the technologies and informations that people use.

> I think we can all see why that would be a terrible idea.

I don't.

> Police are like any other government agency or business in that they contract with the private sector for a variety of services that are not in their area of expertise.

Why shouldn't police (or some law enforcement agency) be capable of operating and maintaining law enforcement technologies?


Develop, no. Host, yes. They should buy, own, and operate any technology like this on-prem. The only involvement that 3rd-party tech should have is sales, tech support, and maybe blind, encrypted backups accessible only by the municipality.


In other countries, police contract companies to develop software and run and manage the software themselves. Putting up a continental drag net to sell to government agencies is something I've only heard of from the US.

Nobody is saying cops should be writing software, but Flock shouldn't have access to the data and analysis tools it has right now. If American police can afford to be armed similarly to a small army, surely they can pay to run a couple of servers in a basement somewhere.

I'm surprised the USA is letting this happen given the culture of individual freedom that seems to have traditionally driven American laws.


> Nobody is saying cops should be writing software

I disagree. Businesses have their own internal software development teams.

Why shouldn't cops?


But we're not talking about speed cameras or a private entity with exclusive contract with the police to provide traffic enforcement.

We're talking about Flock. A company offering surveillance as a service. Per their website:

>Trusted by over 12,000 public safety customers including cities, towns, counties, and business partners.

If Flock's argument holds then most of the CCPA be circumvented this same way. All it takes is a few entities and clever contract language.


Except the data does NOT belong to the government, that's the whole point of Flock operating the way it does. It's not governmental data collection it's data collection by a private company that is then made available to the government upon request. And yeah: it is literally allowed to delete data, because again: it's not a government agency, it's just private data, collected by a private company, with the exact same status as you recording an public intersection with a camera from your window.


I think you're going to have a hard time with this...

Flock seems to leave the data in ownership of the government. They are just providing the service of being custodians for storing and accessing that data.

You probably would get a similar response by submitting your request to Amazon web services or Google cloud or whoever has Flocks data: "sorry, we're just holding the data on behalf of Flock"

In either my example case or your stated case, you would have a very hard time convincing the host business to destroy their customers data without a court order or court case that shows their policy is invalid and they must comply.

Not a lawyer, just noting the parallel.

I do appreciate that Flock's response says that they cannot use the data they've collected for other purposes.. which further reinforces my cloud storage analogy -- the cloud vendor can't look at your data you upload to storage to e.g. build profiles on you/your business.


> the cloud vendor can't look at your data you upload to storage to e.g. build profiles on you/your business.

Would our main check on this be whistleblowers?


models are only as good as our understanding. From the abstract:

> Here we account for the influence of three main natural variability factors: El Niño, volcanism, and solar variation.

All of these events are decades-long (or longer) cycles that don't have a substantial amount of data points... Sure, solar cycles seem to be 11 years, but we don't have a lot of scientifically usable (for forecasting) data points on that -- maybe 8 cycles? less? And the cycles are not consistent. It's not like Year 4 of one cycle is like year 4 of another cycle, we just determined there's a period of about 11 that looks significant.

Same with El Niño -- it's not like its 'true' or 'false', there's degrees of it.. and when it starts, and if other conditions are right to make additional hurricanes that year, and how much cloud cover that generates, etc. etc. a lot of which we don't have data on past 1960 when we launched our first weather satellite ...

As for volcanos... there's lots of them, and we are not great at predicting the high-impact events... we certainly don't have sufficient data to accurately predict what happens if we had a huge eruption on an El Niño strong year during the height of a solar cycle.


I had a similar push years ago, but I did take this approach once step further. For a similar reason Jeff mentions -- lower maintenance over time.

I was frustrated that (because my posts are less frequent) changes in Hugo and my local machine could lead to changes in what is generated.

So I attached a web hook from my websites GitHub repo to trigger an AWS Lambda which, on merge to main, automatically pulled in the repo + version locked Hugo + themes. It then did the static site build in-lambda and uploaded the result to the S3 bucket that backs my website.

This created a setup that now I can publish to my website from any machine with the ability to edit my git repo. I found it a wonderful mix of WordPress-like ability to edit my site anywhere along with assurance that there's nothing that can technically fail* (well, the failure would likely, ultimately block the deploy, but I made copies of my dependencies where I could, so very unlikely).

But really the main thing I love is not maintaining really anything here... I go months without any concern that the website functions... Unlike every WordPress or similar site I help my friends run.


Exactly; and I'm currently tinkering with different deployment options. One thing I may do to speed up the deploy is run the Hugo compilation on the server itself, so the only push that needs to happen for a new post is a few KB via git. A post-receive hook would then run Hugo and deploy into my public www dir.


Molten salt solar power doesn't care. It remains hot.

Advancements in solar also are improving with clouds.

Also, you know, batteries. When someone makes it cost effective to install a device to sell your car battery power on the grid we'll also have a better time managing the grid during spikes... Would be nice if that also did home battery backup in blackouts... 70 kWh would get me through most of the ones I've experienced.


Molten salt solar power plants are completely obsolete. See for ex. Ivanpah being shut down early because the power its generating is too expensive compared to Solar PV: https://www.renewableenergyworld.com/solar/once-an-engineeri...


Molten salt absolutely does care, keeping it molten controls how much power can be withdrawn. It’s a form of thermal battery (and an inefficient one).

If the sun is shining vs not (and if further withdrawal will freeze the salt) absolutely controls power output.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: