For human-maintained config, TOML is only "better" when the structure is so flat that it's almost indistinguishable from an INI file.
Anything more complex and it becomes one of the worst choices due to the confusing/unintuitive structure (especially nesting), on top of having less/worse library support.
YAML's structure is straightforward and readable by default, even for fairly complex files, and the major caveats are things like anchors or yes/no being booleans rather than the whitespace structure. I'd also argue some of the hate for YAML stems from things like helm that use the worst possible form of templating (raw string replacement).
I'm with you on all that. I think YAML's fine, and I like it way more than TOML for non-trivial files.
I think Python's pyproject.toml is a great use of TOML. The format is simple with very little nesting. It's often hand-edited, and the simple syntax lends itself nicely to that. Cargo.toml's in that same category for me. However, that's about as complex of a file as I'd want to use TOML for. Darned if I'd want to configure Ansible with it.
Agreed, I do a lot of Ansible, and it took me a while up front, but I've become pretty accustomed to YAML. Though I still struggle with completely groking some of the syntax. But, I recently took a more serious look at TOML and felt like it'd be a bear for Ansible.
A few months ago I made a "mini ansible / cookie cutter" ( https://github.com/linsomniac/uplaybook ), and it uses YAML syntax. I made a few modifications to Ansible syntax, largely around conditionals and loops. For YAML, I guess I like the syntax, but I've been feeling like there's got to be a better way.
I kind of want a shell syntax, but with the ansible command semantics (declarative, --check / --diff, notify) and the templating and encryption of arguments / files.
> For human-maintained config, TOML is only "better" when the structure is so flat that it's almost indistinguishable from an INI file.
Agree. I've recently inherited a python project, and I'm already getting tired of [mentally.parsing.ridiculously.long.character.section.headers] in pyproject.toml.
Seriously, structure is good. I shouldn't have to build the damn tree structure in my head when all we really needed was a strict mode for YAML.
> I'd also argue some of the hate for YAML stems from things like helm that use the worst possible form of templating (raw string replacement).
I was literally speechless when I saw helm templates doing stuff like "{{ toYaml .Values.api.resources | indent 12 }}", where the author has to hardcode the indentation level for each generated bit of text like a fucking caveman.
The tiny examples might look kinda okay, but when someone has stacked 10 different patch operations in a single file, it gets a lot harder to keep track of what's going on.
“Nesting is bad” is such a simplistic take. Nesting is absolutely essential and inescapable. What that statement is really doing is placing a limit on what whatever it applies to can be used for. It would be better to spend a few more words expressing what you really mean.
Your comment is a simplistic take on "Nesting is bad" given the context.
It's not hard to infer that they're referring to nesting as a footgun: make it harder and you lose some power but you keep your feet.
Config files are a poor place to complex and deeply nested relationships. If it's not ergonomic to reach for nesting people tend to be forced to rethink their approach.
The problem is "config" means different things to different people. Some people see config as "the collection of runtime parameters" basically a bank of switches: Pyproject.toml is config. Others see any form of declarative structured data ingested by a runtime as config: docker-compose.yml is config.
And of course to minimize impedance mismatch, the structure should be similar to the domain.
So yes I want a "config file" to handle at least a dozen levels of nesting without getting obnoxious.
Then I guess to frame it in your language: they want formats that encourage config files, not "config files".
And I don't disagree. The problems of nesting objects "at least 12 levels deep" aren't going to be solved by the right format. The tooling itself needs to expose ways to capture logical dependencies other than arbitrary deep K-V pairs.
What if your problem is best expressed as "arbitrary deep K-V pairs"? It's going to be more common than not, nesting really is that fundamental.
There is no escape, you can't win. If you want the nesting, and assuming you can't remove it from the problem itself (as you often can't, or at least shouldn't), there's only one thing you can do: move inner things out, and put pointers in their place. This is what we do when we create constants, variables, and functions in our code: move some of this stuff up the scope, so it can be used (and re-used) through a shorthand. It loses you the ability to see the nesting all at once, but is necessary (among other reasons) when the nesting is too large to fit in your head.
Of course once you do that, once you introduce indirection into your config format, people will cry bloody murder. It's complex and invites (gasp) abstraction and reuse, which are (they believe) too difficult for normies.
The solution is, of course, to ignore the whining. Nesting is a special case of indirection. Both are part of the problem domain, both are part of reality. Normies can handle this just fine, if you don't scare them first. You need nesting and you need means of indirection; might as well make them readable, too. Conditionals and loops, those we can argue about, because together they give a language Turing-complete powers, and give security people seizures. And we have to be nice to our security people.
This is whining that people won't endorse a lazy, poorly scaling approach to an engineering problem... and justifying that approach by conjuring hypothetical whiners against a common, better scaling solution.
If you need 12 levels of nesting, add indirection, or live with the fact no one is designing formats to enable your oddball mess of a use case.
12 levels of nested braces in a single function is already a crappy idea: it's an even more crappy idea in a config file because of the generally inferior tooling, and now there's a downstream component that needs to change to support a cleanup (meaning it almost never gets fixed and the format just gets worse over time)
That aspect at least is easy to solve by drawing a stronger line between public content and stuff seen by people you actually know. This would be more of a social change than technical of course.
The harder part is figuring out how to handle public content, because there are still innocuous or positive uses for it as well.
> Anyone who understands what Bitcoin enables and still doesn’t agree that Bitcoin is valuable and good for the world has a wildly different set of values that I can’t understand.
You have that right at least.
I cannot understand anyone who promotes the use of systems that catastrophically increase the risks of human error, particularly for the most vulnerable, and which pointedly ignore that other solutions may work better.
Particularly when the people promoting those systems have an obvious financial incentive to misrepresent them.
> I cannot understand anyone who promotes the use of systems that catastrophically increase the risks of human error, particularly for the most vulnerable, and which pointedly ignore that other solutions may work better.
I can't speak for others, but I think it's great to promote whatever solutions may be solving problems for a particular use-case. Ideas like social key recovery give users an option to trade some level of full self-control for a safety net. I expect user-friendly implementations of ideas like this to be successful in the near future.
I think it's great that users can choose to participate in Bitcoin however they want. Some people will write their own private clients that nobody else uses. Some people will give all of their coins to a billion dollar company to hold for them. In-between the two extremes, people make their own decisions on what risks they can tolerate and participate in a way that's right for them.
Because doubling the amount of dollars in the past year or so is not at all worrying for "transitory" inflation. Dont be naive, it's coming. The politicians have kicked the can down the road for decades and had they not bailed out the bigs in 2008 and during this past 2 years we would be bankrupt. Its a house of cards. They are going to push us into a war to escape the inevitable.
Governments go to war to stop deflation, not inflation.
As you lower the interest rate below liquidity preference people will simply hold onto their money because they are speculating on higher interest rates.
No no no! You see velocity of money has decreased and productive output has increased. People don't like to spend the new money. Therefore inflation is low, do not be fooled by the Austrian tricksters suggesting otherwise.
If hyperinflation is a knowable event, you'll already be in it. People would instantly spend their money if they knew hyperinflation was coming, making it a self fulfilling prophecy. It's really meaningless to make such statement about any currency that isn't already in hyperinflation.
> Also, like regular graphics cards, newer versions are highly prized so one might say that graphics cards as an industry is already all about e-waste. Unless you’d like to buy my Radeon 9800 or my GTX 1070? They’re still perfectly good for gaming… except everybody wants the new RTX ray tracing etc.
This is highly inaccurate, particularly on the used market. People want cards that can handle whatever games they play within their budget. A lot of older cards run tons of games quite well still, especially if you accept lower framerates / resolutions.
RTX in particular isn't actually that coveted by gamers from what I've seen. It's seen as a nice-to-have at best.
I don't mind 16:9 so much as the incredibly poor choices for resolution and panel quality I'm faced with on most Windows laptops lately. It feels like TV salespeople took over and completely ignored what actually makes sense for a laptop.
1080p and 4K are both terrible choices for 13-14" screens, yet that's all you see anymore, and even on higher end models trying to get accurate info on panel quality and color accuracy is a nightmare.
And this is exactly why I'm so skeptical of cryptocurrencies in general. There doesn't appear any viable way to make them work as currencies that doesn't either have horrendous externalities, simply replicate what existing currencies already do (often poorly and with many downsides), or often both.
I don't think it's a coincidence that even a decade plus later, the primary use cases for crypto still seem to be grey/black market deals, speculative investments, and pyramid schemes.
> way ... that doesn't either have horrendous externalities
The CO2 emission externality need have nothing to do with Bitcoin or any other proof-of-work chain. Tax carbon at whatever level makes sense and Bitcoin will adjust. (As I understand it, even currently Bitcoin mining mainly uses renewable energy, because it's cheaper; and it's trending cheaper still.)
The externality is at the power plant, not the use. Banning a use is like basing your server's security on client-side Javascript.
> Tax carbon at whatever level makes sense and Bitcoin will adjust.
> The externality is at the power plant, not the use. Banning a use is like basing your server's security on client-side Javascript.
How would that work? Applying the same carbon tax on farming as on bitcoin? You always need to differentiate on use. Otherwise we could also just have a single income tax and be done with it. However taxing food as much as a Ferrari doesn't really make sense.
The whole purpose of taxing carbon is to reduce carbon emission to the efficient level and shift energy consumers away from uses that are not worth the cost in carbon emission.
Say you're a bitcoin miner powered by a coal plant. A carbon tax is imposed. The price of your power goes up. Your competitors, powered by solar, are unaffected. Maybe you keep going at the higher price; more likely, if the tax was set at anything like the genuine externality, you shut down. Possibly you keep going for a while, winding down your ops at this location but moving any new ones to find affordable power. Sucks to be you if you didn't anticipate the tax (which seems implausible, they won't announce it effective next Monday), but Bitcoin itself will hardly notice.
Say you're a farmer also in coal-plant-land. Aren't farmers powered more by internal-combustion engines than grid power? That should be carbon-taxed too in this world, and that's good: you want farming, where it's climatically most expensive, to shift to less-CO2-costly methods and crops. Farming spends energy on a much wider set of tasks, some of them more essential to the output than others, and some outputs more inelastically demanded than others. For some of them you adjust, for some you continue and pay the higher price. The ones you adjust were not worth the carbon cost; the ones you don't were. You have to charge your customers some amount more, depending on how essential the coal turns out to be in your case. Maybe, like the bitcoin miner, you stop farming, or shift to some sort of less-intensive organic farming; maybe you don't. Either way, it's more likely the right decision for the planet! We stopped pretending that dumping carbon is side-effect free.
You don't "differentiate on use" by politicians and bureaucrats deciding what's naughty or nice. They don't even know! It's an incredibly complicated problem! They further have no real incentive to do it even vaguely right, rather the opposite: any competent politician can look to the public like they're public-spirited while favoring concentrated interests. Was the FDA just stupid for banning the J&J vaccine the other day? No, they're fundamentally misaligned with the public interest.
Not to even mention the even more useless buzzword application of blockchains to business to pump up stock prices. I'd go as far to say that cryptocurrency is the most "useful" application of blockchain to date. And even then, it appears only truly useful for dark web transactions and pyramid schemes. Why else would we use a wildly fluctuating currency that takes 20 minutes to send a payment?
While that -might- be true, the question is whether the dollar, the RMB, the CAD, and every other currency have anything like the -direct- pollution cost of bitcoin, which my understanding is that they do not
I think the dollar undoubtedly has several orders of magnitude lower direct pollution costs, but also have several orders of magnitude higher indirect costs.
It's a pretty tangly web, so hard to know what to lump in as a comparison but in the superlative case consider: the federal reserve, many bank/FI departments tasked with securing and transferring money safely, auditing (public ledger has many benefits for transparency and reporting), money transfer industry, international relations, lobbyism, US military dominance, etc.
Bitcoin has zero employees, probably only thousands of people working on Bitcoin-interfaced systems. The network uses a large amount of electricity, but that's kind of it - there are few other costs to account for. All of those industries above collectively employ millions of people - should we account for only organizational energy consumption or do we also account for salaries and thus private energy consumption of all of the individuals necessary to support dollar hegemony?
I think it would be really interesting to find a number for "for each dollar in existence, how much is spent per year preserving the dollar's position as the global reserve currency?" How does this number compare to inflation? If it is greater than inflation, does that mean that dollar hegemony is unstable and its fall is inevitable?
Of the estimates I've read, it seems like BTC uses about 60% green energy. Which is about double the 'green-ness' of the broader energy economy, but it's still a significant amount of 'direct pollution' from carbon sources.
Couldn't disagree more. I've always had false positive issues with Apple's trackpads. The newer ones are slightly better in terms of accuracy, but they're so absurdly large that it still happens more often than it used to - and this is across 4+ macbooks and multiple generations, so it's not just a faulty model, and I'm not the only person I know with this issue.
* No ability to use single click + hover to highlight
* Relative sizing feels way off - everything in Finder always seems to be simultaneously way too spaced out while also being way too small.
* Never seems to remember view preferences properly, and often defaults to confusing arrangements.
* Doesn't like to stay connected to network drives, despite any number of tricks I've tried.
* The usual cut/paste/delete operations being needlessly complicated to perform
I do prefer macOS overall, partly because I'm tired of having to constantly tweak and fix Linux whenever I try to use it as a desktop system, and because of things like iTerm2 and BetterTouchTool.
But I really hate trying to do any kind of real file management with Finder, and most third-party apps I've tried just seem to replicate everything I dislike about Finder.
Anything more complex and it becomes one of the worst choices due to the confusing/unintuitive structure (especially nesting), on top of having less/worse library support.
YAML's structure is straightforward and readable by default, even for fairly complex files, and the major caveats are things like anchors or yes/no being booleans rather than the whitespace structure. I'd also argue some of the hate for YAML stems from things like helm that use the worst possible form of templating (raw string replacement).