It's a single number in that if you take an IQ test one time you get one number, but that doesn't mean you'll get that exact number every single time you take an IQ test. Even ignoring more complex questions about them, your score on an IQ test will vary depending on simple things like how tired you are when you take it, so in practice there's some variance and you do not always get the same number every time you take a test.
Well based on the paragraphs in the README it's not actually being updated anymore, it only reflects SteamOS as of August and the author quit running their process to update it.
What prevents a farmer from simply switching back to the non-GMO seeds if the GMO option goes up in price? Or even ignoring that, switching to a different cheaper GMO seed from a different company?
I think that's the piece I and others are missing, isn't it ultimately a question of which seeds will make the farmer the most money? If a particular GMO seed suddenly become so expensive that either non-GMO or other GMO seeds are more cost-effective, why can't they just start using them instead?
Not really - if the market price for a crop is such that it depends on the greater volume which can be produced by GMO seeds, switching to non-GMO seeds becomes uneconomic.
Let's say GMO crops gives you a grain yield of 1-ton/acre and that non-GMO crops gives you a yield of 0.5-ton/acre. Now the market price is say set at $100/ton. This cuts down their earnings by half in the best case, all other inputs remaining the same.
Now if the GMO-seeds are controlled by a foreign entity, your entire agri output becomes dependent on that foreign entity not behaving badly. Whichever nation that controls the entity who owns the GMO-seed now has leverage over you.
So no, it isn't as simple as "switch back to using non-GMO seeds". This has to be carefully considered before adopting GMO-seeds.
"Bugfixes" doesn't mean the code actually got better, it just means someone attempted to fix a bug. I've seen plenty of people make code worse and more buggy by trying to fix a bug, and also plenty of old "maintained" code that still has tons of bugs because it started from the wrong foundation and everyone kept bolting on fixes around the bad part.
One of frustrating truths about software is that it can be terrible and riddled with bugs but if you just keep patching enough bugs and use it the same way every time it eventually becomes reliable software ... as long as the user never does anything new and no-one pokes the source with a stick.
I much prefer the alternative where it's written in a manner where you can almost prove it's bug free by comprehensively unit testing the parts.
It's a difference of whether the function arguments are declared or not. If you declare a `void foo()`, and then call `foo((float)f)`, the `foo()` function is actually passed a `double` as the first argument rather than a `float`. If you instead change the declaration to `void foo(float)` then it gets passed as a `float`.
> I'd go read the original PR and the discussion that took place.
Until your company switches code repos multiple times and all the PR history is gone or hard/impossible to track down.
I will say, I don't usually make people clean up their commits and also usually recommend squashing PRs for any teams that aren't comfortable with `git`. When people do take the time to make a sensible commit history (when a PR warrants more than one commit) it makes looking back through their code history to understand what was going on 1000% easier. It also forces people to actually look over all of their changes, which is something I find a lot of people don't bother to do and their code quality suffers a lot as a result.
Bisecting on squashed commits is not usually helpful. You can still narrow down to the offending commit that introduced the error but… it’s somewhere in +1200/-325 lines of change. Good luck.
This sounds unattainable to me. For code bases in the 2 million or more lines range, something as simple as refactoring the name of a poorly named base class can hit 5000 lines. It also was not a mistake with the original name it had, but you'd still like to change it to make it more readable given the evolution of the codebase. You would not split that up into multiple commits because that would be a mess and it would not compile unless done in one commit.
Such PR's shouldn't be the norm but the exception. What happens way more often is that such refactoring happen in addition to other criteria on the same PR. In high-functioning teams i've worked this is usually done as a separate PR / change, as they are aware of the complexity this operation adds by mixing it with scope-related changes and that refactoring shouldn't be in the scope of the original change.
FWIW licensing is definitely part of why some 'obvious' stuff is still missing, Nintendo doesn't own the rights to games that they didn't develop themselves (generally speaking).
Ex. We'll probably never see the first six FF games on Switch Online, Square Enix is just unlikely to agree to that for a variety of reasons.
Which is rather surprising to me. I don't know what the contracts between Nintendo and developers say but I would have expected "rights to publish or distribute in perpetuity" would have been in there as part of the deal for making official carts.
That's probably true for later generations, but in the NES (and maybe SNES) era? Undoubtedly, they didn't have the foresight to write that into the contracts.
In the early days of television, many broadcasters were prohibited by contract to retain any copies of the performance, because no value was seen in reusing them, and there's no other reason to give them any rights. Also see shows like WKRP in Cincinnati where music was only licensed for the slim purpose of the original broadcast (and perhaps direct repeats in syndication), but for release on home video, the music used did not support that use, so it had to be replaced.
Nintendo already tried this kind of power grab, more or less, and it failed.
The NES didn't have software lockout in Japan. Most third-party Famicom[0] games were manufactured and sold by their publishers, with little or no control from Nintendo. Nintendo's way to wrestle back control over their platform was the Famicom Disk System, a disk drive add-on that was intended to work around the NES's 32k ROM limit (and associated costs of ROM) with cheaper disks that could hold up to 64k of loadable data per side.
The key was that the FDS had two lockout features that Famicom cartridges didn't:
- FDS disks had an imprint of the Nintendo logo at the bottom that meshed with plates in the disk drive. If your disk did not say Nintendo on it, it would not mount in the drive and the game would not play.
- FDS disks were rewritable, and Nintendo planned to sell games at special vending machines that would write you a fresh disk. If you weren't selling your game through Nintendo, your game wouldn't be on these vending machines.
So if you wanted your game on FDS, you needed to sign a distribution agreement with Nintendo. I'm told the terms were rather draconian. Most developers just... put larger ROMs and better enhancement chips on their third-party Famicom cartridges. This continued until someone figured out how to copy FDS games using just the RAM adapter carts and Nintendo gave up on the FDS concept entirely.
To be clear, Nintendo did have software lockout in the US, and you did have to license your game to Nintendo and have them sell copies of it using their hardware. This caused a lot of problems for NES releases of Famicom games that had custom chips in them (e.g. Contra). But even then, these were not perpetual licenses.
A perpetuality requirement would have killed any and all licensed games stone dead. If you're making a movie tie-in cash grab game on NES, you don't want to have to license that movie in perpetuity just because Nintendo demands a perpetual sublicense for a game with an expected shelf-life of about a year. Hell, not even Nintendo licensed Mike Tyson for Punch-Out perpetually.
> I would have expected "rights to publish or distribute in perpetuity"
I think that would have been unlikely to occur to Nintendo's lawyers, since in the NES era, publishers required the developer's co-operation to provide masters targeting any additional platforms. This was before intergenerational emulation and internet distribution became widespread, and Nintendo would have had a sunset date for NES title sales.
IMO the article is being a little unfair to being highly misleading by only talking about box-office numbers, they're a much more complicated discussion than the article treats them (especially so since it includes a number of pandemic years). Additionally it's ignoring the reason _why_ CG is so popular and there's so many flops - you can make then a lot quicker than hand-drawn animation.
Tons of hugely successful CG movies came out in the past few years that are not mentioned in the article (Inside Out 2, Frozen 2, Mario movie, Moana 2, Spider-man, etc.) - all of those movies had a higher box-office gross than every hand-drawn animated movie ever except for the Lion King in '94. I'm not saying they're _better_, but rather that box office numbers aren't a great reason to argue for hand-drawn animation, I don't think Disney and other companies are very concerned about the flops when the successes rake in so much money.
"Operating system" has no real technical definition, it's a term that doesn't cleanly map to all the stuff we call "operating systems" today. Even the "technical" definition you gave is murky, that definition does not care _where_ the software is running. It easily encompass software running outside of the Linux kernel, much of it is expected to be there for the system to function properly and support various kinds of programs.
This thing is distributed as an installable OS image and has pretty specialized software for make it manage your programs and data in a pretty specific way, IMO that's good enough to call it an operating system.
It costs more because it's less dense than DRAM - the same transistor count that produces 2GB of DRAM can only fit a fraction of that in SRAM because it's 6 transistors per SRAM cell vs. 1 + capacitor for a DRAM cell.
Power usage is also generally worse since the SRAM cells use continuous power to hold their state, while DRAM cells only use power during read/write/refresh (relying on the caps to hold their charge for a short time while not actively powered).
reply