This work seems so much more challenging, interesting, and fun compared to what I do as a senior software engineer. Why does hardware engineering pay so much less??
The number of hardware designs companies can produce is restricted by economies of scale and the less mutable nature of hardware versus software, which in turn restricts the number of hardware engineers needed and thus the market demand for their skills. Also, much of the hardware production pipeline has been moved overseas to places with lower standards of living.
For the same reason game development or software development at SpaceX pays so little: if people want to do it you don't have to pay them to show up, only to survive.
Most employers in the U.S. over about 1,000 employees are “self-insured”, meaning they hire the insurance company for the network and administration, but the company actually does directly pay out claims. At over 1k employees, I expect MailChimp was already self-insuring, and Intuit definitely does.
No mention of permafrost? Yes, the _air_ will be more habitable, but if the _ground_ is a boggy mess, you’re not going to be doing much with this newly habitable land.
There were also no occurrences of the words "sun", "light", or "insolation".
The further away from the equator you go, the more your sunlight becomes a fickle seasonal creature hitting you at odd angles in varying intensity. Temperature isn't the only variable affecting the growing season and agricultural productivity.
Detect it: poke a stick. It stops, at a pretty consistent depth, and not because of rock.
Remove it: overheat the planet. It will turn to mud. Anything built on it will subside or sink. Massive amounts of trapped organic matter will turn to methane.
Eventually, I presume the area would settle into a new climatic zone, with neither permafrost nor mud. Emphasis on "eventually".
That is not what this is about. That's an old demo, using only the documented features of the LCD. As far as I can tell ~everything on there is done using the standard charset and the 8 customizable characters.
There is no demo for what this article talks about; it kind of ends in a cliffhanger and it seems the author hasn't quite gotten to the implementation stage yet.
I kinde of see that screen shaking it's head at the end of it going "WTF just happend" like it's waking from some crazy dream it just had. That's not what I'm supposed to be used for at all!! Sort of like it was just given the choice of a red/blue pill
I get the sense that hyperscalers don’t give Intel a ton of margin in negotiations, so they already have access to great server chips at competitive prices. And they’re bringing in AMD now to compete. Where M1 really shines is client-side applications like video handling, ML inference, and crazy branchy nearly single-threaded code. Servers can afford to run much lower clock speeds (see GCP n1 family, for example) because web serving is an embarrassingly parallel problem. Sure, some ARM might help lower costs of internal things like S3, but Graviton2 isn’t that incredible compared to the M1-vs-Coffee Lake comparisons going on in the laptop world.
I’m rather happy about buying the recently updated 13” MBP and basically skipping ARM Macs. My last one lasted 8 years, so I expect this one to last roughly as long. By then they’ll _definitely_ have ARM figured out, and in the meantime I can continue enjoying x86 compatibility and VMWare. For the fairly small segment of the market that we devs represent, I suspect I’m not the only one here implementing a _reverse_ Osborne Effect. Get your x86 Mac while you still can!
This is essentially what RISC-V does with its "Compressed" instruction set, except without the dictionary switching. They pulled a bunch of statistics over real-world machine code, ran it through compression, then reverse-engineered that compression to make something a bit more sensible to a compiler writer. I think this will work out vastly better than the haphazard patching of e.g. Thumb on ARM.
Oof. This really hurts. So often when folks complain about Google shutdowns, I defend G with “but that was a consumer product! GCP doesn’t kill off things. It’s different.” But here we have Google Cloud (in my defense, not GCP) killing off a powerful, proprietary Enterprise product with no viable migration path. At least folks get a year to migrate off, but that’s a lot of sunk cost.
I love my ARM-powered Chromebook. True, it can’t teally do native development, but ARM clients aren just coming. They’re here. And wow doesn’t have amazing battery life and an unbrickable OS.
I tried going down the route of dealing with OFX in GNUCash, and it's just a huge never-ending hassle. Tiller dumps the raw data from Yodlee (the backend that does the gnarly interfacing and cleaning for most PFM sites, besides Mint) into Google Sheets. From there you can use Apps Script or the Sheets API to push the data wherever you want. Sure, it's $5/mo, but well worth it for not ever dealing with bank websites or APIs or OFX formats again.
Looks cool, but wow, you're really handing all your banking data to Google? Does Tiller support raw CSV downloads?
I'm using Plaid's API with a read-only user on my bank account to download the data locally and enter it into SQLite. Then when I want to run reporting on it, I generate a Ledger file. Having it in SQLite lets me run a web interface over it and change the category for each item (Amazon purchases, for instance, which are seemingly impossible to categorize automatically).