About 10% of eligible voters don't have the ID they would need. The people most likely to lack proper ID for voting are young, Black or Hispanic, and poor.
Total nonsense, there are no studies that link colic to formula feeding. Some newborns have digestive problems with formula; some newborns have digestive problems with breast milk. Formula digestive problems can sometimes be helped by switching to a different formulation, while breast milk digestive problems can sometimes be helped by changing the diet of the breastfeeding parent.
And there are sources of colic that have nothing to do with digestion - a quick perusal of WebMD lists noise/light sensitivity, nervous system misfires, and baby migraines (poor babies)!
Colic certainly might be an indicator that something is not quite right, but that doesn't mean there's always something you can do about it. As I write this with my exhausted newborn son sleeping in my arms, sometimes you just have to wait for them to grow out of it.
EDIT: I looked into this a little more since I'm pinned by a sleepy baby and as with everything with babies it's complicated. Here's a study that shows that formula-fed babies cry more at 2 weeks than breast-fed babies, but less at 6 weeks: https://pubmed.ncbi.nlm.nih.gov/10193923/
This is probably also hindered by the fact that "colic" is an imprecise term - all babies cry, but at a certain point if they cry enough you call it "colic".
It would be safe and effective if we had a mechanism of distributing nuclear waste evenly throughout the entire volume of the Earth's oceans. But we don't have that and that's impossible to build!
Dumping nuclear waste into the ocean will, mathematically, result in a globally acceptable concentration of nuclear waste in ocean water - but it will be locally problematic for wherever we put it, and that's assuming it stays put.
As a doctor, why bother spending your time learning about exotic foreign cancer treatments that aren't available in your country, and your patients couldn't afford if they were?
It would be nice if doctors did anyway, but I can certainly understand why they wouldn't.
I shared this with my spouse, who worked for an ag startup in Boston (still exists, different leadership, who knows what they're up to now). They had this to say on the matter:
One big competitive advantage of [company] was that we were able to hire smart, young ag scientists, especially LBGTQ+ ones, who wanted to live in a big, queer-friendly city. We had a lot of great people who were really excited that they didn't have to live in Davis or Ithaca, as well as people whose partners' careers tied them to a big city in some way (doctors, lawyers, finance).
(They would also add UMich, University of Minnesota, and the Research Triangle in NC to your list.)
Paying a fee per job done, rather than a wage per hour of labor, is literally the entire point of the gig economy. Uber/DoorDash/etc. claim this is a perk, since you can set your own hours.
- True=7 -
Having lots of very low-level code and hardware experience, I developed a bit
of tendancy to "minimize what can go wrong at low levels" - C treats 0==FALSE
and !0==TRUE - most people use 0/1 ... but thats only 1 bit "difference". I
sometimes use 7=TRUE as thats 3 bits with no more chars to type (and of course
foolish as such a 1 bit hardware error would "trash" pretty much any system -
but I do tend to be a creature of habit :)
I have never heard of this convention before! Was "random bitflips messing with your conditionals" a common problem back in the day?
Even if it was, are you really going to think of every conditional as a tri-state boolean?
Or will you assume that one of the branches is safe to execute during failure?
Or will you count the number of bits in each conditional and assume more ones than zeroes is a true, with no error correction? Will you log that a bit has flipped or otherwise alert the operator? Will you consider that as evidence that an int has changed value too?
Will you store ints with three bits per bits also?
Error detecting codes have their place, but it takes more than just saying that true is 7.
No, and the author basically admitted it was silly. My counter would be that it makes the intent less clear. I loved reading through his style doc though and I love that he just threw all this stuff out there. Something in his collection is bound to scratch somebody’s itch.
tangential to 'how can a bit be wrong', when trying to see if a serial data line is working, i write 0xA5 ( 1010 0101 ) to send an alternating bitstream with a twist so i can test the greatest number of things i think can be wrong at once
Yes, AA and 55 are common test patterns for a variety of hardware.
Haven't seen A5 in the wild but I suppose it could be useful as a initial "Let's setup a connection" where endianness is unknown. Assuming the next thing that is exchanged is an endian negotiation.
I like to have several sequential ones. Easier to see on the oscilloscope. (I spent last night getting a microcontroller to talk to a SPI device, so I'm still licking my wounds.)
How would a HNer who's not familiar in those nether regions of computing, but wants to feel the excitement of sending a bitstream over a (possibly faulty!) serial data line, get started? Two Arduinos and a cable maybe?
Personally I would recommend finding fun or useful projects where you have an outcome you really desire. Start simple - one sensor like a bath overflow warner (arduino is good or maybe raspberry pi).
Learning hardware just for the sake of it is tough to keep motivated and perhaps you would never use the skills you learn? Hardware adds a tougher level to debugging - but software experience gives you a fantastic start - a logical mind and rational drilling down.
If you can fix your car you have the skills to start on electronics!
A lot of skilled people grew up through the hardware generations e.g. I began learning basic electronics because on an Apple ][ everything was simpler and we were all at the same stage. My first job was writing low level serial driver code and regularly dealing with serial devices (e.g. on PC). Our modern context is just not the same. The internet is hard to learn from. It is difficult to write good articles to help - the experienced like me just know a huge variety of implicitly learned knowledge.
I suggest you concentrate on a useful or fun outcome - I believe it's good life practice (and good engineering) to stay focused on the outcome and not get too side-tracked by explicitly trying to learn. We implicitly learn just by doing!
>If you can fix your car you have the skills to start on electronics!
I'd like to think that this is a comment on the ease of fixing cars, rather than a comment about how fixing cars is basically embedded hardware/software dev....
i was sending pixel data out from an Arduino (ESP32 really but using Arduino IDE) to a bunch of shift registers that seemed to be 74x595 (but couldnt know for sure) to resurrect an LED display for a local art project, and reading the data coming back out from the last register let me know I was at least getting back what I was putting in, which helped me troubleshoot a few wire length and speed/stability issues
The way that really clarified things for me was buying a 3d printer(an Ender 3 for me) and a Raspberry Pi. Setting it up and flashing a new OS to it should basically teach you the rudimentary workings of the hardware->software interface.
I would suggest to search for "Arduino starter kit" or "embedded starter kit" on Amazon. They come with lots of components and usually with some project guides.
To the degree that you were worried about such things, this wasn't a real answer. Yes, it saves you if you have a boolean variable... maybe?
if (var == TRUE)
; // It was 7
else if (var == FALSE)
; // it was zero
else
??? what do I do here?
And you need to solve that "what do I do here" for every single conditional on a boolean, and have the extra lines of code to handle it and not crash.
But, you know, what if it was a variable that you used in a switch statement instead? Or just "if (i > 17)"? Bit flips can affect your logic all over the place, not just when it's a boolean variable.
And then, if a bit flip can affect a boolean, it can also affect a pointer. Or the return address on the stack (or the link to the previous stack frame).
Or it can flip a bit in the code.
So this is, at best, a very very partial solution, and it's non-trivial to implement. So this was very much not standard practice or a "convention".
I've been on a team trying to argue that exact thing. If you aren't going to handle the case where the var is neither true nor false, at least by explicitly documenting the fail-safe case, you're just cargo culting. You get a lot of that type of thing in MISRA and automotive codebases.
Any team that realizes that the compiler may choose to optimize out a shit-ton of such code gets an extra gold star.
Why do you think you need the “else: ??? What do I do here?” case?
Until you added the 2nd test and the 2nd else case, there is no scenario under which both paths of an if/else would fail to execute due to a bit flip of the test variable, because with ‘if (boolean_condition) {} else {}’ there is only 1 conditional test. A bit flip could have caused the wrong branch to execute, but it could not have skipped both branches. A bit flip could change the jump instruction to something else, but in that case your imagined else case still wouldn’t help.
> this is, at best, a very very partial solution
FWIW, the author said this, and fairly succinctly, saying this TRUE=7 thing is “of course foolish as such a 1 bit hardware error would "trash" pretty much any system”. He was only having a bit of fun that cost nothing, and nowhere suggested this is a solution to cosmic rays or other data errors.
The BBC micro and Archimedes used -1 as true in BASIC.
It meant that you didn't need the distinction of "logical operators" (like && in C) and "bitwise operators" (like & in C). You could just use bitwise operators, e.g. the bitwise NOT operator would convert 0 (all bits clear) was -1 (all bits set) so there was no need for "logical operators".
I always felt that was more elegant than C (but of course required a two's compliment machine, which BBC/Archimedes was, but C didn't require).
This is only sound if you have a boolean type that guarantees that the bits are either all zero or all one. Once a mix of bit values is possible, you have to define whether it mean true or false, and then you can’t use the bitwise operators anymore.
No. Random bitflips (aka hardware that doesn't work) are a relatively new thing.
Bit flips due to buggy software was a thing though. This is why most database engines checksum the payload data even in memory. I've also seen network packets corrupted because a bridge (former name for switch) trashed data in flight, then reconstructed its CRC for the corrupt data on the onward leg.
I beg to differ. Early 90s there were some Amiga memory expansions that would constantly flip bits. I'm pretty sure it contributed to the sentiment that the system wasn't the most stable, although I'm pretty sure one or two of my friends with PCs saw similar issues on their machines. Maybe Microsoft Word wasn't to blame for all the crashes?
Of course, trying to work around it in software is utterly futile.
Bitflips aren’t a new thing. I’ve been rarely but painfully bit by them since at least 1986. This. Excludes serial and modem communications where it was a way of life.
SEE/SEU are not a relatively new thing. However, the frequency of events is inversely proportional to the feature size, which has been decreasing over time.
> Random bitflips (aka hardware that doesn't work) are a relatively new thing.
I thought they (from cosmic rays, etc.) were always a thing, but so rare that you needed a very large system (in scope or time or both) to have a substantial chance of encountering one (outside of noisy comm channels, which use error correction protocols for exactly that reason.)
Some event (unknown) triggered multiple spikes in the "tell me three times" redundant three ADIRU units of Qantas Flight 72 causing a WTF unscheduled sudden and dramatic pitch down
Cosmic rays were suspected but unconfirmed (kind of hard to confirm after the fact).
"All the aircaft in the world" for sixty years is kind of a large system given that currently there are on the order of one million people in the air at any moment.
There are lot‘s of thing that can go wrong beyond cosmic rays. Like timing on the bus or signals from close wires. Digital is an abstraction of an analog and chaotic reality.
I’ve been involved with systems where 0xffff… was canonical “true”, but not something as specific as 7! If you’re going to turn on more bits, why not all of them? Though I think this was because the NOT instruction was used for logical inversion, so the bit flip theory doesn’t apply.
For example, the value of Visual Basic's "True" keyword is -1. This seems silly from the perspective of a C programmer, but -1 (0xffffffff) is "more true" than 1 because every bit is true, not just the LSB. :)
Even in VB there is a grain of rationale .. I never even considered before WHY it was -1.. I always just thought it was VB doing VB, but now I have gained +1 respect for Vb..
Luckily it doesn't happen THAT often. I forget the exact metric but I recall various Google Engineers saying that something like one out of a million test run failures is a random bitflip?
Cosmic rays were a theory in 70's era hardware for failures that ended up being proven to be particles emitted by the ceramic packaging itself. (Modern bitflips are have more to do with component sizes several orders of magnitude smaller.) (edit: not saying that cosmic rays aren't a problem now, just that they only became a problem as chip element sizes shrunk, and they're probably not the only source.)
Also, you can definitely stop cosmic rays, that was part of how they eliminated them as the source.
ECC RAM is for the most part relegated to server-grade components, for what it's worth. So your phone, your laptop, your router? Almost certainly not using any ECC RAM, unless you've gone out of your way to do so.
> I have never heard of this convention before! Was "random bitflips messing with your conditionals" a common problem back in the day?
Due to RAM/CPU failures? I don't think so (though I have seen it, fwiw). With weird serial protocols that don't have proper checksums/digests, running over sketchy wiring? Yeah, and that might be part of "very low-level code and hardware experience".
This doesn't make any sense even if the system isn't trashed.
If 7 == true and anything other than 7 == false, then one bitflip will still change the meaning. If 7 == true and 0 == false, then you could have code that does `if (mybool == 7) { ... }` and later `if (mybool == 0) { ... }` and end up with a situation where code path invariants are broken (i.e. mybool is neither true nor false.
If you use `>= 7` to mean true and `< 7` to mean false, while a 0 false value won't randomly become true if one of 3 bits randomly flips, a `7` true value will become false if any of those bits flip. And if any of the other bits flip, 0 will become > 7.
You can use Transit App in many cities around the world other than New York! It's a great piece of software, I rely on it heavily when navigating unfamiliar transit systems while traveling.
About 10% of eligible voters don't have the ID they would need. The people most likely to lack proper ID for voting are young, Black or Hispanic, and poor.
Lots more details in the report!