So, if the new advice is to avoid "processed foods", what set of characteristics about processed foods make them bad? Is there a specific pattern? High in triglycerides? Preservatives? Artificial colors and flavors? Out of whichever things are "bad", is it individually, or just in combination? What kinds of "badness scores" would each item/combination get? What kind of individual variability is there?
As someone who is obese and working on it, my suspicion is that it's mostly sugar. If you look at "processed" foods you'll see tons of sugar, which is both very bad and not very filling. One thing different from home cooked sugary treats is lots of spices and additives like salt and MSG that make you feel more hungry and thirsty which keeps you eating more of the high sugar non filling processed foods in a vicious cycle.
The code running on the CPU isn't the only thing the computer is doing in that one second, the X86{,_64} opcodes we could capture aren't necessarily exactly what the CPU is doing, and code being run by extra controllers and processors isn't likely to be accessible to anyone but the manufacturer.
In 1989, we'd've also been looking at code running on a single core with a single-task or cooperative-multitasking OS (for most home computers, anyhow), with simpler hardware that an individual could completely understand, and it would run at a speed where analyzing a second of output wouldn't be completely beyond the pale.
I've analyzed CPU logs from DOS-era programs and NES games. I certainly haven't analyzed a full second of the code's execution; I'm usually focused on understanding some particular set of operations.
I think this could only be done from an outside perspective for any given unit of time (meaning e.g. somehow monitoring the actual physical state of every atom of a modern system) and could only be performed by other machines.
I think we've reached a sort of "chicken and egg" point in computing history where we can only understand any given device with the help of other machines, though not necessarily with AI or Machine Learning.
Maybe that sounds obvious, given that VLSI tools have been around for a long time, but I think the point is that there is no such thing is as full knowledge in this realm and we are already totally dependent on our computing devices to understand our computing devices.
It's an interesting thought - how would one implement this? I think the closest thing that comes to mind so far is the DARPA Cyber Grand Challenge work, some of which is open source (https://github.com/mechaphish) and a good starting point at the software level anyway, where the question was similar in some ways: describe what's happening in this code path at time t, but also take it a step further and generate a patch to fix a bug(s).
>I think this could only be done from an outside perspective
>only be performed by other machines
JTAG allows in-device debugging. Of course you need another machine, by some definition, to view the results or you would indefinitely apply recursion.
> simpler hardware that an individual could completely understand
Our hardware now is so much more complex, what has it gained us? The quick answer is performance, but is it true? What about correctness? Hard to prove either way, but my guess is we've gained a little bit on performance and lost on correctness.
I could say 'gained a little bit on performance' is a wild understatement, because the actual gain is several orders of magnitude, but that in itself would be a wild understatement. It could be taken to suggest that a second of work on a modern PC could be duplicated on a 1989 PC if you were willing to wait all day, but in reality it couldn't; very little of what I use computers for today could be done on the machines I had in 1989 at all no matter how long I was prepared to wait.
My computer certainly does more than the one 27 years ago did, in the senses of operations it performs in a second, capacity of data storage, increased intercommunication options, increased selection of devices that I can interface with it, and so on. Some of these are just a change in magnitude, and the changes in the software are just as important as changes in the hardware.
There are more parts of the computer (again, both hardware and software) that are undocumented. Taken as a whole, the system is more capable, but closed hardware and software makes me wonder about capabilities in my computer that serve someone that isn't me.
Personally, I like to occasionally remember that there's not a single computing device in my life I really own. It can be either a comforting piece of knowledge or a frightening one depending on your perspective. I like to take the former perspective.
I see what you mean, but also no one ever owned a loaf of bread that could be cut, toasted and buttered remotely by a totally unknown 3rd party either...
Good point. You walk into the diner implicitly trusting the line cook and the bread baker and so on. Very similar to the chain of trust implicit in the tech we use. Only I believe the chain of trust in the tech to be much longer and proven to have been repeatedly broken in recent decades. Also, line cooks have the luxury of tossing out loafs of bread that have moulded over.
My first computer was an 14MHz Amiga 1200 in the early 90s. In some ways the performance difference isn't too noticable, e.g. the GUI was often more responsive than those I use today.
In other ways it's clearly different; e.g. waiting minutes for a JPEG to decode, as the scanlines slowly appeared one after another.
I think that the expectations have changed though - my first computer was a C64, and loading and starting a game was a several minutes long wait. I don't remember that I was bothered that much by that, but when I tried playing an old game even 15 years ago, I couldn't believe how much time it took.
The same for my Amiga, starting it took a really long time, but I don't think I did mind. The Workbench was never slow though. Unfortunately both my A1200 and C64 have died so I can't test my patience anymore. I remember that on the A500, flood fill in Deluxe Paint was a visible process though :-)
I used the A1200 with a MC68030 expansions while going to the university up to about 1995 and it wasn't much slower to work with than the DEC Alphas that we had there - except for things requiring raw CPU power. Most of the time waiting was for I/O, and my crappy small hard drive was probably faster than the NFS mounts anyway. The Alphas had 384 MB if I remember correctly though, which was just crazy.
The Amiga wasn't fast enough to play 16 bit mp3-files in stereo even with the 68030 cpu though.
In 1995 I replaced the Amiga with a PC with Linux on it, and computing was still amazingly fast. Installing slackware was a two week project however, mostly because I had to download everything in the university and use floppies and partly because the floppies were reused and flaky so I had to go back and redownload many disks.
The internet was crazy-slow outside the university until about 1998 when I was lucky enough to live in a block that got fiber for some reason. It was still slow at most workplaces for a another decade.
At around 1998 I got a job and a laptop for work. I installed Linux and Window Maker (or was it Afterstep?) and it was totally fine to work on. It might have had a Pentium with 32 or possibly 64 MB memory. All in all it was really fast, once it had booted, and I mainly used emacs and gcc. I remember that booting Windows on that machine was much slower.
As far as I can remember it was also possible to use a browser without having 1 GB or RAM at that time.
A full compile of our product took 6 hours though. It wasn't always necessary but it had to be done occasionally. A few years later it took 30 minutes to compile.
Today, I get irritated if I have to wait more than 30 seconds before I can test a line of code.
The reason that modern cores are in the billions of transistors is to get performance with correctness. Tons of speculative work that can be rolled back if some invariants don't hold to be true.
GPUs are definitely in the performance at the expense of correctness bucket. It's common for errors to occur in the LSB which isn't a big deal for colors, but might be a big deal for data.
I assume that SapphireSun refers (primarily) to individual manufactured device faults not a problem where the design fails to comply with the floating point standard. EDIT: although the faults I'm referring to would be just as likely in the MSB so maybe not.
SapphireSun, these are mitigated in large part by ECC. It was offered by both major manufacturers and now at least one of them still offers it.
I don't think it's trespassing for the players to be at Kijkduin, just problematic to have people there in those numbers. The players may be making individually-acceptable choices, and it's just the collective behavior that's a problem. It's easy to say that people should have the judgement not to do some individually-harmful thing. It's harder to argue that individuals should have the judgement not to partake in a harmful collective behavior (humans just aren't generally good at considering things like that).
Niantic's algorithms are sending people around in a pattern that is causing damage. Since they've been made aware of that, and it's difficult to blame the individual players, it's reasonable to ask the company itself to make a change. Niantic's the only one in a position to solve the problem quickly and cheaply.
I see people complain about the low power of the Pi3...then I think about how I went through college with a slower computer than that, and how the people over the previous 30 years had even slower machines to use (if they had their own personal machine at all).
What you can do with a couple MHz and a few hundred KB of memory is great, given a little ingenuity. Of course, you aren't going to do real-time face recognition on something like that, but 99% of what we need to do is just fine with 0.1% of the performance.
That's the point. They're saying that those 85% of self-made American millionaires aren't as "self-made" as they claim.
Like me: I could talk about starting with almost nothing in my bank accounts, getting a job, working through the ranks, buying a home, and working toward my first million... conveniently leaving out all the advantages that I've had to get me to this point (cultural expectation that I'd go to college instilled in me from a young age, a family well-off enough to support that financially, uninterrupted time to work on the hobbies that grew into a career, etc).
Quanticle seems to be relating the origins of Donald Trump and Bill Gates. Donald describes his fortune as self-made. Bill Gates has been described as self-made. It's true, if you only consider that their wealth is much greater than any gift they've ever received. It's false from the perspective that anyone could do it with enough hard work.
Honestly no one is self-made. People who don't have financial cushions still rely on others to back their idea and hard work.
Problem is everyone on HN is talking about how to make millions when that's backwards mentality because then you'll never be satisfied. People need to strive for modest salaries and things that benefit society and the planet not filling one's pockets.
> People need to strive for modest salaries and things that benefit society and the planet not filling one's pockets.
It's a lot easier to do projects that benefit society and the planet if you have millions in your pocket that you can spend/invest on them (say for paying employees or simply having financial runway).
There is some truth in what parent is saying though. We have reach a point in the West were a comfortable life is basically possible for everyone. However, we still have the bullish culture from after the war that paints a guy that want a simple life as some sort of loser.
So we go on holiday and marvel at the simple life of artisans, farmer or other, but we teach our children that such a lifestyle is a failure.
Everybody who thinks that their salary is too large is free to donate a large(r) part of the salary to things that he cares of. So if you think you earn a more modest salary, do so. The problem is that most people who talk of "more modest salaries" mean "others should have more modest salaries". Better call this envy.
> So we go on holiday and marvel at the simple life of artisans, farmer or other.
I surely don't.
> We have reach a point in the West were a comfortable life is basically possible for everyone. However, we still have the bullish culture from after the war that paints a guy that want a simple life as some sort of loser.
And what made it possible that leading a comfortable life is basically possible for everyone? Surely not the people who want a simple life.
I can accept that technology has become too complicated for most people to grasp. I am among the first to accept this as a problem. But if this is your opinion, start to develop technology that is easier to grasp instead of complaining how complicated anything has become. The same holds for laws, too.
> And what made it possible that leading a comfortable life is basically possible for everyone? Surely not the people who want a simple life.
But is pushing everyone to push the boundaries even if they are not driven or motivated not counterproductive at some point ?
Well I may be biased by living in London and working in the financial industry, where personal enrichment seems to be the only accepted way of life. But all the scheming that is done to be on top is definitively keeping competent people out of the loop. Keeping your job is a more valuable skill that excelling at your job, and you don't need much corporate experience to know that there is a very thin overlap between the 2. I see people that are great with children, amazing artists, ... that are just grinding at their job just to keep living a life where they can afford the socially acceptable brand of car, clothes and activities, but dream of the day when they can spend more than a few hours a month doing what they are really good at.
More generally, what about all the support jobs ? Tech is very far away to replace teacher or nurses. Those jobs are loser jobs in a world that only value "potentially world changing career". You are more likely to become a millionaire as a random Goldman Sachs cannon fodder as you are as a nurse. What kind of criminal parent would encourage their kid to teach instead of getting a degree that let them into finance ?
These are two insightful and underrated points. I'm going to store them away somewhere if you don't mind. Have an up vote!
> Keeping your job is a more valuable skill that excelling at your job, and you don't need much corporate experience to know that there is a very thin overlap between the 2.
This is true in most companies, not just finance. The fact that these 2 activities are orthogonal and require entirely different skill sets is one of the more frustrating aspects of corporate life, giving rise to: very talented people who end up grinding away as drone#19221 because they aren't great at "managing upward" and smooth talkers who BS their way into higher and higher roles.
> I see people that are great with children, amazing artists, ... that are just grinding at their job just to keep living a life where they can afford the socially acceptable brand of car, clothes and activities, but dream of the day when they can spend more than a few hours a month doing what they are really good at.
Again, welcome to work everywhere. For most people, a job is the thing you do for money so you can live, and save up to finally do what you are good at or love when you're old and retired.
So true. In today's US economy just simply having a salary, period, is something one must strive for. Continuity and regularity of compensation has become difficult to obtain.
We wouldn't have nice things like computers, coffee, indoor plumbing, jeans, books, teeth, etc. if people were satisfied with modest salaries and not filling their pockets.
Yes we would. Long before capitalism was invented the great creators of the world made great strides with inventiveness. While necessity is certainly the mother of invention, there's no limit to human curiosity.
Examples: Arabic numerals, Da Vinci, Mozart, Galileo, Maxwell, Einstein, Turing, etc. That's where a lot of those modern things come from.
Then along comes the capitalists to exploit the true discoveries for personal gain. Mass produce this-and-that, destroy the environment, etc. Build atomic bombs.
But IME is distinct from "Intel TXT & UEFI Secure Boot" ("this"'s referent).
> Intel didn't force [Intel TXT & UEFI Secure Boot] upon the consumers
is true. Both of those things can be disabled on most hardware. In fact, I disabled them on the laptop the I just bought, because I wanted to install OSes that aren't SecureBoot signed.
It sounds like you operate how I usually do; the communication cost outweighs the benefits of collaboration, and it ends up being a tiring and frustrating experience.
I've seen pair teams working correctly, though. Each of them bounces ideas off the other one, and they progress faster than either would've alone. It's like racing two algorithms against each other, each searching a different part of the problem space, and using the first result that's returned. It lets them move on to the next problem quicker.
I don't usually do that very well, but I can't deny that if you've got the right pair of developers, it's a very powerful technique.
All of my joystick ports were on sound cards. As for the IDE controller, a lot of computers in the late 80s and early 90s only had one IDE channel, and it was pretty common to buy a sound card and CD-ROM at the same time (I remember the first ones my family bought were bundled in a "multimedia upgrade kit"). If you already had two hard drives, you were short on options, except that the sound card provided a solution.