I like debuggers and use them when I can, but folks who say you should only use debuggers tend to not realize:
* Not all languages have good debuggers.
* It's not always possible to connect a debugger in the environment where the code runs.
* Builds don't always include debug symbols, and this can be very high-friction to change.
* Compilers sometimes optimize out the variable I'm interested in, making it impossible to see in a debugger. (Haskell is particularly bad about this)
* As another commenter mentioned, the delay introduced by a debugger can change the behavior in a way that prevents the bug. (E.g. a connection times out)
* In interpreted languages, debuggers can make the code painfully slow to run (think multiple minutes before the first breakpoint is hit).
One technique that is easier to do in printf debugging is comparing two implementations. If you have (or create) one known-good implementation and have a buggy implementation, you can change the code to run both implementations and print when there's a difference in the result (possibly with some logic to determine if results are equivalent, e.g. if the resulting lists are the same up to ordering).
I commented elsewhere, but my uncle is on his third iMac in 30 years. He keeps them a decade at a time. My father is still using an Intel iMac. Normal people do not upgrade their computers after purchase. Displays are generally not something that fail. These machines are capable of providing a decade or more of service to normal people.
I rounded too aggressively. His first iMac was the G4 on a stalk (2002). The second was one of the aluminum pre-Retina Intel models, perhaps 2012. He just purchased his third earlier this year. So, three iMacs in 22 years, but I expect him to keep this one for at least a decade too, at least 5 years, so that will get him to three iMacs in 27 years at minimum.
That feature was available and only possible with specific Intel chips. It went away because Intel stopped supporting it. Sad the feature didn't come back to life in the AppleSilicon iMac.
Their display stack on Apple Silicon is still maturing. It took way too long for them to support more than a single external display. I bet you it's due for a comeback in the next decade.
While it is a shame it was never brought back, at the time it was removed it was unavoidable since the bandwidth required for 5k was beyond what could be carried across a single display port cable.
Displayport 1.3 which supports 5k 60fps became widely available with the NVIDIA 900 series just 5 months after the 5K iMac released. AMD followed suit 1 year after.
They could have very soon added support for it, maybe even launched with DP 1.3 support if they worked something out with AMD.
I'd love to see a regulator mandate that computers like the iMac that have built in screens must have HDMI ports that allow them to be used as monitors.
This would be great for the consumer and prevent a lot of ewaste as people can use obsolete computers as monitors well past their useful lifespan as a monitor.
HDMI might be a bit more complex, but displayport should be doable since most devices use embedded displayport (eDP) anyway for their built in displays. I'm guessing the main cost would be adding a switching chip for switching between external and internal source.
My last iMac lasted 10-years. I replaced it with the M3 iMac for my daughter. I will be happy if it takes her through High School graduation in 2030. If the M3 iMac is still running, I expect to use it for some intro to computer stuff for one of the younger kids.
Yes I cannot mine the iMac for parts at EOL, but realistically, I haven't really done that on any tower-based PC either.
Are you expected to replace your screen often? I don't think I upgrade and replace either one much faster than the other. Usually get a new monitor and a new PC every 4 or so years.
> Usually get a new monitor and a new PC every 4 or so years.
Maybe you're not quite the average consumer that OP has in mind? Maybe you are, I don't know. Either way it's unsustainable and ridiculous that the _average consumer_ would need to replace something after 4 years when it COULD be built to last.
My first LCD monitor is still actively used in our house, about 18 years old now. My mother has gone through several computers, kept the same screen for 15 years. Apple Consumers are not "Average Consumers". Starting at $1300, it's a luxury desktop.
That’s a mid-range desktop at most in a world where people pay more than that for individual components at the high-end, especially when you look at pricing for equivalent quality displays.
The correct criticism of iMacs is that it links two parts with different lifespans. There should be a legal requirement that all-in-one computers have an external connector so that if some other component fails or simply becomes obsolete you can use the perfectly functional display with another system.
I agree that the iMac needs to be usable as a monitor. Both Dell and HP all-in-ones that I looked at do this (I did not do an exhaustive search, so it may not be as common as my 'look at two' makes it sound, but it's not UN-common)
However, let's be real clear, iMac is not a mid-range desktop, price-wise. Amazon's all-in-one category's HIGHEST non-apple price in the top-10 is $599. There are three non-apple all-in-ones over $1k in the top-50. [1]
Obviously, once we separate the pieces out, things become even more clear cut. You can buy the beefiest "mini-pc" from amazon and pair it with a 28" or 32", flat or curved 4k monitor for $200-400 and still have money left over.
The iMac is NOT high-end, but it is luxury, and that's an important distinction.
My point was just that while it’s not low-end it’s also not luxury in a world unless you’re defining that term to mean something like “has clean lines without stickers” or “has a better display than a TV from a decade ago”.
Most of the cost of an iMac is the display and as your example shows, you don’t see significant savings unless you accept massive compromises on quality. 1080p FHDs is like saying you have a luxury car because your baseline is a golf cart and most of those have terrible color quality according to their spec sheets even if you ignore the low resolution. By the time you’re getting to models which are only one generation behind on CPU you’re looking at a $900 system with a display which is worse than what Apple shipped almost 20 years ago.
That wasn't their point. The point is that the average consumer doesn't really upgrade their desktop separately from their screen, if the two are separate. You do not need to replace an iMac after 4 years, they are in fact built to last.
Most people I know who don't use laptop exclusively don't replace their monitors that often. My work docking station is still rocking 2017 4k monitors and my wife home setup is similar.
yeah I have an older model that had the well documented faulty / fragile screen connector for the LED back lights. Very expensive replacement screen was the recommended fix! all for the sake of a tiny six pin connector.
One of these days I'll split it down and see if my hands are still steady enough to solder on a new connector.
Anyway it was enough to swear me off any all-in-one devices ever again. I thought by now we'd be fully modular with desktop computer hardware.
Depending on how time travel works that might have a huge hole.
1. You pick a hash of the headline from the January 14th 2030 of the Springfield Shopper as your key and encrypt some data.
2. You use your time machine to send the encrypted data to someone in 2024 with instructions telling them how to recover the key, expecting that they will have to wait until January 14th 2030.
3. Some rich person decides they do not want to wait. They notice that January 14th is National Hot Pastrami Sandwich Day [1].
4. They buy the Springfield Shopper and institute a policy that every year on January 14th the headline will be "Happy National Hot Pastrami Sandwich Day!".
5. If we are in the kind of universe where time travel to the past can affect the time that the time traveler came from their actions in #4 make it so "Happy National Hot Pastrami Sandwich Day!" is the headline you used.
6. In 2024 they try to decrypt the data using a hash of "Happy National Hot Pastrami Sandwich Day!" and it works.
That's probably crackable with enough effort though. A huge effort, but it might be practical for a team of governments if they knew the person was a time traveler.
You could add more entropy by including not just the headline but an assortment of text from other pages, e.g. market price of whatever stock is listed on page X, the 10th sentence on page Y, etc.
Even if you could: your actions based on that knowledge would alter the price again, such that any method you devise to predict stock prices will a) have to factor that into its calculations and b) by definition will only work for a single user.
It's somewhat amusing to look back on my Computer Science education, where they taught us that database transactions are used to ensure bank balances are moved atomically between accounts
It seems that surely there are lessons a hacker could learn from NATO's success (to the degree that you agree that it has been a success), but I am not a sufficient student of history to know what those lessons might be.
One lesson: some things aren't technical problems that can be solved by software. They need to be resolved by people, sometimes collectively (in the form of governments), using imprecise means.
Hacker News is for things that the ycombinator/-adjacent demographic might find interesting. Despite the name, it's perfectly fine to post things here that are not strictly hacking related.
Have a solid value proposition and communicate it clearly to potential customers. Do this consistently over time, and customers will find you quickly when they need you.