Hacker News new | past | comments | ask | show | jobs | submit | piinbinary's comments login

With a sample size of 1, Gemini 2.5 Pro (Experimental) did a great job of this (and was considerably faster than O3)


It would be cool to see examples of what it can accomplish



Why don't any of the trains go into New Jersey? That seems like a big wasted opportunity for adding more space that can easily commute into the city.


Trains do go into New Jersey from NYC. Just not MTA trains.


Because it's the New York City subway.

There are trains that go to New Jersey - the PATH trains, as well as NJtransit commuter trains that leave from Penn Station.


I like debuggers and use them when I can, but folks who say you should only use debuggers tend to not realize:

* Not all languages have good debuggers.

* It's not always possible to connect a debugger in the environment where the code runs.

* Builds don't always include debug symbols, and this can be very high-friction to change.

* Compilers sometimes optimize out the variable I'm interested in, making it impossible to see in a debugger. (Haskell is particularly bad about this)

* As another commenter mentioned, the delay introduced by a debugger can change the behavior in a way that prevents the bug. (E.g. a connection times out)

* In interpreted languages, debuggers can make the code painfully slow to run (think multiple minutes before the first breakpoint is hit).

One technique that is easier to do in printf debugging is comparing two implementations. If you have (or create) one known-good implementation and have a buggy implementation, you can change the code to run both implementations and print when there's a difference in the result (possibly with some logic to determine if results are equivalent, e.g. if the resulting lists are the same up to ordering).


It's frustrating how disposable these are designed to be

edit: e.g. screen replacements cost nearly as much as the entire computer


I commented elsewhere, but my uncle is on his third iMac in 30 years. He keeps them a decade at a time. My father is still using an Intel iMac. Normal people do not upgrade their computers after purchase. Displays are generally not something that fail. These machines are capable of providing a decade or more of service to normal people.


First iMac was released in 1998.


I rounded too aggressively. His first iMac was the G4 on a stalk (2002). The second was one of the aluminum pre-Retina Intel models, perhaps 2012. He just purchased his third earlier this year. So, three iMacs in 22 years, but I expect him to keep this one for at least a decade too, at least 5 years, so that will get him to three iMacs in 27 years at minimum.


Whether you say 26 years or 30 years is really not the main point here, that's just splitting hairs.


I bet iMacs are some of the longest-average-lifetime computers out there.

But I’m sad that the 27” models are obsolete computers and still-wonderful screens, and Apple removed the use-as-screen mode.


That feature was available and only possible with specific Intel chips. It went away because Intel stopped supporting it. Sad the feature didn't come back to life in the AppleSilicon iMac.


Their display stack on Apple Silicon is still maturing. It took way too long for them to support more than a single external display. I bet you it's due for a comeback in the next decade.


My father-in-law just replaced his daily iMac because _Chrome_ finally stopped providing security updates for part of his hardware architecture.


Especially egregious when you consider older iMacs could be used as external displays - https://support.apple.com/en-gb/105126


While it is a shame it was never brought back, at the time it was removed it was unavoidable since the bandwidth required for 5k was beyond what could be carried across a single display port cable.


Displayport 1.3 which supports 5k 60fps became widely available with the NVIDIA 900 series just 5 months after the 5K iMac released. AMD followed suit 1 year after.

They could have very soon added support for it, maybe even launched with DP 1.3 support if they worked something out with AMD.


I'd love to see a regulator mandate that computers like the iMac that have built in screens must have HDMI ports that allow them to be used as monitors.

This would be great for the consumer and prevent a lot of ewaste as people can use obsolete computers as monitors well past their useful lifespan as a monitor.


HDMI might be a bit more complex, but displayport should be doable since most devices use embedded displayport (eDP) anyway for their built in displays. I'm guessing the main cost would be adding a switching chip for switching between external and internal source.


HDMI is really not a very good choice as they try to block open source implementations: https://arstechnica.com/gadgets/2024/02/hdmi-forum-to-amd-no...

It should be kept out of regulations.


This is why we need a standard that has both HDMI and DP connector options, but DP signalling.


Even laptops?


My last iMac lasted 10-years. I replaced it with the M3 iMac for my daughter. I will be happy if it takes her through High School graduation in 2030. If the M3 iMac is still running, I expect to use it for some intro to computer stuff for one of the younger kids.

Yes I cannot mine the iMac for parts at EOL, but realistically, I haven't really done that on any tower-based PC either.


Are you expected to replace your screen often? I don't think I upgrade and replace either one much faster than the other. Usually get a new monitor and a new PC every 4 or so years.


> Usually get a new monitor and a new PC every 4 or so years.

Maybe you're not quite the average consumer that OP has in mind? Maybe you are, I don't know. Either way it's unsustainable and ridiculous that the _average consumer_ would need to replace something after 4 years when it COULD be built to last.


My first LCD monitor is still actively used in our house, about 18 years old now. My mother has gone through several computers, kept the same screen for 15 years. Apple Consumers are not "Average Consumers". Starting at $1300, it's a luxury desktop.


That’s a mid-range desktop at most in a world where people pay more than that for individual components at the high-end, especially when you look at pricing for equivalent quality displays.

The correct criticism of iMacs is that it links two parts with different lifespans. There should be a legal requirement that all-in-one computers have an external connector so that if some other component fails or simply becomes obsolete you can use the perfectly functional display with another system.


I agree that the iMac needs to be usable as a monitor. Both Dell and HP all-in-ones that I looked at do this (I did not do an exhaustive search, so it may not be as common as my 'look at two' makes it sound, but it's not UN-common)

However, let's be real clear, iMac is not a mid-range desktop, price-wise. Amazon's all-in-one category's HIGHEST non-apple price in the top-10 is $599. There are three non-apple all-in-ones over $1k in the top-50. [1]

Obviously, once we separate the pieces out, things become even more clear cut. You can buy the beefiest "mini-pc" from amazon and pair it with a 28" or 32", flat or curved 4k monitor for $200-400 and still have money left over.

The iMac is NOT high-end, but it is luxury, and that's an important distinction.

1: https://www.amazon.com/Best-Sellers-Electronics-All-in-One-C...


My point was just that while it’s not low-end it’s also not luxury in a world unless you’re defining that term to mean something like “has clean lines without stickers” or “has a better display than a TV from a decade ago”.

Most of the cost of an iMac is the display and as your example shows, you don’t see significant savings unless you accept massive compromises on quality. 1080p FHDs is like saying you have a luxury car because your baseline is a golf cart and most of those have terrible color quality according to their spec sheets even if you ignore the low resolution. By the time you’re getting to models which are only one generation behind on CPU you’re looking at a $900 system with a display which is worse than what Apple shipped almost 20 years ago.


That wasn't their point. The point is that the average consumer doesn't really upgrade their desktop separately from their screen, if the two are separate. You do not need to replace an iMac after 4 years, they are in fact built to last.


They are built to last. I'm typing this comment on a 2015 MacBook Pro.


Most people I know who don't use laptop exclusively don't replace their monitors that often. My work docking station is still rocking 2017 4k monitors and my wife home setup is similar.


I made the mistake of getting a 27" iMac in 2014. The 5k display is still great by today's standards but the internals are obsolete.


Well.. no. But if it breaks or is damaged you basically have to throw away (the otherwise) fully functional PC.


I bought an iMac in 2011 that I had for 12 years before it died. I replaced the HD with an SSD after a few years but otherwise it just kept on going.


When was the last time a display failed on you without getting physically damaged?

The last display I had break down was a CRT piece of shit I got off my school's auction a quarter century ago.


yeah I have an older model that had the well documented faulty / fragile screen connector for the LED back lights. Very expensive replacement screen was the recommended fix! all for the sake of a tiny six pin connector.

One of these days I'll split it down and see if my hands are still steady enough to solder on a new connector.

Anyway it was enough to swear me off any all-in-one devices ever again. I thought by now we'd be fully modular with desktop computer hardware.


Ignoring the butterfly effect issue, I would have said just pick the title of a given newspaper on a given date as the password.


Depending on how time travel works that might have a huge hole.

1. You pick a hash of the headline from the January 14th 2030 of the Springfield Shopper as your key and encrypt some data.

2. You use your time machine to send the encrypted data to someone in 2024 with instructions telling them how to recover the key, expecting that they will have to wait until January 14th 2030.

3. Some rich person decides they do not want to wait. They notice that January 14th is National Hot Pastrami Sandwich Day [1].

4. They buy the Springfield Shopper and institute a policy that every year on January 14th the headline will be "Happy National Hot Pastrami Sandwich Day!".

5. If we are in the kind of universe where time travel to the past can affect the time that the time traveler came from their actions in #4 make it so "Happy National Hot Pastrami Sandwich Day!" is the headline you used.

6. In 2024 they try to decrypt the data using a hash of "Happy National Hot Pastrami Sandwich Day!" and it works.

[1] https://www.daysoftheyear.com/days/hot-pastrami-sandwich-day...


That's probably crackable with enough effort though. A huge effort, but it might be practical for a team of governments if they knew the person was a time traveler.


You could add more entropy by including not just the headline but an assortment of text from other pages, e.g. market price of whatever stock is listed on page X, the 10th sentence on page Y, etc.


If you can ignore the butterfly effect, I'd use the last digit of the closing price of 100 different stocks.

Stock prices are hard to predict in the first place[1]. And the last digit will be the hardest digit to predict.

---

[1] You could get very rich if you knew how. Lots of people want to be rich, yet basically nobody is doing it successfully.


Even if you could: your actions based on that knowledge would alter the price again, such that any method you devise to predict stock prices will a) have to factor that into its calculations and b) by definition will only work for a single user.


or the concatenation of the lottery numbers for day A, B, C, D, E, F, G, H

A bit disappointed by the article: the top upvoted answers there are literally a repetition of what ChatGPT says to do :|


Alerting the course of human history by gifting baubles from the future. Lottery numbers will not be stable.


The same applies to tech debt: https://jeremymikkola.com/posts/2022_01_29_tech_debt_gets_wo...

(Yes there's a typo in the url. It bugs me, too)

prior discussion: https://news.ycombinator.com/item?id=30128627


It's somewhat amusing to look back on my Computer Science education, where they taught us that database transactions are used to ensure bank balances are moved atomically between accounts


I’m sure this is true at some level. Just not _all_ levels.

At some point you just can’t connect to the db on the other side because it belongs to a different institution.


That's how you can identify a 24-ohm snake https://xkcd.com/1604/


It seems that surely there are lessons a hacker could learn from NATO's success (to the degree that you agree that it has been a success), but I am not a sufficient student of history to know what those lessons might be.


One lesson: some things aren't technical problems that can be solved by software. They need to be resolved by people, sometimes collectively (in the form of governments), using imprecise means.


Hacker News is for things that the ycombinator/-adjacent demographic might find interesting. Despite the name, it's perfectly fine to post things here that are not strictly hacking related.


Have a solid value proposition and communicate it clearly to potential customers. Do this consistently over time, and customers will find you quickly when they need you.


The lesson is that is you befriend a bully, you might save billions.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: