Hacker Newsnew | past | comments | ask | show | jobs | submit | shusaku's commentslogin

I was sort of surprised to see MCP become a buzz word because we’ve been building these kinds of systems with duck tape and chewing gum for ages. Standardization is nice though. My advice is just ask your LLM nicely, and you should be safe :)

Yes I think once you’ve got an LLM in the loop it’s easy to be lazy and just use it to make all decisions. But it’s good to step back and think if there is a cheaper way, I mean even some hardcoded logic can do the job.

Very true. Making a non-deterministic system make determinations is also harder for it to do.

Right tool for the step to the right extent.

Feels like soft skills for software development.


The more I look at that chart the stranger it gets. It feels like a good case for concluding CPI isn’t calculated properly

If you're talking about the spike in Q1 2020, there's nothing weird going on. That's from all the service workers getting laid off, which bumps up the average because they're typically lower paid, and no longer drag down the "employed" average.

Also, does it count the COVID checks?

>The usual weekly earnings data reflect only wage and salary earnings from work, not gross income from all sources. These data do not include the cash value of benefits such as employer-provided health insurance.

https://www.bls.gov/cps/definitions.htm#earnings


Why?

PI appears when you’re trying to mathematically describe an oscillating quantity. So it could be a circle, or it could be a wave. So I don’t think it’s the red flag you think it is.

The definition of insanity is doing the same thing twice and expecting different results.

By coincidence, this is the basic way to compile latex.


TBF Typst internally also recompiles a bunch of times until a fixpoint is reached, however it is designed to limit what parts can depend on the previous iterations and to reuse previous results for parts that definitely didn't change.

My makefiles ran it 4 times, i think. I still preferred it to Word.

Anything is preferable to Microsoft Word.

Last time I checked, flipping a coin twice gave different results.

Did you flip the coin in the exact same way? Probably not.

That's the clue - you never repeat the same activity exactly the same way.

Seemingly identical action, from the performer's point of view, is performed in a different environment each time it is repeated. Unless you are the Laplace's Daemon, you can't say for sure you repeat the same action over and over because the environment could change in the meantime in an unimaginable way, and that could influence the outcome.

I just really hate that quote because it is detached from reality.


If you flip it an infinite number of times, you will get the same results anyway. Call when you're finished :P

I'm finished, the result was different 50% of time :).

That's 18 hours to go to infinity and back... nice roundtrip!

Aren’t getting different results the norm in programming anyway? Developers usually don’t make the effort to include idempotency and make builds reproducible.

Normally, if you compile the same code twice on the same machine, you'll get the same result, even if it's not truly reproducible across machines or large gaps in time. And differences between machines or across time are usually small enough that they don't impact the observed behavior of the code, especially if you pin your dependencies.

However, with LaTeX, the output of the first run is often an input to the second run, so you get notably different results if you only compile it once vs. compiling twice. When I last wrote LaTeX about ten years ago, I usually encountered this with page numbers and tables of context, since the page numbers couldn't be determined until the layout was complete. So the first pass would get the bulk of the layout and content in place, and then the second pass would do it all again, but this time with real page numbers. You would never expect to see something like this in a modern compiler, at least not in a way that's visible to the user.

(That said, it's been ten years, and I never compiled anything as long or complex as a PhD thesis, so I could be wrong about why you have to compile twice.)


I wrote my PhD (physics) in LaTeX and I indeed needed to compile twice (at least) to have a correct DVI file.

It was 25 years ago, though, but apparently this part did not change.

This said, I was at least sure that I would get an excellent result and not be like my friend who used MS Word and one day his file was "locked". He could not add a letter to it and had to retype everything.

Compared to that my concern about where a figure would land in the final document was nothing.


Almost every compiler is a multipass compiler.

But in this case the passes are manual!


Dunno - to me it feels like the latex compiler should just run whatever it needs to for however many times until the output is done/usable, like basically all other compilers?

Imagine that your C or C++ compiler gave incorrect output, until you had run it some number of times. And that the number of runs required wasn’t obvious to the average user, so people just ran it again and again, to be safe. It’s absurd, yet we accept it for latex

That's not the definition of insanity, that's the definition of practicing.

People are just fear mongering to suggest Iran would use them or give them to those who would. The real issue here is that once you have them, you basically entrench yourself as a regional power. If the regime started falling out of favor, all their neighbors would be obliged to come to their aid to protect the nukes. Also, you would be far more limited in how you fight your proxy war. These are the things the involved parties are considering, not Armageddon fantasies.

> Israel's settlements are the reason Iran feels the need for such developments though.

Even Iran’s leaders would laugh in your face at such a naive statement, you should reconsider your media diet


I don’t understand the argument for being patient. If you think the new hire will lead to increased profits, there’s an opportunity cost every day you don’t have them on board. And sure, maybe you wait for the best person and they are more productive, but they might be out the door in a few years.


Quality is more important than quantity in engineering, so a mediocre hire can be a net negative. In so many ways:

- consuming time and attention from people who help them

- time spend checking and fixing their work

- additional maintenance costs from poorly thought out solutions

- time spent reproducing and fixing bugs

- lowering morale of better engineers

- creating whatever the opposite of “a culture of excellence” is

- consuming management time in performance management

- inability to interview or saying “yes” to even worse hires


good thing the modern interview process is so efficient at filtering against candidates like this /s


I have to say, including that detail was outstanding writing. It really upped the suspense and sense of horror even though you knew they’d make it out alive.


I agree with this. People use natural language all the time for programs. Talking to a teammate when pair programming, clients asking for features in language to developers. Details are there in the existing code and through a running conversation. This is why I like a lot to code with ChatGPT over a copilot type of setup.


Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: