Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If it were so easy, why didn't you post a few examples rather than insult me?


US Forest Service: 'hi chatgpt, here are three excel files showing the last three years of tree plantings we've done by plot and by species. Here's a fourth file in PDF format of our plot map. Please match the data and give me a list of areas that are underplanted relative to the rest, so we can plan better for this year'

I use it for stuff like this all the time in a non-government job. 100% doable without AI but takes an order of magnitude as much time. No hyperbole. People here talking about security risks are smart to think things through, but overestimate the sensitivity of most government work. I don't want the CIA using chatgpt to analyze and format lists of all our spies in China, but for the other 2.19m federal workers it's probably less of a huge deal.


And do you think ChatGPT is always doing this accurately? There is no end-to-end logic, so what you get could be either bullshit hallucination or correct. This is not the correct use of the tool right now. Maybe in the future with a different architecture.


Accurately compared to what?

In my experience, using it this way is not less accurate than a human trudging through it, and I have no end-to-end logic to verify that the human didn't make a mistake that they didn't realize they made either. So that's as good as it needs to be.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: