Hacker Newsnew | past | comments | ask | show | jobs | submit | jujube3's commentslogin

Most dialects of BASIC didn't have booleans.

Everything old is new again.


And most assembly languages don't have general-purpose booleans. And C89 doesn't have booleans. And FORTHs don't have booleans in the base interpreter, those are words defined like any other.

Most BASIC dialects map booleans to integers with zero is false, non-zero is true and -1 is the canonical true result of boolean operators, so I think they effectively have booleans.

It's important to understand that only a small percentage of academics will ever get tenure. The rest will keep toiling away on increasingly poorly paid and desperate postdocs until they finally age out and decide to take a job in industry. That job will pay less than the equivalent job for someone who never took the PhD track.

Of course, the percentage of tenured winners varies a lot by fields. It's very low in the humanities, somewhat better in CS and math, etc.

Once you get tenure, if you ever do, you will indeed have a lot of freedom, but you will also have a lot of work to do. Sure you can pass grading and other jobs off to grad students and postdocs (which you were for the last decade...) but in many fields, the need to fundraise never ends. It's sort of like funding a new startup every year with a different set of grad students.

Most people don't want to sit alone in a closet and think deep thoughts (well, ok, mathematicians do...). But if you want to do something in the real world, you'll need funding, and that means writing a LOT of grant proposals.


> It's important to understand that only a small percentage of academics will ever get tenure. The rest will keep toiling away on increasingly poorly paid and desperate postdocs until they finally age out and decide to take a job in industry.

There's also a good chunk of people who fail to advance past the assistant professor level, which is pre-tenure at US institutions (not sure about other countries). And it's up or out, so if you're an assistant professor and you don't get tenure within a certain number of years, you lose your job.


> The rest will keep toiling away on increasingly poorly paid and desperate postdocs until they finally age out and decide to take a job in industry.

…and that’s for the fortunate disciplines, like CS, where there is actually an “industry” to go to. Let’s just say things look rather less pleasant in the humanities.


Sounds like a dril quote.


Saruman definitely seems like the kind to use AI.


Provided by Palantir?


Apple only promises to support devices for 5 years.

https://support.apple.com/en-us/102772

They do sometimes end up supporting devices for longer than this, but you can't rely on it.


It was only a matter of time!

https://www.youtube.com/watch?v=W7MrDt_NPFk


Do people really believe that knowledge from more than 30 months ago has no value? Even the people doing keyword searches on resumes are smarter than that.


I started doing NLP in 2014. First, I was using SVM and feature vectors, then word embeddings, then handcrafted neural network models, then fine-tuning transformer encoders, then working with LLMs. In that time I worked with huge number of technologies, libraries, and algorithms. A hiring manager recently asked me what my experience with AI agents are, and I had to say that it's basically zero.

Okay, he was obviously very new to the field and had no idea, but it illustrates how the field progressed in the past 10 years, and a person who is just joining has very similar starting line to old-timers. The breadth of knowledge I have is of course extremely useful and I am able to get new concepts really fast, as there are many similarities. But the market in general does not care that much really.


We were arguing about agents 25 years ago. Everything goes around.


They are talking about the half-life, so it should never become valueless, just drop exponentially.

Also, skills and knowledge are different things, right? I’d believe that half the skills picked up in a fast-growing field are obsolete after a couple years.


The article compares its value to that a Nokia flip phone. Nokia flip phones, while not as valuable as an iPhone, aren't worthless. They can still fetch something on the open market.


Yes. Memorizing how a framework works is not knowledge. Knowledge is the deep understanding thing LLMs can't do.


Mary Meeker apparently believes it, but she's made a career out of being confidently wrong about things.


The Lindy effect can be a useful heuristic. Something invented 30 months probably has less long-term value something that was invented 10 years ago and is being used.


Easily half of the crap I learn from over 2 yrs ago is completely worthless. Mental models of code bases that I no longer work on as one example.


What is "half of what you learn"? Frankly, I think that people underrate the amount of learning that goes into even the smallest things in software development. I think about system utilities: bash/zsh, git, vim, tmux, make, ssh, rsync, docker, LSPs, grep -- all of these are useful and have been useful for a decade or more. C, C++, Java, Python -- all languages which have been useful and will continue to be useful; languages like Go and Rust are really exceptions, not the rule, and even when new ones come onto the scene, by and large languages stay the same more than they change. Things like threading and concurrency, how to manage mutexes. Or background information about the Linux kernel, how it works, how paging, processes, system-calls and the like work underneath the hood. Of course, architectural information is essential if you're doing anything performant, and the minimum time for things to change in that space is the 3 years hardware development cycle, and more practically 5 years or more. Even with GPUs: many things have changed, but practically if you learned CUDA 5 years ago you'd still be doing great today.


It goes even further than single tools like those.

Very little about Go is entirely novel. It’s a newer descendant of C with some additions and excisions but mostly similar syntax. The language is newer, but the way you use it isn’t that new. Java is about equivalent to C++. PHP and Ruby were inspired by Perl and took things in different directions, while Perl itself was an amalgamation of C, BasicPlus, Lisp, shell, and the common shell tools like sed, awk, and tr. From Ruby comes Crystal. From PHP comes Hack.

tmux isn’t entirely different from screen.

git works differently, but on the surface isn’t drastically different from Subversion.

ssh is basically rsh with encryption.

Zsh and Bash are basically in the Bourne / Korn family trees of shells.

Docker has a nicer CLI interface, but a lot of the same concepts as LXC, LXD, jails, and more. Podman inherited a lot from Docker.

make, cmake, imake, Ant, SCons, and other build systems cover the same tasks with different syntax.

GitHub, GitLab, and Jenkins all cover CI/CD differently, but how to build a reliable pipeline is a skill that transfers to new syntax and configuration file names.


That’s like saying the workout you did 2 years ago or the food you ate 2 years ago are now worthless.


I think the ability to grok thru code bases and find patterns plateaus after like 5 years.


Grok, what was Stack Exchange?


No, it's Pocket change. Which could be better spent on the CEO's salary!

https://www.reddit.com/r/browsers/comments/18b6tdp/mozilla_c...


You have achieved enlightenment.

Now you no longer need to post here.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: