Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

"The study would not have been possible without the use of artificial intelligence to weed through dozens of proteins in a class called prohormones."

"the researchers designed a computer algorithm they named "Peptide Predictor" to identify typical prohormone convertase cleavage sites in all 20,000 human protein-coding genes."

so look, when I have to go around talking to normies, they're like, "don't you think AI is going to take over everything?" and for these folks, "AI" means *one thing*: chatgpt or similar LLMs. So when they read this, they're like, "see? AI! we need AI everywhere! chatgpt assistants!" and then I pretty much have to lose my shit and they think I'm crazy, because they have no clue. This is also kind of like how it is deep inside all the corporate hype departments so many of us have to endure where our management is chucking shitty LLM garbage at us all day ordering us to "integrate this ! integrate that! we need AI (by which they mean hallucinating chatbots)".

That is NOT what they used here. They wrote their own algorithm which we'd assume uses some straightforward machine learning approach such as bayesian filtering (edit: it's literally just a python + R script with some training data: https://github.com/Svensson-Lab/pro-hormone-predictor/tree/m...) . and in fact, I bet the researchers are calling this short script "AI" because their investors/bosses/managers are demanding they "use AI". Machine learning is a WAY more accurate term than "AI" which IMO is a completely useless hype term at this point, and it's making our lives as engineers worse having to deal with the whole world thinking robots are taking over and making our bosses obsessed with our having to "use AI" for everything (which is of course because they'd eventually like to have fewer employees).



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: