Hacker Newsnew | past | comments | ask | show | jobs | submit | sh3rl0ck's commentslogin

There's no mention of SLMs or LLMs, though.

> This work represents a compelling real-world demonstration of “tiny AI” — highly specialised, minimal-footprint neural networks

FPGAs for Neural Networks have been s thing since before the LLM era.


Huh? The first paragraph literally says they are using LLMs

> [ GENEVA, SWITZERLAND — March 28, 2026 ] — CERN is using extremely small, custom large language models physically burned into silicon chips to perform real-time filtering of the enormous data generated by the Large Hadron Collider (LHC).


the site might have fixed it, to me it says "artificial intelligence" instead of LLM, still bad but not" steaming pile of poo on you bank statement" bad


One of the very few good things from the AI race has been everyone finally publishing more data APIs out in the open, and making their tools usable via CLIs (or extensible APIs).


I feel like the CLI craze started around 2020. That predates this chat GPT.

CharmCLI golang

Nushell rust

Warp. Shell

Were all around 2020 also that is when alt shells started getting popular probably for same reasons they still are.


I assume it's an economies of scale thing now.

It's not like Apple is putting any thought into either the UX or the engineering side of utilising the compute properly (except calculating those glass effects extra inefficiently).

Minimise SKUs and get some use out of the binned chips who have a few failed cores.


I think there's a reward for finesse too.

As you mentioned, scope definition and constraints play a major role but ensuring that you don't just go for the first slop result but refine it pays off. It helps to have a very clear mental model of feature constraints that doesn't fall prey to scope creep.


There's also a reward for not over thinking it and letting AI bring the solutions to you. The outcomes are better when it's a question, answer, and execution session.


In Gallifrey? In Gallifrey.


Never had Nvidia issues on Fedora and Ubuntu so far, 1P a multi computer research lab as well.


I find this a tad funny since ccc is my claude code alias, since cc is taken up by the actual, working, greatly optimised and really well made Clang C compiler.


cc is the original Unix name for the C compiler binary


It doesn't "require" a launcher at all; but the people have the freedom to change theirs.

Kinda like MacOS users only have Aqua, whereas Linux has a lot of DEs, the choice of which is entirely handed to the user.


Android without a launcher would be pretty weird.

You could use the AOSP launcher, or whatever your phone maker installs instead. I don't, because phone makers seem to really want to push an unremovable search bar that I don't want. And swipe left to see a totally confusing screen I also don't want.


The poor maids and servants, the poor chauffeur, the poor chef, etc.


'Before you joined' seems to that she doesn't anymore.

I find it a little amusing that AI companies provide free AI subscriptions to their employees and their families. Perhaps because I'd never thought of it that way.

That's kinda nice actually; I assume employees get early access to models to test (5.3 codex, for example). Do families get it too?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: