Hacker News new | past | comments | ask | show | jobs | submit login

advancements around parameter efficient fine tuning came from internet randoms because big cos don’t care about PEFT



... Sort of?

HF is sort of big now. Stanford is well funded and they did PyReft.


HF is not very big, Stanford doesn’t have lots of compute.

Neither of these are even remotely big labs like what I’m discussing


HF has raised more than $400m. If that doesn't qualify them as "big", I don't know what does.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: