Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's an open source language model with seven billion parameters (a measure of its complexity), and a longer than typical sequence length of 8K, which allows the provision of more context when querying the model. For example, it allows you to better generate text in someone else's voice by providing a longer example of their work.

https://en.wikipedia.org/wiki/Foundation_models



The number of parameters seems meaningless when the training sets are dogshit and hardened old gum chipped from the shoes of Gregslist and FleaBay.

OTOH, there ought to be a construction in the form of an web app that can pinch out nonrepetitive, coherent ~100 page trashy romance novels in the style of any author given name with open source or specific text(s) or transcripts with enough original input volume: Churchill, The Unabomber, psycho happy kindergarten child development IEP manual writer, The Dude, Walter (agro gun nut), Bob Ross, Grace Hopper, Ayn Rand, LBJ, The Dalai LaMa%, Hitler, Kanye (Ye), Bhad Bhabie, and the King James Bible. Ethical and generational safety features be damned; it'd be generating fucking^2 art for hilarious entertainment purposes. How does one stretch training input to something that might involve human/computer output validation to discard sticking on repetitive nonsense?

% He never saw that one coming Ow^(3 + i).


Do you have any recommendations for cryptocoins?


Do you always ask flippant and rude questions that add no value to HN?


dang so this means that just like my 30yo 4GB hard drive is to my 4GB RAM phone… we ain’t seen nothing yet if we’re still counting these metrics?




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: