Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Increasing the token limit is a solvable problem


Sure we just need next level super computers for these large models and the patience of multiple days to wait for output


Not necessarily - you just need hierarchical abstraction memory. I reckon my "token" limit when analysing code is around 7.


Increasing the token limit without needing more resources to run the network is a solvable problem


but you do sure see the problem with a codebase right?


The current token limit comes from a O(N^2) memory requirements for N tokens, there is research that's trying to reduce this towards O(N), for example as the (downvoted) sibling comment suggests. This is not exactly straightforward but not impossible either. It's not a fundamental limitation of language models going forward.


that still will only be enough to hold an extended cli application




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: