Hacker News new | past | comments | ask | show | jobs | submit | more jpiech's comments login

As a matter of fact, if someone owns the system, competing on equal terms (or at all) is impossible. Re: the Linux, it's tempting for a couple of reasons, but potential user base seems to be a problem (at the moment).


Unfortunately, even for fast NVMe SSDs random access to cells (which determines the performance) can't compare to RAM. In the example above these 0.5 billion cells can still be fully loaded to 16GB RAM, but adding further such worksheets will cause spooling and noticeable slow down.


For various internal indexing purposes, it's more convenient to use powers of 2 or their multiplicities. While "no limits" is doable it would have some impact on the performance and would require a bit more complex UI for the column-oriented editing actions (which might not necessarily be worth it).


BTW, data tables exceeding 12 million rows can be handled by a companion database program GS-Base ( https://citadel5.com/gs-base.htm ). It's a database with spreadsheet functions. Uses up to 256 million records/rows including data types same as in GS-Calc and binary fields: Long Text, Images/Files, Code (for code snippets with syntax highlighting for 16 languages). It uses the same type of calculations within one and more tables but in this case, they record/row based.


Thanks. It's C++, mostly with custom template classes instead of the STL.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: