Hacker News new | past | comments | ask | show | jobs | submit login

In the article the "Materialized view for data derivation" part does the heavy lifting.

I assume this means they are creating time-series (indices) on-the-fly (with eventual backfill of the data). For the "exploratory analytics" the techniques developed for Dremel/Drill/Impala [0] are sufficient, and for anything else raw data crunching speeds are really impressive nowadays. (And they claim they can ingest 1B JSON records in ~10-30 seconds [1].)

[0] https://en.wikipedia.org/wiki/Dremel_(software)

[1] https://greptime.com/blogs/2025-03-18-jsonbench-greptimedb-p...




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: