If converting SQLite to CSV/parquet is a one-off task for you, you can also import SQLite files (among the other 50+ formats for tabular data) and export to multiple supported formats (CSV and parquet included) using Datagrok (https://datagrok.ai/). Just drag-and-drop the sqlite file, then use the "export" button on top. Everything happens inside the browser (so there is a limit on the max size of the dataset, probably around 1GB).
Disclaimer - I'm one of the developers of Datagrok :)
on my machine that i did the basic run, the one in the link is way more faster.
```
$ time ./duckdb_cli-linux-amd64 ./basic_batched.db -c "COPY user TO 'user.csv'"
100% (00:00:20.55 elapsed)
real 0m24.162s
user 0m22.505s
sys 0m1.988s
```
```
$ time ./duckdb_cli-linux-amd64 ./basic_batched.db -c "COPY user TO 'user.parquet'"
100% (00:00:17.11 elapsed)
real 0m20.970s
user 0m19.347s
sys 0m1.841s
```
```
$ time cargo run --bin parquet --release -- basic_batched.db user -o out.parquet
Finished `release` profile [optimized] target(s) in 0.11s
Running `target/release/parquet basic_batched.db user -o out.parquet`
Database opened in 14.828µs
SQLite to Parquet Exporter
==========================
Database: basic_batched.db
Page size: 4096 bytes
Text encoding: Utf8
Output: out.parquet
Batch size: 10000
If converting SQLite to CSV/parquet is a one-off task for you, you can also import SQLite files (among the other 50+ formats for tabular data) and export to multiple supported formats (CSV and parquet included) using Datagrok (https://datagrok.ai/). Just drag-and-drop the sqlite file, then use the "export" button on top. Everything happens inside the browser (so there is a limit on the max size of the dataset, probably around 1GB).
Disclaimer - I'm one of the developers of Datagrok :)