Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
aynyc
11 months ago
|
parent
|
context
|
favorite
| on:
DuckDB is probably the most important geospatial s...
No, the EC2 instance doesn't have 500GB of data. Does DuckDB require that? I actually downloaded the data from S3 to local EBS and still choked.
broner
11 months ago
[–]
Works fine for me on TB+ datasets. Maybe you were doing in-memory rather than persistent database and running out of RAM?
https://duckdb.org/docs/stable/clients/cli/overview.html#in-...
aynyc
11 months ago
|
parent
[–]
Wait, do you insert the data from S3 into duckdb? I was just doing select from file.
broner
11 months ago
|
root
|
parent
|
next
[–]
Nope, just reading from S3. Check this out:
https://duckdb.org/2024/07/09/memory-management.html
fastasucan
11 months ago
|
root
|
parent
|
prev
[–]
Maybe its your terminal that chockes because it tries to display to much data? 500GB should be no problem.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: