Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I mean you can easily mount the S3 bucket to the local filesystem (e.g. using s3fs-fuse) and then use standard command line tools such as ls and grep.


I inherited an S3 bucket where hundreds of thousands of files were written to the bucket root. Every filename was just a uuid. ls might work after waiting to page though to get every file. To grep you would need to download 5 TB.


It's probably going to be dog slow. I dealt with HDDs where just iterating through all files and directories takes hours, and network storage is going to be even slower at this scale.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: