First question - do you even need a database right now? Have you for example considered using CSV files and simply loading those files into Pandas or R on demand?
I am currently working on a project analyzing massive amounts of options data and have found this approach to be both quite easy as well as flexible to work with... and as my project matures I may move select parts of it into a database.
As for massive - something like daily options data for 3000 stocks, spanning a number of years, with information down to the tranche level (let's say 60 million rows if stored in a relational database fashion). In my case the analysis can be done on the stock level though, which means that only a 3000th of the dataset needs to be loaded into memory at any time.
I am currently working on a project analyzing massive amounts of options data and have found this approach to be both quite easy as well as flexible to work with... and as my project matures I may move select parts of it into a database.