If it's one their website the competitors can write a simple crawler and create that catalog.
And you don't have to send every single field you have in your database.
Once the user selects a category you can send a metadata that enable the client to scaffold the UI. Then you cache the rest while user interacts with the site.
Barnes and Nobles - according to their FAQ - has 1 million unique items in their catalog. But they also have tons of different categories. A single book cover weights around 30kb.
I'll leave it as an excercise to figure out how much data you can fit into 30kb to make usable filtering system.
btw: opening their front page downloads 12.1MB already.
If it's one their website the competitors can write a simple crawler and create that catalog.
And you don't have to send every single field you have in your database. Once the user selects a category you can send a metadata that enable the client to scaffold the UI. Then you cache the rest while user interacts with the site.
Barnes and Nobles - according to their FAQ - has 1 million unique items in their catalog. But they also have tons of different categories. A single book cover weights around 30kb.
I'll leave it as an excercise to figure out how much data you can fit into 30kb to make usable filtering system.
btw: opening their front page downloads 12.1MB already.