Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

When it comes to analytics or machine learning type products, the scaling without massive implementation effort problem seems to be pretty universally hard to solve. Of course the usual big names seem to manage this on some level but startups seem to get stuck at building a handful of great solutions for a handful of clients/problems but unable to increase the number without also increasing the effort proportionally.

Do you know of any examples of small scrappy places that have figured this out? Is there any good writing on what kind of organisational and engineering approaches are necessary to make this step?



This is because data is basically a service business. Like the value is the specific data and it's hard to build a product that works for customers without a lot of custom work.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: