Hacker News new | past | comments | ask | show | jobs | submit login

The benefit of a Ruby-specific JSON parser is that the parser can directly output Ruby objects. Generic C JSON parsers generally have their own data model, so instead of just parsing JSON text into Ruby objects you'd be parsing JSON text into an intermediate data structure and then walk that to generate Ruby objects. That'd necessarily use more memory, and it'd probably be slower too unless the parser is way faster.

Same applies to generating JSON: you'd have to first walk the Ruby object graph to build a yyjson JSON tree, then hand that over to yyjson.




All of the would be a big savings in code complexity and a win for reliability, compared to doing new untested optimizations. If memory usage is a concern, I’m sure there’s a fast C SAX parser out there (or maybe one within yyjson)


I don't understand what you're getting at. If performance is a concern, integrating a different parser written in C isn't desirable, as it would probably be slower than the existing parser for the reasons I mentioned (or at least be severely slowed down by the conversion step), so you need to optimize the Ruby-specific parser. If performance isn't a concern, keeping the old, battle-tested Ruby parser unmodified would surely be better for reliability than trying to integrate yyjson.


Take a look at SAX parsers




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: