Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Define better.

As the other poster said, you could use XML which is more powerful, but as a result is a lot more complex. For most tasks I'd prefer JSON because while it is lacking, all the real world parsers I've seen are much easier to work with and I rarely need more complexity. If someone did a JSON++ (I have no doubt many people have but I'm not aware of them!) that added things like integers, without the complexity of XML that might be even better. In the real world if something should be an integer it isn't hard to check that and error out - you need to support parse errors in any data format anyway.

Protobuf is sometimes better for data serialization. It isn't human readable, but you rarely need that and saving data bytes is often useful even today. Protobuf does have your integer type that you are missing, but it has other limitations might or might not apply to you. (I don't use protobuf enough myself to know what they are.

Sqlite has more than once suggested that their database file is a great serialization format. You get a lot of power here and for complex things a database is often easier to work with than an xml file. There are various no sql databases as well that sometimes can work for this.

I've handwritten my own serialization format in the past. The only hard part is designing enough the ability to add whatever the future needs are (note that I've never had to read my serialization on a different CPU family, things like little vs big endian I'm told can be a pain)

There might be something else I didn't cover... Everything has pros and cons.



Protobuf does support JSON encoding[0], which I like as the .proto definition is quite readable, and then you can encode/decode either human readably or efficiently. It's even quite easy to have your consumer support both since the two are pretty easy to tell apart and if you know its either one or the other, you can just failover trying one to the other, possibly at some small cost... the guide also does point out some significant downsides to relying on the JSON version, but it can be useful in development and/or debugging in some cases, especially if you control both sides sending and receiving and can just toggle to use it when you want temporarily.

[0]https://protobuf.dev/programming-guides/json/


> It isn't human readable

This is a tooling problem. Wireshark can decode protobuf for you when you're inspecting gRPC traffic.


Needing that tooling is a format problem.

JSON is bad at everything except being simple and easy. Turns out simple and easy is a real winner.


JSON has one glaring flaw: nested json encoding in strings becomes awful to read. I encounter it too often in reality where individual layers use JSON, but want to support arbitrary strings in their API. Encodings which use prefix length don't suffer from this, which ironically even includes most binary formats.


Back to my main point though: normally I don't need the complexity that things like nested JSON would be. When you do though JSon is a bad format. (actually I would go so far as to say you never need something that complex - but the problems you are trying to solve with nested JSON are still complex enough that you should use a more powerful/complex framework, but better design of your data store would avoid the need for nested JSON.)


If you have the correct version available. All to often when debugging problems the person in the field doesn't have the correct tools, or doesn't know how to use them (in this case you may not want to share the proto config with that person...) As such the less tools needed to understand something the better.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: