Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don’t mind the zero value for the proto enums, makes sense, but I require converting to my inner logic to not include this “unknown” and error during the conversion if it fails.

I’ve seen engineers bring those unknowns or unspecified through to the business logic and that always made my face flush red with anger.



Why the anger?

If you are consuming data from some other system you have no power over what to require from users. You will have data points with unknown properties.

Say you are tracking sign ups in some other system, and they collect the users’ browser in the process, and you want to see conversion rate per browser. If the browser could not be identified, you prefer it to say ”other” instead of ”unknown”?

I think I prefer the protobuf best practices way: you have a 0 ”unknown”/”unset” value, and you enumerate the rest with a unique name (and number). The enum can be expanded in the future so your code must be prepared for unknown enumerated values tagged with the new (future for your code) number. They are all unique, you just don’t yet know the name of some of the enum values.

You can choose to not consume them until your code is updated with a more recent schema. Or you can reconcile later, annotating with the name of you need it.

Now personally, I would not pick an enum for any set och things that is not closed when you are designing. But I’m starting to think that such sets hardly exist in the real world. Humans redefine everything over time.


I wrote my own Protobuf implementation (well, with some changes). Ditching the default values was one of the changes I made. I don't see any reason to have that. But I don't think that Protobuf is a reasonable or even decent protocol in general. It has a lot of nonsense and bad planning. Having default values is probably not in the ten worst things about Protobuf.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: