There's a relative of this in making in-memory node representation efficient for large structs that have a bunch of rarely-used fields: chunk up memory in units (most reasonably 8 bytes/pointer size), allocate offsets for rarely-used fields in ascending order, and then use bitmasks to indicate which fields are present. (Note the bits correspond to units, not fields; a 16-byte field would use 2 adjacent bits in the bitmask.)
The trick is that masking & popcount (both low-cycle CPU instructions in most modern CPUs¹) make this quite fast to access and thus suitable for in-memory representation.
The intended use is when presence of optional fields is known at allocation time and doesn't change afterwards, i.e. some object is built up into a dynamically allocated buffer whose size is shrunk down by omitting fields. Changing which fields are present afterwards requires reallocating the thing, which tends to make the entire pattern pointless.
¹ the real annoyance here is that almost all x86_64 CPUs have POPCNT, except the absolute earliest ones, but if you build e.g. some package for a Linux distribution without any CPU optimization flags it'll use a library fallback popcount routine rather than the instruction :(
The trick is that masking & popcount (both low-cycle CPU instructions in most modern CPUs¹) make this quite fast to access and thus suitable for in-memory representation.
The intended use is when presence of optional fields is known at allocation time and doesn't change afterwards, i.e. some object is built up into a dynamically allocated buffer whose size is shrunk down by omitting fields. Changing which fields are present afterwards requires reallocating the thing, which tends to make the entire pattern pointless.
¹ the real annoyance here is that almost all x86_64 CPUs have POPCNT, except the absolute earliest ones, but if you build e.g. some package for a Linux distribution without any CPU optimization flags it'll use a library fallback popcount routine rather than the instruction :(