Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

C99 specified stdint.h, which includes int16_t/uint16_t. So compliant compilers are required to support it even if it doesn't map to a built-in type. So you won't lose the short.

That said, it wouldn't make it any less insane; so no one does this and in fact int is 32 bits everywhere except microcontrollers (and 8086, for the tiny handful of people writing BIOS or bootloader code).



Each of the intXX_t typedefs are optional. Each typedef is available iff the implementation has a type which is exactly XX bits.


Right, which it does: we're talking about ABI variations within a single architecture. The OS and toolchain enforcing ILP64 vs. LP64 semantics on C programs has nothing to do with the i386's ability to operate on 16 bit chunks.


The "implementation" in the meaning of the C standard includes the OS and toolchain. If the C toolchain does not provide a 16 bit type, then it need not define (u)int16_t, regardless of the CPU that it is running on.


However, as mentioned recently in an HN comment somewhere there are architectures that can only operate on 32-bit or larger chunks, so sizeof(char) == sizeof(int) (I think it may have been an older Cray). I can't find the specific comment, but here's one that mentions a platform with sizeof(char) == 16: http://news.ycombinator.com/item?id=3112704


All machines were word orientated until the IBM 360's arrived and many persisted well into the 80's (the PDP-10 is a particularly famous one for hackers). Many of them got C compilers at one point or another.

That's not really the issue though. My point was that the choice of ILP64 vs. LP64 on a single architecture could not cause you to "lose" a 16 bit quantity. It can't, because those machine instructions obviously don't go away when you change your compiler's calling conventions. So a C99-compliant compiler would still be required to provide int16_t.

Which is... maybe too much minutiae even for a C minutiae thread. But it was my point, anyway.


Yes, I see. It can be annoying when someone widens the scope of an already-narrowed discussion, as my comment tried to do. Thanks for your restatement and clarification.


sizeof(char) is always 1, because the number that it returns is not in bytes but in chars. What you mean is CHAR_BIT in limits.h.

This seems to be common state of things on almost anything, that is designed to be fast first and "C-compatible" second.


D'oh! Yes, I meant CHAR_BIT. Thanks for the correction.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: