Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Defined using char, int, short, long or long long

> Note that C does not have a boolean type

`_Bool` and `long long` are both introduced in C99, this is mixed up info.

Edit: probably tailor-made for old MSVC, which didn't support _Bool until VS2013.



Ironic that you've drawn the eye to the thing that needs to be front and center of any C tutorial, and also the thing that makes C so tricky to work with.

When somebody says "This program is written in C", my initial thought is "Which C?". There is no one, single C.

I don't write C daily. Heck, I don't write it monthly any more. And so my grey cells are struggling with which versions introduced what, and you've spotted something I would have missed on a first read.

And this is a problem.

Can you list all the undefined behaviours, and which language features came into which version across ANSI, C99, C11, C17 and C23? The last one feels a little brighter in my mind, but I definitely can't, and if I was writing a C tutorial - like many that have been written - I'd probably be explicit about choosing a version and sticking with it, and good luck and godspeed to everything outside that version.

Of course this is one of the reasons learning C is harder than other languages, and why languages like Zig and Odin have a decent chance: ergonomically simpler than Rust, all the power and flexibility, (much) less of the head scratching.


Because Zig et al won't have future versions with new features?


Sure, but C predates semantic versioning and is rammed with undefined behaviour that a lot of people depend on.

Modern languages - even those that have high levels of C interop like Zig - can (and do) avoid those problems.


What is wrong with versioning a language like c89, c99 etc.? I think it is a lot easier to keep track of the 7 versions of C than the 14 that are available for zig and the however many there are for rust.

I do agree that some of the UB is a problem though.


The problem (as mentioned above) is that c99 is not the same everywhere.


This is mostly an issue with MSVC which refuses to become compliant with c99 standard. Their support for c11 and c17 also has some gaps around features that were introduced in c99.


Of course. But they are starting with 40 years less baggage. And can reasonably assume a modern hardware architecture, for example.


You could also just use the newest C standard. I would personally trust that C23 code written today still works in ten years and still has excelelnt support in compilers a lot more than that this is the case for any code written in Zig, Odin, or Rust.


Ah, but then you have potential interop and portability issues. C11 isn't yet universally adopted, and there are some dark corners out there where even ANSI (C89/C90) is not quite embraced and original K&R is holding out.

I think the jury is out on Zig and Odin (but I like Zig a lot, in particular), but I feel Rust has hit a tipping point - like Go, Python and Java - where there's too much production code out there for it to disappear in the next ten years.

If you were to ask me about languages where that might not be the case in ten years, I'd point to where usage is not very production oriented (R, Julia), or where people have had a good try and decided they want to pull back investment (anecdotally, Ruby and Scala seem to be on that curve right now).


Nothing really disappears, the question is how strong the ecosystem is in ten years and how good the support for the code you write today. Rust will not go away but I doubt that the code written today still works without hassle or that all the 1000 crates it depends on still exist. That there are dark corners using C89 or K&R is not a weakness, it demonstrates how strong the C ecosystem is. If you write something in Zig or Rust now, you need to realize that in 10 years it might also be considered ugly legacy code, even if you think it is shiny modern code today. The question is then if it is as easy as using "gcc -std=c89" to work with it.


A piece of anecdata: my Rust code from 8 years ago still compiles fine, though compiler complains about not having the defined edition set.


News such as this one do not build confidence though: https://internals.rust-lang.org/t/type-inference-breakage-in...


Ah. Apparently I didn’t use that crate, so happily missed that footgun, thanks for the info! That is an unfortunate case.

Though I suspect it’s fair to say the entire settlement is built of glass houses here :-)

https://news.ycombinator.com/item?id=43798312

I suppose if one wants absolutely no surprises, they will need to lock the entire toolchain as well, regardless of the language…


Maybe, but this GCC change would affect only code that is already broken and then there is even a compiler flag to fix it.


I think you will always have this sort of thing for anything that's primarily driven by a standard (e.g. C, but also most web stuff) versus anything that's primarily driven by one specific implementation (most other languages).

Things are a lot better today than they used to be though: compilers that don't support modern features are rarer, and the compilers give much better errors on things like UD.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: