C comes up regularly, and usually the general consensus is that all programmers should know C. I think it would be interesting to see if programmers take this to heart and learn C.
I've learned some C, but I wouldn't be able to write a program in it without a lot of documentation. I can read it, but I don't use it or use it once or twice a year.
2100 points
I know C about as well as other languages. It's in my top few languages, and I use it from time to time.
1519 points
I know C thoroughly. It is either my main or best-known language. I use it regularly.
766 points
I am a programmer, but I have never learned any C or written any code in it.
I wish more languages had books equivalent to K&R. Here's how I learned C.
1. Buy K&R.
2. Read it. Maybe 2 or 3 times during the process of reading it, I would go write a program to try what I had just learned. This was was around 1980, so trying a program meant walking across campus to a terminal. I could either head to the computing center and use a public terminal, or to the high energy physics building (where I had just started a work-study job programming and it required C). The former would mean competing for a free terminal--and annoying people if I was spending my time at the terminal mostly reading K&R instead of typing. The later meant sitting at a terminal next to a noisy minicomputer in a cold room.
When I got to the end of K&R, I was a C programmer in theory.
3. Start doing real work, using K&R as a reference.
Very quickly theory became aligned with reality, and I was then a C programmer in practice.
Nowadays, most programming books I find assume that I'm sitting eagerly at a computer as I read, and that I want to stop every three paragraphs and type in some code and run it, and then the book discusses what that code did with the author writing under the assumption that I have the output of the code on my screen.
I spent more than a decade (from the age of about 10) learning to program from a wide variety of books, about a variety of languages. I'd written software in Basic, Pascal, C, Amiga E, ARexx, and assembly. It wasn't until I read K&R in my mid-20s that it all actually started making sense (especially C, but everything made more sense after K&R). Up to that point, I'd been copy-paste programming, imitating, etc.
K&R is simply a brilliant programming text. It might just be the right learning model for me, and maybe others can learn as well from other books, but I'd read a dozen C programming books (all much thicker than K&R, but much less valuable) before K&R, and it never really took until K&R.
Amiga E was one of the first programming languages I tried to learn. I had started with Apple BASIC, and happened to get a free Amiga E compiler and booklet on some Amiga magazine. The booklet left a lot to be desired; specifically, it showed how to do output but not input, and as a clueless novice BASIC programmer, I naively assumed that Amiga E could not do input.
I did not write much of interest using Amiga E, and that booklet was nowhere near as useful as K&R.
I started with probably the same booklet from probably the same magazine, but I also found a more complete reference making the rounds on local bulletin boards, so I was able to figure out input, and finally ordered the full language package from the author. Wouter van Oortmerssen is still working on nerdy pursuits, and created some other languages over the years, but none were as popular/successful as Amiga E. I made a small sample playback app with it, but my ambitions were far beyond my abilities.
I think it's primary reason for popularity was that it was so easy to use the Amiga GUI elements and other special features of the Amiga OS and hardware. I always had a hard time making stuff work in C, but could build a UI in no time in E.
I've mentioned this in a previous thread awhile back. My father's approach to the question of "can I get some video games?" was to throw K&R at me and let me flounder.
Machines that I had at the time were a Trash80 something or other mostly running basic and an AT&T machine running some for of Unix...oh yeah.
But, I fought my way through K&R, multiple c compilers, assembler, etc. to actually learn something on these machines.
An interesting comment from an earlier discussion mentioned "this code (in C) is much cleaner than any C++ I have seen!". The truth is, if you know C, have a good style and discipline, you pretty much can produce more readable/reusable code than your normal c++ person.
I program primarily in C and believe I know all there is to know about it. But I have never read K&R. A few weeks back, I started reading it, got bored and gave up. But I keep running into people who swear by K&R. Do you guys think it is worth a read after the fact?
Maybe not. If you're at that level, consider reading "The Standard C Library" by Plauger. I used it as a reference for updating a non-compliant library to be mostly compliant. I gained a better understanding of why the standard C library is the way it is, and learned how to implement all of it.
I've been programming in C since 1990, and I still use K&R, although it's mostly Appendix B (the Standard Library) at this point (I find it easier to read than the man pages). The only other C book I have (and consider worthwhile) is Plauger's _The Standard C Library_, not only for his implementation but for the history behind the Standard Library.
I've never understood the love people seem to have for K&R--especially as a beginner text. In the end I put it down to learning approach differences.
A decade ago I thought the book "C Programming : A Modern Approach" by K.N. King was a good starter text--I don't know if there's a better recommendation that's more recent.
In terms of after the fact reading, if you like the terse style of K&R you might pick up something but if you got bored I suspect that means you don't like the style. I suspect the time might be better spent reading some code instead.
Exactly what I did, fifteen years later. I haven't programmed for anything but fun in ten years, but sometimes fun means C or dialects. It has a staying power in my mind that I attribute to the rigor both of the language and of K&R.
By contrast, I did Perl work for years, and went to script something in it recently. Quickly decided it would be easier to learn Python from scratch than to get my Perl chops back.
Actually this is exactly how I started learning openGL. Be it a new language or new framework always follow these steps.
Read it. Maybe couple of times (say only chapters 1-3) in the beginning. Understand the basic mechanism first rather than try out a program copied from somewhere else.
Start testing out simple things.
I'm lucky to not have fallen for all the blog posts/tutorials on openGL where they spit out a C file and say to compile and run it.
"I know C thoroughly, though I barely ever use it these days. I used it for so long and during such a formative period of my overall programming career that I can read/write C code with virtually no spin-up time even though I find myself touching C code very rarely. Oh, and btw, when I find myself coding C on a nix box, I am still amused at how I have muscle memory for all those old emacs commands I also haven't used regularly in about a decade though I wouldn't be able to tell you what they are without doing some keyboard mime exercises."
Yeah, well, I guess that is a bit too long to be a poll entry.
I love muscle memory. I only vaguely remember how a keyboard is laid out- all I know offhand is things like Enter, "Q", Space, "F", and "A". If I ever need to remember which keys are where, I put my hands on the table and pretend to type.
Reminds me of some C-code I was told to port to Linux (if I remember correctly) some years back (probably in 2003). The code was from 1981 (it said so in the comment header).
Except for the fact the code was almost as old as I am (born in 1976) it turned out that the security check that the code was supposed to perform didn't actually execute.
Yepp, that's right, the code had been running for 30+ years doing nothing.
A friend of mine ended up doing the actual work so I don't know how the story ended but it's a good reminder to never assume _anything_ when you look for bugs.
This reminds me of an interesting event that occurred in my C++ class. During one of the assignments I looked at a peers code. I became convinced that the code was wrong. His code looked a little like this:
Another student saw the teacher pass him and copied his implementation. His code didn't work. It gave the stack overflow error that I was expecting.
It turned out that the reason the guy had his code work in the first place was because it was never included in the program. He had put the code in his C++ file, but forgot to include it in his header. When the program compiled, it never knew his code existed.
I was teaching one of my good friend C way back in undergrad(1999). After explaining him about loops among other stuff...i asked him to print something in loop. And this is what he wrote
For(i=1 to 10)
{then print i;
}
Today my friend is working as a techie ....senior mgr at his new company!
Ok kadbid, I am intrigued. what's the story? Why didn't you make the transition to other languages? Why didn't you make the transition into management? I'm not questioning your decisions, just curious.
Who says I only know C? I've been learning and using new languages for over 30 years; I won't bore you with a list. (Smalltalk is one of my favorites, but I'll never ship a product in it. But I know a lot about dynamic language implementation).
C just happens to be the right choice for most of my projects today (I half seriously proposed FORTH for one before my cow-orkers threatened me with a dunking).
Let's make a distinction between Management as an HR function (writing reviews, dealing with org issues, fighting for equipment and office supplies) and Senior Geek Nirvana (design work, heavy lifting in code and debugging / shipping a product). I tried Management for a while, and it stunk.
Right now I'm in a great position as a senior technical guy (I hesitate to say "Architect" because most of the people I know with that title are self-appointed blowhards) who gets to decide how to do stuff. I lean on managers ("we need some chairs, and an oscilloscope") and project management (for setting up meetings, tracking bugs, etc.), and I get to write a ton of docs and code. It's terrific fun.
I'm living proof that you can be writing in C and not be a sour, dried-up dinosaur in the back corner of a lab somewhere.
For many or most programmers, the "transition into management" is a major career change from something they love and are good at to something they don't enjoy and aren't very good at. However, we're all assumed to be following that track because we're supposed to hunger after money and prestige in the eyes of society at large. This morning I interviewed a woman for a technical position (same job as me) who got her master's degree when I was in third grade. What's wrong with that?
I think it's actually funny that the question, whether we know C. A few years the question would have been if people know assembler. That's how times change.
Not sure if that's a good thing. I wouldn't say the same for a comment on another site (different things drive different people), but I assume you have some aspirations to make it and be your own boss.
Yes, logically, "Not sure it's a good thing," means "It is either a good thing or it is not a good thing." You know, X or not X, and so your ass-covering comment is logically defensible, but come on, you wrote it for a reason. Just say, "Good point," and move on.
"Not sure" in this context was meant demonstrate I'm open to both variants of the situation, good and bad. Since OP had upvotes already indicating that people find it good/funny/whatever, the point was to bring other side of this to light. There's a saying that fits your comment so well - don't tell me what to do and I won't tell you where to go.
Are we turning into an anti-dissent crowd? Look at the downvotes - the same number, and why? Because somebody had a different opinion? Because it's so easy to downvote here and not let out your identity or have to explain why you did it?
Tolerance of, nay, encouragement of dissent (not insubstantiated rants) is the hallmark of a healthy community.
Btw, if you choose to downvote this, please explain why and what I am missing. Thank you.
He's manager doesn't get to program all day though :P As long as it's a company in which compensation isn't determined arbitrarily by position in hierarchy it's a great thing.
As someone who has been a "contractor who ignores all the political bullshit", my personal observation is that it is the contractors who are actually good at navigating the political bullshit (win friends and influence people and all that jazz) tend to be the ones who get top dollar and have continuous employment. Whereas the contractors who are merely extraordinarily good at programming tend to have a lot of downtime in between short engagements.
Some people think that if you have social skills you can't be a 'real' hacker, but often sales skills and good presentation are a force multiplier for a contractor with good technical skills.
Sure, no doubt. You'll never hear me speak against having sales skills or any other kind of skill. I'm for having as many skills as possible. There are no "mutually exclusive" skills, so people who think gaining sales skill causes hacker skill to magically disappear are wrong. And there are those -- also wrong -- who think having programming skill automatically means you can't be good at communication.
As a first reaction I thought, well, more than 50% of HNers fluent with C, the average age should be pretty high probably. But then I realized that actually the average HN age is probably around 25 (from other pools I saw here), so I guess, there are still a lot of fresh programmers learning C. I really hope so. I love higher level programming languages and spent a lot of time with Tcl, Scheme, FORTH, Joy, Ruby, and other languages, but I really hope that the next generations of programmers will still be able to stay near the metal when needed.
I'm 30, and from Googling you seem to be 3 years older than me, but I don't think the use case for learning C has changed all that much over the course of my time as a programmer.
Already around '95 or so Perl was widely used, GUIs could be done in things like Tcl/Tk, basic apps were starting to move onto the web, desktop stuff was mostly in C++.
You learned C when you wanted to work on things like kernels or databases or high-performance daemons and whatnot. That's still mostly the case. Java has eaten into that a little bit, but that's not even a new phenomenon.
The things that people used C for 15 years ago are still the same things that it's used for today, and similarly, you didn't have to write stuff in C 15 years ago if you just wanted to bang out some little utility or "CGI script". I expect things to stay that way until there's some language which could reasonably replace C in its niche. Thusfar one has not emerged.
I don't know if you remember, but in 1995, Serious Software was still Not Written in "Scripting Languages". Things like SATAN and HotJava were avant-garde exceptions, and Emacs Lisp was a holdover from another era (and we often used vi instead because it was noticeably faster).
Today, I regularly use an email client written mostly in JS, message boards written entirely in Python, a chat client written in Python, a high-performance file transfer program written in Python, a music player written in C#, system monitoring tools (iotop, htop, dstat) written in Python, a version-control system written in Python, and so on. In 1995, I used an email client written in C, a newsreader written in C talking to a netnews server written in C (although early netnews servers had been partly written in sh, they were rewritten in C for performance), an IRC client written in C, file-transfer programs written in C, music players implemented in hardware, "top" written in C, version-control systems written in C (although, again, the first version of CVS was a shell script), and so on.
I don't think it's true that "basic apps were starting to move onto the web" in 1995. Web sites that were really programs with HTML user interfaces didn't really start to take off until around 1997, by my recollection. I joined eBay's AuctionWeb in 1996, but it was quite unusual (and it was promptly rewritten in, I assume, C++, as eBayISAPI.dll).
GUIs could be done in Tk, or for that matter PowerBuilder or Visual Basic, but that was thought to be "for amateurs". It became technically possible to do this kind of stuff in high-level languages then, but it took a few years for people to notice that.
Tcl was driving oil extractions plants in 1995... just an example, also there were bit perl programs in production, so for sure now the acceptance is higher but was not impossible to have sensible code written in scripting languages back then.
It wasn't impossible in 1985, either, when GNU Emacs came out; but the now-common AlternateSoftAndHardLayers approach won Emacs a reputation for being a slow memory hog.
My point, though, is that much of C's 1995 or 1996 niche is now occupied by "scripting languages", and in 1995 or 1996, that niche was occupied by C.
Very good point, I'm 34, and actually probably the average HNers is on average ten years younger than me, but well, it is pretty shocking that what you said is so true. No big shift in the latest 10/15 years, there were already good alternatives for all the higher level stuff. Thanks for opening my eyes about that!
Nit-picking time: there's probably a difference between the average age of a HNer and the average age of an HNer that knows C. Conditional probabilities and all that.
That being said, I'm an "average age" HNer that has worked with (and loved working with) C. I'm looking for excuses to work with it more, although it doesn't really intersect with my type of work (statistics).
Maybe I should contribute to Redis or other OSS...any suggestions?
I think that CLIPS (a tool for building expert systems) needs an update, including a shift to a more Redis-like interface and philosophy. The core of CLIPS is still pretty nice, but the project seems to have grown well beyond its natural borders.
I learned C in college classes, but never really learned it until after a "WTF PHP" moment, so I could dig into the interpreter source to figure out what was going on. Later experience with Python's interpreter and debugging pyrex reinforced my knowledge.
I mention this just to say why I think "modern" very-high-level language hackers still end up learning C; the base layer is all written in it, and eventually you've either got to fix something or write something fast.
I find it more than a little disturbing that the first time you really used C was to figure out the PHP interpreter. It's among the worst C code I've ever seen (aside from code I've seen on dailywtf). After I had to debug something in the PHP interpreter, I refused to have anything to do with PHP ever again.
Anyway I hope for your sake looking at the Python interpreter did more than reinforce your knowledge. It's no stroke of genius, but it's much better written than PHP.
Well, it seems the old stigma against multi-language systems is fading. Even 5 years ago, it was more or less "the rule" that you chose one language for a project and you had to live with whatever trade-offs that choice left you with. Now, people are more willing to accept that you can write much of your system in a language designed for programmer convenience and only write the bottlenecks in a more performance-oriented language.
This is true, but if you use C for the right use cases it is possible that actually higher level programming languages will not be a less time expensive pick, as if the use case is right you'll end spending a lot of time into optimizing performances, OS low level access, memory usage, ...
Web applications which rich client-side interfaces might be the reason for the change in thinking. In a web application, you anyway have JavaScript to deal with; and until a few years ago, the server-side code was written in some other language.
I'm 25 and consider myself very well-versed in C (though I still have to consult the C99 spec from time to time). When I was learning to program modern computers around the turn of the century, it was the only "serious" language for writing desktop apps (I had recently "graduated" from VB4). I then refined my skills over the course of my electrical & computer engineering degree and a short stint in the EE industry, where it's very much alive as THE language for microcontrollers (and last I heard, a variant called SystemC was (sadly) replacing Verilog and VHDL for use in ASIC/FPGA design).
FPGA designer here. I don't think it's accurate to claim that SystemC is replacing VHDL and Verilog for ASIC/FPGA design. I view SystemC as a niche tool to allow software engineers to dabble in hardware design. I use C for embedded software, but would never dream of designing hardware in anything else except pure HDL. It's a bit like trying to write a letter in a foreign language using Google Translate-- the results are non-optimal and often comical.
While trying to learn HDL I felt like Verliog was more naturaly than VHDL. I didn't bother trying to learn SystemC or anything, organizing a project on Quartus or ISE is hard enough as it is with Verilog.
I need to go back to dabbling with my little Spartan3E board though.
SystemC was about (co)simulation - for building models of hardware. ALthough there were some tools for converting SystemC into HDLs, I don't think they ever really took off.
Most people are not learning that anymore. I see most universities moving those courses to choice instead of mandatory courses. When I talk to 'young programmers' it always scares me that most don't even know how a computer works at a low level. Which is reflected by the weird things they try to do with Java etc.
I'm 29. I started programming C when I was 12 or 13 and I really think that C is probably the best language to learn to program with. In fact I made that recommendation to a friend who recently took his first programming class using Processing and he thought I was crazy. I think it's important to understand memory management, pointers, pointer arithmetic and such.
My advisor (I'm a senior in college now) does pretty much everything in C. I've written a compiler, large parts of a couple distributed systems, and various Linux kernel hacks. C certainly isn't my favorite language, but I like it a lot. Well-written C is pretty beautiful in its own way.
I'm 23, self-taught (though not asking for pats on the back), and fairly proficient in QBASIC (hahaha), Haskell, Python, JS, Prolog, and I am learning Scheme and Erlang, but there's something mythical about programming in C, and I try to read C (linux kernel, etc) regularly.
It seems like this could be either the second or third option, depending on how well you still know it. Perhaps the second if you still know it well and the third if you've gotten really rusty in it.
I can add the option if this doesn't make sense in your case.
The problem is that it you are attempting to measure two different things with the same question: depth of knowledge and frequency of usage. It's possible that someone knows C quite well, but no longer (or never) uses it.
It's similar to offering 3 options for Marital Status: Single, Married or Divorced. It's impossible to interpret the results in any meaningful way (Divorced and single? Divorced and remarried? Is it important that I've ever been divorced? Or do you just want to know if I'm currently married or not?). It's better to separate these questions.
I was more trying to provide a 1-5 type scale with some guidelines on how to vote. The sentences in the options should be ORed together more so than ANDed.
What if I added a phrase to option two: "Or I learned C extensively at one point, remember most of it, but don't use it regularly anymore" and to option three: "Or I learned C extensively at one point, but have forgotten most of it and don't use it anymore?"
Sorry, I should have added that it seems obvious the emphasis is on depth of knowledge and the frequency of usage parts are merely there as helpful tips. I personally hate modifying a survey once it's been deployed and suggest you leave it as is.
I know C but none of those categories apply to me. Poll design is useful skill to learn!
I am not a pollster but by combining different facts into single options you are restricting the information you can gain from your poll or polls.
I taught C to engineering students at a university, I know it thoroughly, I haven't written a line of it in 12 years - the only option I can logically choose is the final option.
As per other comment thread, my intention wasn't to combine facts, more so to provide guidelines. If one of the statements doesn't apply but the overall "feel" of the option is right, I think you can go with that, since after all, the stated purpose of the poll is to see how well HN users know C. For your purposes, I might pick the first option, as it says C is either your main or best-known language. Best-known could be a reasonable statement in your case.
Agreed. Polls like this one don't work well for orthogonal ranges, and here there are at least two: how well you know C (from knowing nothing to being a master) and how often you use it (never [anymore, if you happen to know it well] to it being your primary language.
A third range might be how much you like C, regardless of how good you are at it or how often you use it.
My second ever experience in programming (the first was typing in BASIC on a Soviet trash-80 clone) was typing in example programs from a C book. I did quite a bit of C programming in university. I've contributed C code patches to several Free Software projects. I am currently writing a C compiler.
You would think this would qualify me as someone who, if not enjoys, is at least is competent at C.
You would be wrong.
I absolutely despise C, and never would have become a computer programmer if I had to do most of my programming in C.
The horrible syntax, retarded pre-processor, and absolutely brain-damaged memory model (addressing modes? segments? virtual memory? reified stack? fuck that, let's just pretend everything is a pdp-11!) make it into this zombie that can't quite decide whether it's a shitty assembler or a shitty Algol clone.
I don't consider myself a competent C programmer, and I have no desire to become one.
I don't blame Dennis Ritchie for it. He probably thought it was a dumb joke too. I blame all the retards in the 1980s that thought it was the bee's knees and made it a "standard."
You're compiling C to LISP, and you're not even telling us??? This being HN, I have to quote you for everybody's benefit:
"Vacietis aims to be a C99 to Common Lisp compiler that produces efficient programs that easily interoperate with Lisp functions, and in addition to offer the introspection, debugging and type-safety features of Common Lisp to C programmers."
C is the language I use when I just want to sit down and enjoy "resonating with the computer". Python, Java etc feel like I am "working", or "cheating" as the case may be in Python. With C you are conscious of the memory of the computer and the lovely datastructures you are forming on top of it and the whole elaborate dance. I hear that Lisp programmers get a similar feeling but I am not close to seeing that yet.
I'm from the "Modern C++" school. I learned C++ without first learning C. I don't consider myself a C programmer. Of course, I can read and write C code, but I would consider it a burden to code without smart pointers, templates and exceptions. How should I answer this poll?
Precisely! It always is a red flag when new graduates have "proficient in C/C++ " in their resumes. I've found that these people tend to write code in a combination dialect of C and C++ that I call C+.
There was old adage that people who say they code in "C/C++" don't really know either one. This was about a decade ago, I can't recall where I saw it, and don't know if it still a widely held view.
I said I would consider it a burden to write in pure C, not that I would not be able to do it. I would not need any documentation, but I would scratch my head figuring out how to organize a large program. I would need to write a lot of comments to replace things that do not exist in the language: const-correctness, memory management conventions, type-safe containers, etc.
"How to organize a large program" is a large part of what learning to program in a language means.
BTW, const correctness is in C99, and memory management is usually done by writing constructors and destructors like C++ (except that they're real functions, often named stuff like <prefix>_init and _destroy or <prefix>_new and _free) and calling them manually. It's not as elegant as RAII, but if you write short functions with clear logic, you can usually ensure you cover all code paths. Remember that you don't have exceptions to screw up control flow in C.
For type-safe containers, I just use void* containers with a comment indicating the intended type, but perhaps more experienced C programmers have a better system. Remember that Java didn't have type-safe containers until Java5, and honestly ClassCastException was a fairly rare error.
Const correctness has existed in C since 1999. Again, knowing C++, and even being able to constrain yourself to some C-like subset of it, is not the same as knowing C.
Naive C++ programmers tend to use pointers as if they were references. They do not seem to understand pointer arithmetic, pointers to functions, etc. They also do not seem to have a deep understanding of memory models, and tend not to understand memory alignment. They do not use bit-wise operators much, though it's more of a cultural thing than skills issue.
I have never met a competent C++ programmer that does not have at least some exposure to C as well. But I'd dare to guess, that such character would not understand the C preprocessor in any but the most trivial level.
[edit]
I originally used the phrase "C with classes" crowd, in a rather derogatory way, to describe the people who never get to learn any of the languages well, and end up writing some sort of sanitized, structured-programming, algol derivative in (a subset of) C++ syntax.
While reading the rest of the comments, I remembered that "C, with classes" can also apply to C hackers that just use C++ to structure their C-styled code, and may take advantage of a few features of the language. I have edited the comment to clarify what I meant.
> I have made the experience that a good C++ developer also automatically knows everything about C.
Not even close in my experience. In college, we used C++ in our intro course. The addition of pass by reference, classes (with constructors and destructors), and the standard library (e.g. std:string) hides a lot of the common paradigms of C, and that's not even mentioning boost.
In my experience, people coming from C++ tend to write really bad C. Many of them have a poor understanding of the concept of pointers.
Once you get familiar with some good C design patterns, it's pretty easy to write clear, modular C. Knowing C++ is not a very effective measure of ability to use C.
Au contraire, a modern C++ developer probably knows nothing about C. Good C is like good assembly language: dense and performance-obsessed. No objects, no exceptions, no STL, bare bones and riddled with pointer manipulation and macros.
Well, function pointers exist in C, though they are certainly a clumsy replacement for having methods (or first-class functions / closures / currying) built into the language.
I come from a similar background and chose the third from the top. I can read it, but writing would require more education, also known as, documentation.
That's why C is used in embedded systems. A trick that I like is to write some critical code in C, compile with the -s switch, then optimize the assembler code and in-line it.
A lot of my time in C was spent on memory management - when I switched to Perl it all went away. (I had some libraries I'd written to take care of it by then, of course, but in Perl I just didn't have to worry about cleaning up at all!) Oh - and strings! (But that's also a memory management question.) And regexp.
CPAN grew on me later - roughly as CPAN was growing. It was garbage collection that was my deciding factor.
I know C although it's been several years since I've used it. Currently Go is my low-level language of choice (although maybe it's not low-level). It gives you the simplicity of C, but with a few of the higher level features like garbage collection, without the baggage of high level languages.
If you're a C purist (or just a purist in general) then of course you're right, but I personally think that C-with-classes is clearer, easier for a layman to understand (many people who call themselves C programmers still get nervous when function pointers come out to play), and is usually worth the trade-offs. You can write a lot of very decent code this way and have a little bit of "the best of both worlds", and avoid getting into a lot of the admittedly very dark corners and nasty surprises that C++ has to offer...
That's probably a really big one in terms of reliability.
Also, the weird C++ behavior of calling copy constructors in the most unexpected places can have a very detrimental effect on performance.
Personally, I think the nature of C++ class definition leads people down the primrose path of trying to provide all potentially useful methods, of which only a tiny fraction get used. Tons of dead code appears in most C++ projects.
Faster compilation, more flexibility, less dependencies in code base (you don't have to define struct fields in headers), no problems with incompatible ABI, interface that can be directly called from almost any other programing language...
It seems to me that it DOES pretty trivially give you polymorphism; just change the function pointers in a struct and it behaves differently, while retaining the same type! Inheritance can be implemented, too, by copying function pointers and other struct contents around.
If you play with languages that use prototype-based OO like Lua and JavaScript, you can see pretty clearly how to implement such things. Implementing your own OO system in Lua is an experience I'd recommend for anyone who uses OO; it's easy, and you'll almost certainly learn something you can apply elsewhere.
I have a lot of respect for C and every once in a while I open up the K & R and go through a few chapters. While I find this to be really well written and holds up even for today's standard of programming book, I feel like the resources for beginner C online are pretty slim.
I think my primary use for knowing some C would be writing C extensions for Ruby or looking at some source code of classic C libraries.
I use C/C++ and assembly everyday as a game programmer. It is the incumbent development language for consoles (at least xbox and ps3 - I'm never done any dev on the Wii). I suppose its stay is due to the ease of access to the metal.
It's propably worth noting that lots of games use higher level scripting languages to do speed-insensetive tasks.
My lecturer in my basic introduction to C class once said to me that he had been a professional C programmer for the past 10 years, and still considered himself a C novice.
I think how "knowing C" is defined is key in this question, yes I can write my fair share of C, but I still don't think I could class myself as a C programmer.
I'm on the younger end of the age-scale (23), and i would have to agree that less people are learning C as a matter of course these days. Over the whole programmer population the percentage would be very low, but the hacker population is almost defined by its use of C. Many notable and large hacker projects are based on C; Mozilla, Linux Kernel, GNU tools, etc. I think this is in part because the community has such a strong legacy and is driven to keep building and extending what has come before, but also because of the higher average programming ability in the hacker community compared to the general programming population.
From my perspective using Linux is one reason i know some C, but the greater reason is because my formal education included embedded C programming with microcontrollers. Programmers going through pure software engineering courses seem to focus much more on computer science and web development concepts, rather than systems design and programming. That said, i'm sure my experience is not typical, and other universities probably use C in some part of their courses.
In terms of actual programming, i haven't yet found a project which required the use of C, and in the cases where i have used it i've often stumbled more with the bulky IDE and build configurations than the language itself. My particular trouble is around external references and including code not in the standard library, and then the further dependency hell.
Searching around i haven't really found any C tutorial that really covers this sort of stuff really well, most tutorials focus on the language itself (which i don't really have an issue with). Does anyone know any good resources on how to set up an optimal and easy to manage C development environment?
C is the only language that I use at work (along some basic scripting). Due to work that is mainly firmware, device drivers etc. Honestly at times, I do feel that embedded engineers are odd persons out here at HN.
I've known some C for long enough, but only recently dived really deep into it, it was Apple and Objective-C that brought back my interest in the language. Writing some forensics and data restore tools as a project to learn the language, progress is slow, but I'm loving it.
One thing I've noticed is how focused you have to be when working with C vs for example python/php - everything is very error prone mostly due to manual memory management. You also need good visualization and some basic math skills.
Polls != me, ever. I chose the second answer because it is the closest, but....
I haven't really used C in years, but I can start programming in C anytime and get operative, robust (I'm a freak about error handling and knowing where lie the exceptions) code quickly and cleanly.
These days I dabble in Javascript, higher order perl, and am trying to grok Haskell, but I think in C. To my simultaneous benefit and detriment.
The vast bulk of code I was paid to write was /bin/sh - thousands of lines of clean, robust, cross-platform installation and configuration routines, with pipes to awk and sed as required. But I thought in C.
Other than deliberately obfuscated code and kernel programming, I've never met a C program I couldn't grok.
But I don't use C, haven't in years, and don't know it thoroughly. I am not a C guru, nor even close to being a wizard. But I would call C the lingua franca of the computer world, the cleanest, most elegant tool out there.
C wasn't the first language I learned, but it was the first that blew my mind. After years of hacking on various BASICs, then FORTRAN, and Pascal, I encountered C (I have a BSc in Physics; I didn't hit C until after I decided to experiment with CS for a while).
After years of forcing my thinking into the little boxes these other languages insisted on living within, BAM, here was a language that allowed me to escape any box, build better boxes, nest boxes, hide boxes, thermonuclearly destroy anything within 6 gigs of my box, etc., etc., etc.
The sad side note: Scoping was never really an issue with the other languages, they had pretty hard and fast rules. Scoping wasn't all that much different with C, but all of a sudden it became really, really important to understand it, to be able to unwind pointers mentally and really appreciate what was going on.
So for me, C scope was computer scope, all scope was C scope. One of the biggest mental hurdles for me in learning higher order perl was abandoning C scope and automatic dereferencing without global variables.
It took me a while to get my head around how the difference in scope management made class and object variables and memoization so much easier in perl - and in many other languages - than in C.
So I think in C. And sometimes that's good, but other times it is a handicap.
I agree with you Peter. I came to C after a decade using Pascal (assembly, "systems" languages before that). It was like going from programming in a straitjacket to running shorts and a T shirt. The freedom was wonderful. And programming in C is like learning to ride a bicycle. You never forget. My first love.
I selected 'none of these options apply to me.' I worked with C for many years (pidgin was written in C) and did some C at my first, second, and third jobs. However, I haven't had a need to write anything in C in about 7 years so I am more than rusty at it.
I didn't feel like "I learned some C" applied. That could be my awesome sentence parsing skills at work, though. :P
I wonder if I could get back into the swing of things easily. I should give it a shot.
I've learned some C, but I wouldn't be able to write a program in it without a lot of documentation. I can read it, but I don't use it or use it once or twice a year.
I answered with the first option, even though I haven't had to use base C in a long long time. But I do know it inside and out. It's a pretty easy language. Some days I miss it.
I'm about to get out of college, and at my new job I'm using mostly C. Well-written C is surprisingly fun to work with (although you could probably say that about anything).
C was my first real programming language. Loved it. I just don't use it much anymore. Most of what I do is either web (Ruby/Rails, PHP, etc) or mobile (Objective-C).
During college days i have used lots of C for different projects but its been few years i haven't touched C, I think i have forgotten pointer and stuffs need to review
I have extensive knowledge of C. I have been using it since 1987. Our companies youngsters have to get up to speed quickly as our simulations are C/C++ and the datasets are massive and the code must be very fast. Of course they were taught Java in college and find pointers difficult. They quickly come to appreicate the speed difference.
c syntax lives on in many languages like java and javascript. it was the second language that i learned behind java, this was back in 1998. i still think it's a great language because you have so much control over the important things like memory management and pointers, how priceless (albeit dangerous) is that? string manipulation was a real detail that i could live without and exception handling was non-existent (don't go there with goto labels), but overall it was a clean language, you could optimize the hell out of things like byte boundaries for structures, you just felt like you had full control, it's the closest thing i had to assembly without the syntax being too cryptic. that was the good old days and i think it was an important foundation for me in terms of learning how to be anal as hell with source code.
Am using C a lot, from writing device drivers and hacking network protocols to userspace stuff. Yet I can't say I know it thoroughly: there are enough little quirks in the specs.
It is a great little language for sure, still destined to outlive some of the languages that are hip today.
I learned draft ANSI C++ before C. Actually there wasn't that much to learn outside of the definition of NULL and where you could stick variables declarations.
The only time I use it these days is when I go into random open source projects to fix bugs or occasionally add a feature.
Unfortunately, those years were mostly in the 80's and I've barely touched it since, so I really have no idea if I can still call myself a C-programmer. I used to be good at it though. At least that's how I remember it...
Yes, I first encountered it from an old issue of DDJ (Small C) while I was studying electrical engineering at UNSW back in the 80s. Then I had to understand it further while studying the UNIX source code in John Lions' OS lectures.
Nope, the first edition gives the nice approach and if you want anything with more depth than the reference manual, you might as well go a get a copy of e.g. the C99 standard itself.
I learned C for the first time in 1986 and still use it all the time as my day to day favorite language is Objective-C (and also I've really started to develop a fondness for Ruby too, that's a neat language).
My main language is Java these days, but I use lots of other languages as well (e.g. Perl, Python, R, sh, MATLAB, ...). However, C remains the language that I know most thoroughly, and I've used it since 1989.
I learned C as my second language after FORTRAN. As it turns out, some of the strangest C code I've ever had to maintain was a system that had been converted from FORTRAN to C using an automated tool.
My first full time software job, was writing C for PSION Workabout terminals. They had 2MB ram and their own custom plib for system access that ran more efficient than using clib functions.
I'm not a programmer -- but I was able to learn some HyperTalk back in the day: I really wish they'd make some heavy duty programming language for non-programmers...
Of the FOSS projects I maintain, all those intended for use by others are written in C. It is in my opinion still the best language for providing reusable code.
I used to have the C operator precedence table memorized as I taught C classes weekly. But I never use it these days. Your know it/use it options are conflated.
I'm hoping that I never have to write C/C++ or use vim/emacs extensively during my programming career. I'm perfectly content to leave that work to someone else.
Unfortunately not that much since academia and a few other bits and pieces, 30 here, still do a lot with C++ and ObjC though rather than higher level languages
1. Buy K&R.
2. Read it. Maybe 2 or 3 times during the process of reading it, I would go write a program to try what I had just learned. This was was around 1980, so trying a program meant walking across campus to a terminal. I could either head to the computing center and use a public terminal, or to the high energy physics building (where I had just started a work-study job programming and it required C). The former would mean competing for a free terminal--and annoying people if I was spending my time at the terminal mostly reading K&R instead of typing. The later meant sitting at a terminal next to a noisy minicomputer in a cold room.
When I got to the end of K&R, I was a C programmer in theory.
3. Start doing real work, using K&R as a reference.
Very quickly theory became aligned with reality, and I was then a C programmer in practice.
Nowadays, most programming books I find assume that I'm sitting eagerly at a computer as I read, and that I want to stop every three paragraphs and type in some code and run it, and then the book discusses what that code did with the author writing under the assumption that I have the output of the code on my screen.
I like the K&R way better.