Yes. One of the biggest complaints that computer science departments used to get from students is that they weren't learning any languages that employers are using.
I always found this to be a shortsighted complaint. Getting exposed to languages where computation models are clear gives you an excellent background to switch to the language du jour and become a master. Going through e.g. HtDP or CTM makes it easy to transition to Python and write excellent code, whereas traversing the opposite path is going to be tough.
Sure, it's possible to catch up, but we're talking about at least four years of potential experience. That means spending years catching up, which can seriously impact you career.
Not a unique problem to software either. My sister in-law is a mechanical engineer. Her first employer was upset she didn't know anything practical, and only knew theory. She had to spend years catching up.
Some technical universities in EU solve this problem by having full-time practical courses e.g. in January and June. For example, as a CS freshman you can build a really solid background in functional programming by going through a sequence of SML or Haskell courses in Autumn and Spring and spend January and June learning trendy technologies in depth. It's the best from both worlds.
I agree, but those don't need to be the languages you use in all classes. I learned a few assembly and academic languages (MIPS and LISP) in classes for those two topics, and it was absolutely useful to learn how to think in terms of those languages. The vast majority of our classes were Java though, and that's also been the vast majority of what I've done since graduation. (I think my school has since moved to Python as well)
So I graduated with that really helpful knowledge about why modern languages work how they do, but also a lot of practical experience of actually using those modern languages too.
I don't agree with that. I think Python is a better first language. The better students will get through the program just fine, especially if they really want to learn computer science in depth. And the average ones will at least learn something useful.
Having taking a course based on HtDP as my first serious coding course in college, it was good at teaching many concepts, but it felt very verbose and was aggressive in its hiding of features that weren't that difficult to grasp.
I learned programming before entering a degree in this stuff and I think learning a badly designed language as first language made me lose a few years, and additionally, if I weren't so driven to learn more, I probably still wouldn't know more than many others, and might have never explored the field of functional languages. If I hadn't started reading SICP and exploring Scheme and others, I wouldn't know half of what I know now about computer programming.
Sticking to only some Algol family language makes people have a severely limited perspective on things.
Human languages and programming languages are not comparable. You will need a lot more effort to become fluent in a second human language than in second programming language. Even if the human language is Esperanto (designed to be really easy for speakers of European languages), and the programming language is C++ (perhaps the most inconsistencies and foot guns) the programming language will need a lot less effort to learn to a high level.
having actually learned esperanto and become a fluent speaker in just half a year, i am not so sure. esperanto grammar is trivially easy compared to any other language (the rules fit on a postcard in regular sized print), but if you'd take any other language, i'd agree.
They're not directly comparable because humans have an inbuilt ability to learn human languages.
The vast majority of people on the planet know more than one human language and know zero computer languages. It's literally the opposite of what you're claiming.
It makes a huge difference, whether you have to learn thousands of new words, irregular grammar and (after learning thousands of concepts in the first language) learning a few hundred new concepts, or you learn a computer-understandable language, that has maybe, if very inelegant, 100 keywords, and 100 concepts, most of which you will probably not use often.
Compared to these numbers, the fact, that something is a natural language, has very little influence on the outcome. It is the sheer effort needed to learn a natural language, that makes the difference.
Learning to program may be hard, but learning programming languages is relatively very easy. In other words, your second programming language is so much easier to learn then your first was.
I regularly use several programming languages, and tend to pick up a new one every year. I've been spending the last six months studying my second spoken language; I promise you human languages are much harder to learn.
They’re not directly comparable because programming languages are much simpler and much easier to learn. Becoming a good programmer is hard, but that’s not due to difficulty of learning a programming language. Once you’re a good programmer, new languages are easy. A decent programmer should be able to do something useful in a new language in a week or less, and be reasonably competent in a month or two. See how long it takes anyone to learn a human language to that level.
> Once you’re a good programmer, new languages are easy.
Not always. Languages can differ radically. If the new language uses concepts you've never encountered before, you're going to need to do the work of learning those new concepts.
An example I've used before: 20 years writing C code for embedded systems won't give you any insight into Haskell's applicatives or monads.
I bet that C programmer will still learn Haskell way faster than someone with no programming experience, and way faster than just about anyone will learn a human language.
You also have an inbuilt ability to learn a computer language. What even is an inbuilt ability?
Programming languages are something you read and write and execute. You can learn many and their definition is precise and limited. It's very easy to be able to pick a programming language and use it in relative low amount of time.
Human languages are absolutely different. You can't easily pick them up and they carry cultural context, regional variations, and a lot of ambiguity and history. Definitions of those languages tend to be complete or prescriptive but descriptive and evolving. The languages are written, spoken, read and listened to. The variation in all of those is immense.
Do you acknowledge any of this or will you double down in the most absurd of points?
Only knowing the language you were required to know to pass a bunch of tests when you were 18 is a negative signal.
If someone is interviewing and they only have Java listed, and their school is known for teaching Java for their introductory classes. They're probably not that strong of a programmer.
programmers, and good ones imo, are almost always polyglots on some level, and i tend to think have a better than average ability to even pick up natural languages.
programming languages have a small, manageable and finite set of vocabulary, idioms, and constructs that most languages share but express differently depending on their intended use. a programmer fluent in programming will be able to pick up most languages. how those pieces are cobbled together to form more complicated abstractions becomes the skill obv.
that does not mean they'll be an expert right away, but it does mean they are usually competent enough at minimum to dive in and work with it just like any other tool -- they know they'll need a screwdriver, maybe a hammer, so they look up what it looks like and how it is used.
my daily drivers are python, cmake/Makefiles, c++, and c, with a sprinkling of bash, powershell.
i've worked with microsoft stacks C#/SQL, JavaScript, and i've written a ton of Lua. i've studied concepts and swe fundamentals in languages i don't really write code in and transcribe into code i do intend to write code in. i learned mostly using Lua first, then i picked up c++.
these are just the tools of my job overall. my main skill is communication and learning imo, and knowing which tools are better suited for a task at hand depending on requirements and limitations (mine or technical or both).
To be fair, if you learn computer science well enough to thoroughly understand Scheme, I don't think it'll take more than a few weeks during the summer to learn Python.
I disagree. You can learn the language itself pretty quickly. Finding your way through the expansive standard library will take longer. Getting a good handle on the package ecosystem is a lifetime learning project.
Knowing your way around the ecosystem of one programming language does not build up the intuition necessary for identifying O(n²) (or worse!) algorithms and choosing/writing O(log(n)) (or better!) ones instead.
Computer Science has little to do with science, but what it teaches you is certainly closer to science than just building a huge mental index for a bunch of work done by other people.
There's certainly value in that skill, but it has no place in a Computer Science curriculum.
This would be like taking Astrophysics students and telling them to study the details of all of the different kinds of telescopes they can buy.
That's not really what we were talking about here. The context was that compsci students want to learn marketable skills, and the claim was that if the student had learned Scheme in class, they could quickly pick up Python. And that's true, for the language itself. But knowing only the language, like never importing code you hadn't written yourself, doesn't go far along the road to marketability.
I'm not arguing that compsci should be job training, not at all. My disagreement is solely with this specific claim.
But FWIW, while I understand your analogy, an astrophysics department that didn't tell the students that there are these things called telescopes, and here's why you might use one over the other for various situations, and that they're how you're going to get the observation data you'll test your theories against, would be doing a disservice.
> Knowing your way around the ecosystem of one programming language does not build up the intuition necessary for identifying O(n²) (or worse!) algorithms and choosing/writing O(log(n)) (or better!) ones instead.
I'll disagree with this, at least in terms of Scheme versus Python.
Python is visually close enough to other languages that the skills you develop to quickly see O(n²) algorithms easily transfer to many other languages. Scheme is very different visually, and so the intuition doesn't transfer as well. Sure, it's possible your intuition is wrong, but when scanning a program intuition can help in the first pass.
Scheme has (trace), at least most interpreters (and his cousing Common Lisp) have some tracing and pretty-printing features. Far more powerful than anything Python could offer.
Oh, and of course it has functions like sdraw or draw-cons-tree when you can print the contents of a list in seconds as an ASCII-ART chart:
Python has the pprint module, which takes care of this for you, and for more datatypes than are done here. (I don't see how this would handle a hashmap in a sensible way.)
But I'm not sure how this addresses what I was saying, which is that the intuitions about algorithms you get working on Python are easier to transfer to popular languages like C++, Java, Javascript, Rust, etc..
This depth you are never going to get in a college education anyway. Especially not since programming isn't (and shouldn't be) the only thing you learn in a Software Engineering / Computer Science bachelor.
One of the big shifts in academia over the past couple decades is that, for any number of reasons, students today are less likely to self-study or tinker outside of classes and internships. The increased prevalence of basic bootcamp-style classes like "Let's Build a Rails App" in CS programs is because departments can no longer assume that students will explore things like that in their spare time.
What good does that do, though? Make it harder to tell the intrinsically motivated students from the “I’m just here to get a job when I graduate”? It seems like it harms the former.
Is that what we need from universities? Is that helping employers? Helping strong or intermediate students?
It's what universities have become. They are expensive, grandiose trade schools operating out of very distinguished-looking Collegiate Gothic designed buildings.
I think someone who learns computer science well from the theoretical side can pick up a new language quickly. After all, if it's turing complete, it's turing complete, they're all the same at that level.
Problems creep in when the person doesn't learn CS well, chooses an approach that is deeply reliant on overly complex/opaque libraries without good documentation, or the like.
When I went to university in the 90's, Scheme was used in a couple of early courses. I had already learned C and Java on my own, but greatly enjoyed how Scheme made me think about problems differently. (I didn't learn Python until much later when I was out of school.)
As long as students are exposed to multiple languages, I think starting w/Python is fine. Every language has its issues.
> I'm not so sure someone who's good with Scheme will be bad with Python or Java.
They won't be. But students don't understand that, they want to learn marketable skills and are 18 years old. They haven't figured out that if your skills transcend language choice you will be more marketable even if it means you have to spend a few weeks learning a new language for a new job. They lack maturity, which isn't surprising given their age and experience, and so they complain.
I always found scheme to be a clean slate where no one’s high school experience benefited them in the first year.
My program did Java for the second course, which was very popular in industry and I loathed it.
I do think Python is not so bad to standardize on because it’s stood the test of time and is one of the most popular ways to write code for many disciplines and has applications outside of computer science so it helps everyone that takes the cs requirement even if they aren’t a cs major.
What im saying is Python can even serve as a better version of spreadsheets for folks that aren’t cs.
indeed. The problem with being expose do to something so much more clean/elegant/powerful than the languages employers are using is that you no longer want to do it. It's like having tasted good whisky. You no longer think Jack's acceptable.
These days, employers more or less get what they wanted. We're doomed.
That's kind of a useless argument. Universities often also don't teach people any web development. In many cases a graduating student has never worked on any real project. If the university's idea is, that the student can learn those things in their free time, then surely asking someone to learn a little bit of Python or another language is not too big an ask either. So which one is it? Learn that stuff at home on your own time, or university should teach it, because it is needed on the job? Then what other things are they not teaching that is used on the job?