The mention of no compiler I'm surprised at. We worked with a microcode simulator for low level CPU poking examples, partially implemented a pre-written compiler as part of our Computer Science course, likewise wrote a DNS server (these are parts just from memory I as an undergrad found challenging).
We also learnt more useless things with databases like aligning writes with HDD sectors for performance (which at the time I, and others, rolled our eyes at since we knew we'd never need it, although not because of SSDs, but how many people write a database?).
This was a 3 year course at a "red brick" university ~1997-2000 in the UK but by no means one of the best for technology - e.g. the lead of the department, and by extension those under him, refused to teach design patterns (or enterprise patterns).
When I asked why of 2 professors and laid out (what I thought was) factual grounding I was told because patterns are for Java or C++ and language specific (which as you probably know is BS, only implementations are, or patterns that work around a language deficit). I later learned they took this as personal criticism, instead of course criticism.
Offtopic:
I'm still salty 25 years later about being given bad grades for things such as that (i.e. first and 2:1 grades for some coursework, thirds, passes and fails in others usually those I happened to argue in even though marking was meant to be anonymous).
I now have an illustrious career in IT, open source and competitive coding (having won my fair share).
Tldr; I learnt don't argue with people who grade you in a polarised institution until you get the qualification. I wish I could have told myself that at 19.
We have had a compiler elective before, but it was never required, and currently we don't offer it. I don't think there is a lot of demand from students for it, for better or worse. We also don't do networking fundamentals as a requirement, like sliding window protocol, CSMA/CD, how routing works, etc., but we do have an elective that I believe covers these things.
We do require architecture, and still even do Karnaugh maps. I do believe that every CS person should have a fundamental understanding of cache, instruction fetch, decode, MESI, etc., but probably don't need a semester's worth of architecture. If I had my druthers, I would consolidate a number of separate courses into maybe a 2-semester sequence that would basically be: "What every computer scientist should know", and basically cover the coolest and most seminal topics from different areas of CS.
We also learnt more useless things with databases like aligning writes with HDD sectors for performance (which at the time I, and others, rolled our eyes at since we knew we'd never need it, although not because of SSDs, but how many people write a database?).
This was a 3 year course at a "red brick" university ~1997-2000 in the UK but by no means one of the best for technology - e.g. the lead of the department, and by extension those under him, refused to teach design patterns (or enterprise patterns).
When I asked why of 2 professors and laid out (what I thought was) factual grounding I was told because patterns are for Java or C++ and language specific (which as you probably know is BS, only implementations are, or patterns that work around a language deficit). I later learned they took this as personal criticism, instead of course criticism.
Offtopic: I'm still salty 25 years later about being given bad grades for things such as that (i.e. first and 2:1 grades for some coursework, thirds, passes and fails in others usually those I happened to argue in even though marking was meant to be anonymous).
I now have an illustrious career in IT, open source and competitive coding (having won my fair share).
Tldr; I learnt don't argue with people who grade you in a polarised institution until you get the qualification. I wish I could have told myself that at 19.