That's definitely strange. I studied at a university in the Netherlands which implemented what I think was a European standard curriculum, and what this article mentions is a subset of the first two years of the undergraduate degree if I remember correctly.
Some elective courses are missing, and AI though it is mentioned. We also used many of the books mentioned here. Also back when I did it there were more seperate math courses that we shared with the maths undergrads. I know that they stopped doing that after I graduated.
Undergraduate studies in the US in general are very different from in Europe.
I did my undergrad (EECS) at Imperial and MIT (for my final year), and the difference between the two was pretty enormous. My home department expected me to take five graduate classes (plus some undergraduate classes for "light relief") over the course of my year at MIT, something the other undergraduates there thought was very unusual.
In general US undergrad is much broader than in Europe, and doesn't go into as much depth, even though a US bachelors is a year longer than a European one. Not necessarily a bad thing, it just depends on what you consider the point of an undergraduate education to be; I'd say that most European universities try to structure degree programs which can funnel you straight into research or industry without having a sudden step up (the step up from undergrad to graduate studies in the US is pretty huge) whereas the US thinks it's more important to develop yourself in a broader range of areas rather than see the degree solely as a means to an end.
They both have strengths and weaknesses (having experienced both) and I don't think either can be said to be better.
I feel like a lot of my coursework was overstuffed with learning and re-learning waterfall vs agile, and fluff subjects like 'user-centred design' and 'professional computing practice'.
The latter was basically 'how to make a resumé' and 'how to not be an arsehole in email'.
That sounds like I might have been a professional education instead of a scientific education. Nothing wrong with that, you could get a full computer science degree from our university and not be able to code for shit. You definitely can't hire university grads blindly. You often see developers act tough about how knowing how a CPU works is important for being a developer, and it is, but knowing that doesn't make you a good developer. It just makes a good developer better.
Some elective courses are missing, and AI though it is mentioned. We also used many of the books mentioned here. Also back when I did it there were more seperate math courses that we shared with the maths undergrads. I know that they stopped doing that after I graduated.