In a medieval guild, to be admitted as a master, an apprentice had to create a chef d'oevre, or masterpiece, so called for this reason.
In the computer engineering industry, you increasingly have to demonstrate the same: either as a part of your prior work for hire, or a side project, or a contribution to something open-source.
A diploma is still a useful signal, but not sufficient, except maybe for very junior positions straight from college. These are exactly the positions most under pressure by the automation.
I think software developers might be somewhat of an outlier. Industry wants good programmers but universities teach computer science which really should be called "computation science". Much of what we learn in university will hardly ever be used while many practical skills are at best learned as a side effect. Dijkstra favorite said that computer science is as much about computers as astronomy is about telescopes.
So degrees have been a weak signal for a long time. Several of the best developers I've worked with had no CS degree at all. As a result we have interview processes that are baffling to people from other industries. Imagine a surgeon having to do interview surgery, or an accountant having to solve accounting puzzles. AFAIK we are very unusual in this regard and I think it's because degrees are such a weak indicator and the same is true for certificates in our industry.
> while many practical skills are at best learned as a side effect.
I strongly disagree, that’s the intent not a side effect.
It’s IMO a common misconception that early algorithm classes are just designed around learning algorithms. Instead basic algorithms are the simplest thing to turn abstract requirements into complex code. The overwhelming majority of what students learn are the actual tools of programming, debugging, etc while using the training wheels of a problem already broken up into bite sized steps.
Ramping up the complexity is then more about learning tradeoffs and refining those skills than writing an ever more efficient sorting algorithm or whatnot.
That is true, in the sense that the 100/200 level classes are covering programming basics in addition to whatever algorithmic theory is being presented. But beyond that that, programs really seem to differ pretty strongly on applied projects and software engineering practices (basic stuff like source control) and more theoretical/mathematical concepts. One type of capstone style class commonly seen is compiler design. To a certain extent, a good school will teach you how to learn, and give you enough of a background, class projects, internships, electives with applied options, that you get a well rounded education and can quickly ramp up in a more typical software organization after graduation. But as someone who has hired many new grads over the years, it always surprises me what sort of gaps exist. It rarely is about programming basics, and almost always about "software engineering" as a discipline.
My experience is graduates of schools focused on the more practical aspects tend to make better Jr developers on day one but then stagnate. Meanwhile graduates of the more theoretical programs people pick up those same practical skills on the job leaving them better prepared for more demanding assignments.
This then feeds into the common preference for CS degrees even if they may not actually be the best fit for the specific role.
Interesting. I did my undergraduate in Germany and my graduate in the US, so my experience might be unusual here and different from what you get in the US. My undergraduate algorithms classes in Germany and my advanced algorithms classes in the US involved zero actual coding. It was all pseudocode as you'd find in the Knuth books or in Cormen, Leiserson and Rivest.
And they were supported because they were useful labor. Even an unskilled, brand-new apprentice could pump the bellows, sweep the forge, haul wood and water, deliver messages. If it frees up the master to produce more valuable output, that’s a win-win. Then they can grow into increasingly valuable tasks as they gain awareness and skill.
IMO one of the big problems is that we’ve gone too far with the assumption that learners can’t be valuable until after they’re done learning. Partly a cultural shift around the role of children and partly the reality that knowledge work doesn’t require much unskilled labor compared to physical industries.
I was somewhat aware that in medieval period most started out as an apprentice in mid teens. Essentially work slaves in the house of a master. Then after a decade or so of toiling and gaining the skill they would go on to become individual business owners.
But I wasn’t aware about the master peace. Thank you for sharing that!
By the time one is early/mid 20s they would be nearing master level in skill. Would have faced the real world for 7-8 years, know how the world works in terms of money, dealing with customers, and so on.
Compare that with today, by early 20s one is only getting out of college undergrad. About to start the real world job training.
Yeah, they are different domains. I don't mind options for those who want to pursue an acedemic approach compared to a practical one. But for most fields we just don't have that choice anymore. Getting hand on experience? Gotta be recruited from acedemia first.
More reason to vye for labor protections. If they realize they can't just rotate out people every 6-20 months they may actually go back to fostering talent instead of treating acedemia like a cattle farm.
In the computer engineering industry, you increasingly have to demonstrate the same: either as a part of your prior work for hire, or a side project, or a contribution to something open-source.
A diploma is still a useful signal, but not sufficient, except maybe for very junior positions straight from college. These are exactly the positions most under pressure by the automation.