> If I were interviewing a candidate now, the first things I'd ask them to explain would be the fundamentals of how the Model Context Protocol works and how to build an agent.
Um, what? This is the sort of knowledge you definitely do not need in your back pocket. It’s literally the perfect kind of question for AI to answer. Also this is such a moving target that I suspect most hiring processes change at a slower pace.
I would just give some sort of LLM-esque answer that sounds correct but is very wrong, and hope I would get the opportunity to follow that up with: "Oh, I must have hallucinated that, can you give me a better prompt?"
Having worked with and around a lot of different people and places in IT as an instructor; frankly the funniest thing I've observed is that everyone in IT believes that there is some baseline concept or acronym or something that "ought to be obvious and well known to EVERYONE."
And it never is. There's just about nothing that fits this criteria.
Sure, but if you've never even heard the term hash table, and you don't know that Dictionary<K,V> or your language's "associative array" or whatever uses hash tables under the hood... you're not a programmer in my mind.
It's such a foundational thing in all of modern programming, that I just can't imagine someone being "skilled" and not knowing even this much. Even scripting languages use hash tables, as do all interpreted languages such as Python and JavaScript.
Keep in mind that I'm not asking anyone to implement a hash table from scratch on a white board or some nonsense like that!
There ought to be a floor on foundational knowledge expected from professional developers. Some people insist on this being zero. I don't understand why it shouldn't be at least this?
You can't get a comp-sci or comp-eng degree from any reputable university (or even disreputable ones) without being taught at least this much!
What next? Mechanical engineers who can't be expected to hold a pencil or draw a picture using a CAD product?
Surgeons that have never even seen a scalpel?
Surveyors that don't know what "trigonometry" even means?
No, it is actually a critical skill. Employers will be looking for software engineers that can orchestrate their job function and these are the two key primitives to do that.
The way it is written is to say that this is an important interview question for any software engineering position, and I'm guessing you agree by the way you say it's critical.
But by the same logic, should we be asking for the same knowledge of the language server protocol and algorithms like treesitter? They're integral right now in the same way these new tools are expected to become (and have become for many).
As I see it, knowing the internals of these tools might be the thing that makes the hire, but not something you'd screen every candidate with who comes through the door. It's worth asking, but not "critical." Usage of these tools? sure. But knowing how they're implemented is simply a single indicator to tell if the developer is curious and willing to learn about their tools - an indicator which you need many of to get an accurate assessment.
Understanding how to build an agent and how Model Context Protocol works is going to be, by my best guess, the new "what is a linked list and how do you reverse a linked list" interview question in the future. Sure, new abstractions are going to come along, which means that you could perhaps be blissfully unaware about how to do that because there's a higher order function to achieve such things. But for now, we are at the level of C and, like C, it's essential to know what those are and how to work with them.
It's a good heuristic for determining how read in somebody is to the current AI space, which is super important right now, regardless of being a moving target. The actual understanding of MCP is less important than the mindset having such an understanding represents.
Hard disagree. It’s not super important to be AI-pilled. You just need to be a good communicator. The tooling is a moving target, but so long as you can explain what you need well and can identify confusion or hallucination, you’ll be effective with them.
Nope. Being a good communicator and being good at AI are two completely different skillsets. Plenty of overlap, to be sure, but being good at one does not imply being good at the other any more than speaking first-language quality English means you are good at fundraising in America.
I know plenty of good communicators who aren't using AI effectively. At the very least, if you don't know what an LLM is capable of, you'll never ask it for the things it's capable of and you'll continue to believe it's incapable when the reality is that you just lack knowledge. You don't know what you don't know.
Um, what? This is the sort of knowledge you definitely do not need in your back pocket. It’s literally the perfect kind of question for AI to answer. Also this is such a moving target that I suspect most hiring processes change at a slower pace.