Maybe I am just bad at interviewing people, but I have tried giving the experiential interviews Casey describes, but I find it quite hard to get signal out of them.
You run into questions of how well a candidate remembers a project, which may not be perfect. You may end up drilling into a project that is trivial. The candidate may simply parrot things that someone else on the team came up with. And when candidates say things, you really have no way to understand if what they're saying is true, particularly when internal systems are involved.
I have found system design interviews specifically much much better at getting signal. I have picked a real problem we had and start people with a simplified architecture diagram of our actual system and ask them how they would solve it for us. I am explicitly not looking for people to over design it. I do give people the advice at the start of every skills interview tp treat this as a real work problem at my startup not a hypothetical exercise.
I have had a lot more luck identifying the boundaries of people's knowledge/abilities in this setting than when asking people about their projects.
And while everyone interviewing hates this fact, false positives are very expensive and can be particularly painful if the gap is "this person is not a terrible programmer, just more junior than we wanted" because now you have to either fire someone who would be fine in another role if you had the headcount for it or have a misshapen team.
I have found system design interviews specifically
much much better at getting signal. I have picked a
real problem we had and start people with a simplified
architecture diagram of our actual system
To me, this heavily biases towards engineers that have already built or at least designed a similar system to the one you're presenting them.
I believe this will tend to be true even in the ideal case, which is when the interviewer is focused on "is the candidate asking great questions and modifying their proposed design accordingly" rather than "is the candidate coming up with the 'right' solution."
Because, if the candidate his already built/designed a similar system, they will naturally be able to ask better and more focused questions about the design goals and constraints.
I have tried giving the experiential interviews Casey
describes, but I find it quite hard to get signal out of them.
[...] when candidates say things, you really have no way to
understand if what they're saying is true, particularly when
internal systems are involved.
Okay, here's where I think I would tend to disagree in a very specific way. First, I would point to the linked interview where Casey conducts a brief mock drilldown interview with the host.
The host discussed work he did for Amazon on an event ticket ordering system. Now, I don't think you need to know much about the ticket sales business to think about the challenges here!
The most obvious challenge to me is some sort of distributed lock to ensure the system doesn't don't sell the same seat to multiple people. Another obvious challenge would be handling very bursty traffic patterns, ie an onrush of people and bots when popular event tickets go on sale. Another challenge, brought up by the host, that I wouldn't have thought of is that ticket sales need to avoid "fragmentation" as much as possible. Most people don't want single seats, so you want to sell tickets in a fashion that doesn't leave scattered, "orphaned" single seats that nobody wants to buy.
Those are interesting challenges, and if the interview is an experienced developer, I don't think a candidate could really bullshit through them to a significant degree.
The candidate may simply parrot things that someone else on
the team came up with.
I personally wouldn't care if the ideas originated with the candidate vs. another member of the team. I'd be looking for how well the candidate understood the problem and the solution they implemented, the other solutions considered, the tradeoffs, etc.
You may end up drilling into a project that is trivial
This feels trivially avoided. If I was the candidate and the interviewer picked a boring project when I had juicier ones, I would just say so. And if I was the interviewer and a candidate told me that, I'd happily pick one of the juicy ones. The point isn't to rigidly lock into a particular project.
You run into questions of how well a candidate remembers
a project, which may not be perfect
This would be expected. Again, I'd be looking big picture here. Approaches considered, tradeoffs made, the "unknown unknowns" that popped up during implementation and how those were handled, etc.
> To me, this heavily biases towards engineers that have already built or at least designed a similar system to the one you're presenting them.
Yes, this is not an IQ test, we are trying to see how people react to problems in our domain not measure some generalized form of reasoning. The advantage of picking a problem as close to our real problems as possible is that I don't have to worry how they generalize from the interview to work.
In general, my experience with system design interviews is that people make bad designs, and when you drill down on them they give bad rationales. Similar to coding screens, people just out themselves as not very good at their jobs regularly.
> Those are interesting challenges, and if the interview is an experienced developer, I don't think a candidate could really bullshit through them to a significant degree.
It's not really about "bullshit" per se, but about whether their understanding of their context is correct or not. They can tell you fully reasonable sounding things that are just wrong. In a mock interview, you can see if they ask good questions about their context.
> I personally wouldn't care if the ideas originated with the candidate vs. another member of the team. I'd be looking for how well the candidate understood the problem and the solution they implemented, the other solutions considered, the tradeoffs, etc.
I totally disagree with this. It is very different to be able to remember the design doc for the project and parrot the things that were talked about vs actually writing it.
If I want to hire someone who can design things well from scratch and I get someone who makes bad decisions unless someone is supervising them, I will be very disappointed.
In general, I have given both interviews to the same candidate and after saying a bunch of reasonable things about their existing work, when I ask them how to do design something I quickly find that they are less impressive than they seem. Again, maybe I'm bad at giving experiential interviews, but being hard to administer is a point against them.
My experience of hiring is also that I am generally not looking to take chances, unwinding bad hires is super painful.
You run into questions of how well a candidate remembers a project, which may not be perfect. You may end up drilling into a project that is trivial. The candidate may simply parrot things that someone else on the team came up with. And when candidates say things, you really have no way to understand if what they're saying is true, particularly when internal systems are involved.
I have found system design interviews specifically much much better at getting signal. I have picked a real problem we had and start people with a simplified architecture diagram of our actual system and ask them how they would solve it for us. I am explicitly not looking for people to over design it. I do give people the advice at the start of every skills interview tp treat this as a real work problem at my startup not a hypothetical exercise.
I have had a lot more luck identifying the boundaries of people's knowledge/abilities in this setting than when asking people about their projects.
And while everyone interviewing hates this fact, false positives are very expensive and can be particularly painful if the gap is "this person is not a terrible programmer, just more junior than we wanted" because now you have to either fire someone who would be fine in another role if you had the headcount for it or have a misshapen team.