Hacker News new | past | comments | ask | show | jobs | submit login

I'll share what we've transitioned to recently: a testing focused interview!

We provide a candidate a piece of production code with some slight tweaks made that will break it in (common) edge cases. Further, we give a test harness that already supplies some basic tests for which it will work. What we're looking for is to see if the candidate can grok the provided code (< 30 lines for a 2hr interview) and reason through ways it could break. Writing the tests insures that the candidate can _actually_ write the code and shows any areas they might be less focused on or miss.

Compared to our old procedure of writing the (provided) code snippet this has been a far more successful approach. In 2 hours I learn more about an applicant than through the entirety of the previous process. An added benefit: this workflow is representative of real work - you'll need to track down code you likely didn't write and implement new functionality, or fix something that's broken. Doing this in an interview is a near direct translation to the job (minus understanding the whole codebase).

I'd strongly recommend this unit testing based process and it can definitely be tweaked to fit your own scenario!




>representative of real work - you'll need to track down code you likely didn't write and implement new functionality, or fix something that's broken

This. The stereotypical algo interview assumes a greenfield project, but there's almost always "legacy code" you'll need to grok first.

“Indeed, the ratio of time spent reading versus writing is well over 10 to 1.” ― Robert C. Martin


And yet so many people refuse to read working code, and agitate for re-writes.

It's just bizarre to me.


When all you have is a hammer.... Think about it: coursework is all about writing new code. I can't recall any resources for learning "software archeology."

Part of the deal is the 'humiliation' of being stuck following someone else's stylistic/design/naming-scheme choices. Another aspect is the curse of knowledge, leading to useless documentation. Plus, it's easy to underestimate the amount of effort required to reinvent the wheel: all the necessary complexity looks like incidental complexity at first glance. "Why should I be forced to learn all this gobblety-guck when I should just be able to re-implement from scratch in an afternoon? [... three weeks later] Oh, now I see why."


Are you telling candidates in advance what type of code to expect or surprising them?

Do you know what I do when I see a problem for the first time that I don't immediately know the answer to? I Google it. Because A) it's faster B) many minds greater than mine might have already come up with an optimal solution that I'm unlikely to self-produce in 2 hours.

What if a developer hasn't been deep in debugging production code for a while? Those edge cases might not be top of mind.

Maybe they're doing documentation or planning or writing Jira tickets and remembering all the ways to nullcheck a JavaScript variable isn't their current life goal. Instead they might try running the code locally or write some tests for it or research to see what others have done in similar patterns.

Dreaming up magical solutions while under the stressful glare of people judging your life's work may sound reasonable to you because presumably it matches up to some internal signals you value in each other, but every workplace is different -- different codebases, different problems, edge cases prevalent in one may not be prevalent in another.

You're better off doing a traditional Q&A about their experience and a take home mini-project (or asking them about theirs). If they don't want to spend time on that, then and only them give them some timed test you feel is applicable.


None of the code that we use for this is too difficult. To date we've only hired for generalist engineers, and are not using some obtuse algorithm example. It's things you'll need to do in a day-to-day like filtering logging outputs, checking HTTP response codes, throwing errors, etc. We want to see how a candidate preforms on an example like this rather than writing something from scratch.

Of course there's Q&A and their reasoning/explanation is just as valuable as the code they right. This is just the technical portion of the interview. However, I think we've really hit the sweet spot with this approach.


Of the 3 you mention "filtering logging outputs, checking HTTP response codes, throwing errors" only the last one seems universal to me.

Take logging. Super important - I spend a lot of time evangelizing logging to younger devs and how to do it well. But filtering? How? In what way? For what values? I don't have context but the expectation I should be able to filter for specific things you have secret preferences for strikes me as a signal that's not clear or universal.

Checking HTTP codes. In what context? There are plenty of times when it's not helpful -- for example there are many page not founds that DON'T return as 404 (even tho they should). So this a priori belief that all devs should automatically have this built-in preference for checking HTTP codes as some universal signal of quality in programming is, I think, a larger assumption than you might realize.

I don't think there is a perfect way to interview, but I do think if you're going to quiz developers it helps to give them some advance warning of what areas you prefer to focus on so it's not such a random, out-of-left field line of questioning.

You may not consider your questions abnormal -- but that's the problem, neither do the people who ask obscure algorithm questions! It's very hard to validate your assumptions.


It is not necessarily bad but you need to handle people that are not familiar with the language, the environment or the test harness or even worst comes from a different field.

You also don't give a second chance if they fail this one assignment.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: