Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I've interviewed hundreds of programmers, and really, you cannot trust pretty much anyone.

There's so many people that think they can code, or just say they do that really don't, that some sort of basic coding test is necessary if the job requires it.

Our hiring rate for some roles is 200:1, but we pay handsomely for those.

Think it the other way around, interviews are a two way process, you're also interviewing the company, and the way they conduct the interviews, in many ways also tells you what they value. What they're like.

Take it as an opportunity to get to know each other. Bring this up in a constructive manner. It may make all the difference and provide a stronger signal to both of you, you learn how do they react to feedback and at the same time you show that you're capable of critical thinking and care enough about your craft to call bullshit when you see it.



If those tests actually filtered out incompetent programmers, I'd say they were great, but my experience is that they all test very shallow knowledge that doesn't relate to ability to produce software that performs under load.

If you could come up with such a test, it ought to just be used as a standard for everybody to take.

And yes, as unpopular an opinion as it is around here, I'd love to see a "bar exam"/licensing board for software developers.


I've had moderate success discussing Rust (a language we hire for) at moderately low levels.

Ie, i've had the thought that it's difficult to bullshit topics with any depth. Even basic questions, why isn't an Rc safe to send to a thread? How does an Rc work? What is the difference between an Rc and an Arc? etc

I'm happy to accept "i don't know", those are valuable too. But the whole conversation process led me to believe a lot about the candidate. We've hired 10+ juniors and seniors with this process and i've not felt cheated yet.

I still feel like i need to find a way to open up people who "freeze". I do that myself, so i can relate, and my strategy of interviewing is pretty 1:1. Still though, i've had more faith in it than live/take-home tests.


> I'd love to see a "bar exam"/licensing board for software developers.

Can't resolve IRQ conflicts? Don't know the big-O complexity of different Delaunay-triangulation algorithms? Out of date on which features Microsoft added to LINQ in different versions of C#? Don't know where the "launch debugger" button is in Android Studio? Sorry, we aren't allowed to hire you to work on Instagram's iPhone app.

Better hope the police don't find out your Android has an unlocked bootloader; I hear prosecutors are really cracking down on unlicensed software development ever since the assassination. They say unregulated software development leaked his location to the Zetas, I don't know if I believe that.


Witness the world of unlicensed renegade civil engineers roving across the American landscape, building unauthorized bridges and infrastructure.

Or perhaps Robert De Niro's character in Brazil.


And I think this is where the crux of it is; not all software is equal.

Building bridges is very different from putting a shed in someones backyard.

Software is a spectrum and if your company builds sheds in peoples backyards then you don't have to hire in the same way you'd hire a civil engineer.

Lots of companies copy the Google style interview without actually being anything like Google or any understanding why Google hires that way.


Man, can you imagine how great our physical infrastructure would be if the Linus Torvaldses and Fabrice Bellards of the civil-engineering world could actually do this?

Now imagine it was illegal for Linus Torvalds and Fabrice Bellard to do what they've done.



It's difficult to interpret your comment. It's a link to a news story about how a partly built "illegal" stone bridge collapsed, killing 11 people, presumably because it was poorly designed. In context, you seem to be suggesting that Linux, QEMU, and ffmpeg are analogous to that bridge because they are of extremely poor quality, while there's some kind of official government software licensing body that protects people from such terrible software. Is that what you meant?

No such body exists, though many attempts have been made to establish one, with dismayingly destructive results, as I outline in https://news.ycombinator.com/item?id=30621151.

I think that if we want to enable the Fabrice Bellard of civil engineering to build bridges, we probably need to keep all humans at a safe distance from the bridge until it's fully complete, then stress-test it to validate the FEM model before opening it to traffic. Also, probably not build it out of brittle materials like stone, because the stress test wouldn't give you enough information to validate the model's failure predictions.

We should be thinking about how to make civil engineering more like software engineering, with rigorous version control, automated testing, automated builds, rapid improvement in the state of the art, and safety features that minimize risk to humans in case of failure. Not vice versa.

(Though it would be nice if more programmers were in the habit of planning ahead and thinking about failure modes.)


It's an unserious comment responding to an unserious discussion. You've provided a bevy of historical knowledge, which while interesting, in no way resembles what software engineering licensing would be like in the present day. For one thing, corporate power is so entrenched and regulatory bodies so enervated in this space (as everywhere), there is absolutely no possible way the licensure dystopia you've so vividly conjured could exist. And nobody on this forum calling for licensing is asking for some onerous top-down governmental bureaucratic licensing regime, so you've basically created a neat '90s cyberpunk retrofuturistic setting that has little to no relevance to the discussion at hand.

If anything, people are calling for industry standards, something akin to craftsmen guilds of olde, overseen by an existing professional association of engineers such as the ACM or the IEEE [0]. Alternatively, perhaps it could arise as the end result of Triplebyte or some other private solution trying to be the Collegeboard of this industry. Speaking of which, academic exams such as the SAT, the ACT, or AP tests have no legal binding, but most higher academic institutions (until recently) choose to use them as standards. So it's more of a widely-subscribed informal standard.

In practice, such a license wouldn't hold too much water anyway, certainly FAANG would ignore it for the most part in favor of their higher standards, but perhaps it could at least be a Leetcode replacement for smaller firms [1]. That would save both applicants time in having to run the LC gauntlet over and over again and every single company they interview with, and help those smaller companies avoid their hiring woes of cargo-culting FAANG interview practices and accidentally filtering out every applicant along the way. None of this would forbid the practice of programming by non-accredited practitioners, as the licensing regime bogeyman you fear would have absolutely no teeth for enforcement.

Finally, if formalizing the engineering nature of software engineers means that American engineers can be like their Canadian counterparts and wear iron rings [2] to mark their achievement and profession, then that is infinitely more cool than the status quo and more compelling than any number of fantasy dystopias, or evocations of the state of the art of '70s computing, or Operation Sundevil, or the goings-ons in... Iran.

[0] https://news.ycombinator.com/item?id=30616114

[1] https://news.ycombinator.com/item?id=30493041

[2] https://en.wikipedia.org/wiki/Iron_Ring

> We should be thinking about how to make civil engineering more like software engineering, with rigorous version control, automated testing, automated builds, rapid improvement in the state of the art, and safety features that minimize risk to humans in case of failure. Not vice versa.

Have you seen the state of software engineering? I'm not sure if this field is in any state to be acting smarter-than-thou towards other engineering disciplines.


> If anything, people are calling for industry standards

Industry certifications do exist. They just haven't been very successful so far. Perhaps we're going to see more focus on them if the pace of development and disruption slows down, but as it is I don't think there's much interest in them.


Not only could it save us time on job interviews, it could save us time on tech support calls: https://xkcd.com/806/

I think you're being unrealistically conservative about how much things are going to change. You might be right that things will get better instead of worse, but overall I think you're seriously underestimating just how much and how fast the world changes over time. Don't forget that it's only been 21 years since "Know Your Customer" was an unthinkable Big-Brother sort of idea, only a year since "vaccine passports" were widely thought to be a paranoid fantasy that could only happen in Israel, and only two weeks since Russia invading Ukraine was widely dismissed as implausible.

More broadly, I think that if you want to know what's going to happen 50 years in the future, you need to look at things that have happened at least 50 years into the past. 02072 will probably be as strange to us, if we live to see it, as our life today would have seemed in 01972.

I'm not sure why you're dismissing Iran. 85 million people live there and their welfare is important.

As for "smarter-than-thou", I think it's rather that good programmers are acutely aware of the severe limits on human mental capacity and therefore look for solutions that don't require them to be infallible or even smart. Also, we have computers, so we can build our practices around them!

A couple of years ago, Hillel Wayne interviewed a number of "crossover" engineers who had worked both as software engineers and in other engineering fields, and my recommendations above are strongly influenced by what they said about what the different fields can learn from one another. If you're interested in these issues, you might be interested in reading his writeup: https://www.hillelwayne.com/post/what-we-can-learn/


Actually, if you had wanted to debunk my proposal, this classic would've be the one to do it: https://xkcd.com/927/

> I think you're being unrealistically conservative about how much things are going to change.

They will change, but the direction you are invoking is fundamentally in the opposite course of history wrt this industry. When has there been greater government control over the software industry, specifically when it comes to what constitutes a software engineer?

> You might be right that things will get better instead of worse

I said corporate control is rampant and regulators are as weak as ever. I supposed from a certain political point of view things are getting better.

> only two weeks since Russia invading Ukraine was widely dismissed as implausible

The Orange Revolution happened in 2005. I'm sure technothriller novelists and wargamers have at least been considering the current scenario since then.

> I'm not sure why you're dismissing Iran. 85 million people live there and their welfare is important.

Because this entire discussion has been mostly about Silicon Valley-adjacent engineering. Perhaps Leetcode interviews are a pain point for Iranian programmers. But if you're bringing up government tyranny in civil rights in one area to dismiss a licensure scheme - or more accurately, certification [0] is just plain ol' motte-and-bailey.

[0] https://news.ycombinator.com/item?id=30493814

> Also, we have computers, so we can build our practices around them!

Computers are only as good as those who write the code, unless we are to start building our practices around the results of neural networks, which is a most intriguing proposal.

That Hillel Wayne article is a good read, but again, I am calling for humility. Because much of the engineering in this industry is far from the lofty bridges and oil rigs the word implies: https://www.karllhughes.com/posts/plumbing

And to return to Wayne, consider part one of his series: https://www.hillelwayne.com/post/are-we-really-engineers/


Why would I want to debunk your proposal? I don't think it's bunk, I just think it would have some undesirable consequences. That's no bar to pointing out the desirable ones!

This is the first time Silicon Valley has been brought up in this discussion, but keep in mind that practices in the US are influential around the world, and the harms from strengthening inequality and substituting credentialism for meritocracy fall predominantly on people outside the US. The discussion has mentioned YC (Boston originally), Rust (Mozilla, San Francisco, previously SV), Microsoft (Washington), Android (SV), iPhone (SV), Instagram (SF), the Zetas (Mexico), Brazil (the movie), the American landscape (much of which is in Brazil, the country), Google (SV), FAANG (SV/Washington/SV/SV/SV, but all with offices worldwide), Linus Torvalds as a student (Finland), Fabrice Bellard (France), Guangdong (China), the ACM (US-wide), the IEEE (also), Triplebyte (SV but kind of worldwide), the SAT/ACT/AP tests (US-wide), Ukraine, Russia, and Canada.

I agree that the danger of war in Ukraine was foreseeable; I wrote about it in 02016 https://dercuano.github.io/notes/wwiii-genesis.html https://dercuano.github.io/notes/state-of-the-world.html and 02017 https://dercuano.github.io/notes/2017-sap-allocation.html and George Kennan wrote about it last millennium. But it's also true that it was widely dismissed two weeks ago.

I'm glad we agree on calling for humility! And, yes, Wayne's whole series is worth reading.


On the same note of this prior conversation, what timing for this to be #1 on the front page:

https://news.ycombinator.com/item?id=30623617


Heh!


Yep, that's exactly the sort of ridiculous hyperbole that gets thrown around every time somebody suggests any sort of standardization around professional software development!


There's no reason to think it's hyperbolic or ridiculous.

We've seen lots of sorts of standardization around professional software development, with uniformly dismal results, and lots of criminalization of experimentation by astoundingly incompetent authorities.

— ⁂ —

The first official standard for professional software development was COBOL, shortly followed by JOVIAL.

In the 01970s the US military standardized on Ada for, among other things, their hard-real-time systems; it took about a decade of violating the standard to convert Ada into a system that was somewhat suitable for hard-real-time systems, after which an Ada-induced problem destroyed the first Ariane 5.

The ACM withdrew from the SWEBOK process after it became apparent that knowledge of actual programming was going to be completely excluded; instead SWEBOK certifies that you endorse the obsolete, counterproductive management processes to which "agile development" was conceived as an antidote. (If you're familiar with the BOK for some other branch of engineering, you may be aware that, for example, the structural engineering BOK includes not only project engineering practices but also what makes structures stand up or fall down, and the civil engineering BOK includes things like engineering mechanics, strength of materials, and chemistry. Presumably this is because political decisionmakers have been able to observe the difference between successful and unsuccessful skyscrapers and dams and infer which engineers were responsible.)

The A+ certification for IT work is impossible to get if you only know Linux and not Microsoft Windows, and last I saw, it did still require that you knew how to debug IRQ conflicts—long after ISA buses were limited to PC104 systems.

PCI-DSS audits routinely, though not universally, mandate counterproductive security practices like installing virus scanners on Linux servers.

If we'd effectively standardized professional software development in 01960 we'd still be programming in COBOL today.

Moreover, the Board of Licensure could revoke our licenses when we criticized widespread bad software development practices, as has happened to Charles Marohn of Strong Towns: https://www.strongtowns.org/journal/2021/5/23/lawsuit

— ⁂ —

You may not have commemorated it, but on March 1 we passed the 32nd anniversary of the Secret Service raiding Steve Jackson Games for publishing a "manual for computer crime", namely, the Cyberpunk module/game for the GURPS role-playing game/system.

Outside the US the situation is often even worse; Saeed Malekpour was sentenced to death in 02010 because he wrote open-source image-uploading software and a porn site used it without his knowledge. (His sentence was suspended; he escaped in 02019.)

Do you really think it would be a good idea for the governments that chose those prosecutors to decide who gets to program and how?

— ⁂ —

The bigger issue is that computers are now our exocortices; we've offloaded significant amounts of our thinking and communication onto them, as we did previously with inventions like books and clocks. Computers vastly extend our mental abilities. Without control over our computers—without the legal and practical ability to program them and to investigate what they are programmed to do—we in effect lose our freedom of speech, association, and thought. That power is far too dangerous to allow any guild or faction to monopolize it.


I mean, you don't need a really hard test to weed out most hacks. In any case, if the test is hard, companies that know their shit in hiring, care more about how you approach it than what the end result is.

In the end, the interview tries to figure out how it's like to work with you. Do we get along? Can we communicate effectively? Will you be happy and thrive here?


You sound like a great hiring manager/interviewer. However, I have read stories of people who got rejected only because they did not finish the coding challenge in time. The engineer reported that the candidate communicated well, but did not finish. Hiring manager rejects the candidate.


Yeah, there are bad employers and bad interviewers. Sometimes the importance of it is underplayed. I always drill others that interview duty is the most important thing you can do at a company (even though it sometimes feels like a chore). The people you help choose, influences culture and long term viability of the company. Take it seriously and be fair with candidates.

Having said that, figuring out a fair and effective interview process is HARD. It requires constant tweaking and monitoring. Also, you have to start from a decent set of values at the company.

For example, we do shadow interviews. When we certify interviewers, we have them come to watch and listen to an interview (we explain to the candidate beforehand what's going on), and then after the interview ends, we meet with the shadow interviewer and discuss what they picked up, and explain why I went such and such way, what I was looking for, why such and such mistake is not a big deal, and so on.

After a couple of these, we have them interview, but with a certified interviewer shadowing them. Just observing (we have them practice on us their interview exercises if any before the interview).

And the we give feedback, you rushed a bit here, give them more time to be at ease with you, did you notice such and such, whatever... If everything goes well, we certify them, and they can fly solo.

We try to have a varied pool of interviewers to avoid biases. The downside is that you need to do more interviews as a candidate, to get a fair assessment (i.e. no single person can derail you).

In the end, we've all been in a crappy interviews and try our best to not inflict that on other people.


> ... they all test very shallow knowledge that doesn't relate to ability to produce software that performs under load.

That's a later filter. The first filter is, can they code at all. Like, FizzBuzz level of "at all".

I mean, if you need them to produce code that performs under load, you'd better interview for that too. But people tend to first worry about whether they can clear the lowest bar before they proceed further.


It is difficult to comprehend how low-skilled the shallow-end of the applicant pool is. I think it’s gotten worse, not better, since the original FizzBuzz essay was written.


They actually do.

The test isn't (or shouldn't be) that you understand string manipulation (or whatever) but rather that you can work through a new problem. There are many programmers who claim many years of experience but have poor problem solving skills. They can implement a CRUD form and basic tasks, but fall flat when faced with a problem they've never encountered before.


The thing about the license is that it avoids the redundancy of having to pass the same orthogonal algorithm questions at every single company an applicant chooses to apply at. That alone is a great time-saver, and allows applicants the freedom to focus on actual work.


Not really. You'll still have interviews, they'll just be more personality and resume-driven.

Personally, I'll take writing FizzBuzz over and over again vs. "where do you see yourself in 5 years?" ("I don't know but probably not working here") and "why do you want to work for our company?" ("to earn money").


Most interview loops include Leetcode for the technical screen, then at least one domain-specific coding section (like a more tense pair programming section) during the on-site, and maybe another algorithms section on the on-site (several if you’re Google). The license would at least eliminate the algorithms section to focus on the domain-specific, which is at least more reflective of real-world work.


Also, its a lot easier to study leetcode problems than it is to study for domain-specific ones (since there are many different domains). I would expect having more domain-specific problems would limit you to a narrower range of jobs.


It all comes down to philosophy- one could argue that it's better to have interviews reflect the actual material of the work. And that such domain-specific questions would require less prep, as the candidate's experience should already be strengthening them to answer such questions. Not to mention additional career development in those skills- asking and answering Stack Overflow questions, technical blogging, side projects using the same stack, etc.


That's much worse than leetcode problems because it requires tons of work outside work. Most people, myself included, have a life and interests outside of the very narrow focus of their career, and don't want to spend 16 hours a day focused on that one aspect of life. Studying a few leetcode problems is no big deal in comparison.


Depends on the companies you're applying to, it's rarely just "a few." One of the difficulties is retention. A LeetCode a day only goes so far until you have to repeat older problems. Better to get it all done in one test.


You're assuming they will actually eliminate it though rather than just pick something else to test.


Sure, but at least that might be something more relevant to day to day work, and require less grinding than Leetcode.


> where do you see yourself in 5 years

Mostly in mirrors. Unless I turn into a vampire.


I think you are looking at this the wrong way. the test isn't too find who can write software that performs under load. It is too find out who can write software, period. Essentially the fizzbuzz test. There are a ton of people who say they can write code, but then can't write a simple for loop.


Oops, I posted my rant[1] about our need for a bar exam before realizing you got there first. Fully agree, this profession really needs this.

1: https://news.ycombinator.com/item?id=30616361


> you cannot trust pretty much anyone.

This. I hire financial analysts regularly and it's similar in that it requires a high degree of proficiency in Microsoft Excel. People consistently over many years of hiring absolutely do lie about their level of proficiency. Sometimes it's over confidence, ignorance, but often just a straight lie. So, we do Excel modeling similar to this. It's truly the only way to get a good gauge of where the individual is in their proficiency and how they approach modeling problems. I don't see a way of eliminating them, but to OP's point, I do try to make the tests generic (no industry/business knowledge required), open ended (allowing them to solve the problem however they want) and generally "good". We always tell folks taking the test, "it's not meant to be trick questions, if the instruction is confusing ask us for clarification" and afterwards we discuss "how do you think you performed?", "what did you think?" those kinds of things to try to iterate on any of the poorly constructed aspects we may be blind to.


At least with Financial Analysts they actually use Excel daily (or often).

For me I have over 20 years of experience as lead architect, designer, engineer, of both apps and databases at a wide range of companies and industries. But it feels like that experience is out of touch with how most companies hire in the current market.

Whenever I get invited for FAANG interviews the recruiters tell me it's best to cram for months on stuff I would never/rarely use in real life and learned in theory over 20 years ago. I did one interview after prepping for an hour or two per day for two months and passed most interviewers but missed on one so they told me to try again in 6 months. Non-FAANG tech companies out of Silicon Valley seems just as bad from what recruiters have been sending me.

Although some of them FAANG companies pay so well (particularly META) it can make it worth it. That said after wasting so much time on just for the option to interview every 6 months again makes me wish I could do some other industry that pays well or better.

The current interview process makes me lose my interest in the profession but not actual development of apps and sites.


Honestly, what you describe is a major reason I never pursued software engineering as a profession. I love it as a hobby and entrepreneurial tool.

Also, I kind of just prefer building my own things instead of working on some over complicated stack I didn’t design.


Is there a video demonstrating the kind of modeling you're talking about? The only thing I've seen that sounds similar was a Martin Shkreli video which is of course removed from YouTube now, and I don't know whether Shkreli was a beginner or an expert, just that he navigated Excel 20 times faster than I do.


There's some videos of high level skills if you search "excel modeling championship". Shkreli was a hedge fund/wall street guy if I remember correctly and probably has more of a banking background. That area of finance tends to be hyper competitive on Excel skills. I've heard stories of them banning mice from the office to force people to use keyboard shortcuts and what not. It's locker room braggadocious badge of honor in that world. But it's the type of application navigation that it sounds like you're describing.

I on the other hand am in realm of corporate finance. Think CFOs, Controllers, and their teams. It's a bit more relaxed here but you still need enough skill to get the job done effectively. We don't really care how you get the job done (use a mouse, keyboard, google, it's all game) but you need a nice output and your approach in an interview context should convince me that you'd be capable of solving similar problems in the future. For that reason, I typically provide crappy raw data (oddly and inconsistently formatted) and ask them to build or analyze it. Given some fictional general ledger data; can they build a 3 statement model? Can they summarize some payroll data in a logical way? Build a budget scenario based on the following actual financial data... also, here are 5 basic features the model should incorporate. I don't typically watch them so I don't know how they navigate the application (I don't find it important) but we give a time limit and stop them at time and review work attempted. The highest skilled folks tend to finish within time.

It's counter-intuitive but the worst thing for an analyst is to work at a company that has a great ERP/BI tools with reliable data, etc. It's only about 10% of companies that have their shit together in that regard and those analyst just run reports from XYZ GUI and never have to work with data. So they have low level of Excel skills but think they are great analysts. I've never actually experienced that world, but I know their analysts wouldn't cut it in the real world of crappy data and ad-hoc work. I've seen them crash and burn too many times and it's painful for the entire team.

Back to your original point, as you might expect, most folks that are great at Excel just use it a lot to build complex things. They may have learned the keyboard shortcuts along the way; or not. It just takes practice and muscle memory will kick in but also requires regular practice to keep sharp at it. There's also some pretty decent tooling to help you step your game up if you wish. I personally like https://macabacus.com/


> There's some videos of high level skills if you search "excel modeling championship"

From a few months ago...

Excel World Championship Finals - https://news.ycombinator.com/item?id=29521324


This is great, thanks!


I've recently had multiple candidates recently with inflated resumes. Not mere embellishments, but claiming advanced degrees in computer science when they can't solve a simple algorithmic problem (something that can be coded in <10 mins).

I dislike most take home problems, especially longer ones. But there's few exceptions where I could move forward with a candidate who refuses a technical interview. In almost all cases it's egotistical and a glaring red flag.


> I've recently had multiple candidates recently with inflated resumes. Not mere embellishments, but claiming advanced degrees in computer science when they can't solve a simple algorithmic problem (something that can be coded in <10 mins).

I've a BSc in one of the better universities for CS in the UK. I can barely implement any algorithm that I learned during my university years. The amount of times I've been told implement algorithm x, y, and z in the past ~10 odd years in my professional career has been zero. Conversely, the amount of times I've been asked to implement them during interviews has been way higher.

Unless you're hiring someone to implement such an algorithm, you shouldn't be using them as a test, because most, if not all, the algorithms we got taught during my CS degree, have one or more implementations to the point where I can just use those implementations. Heck, I wouldn't even want to waste time re-implementing it, and probably doing a bad job of it, when I can just use someone else implementation, who most likely will do it better than anything most people can.


No OP, but when I ask people to solve an algorithmic problem, I don’t mean “regurgitate this algorithm from memory”, I mean “I give the candidate a plain english description of an algorithm and ask them to turn it into code” (eg “read this .txt file, for each word in the file, reverse the word, and print it to stdout”). This does a pretty good job of filtering out people who can’t write a `for` loop, which is a lot of people, even those with “senior” in their past job titles...


This is exactly what I'm referring to. I'm not testing for rote learning. I'm testing problem solving ability.


The vast majority of places are asking algorithmic questions not for you to reimplement known algorithms, asking candidates to regurgitate existing algorithms is widely known as bad practice.


> The vast majority of places are asking algorithmic questions not for you to reimplement known algorithms

So your claim is every company is coming up with new algorithms that they use for their tests??


An algorithmic question is a coding problem that focuses on problem solving, and may include questions/constraints related to time & space optimization.

The test should never rely on rote learning (e.g. implement Dijkstra's algorithm from memory) or require a "magic" solution (e.g. Project Euler)


> Not mere embellishments, but claiming advanced degrees in computer science when they can't solve a simple algorithmic problem (something that can be coded in <10 mins).

I thought the second part of the sentence would be about how they don't actually have those degrees. If you say that you have the degree you have, that's not embellishing at all, and it doesn't mean that you'll be able to solve a coding problem. They're two separate things and you've kind of mixed them up.

There's two questions

- is the candidate outright providing false information (where they've worked or studied, how much experience they have and with what tools etc)

- does the true information give you the wrong impression about the candidate i.e. you think someone with this degree or with that many years of experience should be capable of more than they are


Ok I agree with your point that "embellishment" may not be the correct term and I really meant the latter.

I 100% expect that someone with an advanced computer science degree can demonstrate they have problem solving ability. I don't really think that is an unreasonable assumption. I mean we're talking undergraduate level skills that are the bare minimum for an entry level position at my org.

If it really is the case that an advanced computer science degree does not convey computer science knowledge then it reinforces the need for screening questions.


> Ok I agree with your point that "embellishment" may not be the correct term and I really meant the latter.

There's no 'may' involved. It is not the correct term.

Here in the UK, embellishing your CV can be classed as 'fraud by false representation', which carries a maximum 10-year jail sentence.


The accusation of being “egotistical” is total projection. If you want people to do monkey tricks for, the least you could do is allow a bit of dignity by letting them speak for themselves.


If someone chooses to refuse a technical interview they did speak for themselves. Even more so if they, as OP did, explicitly said that it is "insulting" or otherwise beneath them.


Stats from the experts show that even great engineers fail technical interviews [1]. Anecdotally: I've bombed interviews myself! Sometimes due to anxiety (a real problem for some of us), sometimes because the problem was difficult. So false negatives are very common. Especially if you are interviewing 200 people - you are wasting SO MUCH TIME. I simply can't believe that you're actually getting that 1 in 200 dev. Why?

Because devs also pass interviews they have no business passing. You get a technical question that you already know the answer to and so you get through it easily (this has happened to me as well). I'd consider this a false positive. And today, I suspect this is just as likely as a false negative - we have many books on coding interview prep, probably hundreds of sites with coding problems to practice. Are you getting a good dev or are you getting someone that practiced 20 questions on the same subject last week so they happen to know exactly how to solve your interview problem?

And the worst types of false positives are not ineptitude, in my opinion. I can make a mediocre dev (or a dev who is bad at algorithms) into an excellent dev. But there are a lot of devs who are very smart, but very lazy or toxic. And those people will sometimes impress you the most in interviews... because they have to do it a lot because they don't last.

If anything, if the industry insists on technical coding challenges, we should stop the random bullshit and commit to a standardized exam and license. That's how other professions do it (and it works quite well with rare exceptions), and it would save software engineers a lot of time and headache, and make the hiring process smoother for everyone.

Experience should count the most. How recently have they been coding. Look for signals like: they got promoted. Listen to them talk about things they worked on and find out how much detail they can give on those things.

1. https://medium.com/free-code-camp/you-will-randomly-bomb-tec...


> Listen to them talk about things they worked on and find out how much detail they can give on those things.

This always stresses me out, not just in interviews (which are actually a bit easier since you try to prepare beforehand there, although I guess the anxiety and stress can cancel this out anyway) but also in everyday chit-chat with other devs. I strongly suspect that I may have ADHD and I'm just unable to randomly talk about things I've been doing even as recently as last week without preparation. I can recall a lot if I actually sit down and calmly trace back some stuff I've been doing years ago, but if I got surprised with a question like "you said in your CV that you used X at job A, what was the thing you liked the most about it?" it could take me the whole interview to just get into the mental state necessary to even recall my experiences with X, let alone put them into proper words. I'll eventually find a perfect answer, but chances are high that it will happen at evening under shower accompanied with a huge regret titled "I should have said that back then" :P


Um, not to sound like an ass but how is being asked to dig into the things on your resume a surprise?


Every single thing on it? I've worked with dozens of technologies across multiple fields in my career and I'm still in my (late) twenties, so there's probably more to come. I don't think I could be well prepared for every single question about my past that could come up during an interview. I once got surprised with a question about "the story that led to" discovering a CVE that's attributed to me, for example. In an asynchronous chat that would be a very fun question to answer, but not in a real-time conversation. Playing a detective trying to reconstruct your own memories clue by clue isn't compatible with keeping the interview going.

Nevertheless, I can be prepared on what I was working on during my time at company X, but a question like "what I liked/disliked the most" could still put me off track if I haven't thought about it before. I just don't know without a longer analysis.


I've actually prepared a document with common interview questions / answers / examples of work to talk about, which I typically review before an interview and keep open in my browser while I'm doing video interviews. I think it helps, you might find it to be a useful exercise.


>Anecdotally: I've bombed interviews myself! Sometimes due to anxiety

I used to get pulled in on anxious candidates to help make a final decision. Interviews are rarely an environment were a person does its best work. Experienced interviewers know this.

Typically, I didn't make them code. Coding and anxiety don't mix well.

I usually approached the interview with roughly the following structure:

- Reduce anxiety

- Work on a problem (figure out how it is to work together)

- Reverse interview

To reduce anxiety, what has worked for me is basically making them feel in control. I first make them talk about themselves (i.e. something they know), and then I try to find a passion, something they feel strongly about, related to a project they did (hobby, uni, whatever).

Have them walk me through it (still something they know and control the narrative on, and also something they care about).

I hear actively and engage with them. I'm non judgmental and positive about whatever they tell me.

At this point most people feel at ease.

Then I present a simple, open ended design problem, no code required, but I may probe for specific knowledge here on DB, distributed systems, concurrency, probability, security, whatever I'm looking for. Usually as a discussion of some sub-topic in the design.

I make it clear at the beginning, that I don't care much about the solution and more about the process. Picture us working together working on this problem, this is what I'm trying to figure out. How is it to work with you, and hopefully you'll figure out how is it to work with me.

Then I typically close with an interview reversal, I give them a chance to ask me anything and I answer truthfully so they can assess if they would really like to work here.

This has worked well for me, but I'm an extrovert, so the putting you at ease part is relatively easy for me (it's emotionally exhausting sometimes).

>specially if you are interviewing 200 people - you are wasting SO MUCH TIME. I simply can't believe that you're actually getting that 1 in 200 dev. Why?

Most of the 200 don't pass basic screening, 30 will do a simple code screening, maybe 10 will get to an actual interview. Then one or two of those will get hired.

>But there are a lot of devs who are very smart, but very lazy or toxic.

We care a lot about this, we've passed on really smart candidates with lots of experience because we thought:

- they wouldn't be happy working with us

- they were hard to work with

BTW we measure post-interview experience, whether you got hired or not, to improve our interview process and assess interviewer performance, we might pull somebody from the interviewer pool if they perform poorly at it.

>a standardized exam and license

I really don't think it will change anything. You have a CS degree, I will still make you code a bit.

In any case, hiring requires many signals. Coding is just one, there are many others at play that can make or break a hire.

Also, the company moment matters: are we risk averse at this point? A startup usually would rather lose a good candidate than hire a bad one, larger companies can take more risk, and give you time to prove yourself.


We've talked about this so many times on HN, and I've weighed in on this so many times. But it's an important enough topic that we should have the discussion again.

You left undefined what you mean by "some sort of basic coding test". I have been screened out of jobs based on questions like "find all matching subtrees in a binary tree", or "find all sub matrices with positive determinants in an NXM matrix" (at the whiteboard, in 45 minutes).

I don't think this is as much about critical thinking as you do. What limited feedback I got (from a recruiter, never the person who did the test) was "analysed the problem and sketched the solution well, but didn't make enough coding progress."

Again, this was 45 minutes at the whiteboard in a day of getting shuttled from room to room for 5 hours of this.

At least you didn't mention "programmers who can't write fizz buzz." I have never heard of anyone getting screened out because of fizz buzz. You may have seen it. But based on my experience, the coding problems are way, way harder than fizz buzz.

Maybe you're doing things differently. If so, that's great, but I don't believe it's he norm at all.

I'm also totally ok with you giving a difficult technical test that will weed out many competent programmers, as long as you don't pretend it isn't.

I'll finish the way I always do - the alleged "shortage" or software developers in Silicon Valley is a product of rational market behavior among educated people who are free to choose a career according to market signals and their own personal interests. Re-taking your second year undergraduate data structures and algorithms exam is about as appetizing as retaking organic chemistry for the rest of your life for a physician, or retaking your partial differential equations exam is for an actuary. If free citizens don't want an employer's job in the numbers the employer thinks they should, even though the employer has decided it pays well, that's the market's answer. I see no need for the government to ask again on the employer's behalf.


>You left undefined what you mean by "some sort of basic coding test".

Yes, they never define it, because it's really hard work to clearly define what the bar is in terms of a know coding problem (like fizz-biz).


A precise definition in the abstract would be difficult. However, examples are still very useful. Typically, there's an initial easier screening followed by a much harder coding test. Rather than discuss them in the abstract, why not provide clear examples of each?

For example, consider the difference between the three statements:

You'd be amazed with how many candidates can't actually program. Like at all.

vs.

You'd be amazed with how many supposedly 'senior' programmers can't pass a very simple programming test. And even those who do can't handle relatively elementary data structures and algorithms once they get to the second round.

vs.

You'd be amazed how many supposedly 'senior' programmers can't code up fizz buzz or sum odd numbers from 1 to 100 in a loop. And even the ones who do can't search a binary of integers to see if it contains a particular value at a whiteboard in 45 minutes.

The last one doesn't define what an easy and hard problem is in the abstract universal case, but it gives me a very good sense of who is and isn't failing these tests. Without that, the claim is relatively meaningless.

Unfortunately, I actually think that many of these claims are deliberately left ambiguous. Many people who are claiming a shortage really don't want to be clear about why.


yeah I just dont get why we have to make the coding exercises algorithmic.

I care more about like, do I have to hold your hand or not? no I dont. Okay we are cool.


I'm not a fan of leetcode questions on a whiteboard, and I personally don't ask those. However for an online automated assessment, provided the question is reasonably time-boxed and not too crazy, I think that's fine, on both side of the interviews.


The questions available at CodeWars page seem the best for me: They are no "gotcha," algorithmic questions (except the higher level ones) and they allow for a good amount of problem solving.

The ones at LeerCode are asinine... oh, you dont remember Dynamic programming? Tough luck. Even if you really wont use it at all.


It has to be simple, not too hard. That first test is basically a screening. Will I invest an engineer's hour to try to figure you out?

Then, if you get to a whiteboard challenge, in those the end result is not usually what matters, but the process of problem solving is what you typically look for.

Can we communicate effectively? Will you thrive here?


I agree there are some very good fake programmers. And I haven't interviewed as many programmers, more like tens of them but I had good luck discussing the issues I was having at work.

For example, I might tell them about a bug I am working on or even show code on my laptop. Good programmers actually wanted to solve the issue. Many solved or made useful suggestions.

Fake programmers usually had pretty hard time, even had troubles verbalizing questions to ask.

And sometime I turn tables and ask them what they are stuck at work and I try to help them. With good programmers, we had good discussion and sometimes actually solved their issues. Bad programmers never had any issues.


> 200:1

Do you mean you hire 200 out of every 201 people? Or you hire 1 out of every 201 people?


Source 200, hire 1. Many don't even get to basic screening,




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: