Unfortunately, my experience with TripleByte was terrible and a waste of my time. I completed the interview with a very high evaluation across all categories as provided by the interviewer.
There were certain areas where the interviewer messed up in their evaluation, e.g. they felt I "was a little weak at hashmaps and API design", which is probably because they did not know what I was talking about when I described the details of advanced hashmap implementations. There seems to be a bias to discredit the interviewee if the interviewer lacks knowledge in an area.
Either way, despite getting great evaluations, I was matched with a total of 5 companies most of which were highly underwhelming early stage companies with minimal traction. Furthermore, I was matched with full stack companies despite begin evaluated as "weak in API design", which is perplexing. I was able to get higher quality offers in my own search and it seems like the TripleByte pipeline consists of many mediocre companies.
If I had known, I wouldn't have wasted my time with this service and invested more time in my job search.
I had a common point about hashmaps -- the interviewer seemed to be at loss, and was asking weird questions that had little connection or relevance to the implementation path I'd chosen -- I politely explained the confusion and we swiftly moved on. They then marked that as a weak/"fuzzy" spot in their evaluation. I made sure to give my feedback about this to the person who shared the evaluation, but did not receive a response.
In the end, after making it to the company-matching phase, they found a whopping 1 company with <5 people, in an area I had clearly said I didn't plan on moving to. I don't know which part of the data-centric recruiting process got this so badly wrong.
Hoping that this process improves, but so far it hasn't lived up to expectations and I ended up finding multiple great matches on my own afterwards.
I ran the TripleByte course once, and it was 100% not worth my time as a professional.
The "coding test" was criminally simple, but got me in the door quickly for an interview. I spent a number of hours building out projects with a paired interviewer, as well as answering questions. This part I enjoyed, it felt like a nice back and forth while building an interesting bit of software. It was like an open discussion, and getting to tap away on my laptop was such an enjoyable time.
Then came the technical questions. The interviewer asked if I knew anything about a specific Technology X. I'd list the tech here by name, but it's so specific I'm afraid of it being linked back to me. It's not something most engineers would run into.
I responded with "I have not worked with that, I've heard of it" as well as it never being listed on my resume or professional work. The interviewer went ahead and simplified it down for us to discuss, much like "Ok well it works like this, so let's chat abstractly". I went along with the discussion since I figured it would be fine to chat abstractly about a technology I never worked with.
The interview concluded, we parted ways and I thought things went very well.
The following days later I received an e-mail from Triplebyte. They praised my clean code and thought process, but specifically said my weakness in said Technology X, which I would like to call out again I never worked with professional nor had it listed as a skill or on my resume, was too much to consider me for the next round.
TripleByte literally evaluated, and discounted me, on not knowing an uncommon bit of technology. Just what the hell.
I was shocked at the levels of failures that occurred to reach this point. It was unfair to use that as any benchmark, and unfair to waste a day of my time doing that. It was a smack in the face to an industry vet like myself.
I tell all job seekers to stay far, far away from TripleByte for this reason. They're not really changing the game at all, but like to pretend they are the magic answer.
One footnote: I'm an engineer at one of the giants (Google, Amazon, Apple, Microsoft, etc) who was and is way more qualified than anything TripleByte was or is pushing out.
The giants fail people for the same reasons as well. I've learned over time that qualifications matter far far less than interview skills at software companies, and even a maximally qualified person only has a 60-70% chance of passing any arbitrary interview.
That first sentence, if true, already narrows the search space drastically for anyone who wants to find you, especially given the length of your comment.
Echoing with a similar bad experience although this might be their customer's (Mixpanel) fault. I applied to Mixpanel and they sent me a Triplebyte quiz. They bugged me a few times to fill it out and then promised to give me feedback "very quickly". I never heard back (I applied several months ago) and sent them 3 - 4 follow up emails.
I'm not sure if the fault lies with Triplebyte or Mixpanel but it was an overall shitty experience. To be asked to take out your time to complete a quiz (which can be automatically scored with an algorithm since its multiple choice) and then get radio silence is terrible. Even if it was Mixpanel's fault, they should ensure their partners actually follow up with interviewers. For example, I know that Hired (another platform) actually will ding an employer that doesn't respond to applicants.
Anyways, Mixpanel & Triplebyte are probably on my "never apply to again ever" list.
It may well have been Mixpanel. I applied to them and they sent me a non-Triplebyte quiz before having any human contact with me (not even a personal email). It was horribly designed; all edge-case language trivia and multiple choice design questions. Needless to say I didn't pass, and I will never go near them again.
About a week ago, I did some A/B testing. I completed their online screener twice. Once answering all of the questions as good as possible. The other answering the questions with the third option (multiple choice). In both cases, I passed the screener. (shrugs)
What was always most interesting to us about starting a recruiting company was seeing what would happen if you treated hiring as a data problem. Partly we've raised more funding for the same reason any startup does, so we can grow faster to get more customers = more revenue = more success, etc. But we're also driven by how the larger the scale we operate at, the faster we can run experiments to answer questions about the best way to evaluate technical skills. More rigorous and data focused approaches to hiring benefit everyone.
Interviewing and evaluating engineers is an area a lot of people feel passionately about and have strong opinions on. We're continually looking for ways to improve our process, if you've any thoughts or feedback please ping me - harj at triplebyte.
Wow. Is it just me or does that sound like a downright horrifying company to deal with as a candidate?
At our company, we try to painstakingly craft our recruiting experience to make sure each candidate we interview has a good experience and ends up with a positive impression of our company regardless of whether or not we end up sending them an offer. At the end of the day, we're all human beings, each with something unique to bring to the table, even if that something might not be what we're looking for for a particular role at the moment.
Maybe past some scale we'll have to start changing our approach and start reducing candidates down to data points and "run experiments" on them like lab rats, like Google, et all, and these guys here seem to be so proud of themselves for doing, but I'd sooner quit than to stay a part of a company that does that.
For giggles (I work on OS kernels and high performance TCP, there's no way this kind of meatgrinder would ever place me correctly) I did the online quiz thing and got to the interview piece just to see why this was generating buzz. The quiz was easy and when I saw the interview prep I was disgusted. It's gamified hiring practice done in an outsourced fashion.. the kind where you are supposed to read a book that tells you what kind of questions a hiring party will ask you. I guess if you are a lousy engineering manager and want to offload and emulate the hiring process of something like Google this is up your alley. If you want some relatively fungible developers that can work on CRUD applications it may be fine but I don't think that's hard to hire for in any case.
Engineering managers MUST do recruiting - never let HR take this from you. HR can do clerical work, but they should have little role in search and outreach. I'd instead recommend finding new talent at local universities, mid level talent give recruiter like bonuses for employee referrals and poach from competitors, and extraordinary talent go look at commit logs of the open source software you use and hire people from that list. Easy, cheap, and effective.
I mean sure, so would I if I had to choose between the two, but that's a false dilemma.
And for me at least, a company that doesn't treat their candidates with kindness and respect in the hiring process likely wouldn't be a long term fit for me anyways.
I haven't been interviewed by you, but the initial process is stellar! Applying and going through the code challenges was very smooth. Plus the fact that on completion, if the applicant passes, are pretty much guaranteed some form of an interview, is great.
Just wanted to say thank you! I used Triplebyte to find the company I'm currently working for, and I've had a wonderful time here so far. The process took a little longer than I had anticipated, but it was well worth it :)
The crux of this problem in my opinion is that long term results aren't tracked.
It's hard enough internally to track someone's performance over the course of the year or two after they get hired, it would be even harder to do it if you are a recruiting company.
It's especially sensitive because employers are weary of sharing employee performance data to third parties because of the high risk of a lawsuit (there is clear precedent for these lawsuits.)
Once that data problem is bridged, it blows the problem right open for data to be explored and figure out what exactly predicts a top performer, in any field.
>Once that data problem is bridged, it blows the problem right open for data to be explored and figure out what exactly predicts a top performer, in any field.
Yes, assuming there is some top-level "data problem" to actually bridge here...
How do we know that the concept of "top performer" isn't just a completely divergent idea that means different things to different people and different companies in different industries and different geographic areas?
I'm inclined to believe that measuring performance is fairly subjective. However, in an attempt to put some numbers behind a performance score, you might have some combination of individual weighted scores that involve things like:
(1) length of time employed at the company
(2) some measure of the performance feedback the engineer receives in their annual review - which may include percentage salary increase (possibly subjective, though)
(3) a score compiled by surveying the employees' peers (again, possibly subjective)
(4) the overall TripleByte turnover rate at said company
It highly depends on how you measure value. If you measure value in terms of what they bring to the overall organization, people from top 10 schools are very likely overvalued. If you measure what value they bring to the team, top 10 schools are probably correctly valued. The difference is that the manager of the team hiring from Harvard can speak to people in more senior roles about how great their team is and how they have the best talent.
At the end of the day, value is driven by how much political leverage a hire can give a manager, not by how much value they add to the org.
Yeah I believe Triplebyte will only accept candidates from countries that have an easy way to get a work visa in the US - Canada, Mexico, Australia and Singapore (I think). (I'm Canadian and am working at my current company through Triplebyte)
That's weird, I am also Canadian and my profile says that they can only proceed with my application with people who hold an US visa and/or work permit. I thought it was a mistake so I contacted their support team and got a reply saying that they can only consider my application if I live in the US.
Did you change your location to some city in the US?
How did you get your application accepted?
On the same note, Hired.com which seems to be a direct competitor, also has restrictions on the locations available for their recruitment process. In Canada, for example, they only accept people living in Toronto. Or people lying about their current location because they are planning to move there.
Hi,
I just went through the interview questions, they were fun (they said that I did exceptionally well, but nothing concrete, like percentage). I'm not looking for job, just though I try it out.
What I was interested in is whether the questions get harder, if I answer well, but they seemed random.
You could use logistic regression to estimate the level of an interviewer and adjust the questions to get to the same accuracy with less time (or to improve accuracy with the same number of questions/time)
So you're right that the quiz does try to be harder if you're doing well, but it'll also give you easier questions if an incorrect answer lowers its confidence in your ability estimate. We have a pretty sizeable bank of potential questions to ask a candidate, but the quiz tries to strike an optimal balance between appropriate difficulty and maximum informativeness. For example, we wouldn't want to as you a particularly difficult question unless we're confident that it's a) a good fit for your estimated ability level, and b) will give us more information about your ability than any other question in the bank.
You're right that tailoring question difficulty to ability level can drastically increase a test's accuracy. But while a logistic regression model works well when you have a fixed quiz or a low number of questions, it isn't flexible enough to work with a fully adaptive system like we have at Triplebyte. Our models are loosely based on the kinds of systems that the MGAT or GRE use, but we've implemented significant extensions on top of those approaches to fit our needs.
Thanks for the answer.
When I was implementing a language learning program (who hasn't? :) ), using logistic regression was working quite well to quickly find my vocabulary level in about 10 questions in the top 10000 most frequently used words list adaptively (I ran a full logistic regression on the user dataset after each new data point, by mapping the position of the words to the estimated level of the user), and the questions just felt right. So I'm not talking about multiple logistic regression model, just using 1 variable, which works with lots of questions (as long as the question hardness is well calibrated).
Although I'm happy that you're trying to predict the most informative question, for me some questions near the end felt trivial, so either my feeling wasn't right about the hardness of a question, or the algorithm has lots of space to improve, or the question hardness levels weren't calibrated optimally.
Anyways congrats for the success for your startup (I just hope that you prioritize people who don't have U.S. VISA)!
Yeah GRE does that. They just use your answers as some sort of dichotomic search: they start giving you harder (or easier) questions until you are answering correctly ~50% of them.
My interactions with Triplebyte have been less than good and honestly am concerned about the company as a whole (perhaps my thoughts are misplaced).
In a prior conversation on HN (link below), I brought up some aspect of my interview (interviewer late, argumentative, smug, etc.). Then the interviewer came on to HN and PUBLICLY SHARED PORTIONS OF MY INTERVIEW. Honestly, should have been fired on the spot, but nope.
To the interviewers credit, after I was the number one comment for most of the day he deleted that portion of the comment. I am grateful (looking back now) that was removed, however I think it speaks volumes.
My two cents, is the idea is good - there is some room for improvement. What's scary is putting one company as a wall between you and the employer. I hope it never comes to pass where they control even 5% of the market. No one should be able to interview better than the company itself and employees shouldn't use a service which upon being declined blocks them from other companies. I don't believe that's the case (yet), so no qualms for the time being.
Given my experience, I hope they've improved and would happily change my view if I had reason to.
It's really easy to game Triplebyte. It's probably really easy to game all the online interview platforms. May be it's hard to game Interviewing.io, because they have you chat w/ an actual person.
For Triplebyte they have a two hour interview with a real engineer who has passed their own interview. Even if you cheat on the live interview, you still have to do an on-site at the company that might hire you, so it’s not a huge benefit to cheat.
1. If you are out of town and trying to relocate to SF, they handle all the scheduling, travel, lodging, and transportation for you.
2. Skips past the resume filter, HR phone screen, and technical phone screen. After you've gone through the TripleByte technical interview, you skip directly to on-site final interviews at the companies. If your resume is non-traditional (self taught, bootcamp, etc) this is pretty significant.
You don’t get skipped past phone screens, you still have to do those.
I don’t think you skip resume filter either. They claim they don’t share the resume, but on multiple occasions I had someone from the company magically view my LinkedIn soon after I was paired with them.
It’s not difficult for them to find you, there’s not really much anonymity built into TripleByte after you’re accepted.
You do get to skip remote technical screens, which is the primary benefit imo. That might be the only benefit to the job seeker, actually.
Using a burner because I don't want to tie this to my company.
You can send your CV to $bigcorp directly, but $bigcorp is inundated with CVs, and many CVs have a casual relationship with reality.
I do a lot of interviews on the hiring side. I look at a lot of resumes. And other than "worked in a similar position at a big-5 tech company or well regarded unicorn," nothing on a resume provides much signal. There are a handful of universities that make me pay attention, and a few particular programs outside that handful, but even that is weak signal. A candidate saying "expert in [whatever]" is useless.
I spend about 10% of my work week doing technical phone screens. This comes after a recruiter reads a candidate's resume, talks to them on the phone about their experience, and decides they're a plausible candidate. I get on the phone with them, we fire up coderpad, and I ask them to start coding. Nothing requiring exotic data structures or algorithms, just a straightforward "make a class with a couple of properties" type thing. Something along the lines of:
Part 1: Make a class that represents a playing card. It should have a suit (clubs, diamonds, hearts, or spades), and a rank (two through ten, jack, queen, king, or ace). Make a way to print out the card as a string. And make a comparison function that can tell whether two cards are the same (same suit and same rank).
Part 2: Make a deck of all 52 possible cards.
Part 3: Make a 5-card hand by picking 5 random cards from the deck.
and one or two more similarly straightforward parts after that.
There are a few places where a candidate can show off (e.g. override the default string representation for the class, anticipate that we might want a comparator that does more than just check for identical cards, or use a special data structure, use an optimized algorithm for picking 5 random cards) but none of that is necessary to pass the interview. More than half the candidates I talk to can't get through part 1. Half of the remainder can't get through part 3. Most people who get through part 3 move on to an onsite interview.
70 or 80 percent of candidates who say on their CV that they can code, and have convinced a recruiter that they can code, can't write a class and a couple of functions. So a lot of companies do the easy thing, and just toss out (or at least don't fast track) all the resumes that don't include particular schools or companies, or come from referrals. And in the process, companies lose out on tons of qualified candidates, and those candidates lose out on jobs.
If Triplebyte can reliably identify candidates who can pass a phone screen, when I'm sorting through resumes I'd be happy to treat a Triplebyte stamp of approval the same way I'd treat a degree from a top school.
> 70 or 80 percent of candidates who say on their CV that they can code, and have convinced a recruiter that they can code, can't write a class and a couple of functions.
I work for a large non-Big-5 and our failure rate at that level is far lower. Is this only or predominantly for juniors?
70 or 80 percent of candidates who say on their CV that they can code, and have convinced a recruiter that they can code, can't write a class and a couple of functions.
Any other hiring managers here see this? It just seems hard to believe. Are you sure it's not the case that they can code just fine but just can't balance a red black tree in 5 minutes in a language that was never mentioned as a requirement before while a stop watch is shoved in their face while also having to jump through rings of fire to the background noise of some people sighing disapprovingly.
The number is honestly not surprising, particularly if it's a genuinely large and well known tech company.
When we've put out the call for Senior Developers experienced with language X (pick any: C#, Javascript, Objective-C), we get people who can't even write a basic hello-world statement being sent through by recruiters.
Years ago we built a basic coding test which is effectively "Here's a class, now modify it according to these sets of requirements" - something that anyone modestly competent in those languages can do in 20 minutes at most. We deliberately avoided writing trick questions, and tried to write it in clear basic language to avoid any issues with language barriers.
Applicants are asked to either do it at home, or if they don't have the ability to do so, they can come into our offices and sit in a meeting room with a laptop to complete it.
Keep in mind, this is for people who've already submitted a CV (directly, or through a recruiter) and report they have years of experience with these languages.
We've had people write back saying that it was too difficult, others who submit complete garbage that not only doesn't compile/run, but doesn't even have vaguely correct syntax for the language.
We had someone who took up the offer to do it in our offices, who we had to kick out after they sat there from about lunchtime until we were closing the offices at 7pm, and all they had done was copy down our model class from the questionnaire (incorrectly) and write some comments about how they might implement it.
I thought that it kind of overstating of skill was limited to just development roles, but having seen the quality of the people applying for Senior-level roles as DBAs, Sysadmins, BI folks - it's all terrible.
$50m in job offers last 6 months at $130,000 average salary, with 70% of job offers being accepted/signed = 267 people got hired in the past 180 days after completing roughly 30,000 interviews (based on the 5000/month quoted in the article).
That means the chances of being hired after doing a TripleByte interview is slightly under 1% if my back of the napkin calculation is accurate.
I took the backend test expecting programming and SQL, but it was hard-core DevOps and SysAdmin. Had nothing to do with programming a backend system. I know a little about working with command line and admin, but using tools to orchestrate 100 servers isn't something I know about.
My impression is that it's built with a very high failure rate built-in, targeting people who have a very specific and easy-hire background. I'm not really sure if someone of this talent level would really need Triplebyte.
AFAIK when most recruiters talk about placement rate, they're not talking about "engineers who we worked with who happened to get jobs," they're talking about "engineers who we placed and received payment for the placement." While Triplebyte hasn't explicitly stated that distinction here, I think a good faith reading of that phrase would suggest that the 40% is actually people who are getting jobs through Triplebyte.
Things may have changed but when I did it I think ~2 years ago the projects and in person bit were much harder than the initial quiz, I'd expect the drop off to be higher.
Edit: I admit didn't pass the in person part and I have some issues with things I think were held against me when it was more confusion about what they were looking for, but overall I thought it was a great process and have recommended it to people.
Only 267 people in 180 days? Sounds low. I guess it might not be based on the number of interviews as I am not sure how selective the hiring companies are being.
My experience with triplebyte was positive, but the interviews with the companies I matched with was not.
The first thing every company asked for was my resume, clearly they had not bought into the triplebyte process. Some seemed entirely unfamiliar with triplebyte.
Interviewing can be a sad process. Triplebyte gave me a taste for what things could be like, but didn't give me any advantage in the application process.
The companies triplebyte matched me with resulted in some of my worst interview experiences. Think disinterested ceos, hostile line of questioning, and a focus on my previous job experience vs things I would have liked to talk about (open source, personal projects)
As a job seeker, how is TripleByte different than the other companies that spam my email inbox with their own "exclusive" coding tests? It seems to me that when you peel back the fancy website, TripleByte is not functionally different from the hiring agencies on other job boards (cough cough, Dice) that would advertise an "exciting opportunity" with their nameless client in order to hook you into signing a contract with them. ("Hey, while you're not qualified for this role, we have others that come by our desk constantly. So how about letting us sell you to other companies?")
EDIT: Their website layout is a classic agency layout.
> header with giant "sign up" button
> "top tech companies" in big print as a selling point
> huge section with the most "famous" companies in their client pool
> free cost (you're the product they're selling, so they're not looking out for a best fit - they're looking to get paid for placing you)
> testimonials
> blogroll that reads like it was built solely for SEO
Do programmers love numbers and algorithms so much they want to be reduced to them? Programmers are people and should be treated as such.
Also, shouldn't we be concerned that giving one company's algorithms control over who gets hired will be too much power in too few hands? And algorithms are not neutral. The people that make the algorithms have biases and discriminations just like regular people do but at least if your company does its own hiring you can work on figuring out what those are and how to address them. How can you do that if you depend on some proprietary algorithm?
And what about disabilities? How does your algorithm handle those? Racial bias? So many unanswerable questions.
I have many issues with the way most companies interview but giving up that process to a proprietary algorithm seems like the worst solution. This is not news to be celebrated.
This quote from the TechCrunch article was to me the most striking element of this story. They already yield double the rate of good candidates:
""The metric that companies care most about is what percentage of on-site interviews convert into hires, and the industry standard is 20 percent. Triplebyte’s placement rate is 40 percent," says Taggar."
That's nice but averages aside, anything less than 50% for on-site candidates (or final round interview candidates) that don't get an offer means your process is very likely broken somewhere else upstream and you have work to do on your interviewing and sourcing funnel.
I wonder if these numbers are comparable. For the 40% to be true their candidates would have to interview at a maximum of 2.5 of their cleints on average. That doesn't seem to fit with their model.
This is great news. Hiring needs to be better solved.
The current screening process provides a low signal of competence, and so companies have to rely more on credentials (degrees, previous company brands) during screening, which means that a lot of skilled people still can't get their feet in the door at companies if they don’t “look right”, and companies fight over a restricted talent pool.
Lack of hiring data for smaller companies means they copy larger company’s interview processes, but there’s no strong forcing function to drive innovation in larger company’s hiring processes (i.e. their success could be despite a bad interviewing process - because they have a brand and offer a lot of perks, hence attracting the best talent, and so they aren’t in a “we have to fix hiring or we will die” mode).
This also really hurts startups - who aren’t in positions to take risks with hiring, and with a lack of good evaluations, have to rely on credentials, which restricts their pool, and makes them compete with the big cos for that talent.
Another important implication of fixing hiring is that it will introduce a powerful forcing function on higher education institutions. If students know that they can get jobs without having “traditional” credentials, but if they can pass, say TripleByte’s, or some other company’s, assessment which is more aligned with what’s required on the job, and is a signal that companies believe in, then students can use money that they would have spent on college to instead actually learn the skills that would be useful on the job.
This movement of money out of higher education, would fund a lot more experiments in learning and education.
I can’t stress how important I think this problem is to solve, and I’m glad companies like TripleByte, interviewing.io, are working on it. We need more companies, more approaches, more experiments in this space.
> We started Triplebyte because we were frustrated by the noise present in every step of the hiring process.
This is largely just a software/technology problem. In all other professional industries there are means to validate a candidate's competency before they are allowed to interview for a position: licensing, required internships, legal certifications/authorizations, authorized relationships, and so forth.
Technology doesn't have this. The big difference is that in those other professions they are using the interview to actually interview the candidate, as in the person. In software and technology the entire interview is used to gauge basic competency and even then the trust relationship is inherently broken.
Contrary to what technologists will tell you the problem isn't the hiring process or low salaries (preposterous answer unless you live in the bay area). These are symptoms of a broken trust relationship. Hiring companies inherently do not trust the people they are interviewing as basically competent unless they have been told otherwise by somebody they know personally.
Hiring companies shouldn't trust a candidate is minimally competent, because there is no means to a standard baseline on which competency is measured. That is the primary problem. Solve for this problem and the resulting symptoms are easily addressed by the marketplace as a matter of economics.
---
The problem is very clear to see when you have two simultaneous careers: one as a software developer and a different one in an unrelated industry that has professionally addressed these concerns with required professional education and accreditation/licensing.
I believe that this problem is far, far, far more pervasive than just a software/technology problem.
> there are means to validate a candidate's competency before they are allowed to interview for a position: licensing, required internships, legal certifications/authorizations, authorized relationships, and so forth.
The problems with credentials that you mention:
1. They are often weak signals of actual competence, and in the case that they are decent, there is still a lot of room for improvement through experimenting via a data driven process (current credentialing is, in many cases, outdated, and doesn't map to what actual work is like).
2. They are not accessible by everyone. This is problematic as the means to learning is becoming more accessible (through online education, etc.), but the credentialing is still restricted - since the institutions that hand them out haven't scaled credentialing. There is a lot of opportunity to provide signal for competence that scales... and measures skill that is actually used on the job (which is also changing as technology matures and penetrates other industries - we'll need a credentialing system that can adjust to those changes quickly).
In fact, I'll go as far as to say that this is a bigger problem in non-software industries. At least in software, there is a more objective way to measure a candidate's competence independent from the path they took to gain that competence. This means that people that might not have necessarily had a formal education / credentialing have a sliver of a chance of an opportunity to prove their skill. In other industries, if you don't have the credentialing, you have no shot.
> They are often weak signals of actual competence
I disagree. They are weak at separating the top 10% from the rest of the qualified people, but they are excellent at removing the people who have no business being there in the first place.
The first two that comes to the minds of most people are law and medical licenses. These licenses don't exist as a job qualifier. They exist as a legal qualifier. That means a gross abuse of the license requirements are cause for law suits and serious criminal offenses even though most lawyers and doctors are corporate employees.
If programmers had the realization that gross negligence could land them in jail or cause them to lose their career and property in a lawsuit I suspect they would take their jobs more seriously than merely writing code.
Programmers don't just write code just like doctors don't just prescribe painkillers and soldiers don't just shoot people. They make numerous critical decisions that have real world implications. Examples of gross failure are simplistic known security breaches that allow confiscation of millions of credit card numbers and PII. Other examples include discriminatory and accessibility violating software products.
These are basic foundational qualities of competence. In any other industry negligence of this magnitude would put in prison. Since the base line is so ridiculously low for hiring developers these are considered advanced qualities often transferred to third party firms and only after threats of pending legal actions. All we care about when hiring developers is whether they are literate and have a pulse.
Be serious, no change to any hiring process will fix that.
> They are not accessible by everyone.
Don't care. If a person want to achieve access to a given career they will find a way through their own internal motivation. If the industry wants to make the careers more accessible they will promote a desirable education path. This isn't a secret legendary arcane black magic.
Law and medicine are not relevant examples at all.
In both cases, licensing did not arise because of hiring issues, but rather because both law and medicine directly involve human lives and livelihoods.
Any field where this can be the case on a day-to-day basis ends up having strict licensing and/or training requirements. Examples include civil engineers, pilots, soldiers, sea captains, heavy machinery operators, and so on.
The guys who work at Lockheed Martin programming vehicular data sensors, or who work for medical device manufacturers on MRI machine interfaces - they aren't directly involved in human lives and livelihoods? What about the massive security failures at Equifax? That's not putting people at risk?
Once again, these kinds of industries adapt to the kind of work they do. There are numerous bureaucratic and access-control procedures in place at defense contractors. The FDA has a ton of requirements for anyone working on medical devices.
Equifax was a monumental shitshow. I think SSNs are also a shitshow, but I digress.
> but rather because both law and medicine directly involve human lives and livelihoods.
How does that not describe software? When was the last day you were completely without software? Software powers all manners of our gasoline vehicles and the various traffic signals we encounter. It powers many hygiene products and kitchen utilities. Soon all of these will be part of the internet of things if they already aren't.
> Any field where this can be the case on a day-to-day basis ends up having strict licensing and/or training requirements.
It seems like you're not understanding my point at all.
> Software powers all manners of our gasoline vehicles and the various traffic signals we encounter.
The automotive sector has its own complex procedures and policies in place for working on vehicular control software. You and I would likely not even be able to land an interview for an automotive firmware design position without prior experience and/or certification.
In many cases, there are entire programming standards that dictate how such systems need to be written.
In other words, the software that runs inside your vehicle is nothing like the software powering our favorite websites.
> It powers many hygiene products and kitchen utilities.
The chance of a kitchen appliance endangering a human is much lower than a car going haywire or a doctor making a mistake due to inadequate training.
> The chance of a kitchen appliance endangering a human is much lower than a car going haywire or a doctor making a mistake due to inadequate training.
The chance of any device on the internet of things leaking your personal habits online is not low. Furthermore, the risk of death or injury from electrical devices is only due to regulation upon such devices before commercial software was ever imagined. I would also say the hiring and performance of truck drivers is far more regulated than the firmware developers who write the code that powers that very truck.
> the software that runs inside your vehicle is nothing like the software powering our favorite websites.
So website software doesn't need to be written by competent people with regard for your privacy or security? Is insecure online software not harmful? Are credit card data breaches not harmful?
So lawyers shouldn't have to be licensed or certified? They cannot take a kidney from you. Neither can truck drivers, real estate agents, or police officers.
That tech is open to people of all backgrounds, even self taught, is awesome but it has its drawbacks: it's harder for the interviewer.
I disagree that there are objective measures and the vast amount of e-ink spilled on interviewing practice debates is proof of that (tech interviewing also doesn't correlate to the actual work being done!).
I'm on the "tech should be open to all" side but it _is_ harder for companies to filter out potential bad candidates. That's why they make up their own filters like "we only want seniors".
> the vast amount of e-ink spilled on interviewing practice debates is proof of that (tech interviewing also doesn't correlate to the actual work being done!)
You're right that CURRENT interview screening practices are far removed from actual work, but that will change. I don't think that it's proof that there aren't better assessments (which should be aligned with how work is actually done on the job). If anything, this is an opportunity to experiment more, and find better assessments.
> I disagree that there are objective measures...
I agree that there aren't fully objective measures, and may never be. What I do believe is that you can get SOME signal of competence from, say, someone's code about their competence at building something right now, which is a more objective, impartial signal than what happens in other industries which relies mostly on a behavioral interview (which allows for a lo t of bias to creep in)
On the contrary, I think software/technology is a field where objective assessment of candidate's competency is relatively easy, and the professions requiring accreditation/licensing are relatively rare.
Do you think requiring those credentials would benefit software industry as well? Is that enough for top companies to base their hiring decisions on?
> and the professions requiring accreditation/licensing are relatively rare.
I am thinking that is every professional career other than software. Truck drivers without any education are substantially more regulated than a software developer writing life saving applications in an MRI machine.
> Hiring companies inherently do not trust the people they are interviewing as basically competent unless they have been told otherwise by somebody they know personally.
Is this why it's so hard to find work by applying through a job portal?
I decided to comment after seeing a number of negative comments here. I went through the process a it was a positive experience for me. I ended up interviewing at 5-6 places and didn't receive an offer. I liked getting the interview feedback. The time saved in skipping the usual application process seemed worthwhile to me ( You spend 2(?) hours on the triplebyte interview. Then a short introducutory call with each company you are matched and the onsite interview). My only complaint was that I was looking for larger and my matches were all 5-50 employee companies. I guess not many large firm are using them. Overall I would recommend triplebyte for anyone who is interested in startups and who currently in a full time job search.
"Evaluating" 5000 engineers a month almost seems low to me - they've been going full bore with advertising and have been on the top of reddit for the last month or two (and if I remember correctly I think I've seen them on twitter and FB). With this much spend I would have thought they'd have more candidates. Maybe thats a lot!
Google says there are 3,500,000 software engineers in the USA. If they are doing 5,000 a month, that means they are interviewing about 2% of all the engineers in the US each year. That's not totally accurate because it doesn't account for those breaking into the field or if international candidates are going through the process. Either way, that's a lot of candidates.
Define "software engineer"? I know for example the US gov puts basically all tech jobs in the same bucket of IT. Hardcore low-level assembly programmers with people who set up office on business machines.
I would speculate the total number of people the average HN reader would consider a software engineer is much lower.
I get that way with any ad I see or hear too much. The more I experience your advertisement, the less satisfied with your company or product I am going to become.
The only advertisements I see on Reddit anymore are from triplebyte. I'm not even looking for a job (still a student), and I won't be for at least another 9 months.
Further down the comment thread, the interviewer came on and shared information about my interview (essentially because I called them out for rude and smug - which IMO they were).
I had a great experience with Triplebyte as a candidate (now hired employee), but I'd love to see them expand to other categories. Remote-friendly companies is a big one, but even more important is different skill levels. Right now triplebyte is oriented around finding the best, instead of finding everyone and helping employers get a candidate whose career path matches their needs.
Their process is fantastic. I can see them replacing first round interviews entirely at some companies if they can look for all the candidates companies need, not just the most senior.
Not nearly enough people in the hiring chain understand the importance of this.
Whether an employee can get along with co-workers is probably by far the most important metric in most jobs, yet it's mostly ignored because it's hard to test for. I guess companies hope they'll figure out if someone's a bad fit while they're still in a probationary period or something.
The most skilled worker in their field is useless if they can't cooperate and communicate effectively with others.
With the loyalty, willingness to learn and work ethic you can usually expect from an initially lower-skilled employee (along with lower wages/cost), a little training could turn them into a hugely valuable team member in fairly short order if a company makes an effort to ramp them up properly. Giving more of these people a shot will dramatically increase your odds of finding team members who work together well and become greater than the sum of their parts.
This idea of ignoring everyone who doesn't fit a ridiculously narrow criteria causes a whole lot of missed opportunities across the board. You end up hiring a bunch of elites-on-paper all trying to outmaneuver each other into the most possible money who will be gone in 2-3 years, while 'less attractive' candidates who would actually care about the work and tend to stick around get tossed in the garbage without even being seen.
I went through the process & got some attractive offers. I had mixed thoughts overall. Initially, they told me the benefit was I can avoid white boarding & on site coding challenges, however all the companies had their own white boarding or coding challenges. The hotel they put me in was in the Tenderloin, a bad part of town. I had lunch with Triple Byte & they were all very nice. Overall it was a positive experience so much so that I did not submit my urban fee/uber costs for reimbursement, seeing as I got offers on the upper end of their range & still turned them down. Going through their process has easily made me 2x as strong of a developer & helped me recognize my own worth.
It might mean something like the test predicts better than a randomly-selected interview whether a candidate will eventually be hired. The real question is whether the interview here is an unstructured interview or a structured interview. Industrial psychologists have been pointing out for decades that unstructured interviews are awful, even though everyone keeps doing them, so it wouldn't be surprising if they can be outperformed by even a simple checklist or test. (There are similar, somewhat infamous, results for things like judges and parole - where simple linear models can outpredict the experts.)
I'm one of the co-founders at Triplbyte. What this means is that our statistical model now significantly outperforms our own interviewers (and me) at performing a structured interview, and then making a judgment about whether a candidate will or will not go on to do well at interviews at other companies. It is slightly meta, because we're using an interview to predict other interviews. But I find it very exciting, because a) it means we're more accurate, and b) we know exacly what features we're feeding into our model, and can avoid a lot of the bias in the process.
> does the test successfully identify the people who have the best portfolios of things they’ve built previously
Even with an algorithm doing the identification, any data produced is still going to be highly subjective (based on the "best portfolio" ideas of TripleByte and whoever worked on the algorithm).
Anyone care to comment on how the whole Triplebyte process goes as a person who wants to be hired? I'm interested since I never got past the set of questions due to some kind of bug in the beta.
I went through the process about 2 months ago, and I really enjoyed it.
After the initial quiz, I had a ~2 hours interview with a human, which included a 1h "pair programming" challenge, random technical questions on my field, general CS questions and architecture (system design) type of questions.
Once I passed that step, my talent manager (the person who helps facilitating the discussion with the companies) told me IIRC that about 1 in 5 passes the human test.
Based on my skill set and preferences, the system "offered" some 30-something companies, and I chose to have an introductory call with ~10 of them. Each company has some background information, what they're good for (in TripleByte's opinion), their general size and their engineering size. Some companies (the bigger names) have additional steps before the on-site, like another pair-programming session or take-home exercises.
From those calls, 5 on-site interviews stemmed, and 4 of those resulted in an offer.
TripleByte also helped arranging the on-site all in the same week, so if you're remote you don't have to fly back and forth all the time.
I'll answer, made a throwaway because I don't want to be identified. I did the quiz, moved straight to remote Triplebyte interview. Then to on-sites with 3-4 companies. Got along great with everyone, rocked many of the algorithm and system design questions (and some not). Got zero offers.
I am a swe with 2-3 years of experience, maybe four depending on how you count experience maybe, speak at lots of large conferences, contribute to open source, etc. I feel (and have data to back up) like Triplebyte sold me as a 5-7 years experience person, so only got interviews with companies looking for senior people. Really got along great with all companies and I have historically been a great judge of how my interviews have gone, but I think Triplebyte overselling me essentially caused me to waste a week of my life. Feel free to handwave and say I am blaming Triplebyte for my failures (of which I have many), but I do think I would have had 1-2 offers otherwise. Although to be fair, the 130-140k comp most companies mentioned would be a significant pay cut and may not have been do-able for me even if I had received an offer.
I had a similar experience, but, having interviewed with several companies outside of triplebyte I'm not too upset about it. The old hay about companies "looking for someone with 10 years experience in Go" applies. In the end it seems like the only way I can get a job is the really old fashioned way (via friends network).
Around the middle of 2017 I interviewed with them to be a part time interviewer. It was something they promoted on HN for a while.
They put me through the standard interview process along with additional discussion about the interviewer position.
Overall they're much better than most startups but have room for improvement.
The good:
They kept me well-informed throughout the process and set proper expectations. They were prompt with followups and stuck to the schedule they set. My interviewers were knowledgable, clearly software engineers.
The bad:
Too much focus on algos and CS fundamentals, not enough on higher level concepts and what makes someone a good fit for an available position.
I understand it's really hard to have a generic evaluation that covers multiple potential roles. That said, I believe that these questions do not select for good employees for most roles. They are biased to select for recent college grads and people who are willing to study before interviewing.
For example, they asked me about bloom filters. 95%+ of startup software jobs will never have to deal with bloom filters. Why would you ask about them? Ask something that will actually be encountered on the job.
To add a little background I was responsible for our hiring process in my last position, including interviewing, so I'm a bit opinionated about this.
I had a great experience, even as someone primarily working in embedded development. They only targeted SF & NY when I went through their process, which wound up unexpectedly being a dealbreaker for me, but if it weren't for that I absolutely would've taken one of their offers.
I have a very similar situation. I started taking pitch calls and then ran the numbers on what it would cost our family to move to San Francisco / Silicon Valley, and it didn't work out. But their process was fantastic, and if my situation is different in the future I'd absolutely use them again.
Edit: By the way, you should email them about the bug with your profile, they'd probably want to know about it / give you another shot.
Same here. The bay area has now priced out professionals with families. I was looking at $40-$60k more per year for housing depending on where in the bay area I would live and I'm currently making over $120k...
I ended up not taking a job through them (I did the math and relocating to SF meant I would have a longer commute and only slightly more pay after housing my family of 6), but the process was great.
After the initial test, there is a fairly long phone/remote desktop interview that consists of:
* Writing code (on your actual machine with the tools and language of your choice!) to solve a simple problem.
* Debugging (a smallish program with 5 failed unit tests)
* General knowledge questions (databases, web (both html and http), data-structures algorithms)
The phone interview then ended with them giving you a couple of tips on answering the non-technical interview topics that a lot of engineers flub (why do you want to work here, when can you start, compensation).
The next day I got a list of over a dozen positions with the recommendation that I pick at least 5 to move on to phone screening.
The phone screenings went well (3 of them were just varients of "all the candidatese triplebyte has sent us were great, so we just want to talk about our company"). This was also my first hint that compensation would be an issue; one company was immediately ruled out because they were early stage and I can't pay a mortgage with equity.
Then triplebyte scheduled the on-sites all in the same week so that I wouldn't have to go back-and-forth to the bay area.
Ultimately Apple was the only company on my list paying enough to get me to relocate, and they passed on me.
I did the Triplebyte interview around six months ago. It was pretty pleasant. (I used Triplebyte because I'm self taught, without a college degree.) I really like the company I ended up getting hired at. I only interviewed at my current company, though, so it's very possible I just got lucky.
I find interviewing unpleasant. Triplebyte's interview wasn't an exception, but it was less unpleasant than most. Even though they choose not to move forward with me, I thought the feedback was valuable and I don't regret having gone through the process.
I passed the quiz and moved on to the phone interview.
Overall I don't think it was a good experience for me.
The interview was pleasant but long. Since they are more of a recruiting firm than anything, they are able to ask questions in a way that an organization looking to hire won't. It allows the candidate to be more candid and detailed with their knowledge and expectations. I thought it went well enough but I did not proceed to the company matching phase.
It is _very_ clear that they are looking for a specific type of developer - a type which I reckon probably doesn't need Triplybyte in the first place. I am not a web/rails/whatever developer, nor am I a senior engineer with very niche skillsets like low-level systems, etc.
The feedback I got was mostly positive, but contradicted itself in odd ways (pro: "we like your DB skills" / con: "work more with DBs") and really just translated to "you aren't marketable to startups and don't have the pedigree / experience needed to throw at our larger clients."
I suppose if you have an popular or incredibly niche skillset that is in-demand, but are having trouble getting the attention of companies for whatever reason (it happens), TripleByte is a decent shot. But if you have more general experience / a skill-set that
Ultimately they are a business and they have to operate this way, so I understand being turned down. I was, however, disappointed that I was 'let down' in a way by the vision I was pitched of what TripleByte claims to do / be.
I was interviewed by them. Best process ever. Easy, straightforward, had options but I wasn't really interested in working for other people anyway. They let me code in ClojureScript for the snake game if memory serves. Also, this was many years ago when they initially started out so I am assuming they only got better.
I agree, I was one of their early people (they even met me on a holiday weekend at their offices across from the stadium as that was the only time I could fit in for their "long form" interview). Was best process I've ever been through.
My only critique, is that it seems a bit too front end oriented these days for someone like me who is basically a deep backend person. They did have long form infrastructure questions, but as I get asked to take their prototype tests every so often, it seems many of the new tests are just front end oriented (i.e. language types). I'm guessing this is because that's where the market has taken them.
I worked with them roughly a year ago, and it was incredibly straightforward and enjoyable! I went through the online test and two video conference interviews before being matched with a generous list of companies that matched my preferences, both small and large. The people I worked with (shoutout to Michelle and Buck) also kept in constant contact throughout the entire process, ensuring that things were going smoothly from the company-side after matching. All in all, I heartily recommend Triplebyte to anyone who's looking to streamline their search process, and to see how much better the interview procedure can become.
Took the quiz just over a year ago, and passed the phone interview. I stated my preference to avoid very small / early stage startups, and got interviews at Asana, Apple, and a medium-sized startup. Accepted the Asana offer, where I have been working for a year.
I think Triplebyte was really useful for me because I didn't have a CS degree and didn't have any work history directly in tech, so I couldn't get through resume screens via direct applications and didn't really know many people who could refer me. So Triplebyte had high value in getting me actual the actual technical interviews. I'm not sure how valuable it would be for others who don't have trouble with this, but given the minimal time investment (a few hours for phone interview), it's probably worth a shot regardless.
Disclaimer: I am building a startup in this problem space.
Does anyone think that social proof could work here? If 15 peers endorse Sally for React Native and those 15 people are likewise found to be credible, could such a network effect be more valuable than a coding test?
Obtaining endorsements is a different skill from RN. Those on linkedin with all the endorsements are not the ones doing the work but the social operators that value networks and symbols more.
I 100% agree about LinkedIn endorsements. But those seem pushed on users without their consent in order to drive engagement metrics. I often get garbage endorsements for vague skills like "mobile development".
But what if you could self-identity your skills (Erlang, Vue.js, etc) and there was a low friction way for your peers to +1 those skills?
Late reply, but I think it should be exceedingly valuable, but at the highest quality you'll get a lot of noise. There are a lot of engineering organizations that put out bad code.
I'd consider recommendations from CTOs, former Senior or Principle engineers to be the #1 indication that the person can do what they say they can do. Verify a bit more, but honestly the college level micro-optimization whiteboard-only BS questions are mainly only useful for checking new grads.
Do they have lots of jobs on offer?
I passed the quiz and was accepted after the technical interview in late 2017 (I must say the interview was very well done and the feedback was very constructive), but they had 2 matches for me in NYC, of which neither progressed to a company interviewed. I heard of others that had a similar experience.
I got my current job with Triplebyte. I’m incredibly happy with how it all worked out and couldn’t recommend it enough.
I got to interview with some pretty exciting/interesting companies.
The only problem with Triplebyte, in my opinion, is that I don’t think they track job success AFTER the hire. I imagine this is probably a problem they’re working on. But it’s hard to build a successful recruiting company if you don’t know what happens to the employees once they actually get hired.
I hope we move towards recruiting being a culture/attitude problem and not really a skills data problem.
Most jobs are repeatative and most engineers adapt. So important is to identify culture and attitude fit.
The problem with raising the bar for interviewing engineers is the work that they end up doing isn't moving at same pace.
With more frameworks, better languages and open source building stuff is getting easier
While I can totally understand the need for this by hiring managers who need to get people onboard without wasting too much time, I am a little bit skeptical of being reduced to a simple number in some sort of a machine learning algorithm. Especially if this algorithm is then being used by half the companies in my area.
I think hiring is a difficult process because we need to work with others and people are different in general.
I have personally worked with people who started programming just because they were interested in it - they had no knowledge of algorithmic complexity but they were very open-minded, had a great perspective on the domain and were a pleasure to work with.
This is very anecdotal of course but I sincerely hope that they would have been able to make it past the online quiz...
(If you're thinking they should be smart enough to be able to game the quiz, then my question would be - why not just screen everyone in person then? Of course, that's not scalable and not worth the 50 million then...)
Thinking about it, I think I understand their insight now. The idea is that, even if you got triplebyte-d, the startups will pass you through their interviewing process - let's be realistic for a sec. So triblebyte is not about getting you hired, I mean, not directly. It's about something else.
Thing is, a startup cant do like amazon and actually dive into every random applicant. It's too big of a work. So, a startup is limited and can only use recommendations in order to even think about interviewing someone.
Now, what if a company would do the grunt work and select a few of those random applicants and submit them to the companies. That would bring a shitload of value because now startups would have a new source of relevant applications to tap in.
I'll bite: Isn't using scientific methods in HR similar to stock trading? I.e. you need to predict how well a given person would fit within a company; you can't really capture significant soft abilities like who-knows-whom, which might have significantly bigger impact on profitability of the project than any individual/technical contribution if you purely optimize for profit. You also need to take into account company's strategy, environment that is changing etc.
Alright :) But it won't help you detecting emerging trends until it's late already, and in hiring you likely cut away people like Steve Jobs if you measure them scientifically. But if your goal is predictability, hiring replaceable corporate drones, then maybe...
So I just tried to signup as an engineer and it says... "Unfortunately we're currently only able to work with people with legal status permitting employment in the US. We hope in the future to help set up visa sponsorships." So looks like it's only for US companies at the moment.
>We'll also be expanding to support engineers and companies in new locations.
This is the most exciting thing to me. I would love to use Triplebyte to try to find a position, but relocating is just not an option for me right now.
When she joined yahoo in 2012, Microsoft was willing to pay $51 Billion for Yahoo thats how the valuation was, But after running it for 4 years the valuation reached a low of $5 Billion. She said she will turn it around. In 2016 she said she will turn yahoo around in 3 yrs. And very soon yahoo was sold, she misguided a lot of stockholders with that statement. Also the blame of a sinking ship goes to the captain as does the credit to the loot of a pirate ship.
If she didnt run yahoo into the ground then who did ?
I do actually want to give credit to this point. I have no strong feelings at all on Marissa/yahoo, but if you join a company as a CEO and it loses 90% of it's value, from a business perspective at the very least, you failed.
If you know the ship is sinking, then you should sell it right away (at 50B) then invest that in index funds, and end up with +7% each year, instead of -90% over 4 years.
There were certain areas where the interviewer messed up in their evaluation, e.g. they felt I "was a little weak at hashmaps and API design", which is probably because they did not know what I was talking about when I described the details of advanced hashmap implementations. There seems to be a bias to discredit the interviewee if the interviewer lacks knowledge in an area.
Either way, despite getting great evaluations, I was matched with a total of 5 companies most of which were highly underwhelming early stage companies with minimal traction. Furthermore, I was matched with full stack companies despite begin evaluated as "weak in API design", which is perplexing. I was able to get higher quality offers in my own search and it seems like the TripleByte pipeline consists of many mediocre companies.
If I had known, I wouldn't have wasted my time with this service and invested more time in my job search.