On the surface, software engineering interviews look pretty much the same in any company. You usually have a phone call with a recruiter, followed by either a short technical round or a set of take home exercises. If you pass that you're invited for several rounds of on-site interviews, focusing on your technical abilities and behavioral cues that could predict your job performance and how you'll get along with the team.

However, if you simply take this format, pick a set of technical questions for the specific role and go with it, you could soon find yourself thinking about how to improve the process. You might find it slow and tedious for the interviewers; you could start seeing evaluations for the same candidate that widely differ depending on the interviewer; you could get feedback from the candidates that the process requires too much of a time investment for them.

In this post we'll summarize lessons shared on official engineering blogs of several companies. Namely, the advice falls down into three sections:

  1. Know what you are looking for
  2. Be ready to iterate
  3. Techniques to avoid bias

If you're curious and want to find more stories about engineering interviews and hiring process, you should try Blogboard search.

Know what you are looking for

You want to hire an engineer, and you want to hire a great one. But the exact definition of "great" can depend on many factors, so a one-size-fits-all list of traits is unlikely to exist.

Are you a big company or a startup, what's the current state of the project and the team, are you hiring for a junior or a senior role? These are some of the obvious things to consider when thinking about the best fit for the role. Once you're clear on this you can work out the details and the exact qualities you're looking for in a candidate. Finally, all this will determine what you screen for and what your interview scorecards need to look like.

In a Lyft Engineering blog post How Lyft Designs the Machine Learning Software Engineering Interview, Hao Yi Ong states the following three questions to ask in order to understand what you actually want from a role you're hiring for:

1. What are Lyft’s challenges (and can a specific role help)?
2. What should the role be with respect to the organization’s goals?
3. What are the desired skills, knowledge, and talents given the expectations for the role?

Companies are more or less transparent about detailed requirements for the roles. The above-mentioned article briefly describes them as follows:

Our desired talents are recurring patterns of thought, feeling, and behavior that can be productively applied in the context of Lyft’s ML SWE role. What we’re looking for here is a bit more complicated than simply work done in the past by a candidate. Faced with the same stimuli, people react and behave differently. When we look for role and values fit, we do mean just that. Beyond skills and knowledge, will a candidate’s unique way of responding to the problems thrown up in Lyft’s business context help that candidate succeed? So while conventional wisdom might suggest it, we’re not always looking for the Michael Jordans of machine learning (be it I. or J.). The narrow sort of talents associated with celebrated excellence can be important but in most cases the interviewers are listening for predictive clues of how a candidate will react when posed Lyft-specific problems on the job.

Speaking about high level qualities of successful engineers, here's what Ted Tomlinson of Databricks shares in his article Engineering Interviews — A Hiring Manager’s Guide to Standing Out:

At a startup like Databricks, the most important quality I’ve seen in successful engineers is ownership. We are growing quickly, which brings a lot of new challenges every week, but it’s not always clear how responsibilities divide across teams and priorities get determined. Great engineers handle this ambiguity by surfacing the most impactful problems to work on, not just those limited to their current team’s responsibilities. Sometimes this means directly helping to build the solution, but often it’s motivating others to prioritize the work.

The second quality we focus on, particularly for those earlier in their career, is the ability to learn and grow. The derivative of knowledge is often more important than a candidate’s current technical skills. Many of the engineering problems we are solving don’t have existing templates to follow. That means continually breaking through layers of abstraction to consider the larger system – from the lowest level of cpu instructions, up to how visualizations are rendered in the browser.

Going back to tailoring the requirements to the needs of your company, notice that Ted points out that ownership is the most important quality at a startup like Databricks, suggesting that wouldn't necessarily be equally important at a different place.

Once you've figured out the qualities you're looking for, you can go deeper and break them down into fine-grained categories that you can actually screen for in an interview. Medium Engineering shares a great example of how this can be done. In Engineering interviews: what we screen for, Jamie Talbot explains how at Medium they're looking for three things: (1) Can they build software? (2) Can they learn and teach? (3) Are they aligned with our values?

Each of these high level requirements is then broken down into six sub-categories. For example ability to build software covers the following categories:

  • Problem solving
  • Code fluency
  • Autonomy
  • Basic computer science knowledge
  • System design
  • Resoluteness

If you're curious, each capability is described in detail in the blog post. In addition to that, the team at Medium devised a detailed grading guideline for each category, helping interviewers decide on the scale of Strong No > No > Mixed > Yes > Strong Yes. They describe it in another article here.

With any problem, hiring being no exception, it helps to know why you're doing it and what your environment and constraints are. Only then you can come up with creative and more effective solutions.

Speaking of constraints, they're likely to change over time, rendering your existing interview process inefficient.

Be ready to iterate

There's several reasons for being flexible and ready to adapt your technical interview process.

In the previous part we discussed how you need to tailor your interviews to fit your company, project and team needs. Inevitably, these things will change over time, and with that your interview process will likely need to change as well.

On that note, when reading through company blogs a recurring theme is redesign of the interview process. It usually happens because circumstances change, but often (as with any process) you can find room for improvement even when the environment hasn't changed much.

At Medium they noticed more than a few things they thought need improvement. As elaborated in Engineering interviews: refining our process they wanted to address lack of clarity in capability requirements, inconsistencies in candidate evaluation, which traits are considered important and which less so, how they approach personality traits and non-technical qualities.

At The New York Times, they recognized the need to standardize the hiring process across the company, as it's recognized to be the key determining factor for the culture. One of the things where a consolidated interview process helped is increased trust for internal mobility. Due to lack of trust among teams, it could happen that an engineer would be required to pass a technical interview if they were to change teams. You can read about this on the NYT Open, the official behind-the-scenes blog of The New York Times (namely these two articles: How We Designed Our Front-End Engineer Hiring Process, How We Hire Front-End Engineers at The New York Times).

In Refactoring Backend Engineering Hiring at Slack, Slack engineers share the story of why and how they optimized the take-home exercise. Although it had many points in its favor, the exercise was a bottleneck in their hiring process. Candidates, wanting to show off the best of their skills, would take too much time to complete the exercise. Slack, on the other hand, was in state of rapid growth and the projected time for staffing all the necessary positions simply was too long:

The end result was that, by our estimates, it would have taken a year to fill our existing open headcount, future growth aside. This timeframe clearly would not allow us to grow at the speed we needed. However, we were also unwilling to sacrifice quality. We needed an approach that would give us good signal and help us hire great engineers, but at a reduced time cost to the candidate and to us.

To satisfy these needs, we decided to create two new take-home exercises: an API design exercise and a code review exercise. In creating these exercises, we sought to create a problem that was not an onerous time investment on the part of the candidate. We wanted something that would give us good signal on the attributes we cared about while taking at most two hours to complete.

Finally, the team at Slack came up with a new format for the challenge as well as internal apps and GitHub automations to streamline the process, resulting in significant measurable improvement:

In the end, we saw tangible improvements against our goals. We saw a decrease in our time-to-hire — the time from when a recruiter first reaches out, to the candidate’s first day in the office. The time-to-hire metric decreased from an average of 200 days to below 83 days — and it continues to drop. We’ve seen positive feedback from candidates and employees in all parts of the process.

At Soundcloud, the team attacked the same part of the interview funnel - the take-home exercise, since they noticed it often takes far too long to complete it. As they point out in this article, there's a very subtle reason to be careful about time investment on the candidate side:

Many great candidates have good jobs and busy personal lives. We want to talk to as many qualified candidates as possible, but to do that, we need to minimize the chances that our interview process itself gets in the way.

So not only does a tedious process slow down your hiring, it might cause the best candidates to simply give up because they're already too busy.

How to avoid bias and variance?

It's no secret that interviewers can easily fall victim to all sorts of biases. A good first impression can make you give better score to a candidate's technical abilities than you might do otherwise.

In 7 Practical Ways to Reduce Bias in Your Hiring Process, Rebecca Knight outlines the ways in which bias can hurt your hiring and offers several ways to overcome this inherently human problem:

Unconscious biases have a critical and “problematic” effect on our judgment, says Francesca Gino, professor at Harvard Business School. “They cause us to make decisions in favor of one person or group to the detriment of others.” In the workplace, this “can stymie diversity, recruiting, promotion, and retention efforts.”

In more concrete terms, Medium Engineering team share their means for fighting bias by standardizing the hiring process. As described in the first section, they've laid out in detail all the qualities they're seeking, as well as those they don't find predictive of work performance, such as school, GPA, previous employments, open source contributions. And not only they're not good predictors of the performance, they're the usual suspects of causing unconscious biases. Having identified these categories, interviewers then avoid penalizing anyone based on these criteria. You can read about this in Engineering interviews: what we don’t screen for.

Having clearly defined qualities and grading rubrics helps interviewers standardize evaluation and decisions across interviewers and candidates. This helps eliminate both bias and variance in the process. Simply put, eliminating bias will make sure that a single interviewer will evaluate equally two candidates with the same skillset. On the other hand, you want to eliminate the variance among interviewers t00, so that a candidate would be evaluated the same regardless of who interviews them.

Writing about their take-home test in How to prepare for engineering interview assignments, Intercom engineers Lorcan Coyle and Alex Mooney point out that at this first stage of the technical interview they don't care about anything but the solution itself:

Unconscious bias is a well-researched problem in our field, and it’s important for reviewers to eliminate as many potential sources of bias as possible. It’s crucial to be clear about what we’re looking for when reviewing an interview assignment, and it’s just as important to know what we’re not looking for. When assessing a technical submission, we don’t care about:

- The candidate’s experience level.
- The position they are applying for.
- Their CV or professional history.

None of these details are relevant at this stage. All we assess is the take-home test itself – we only care about your code!

At Slack, they automate parts of the take-home task review so that a script converts a GitHub pull request into an anonymized markdown file, ensuring that graders are unaware of the candidate's identity on GitHub.

Finally, at the New York Times, the engineering team makes sure that at each step of the process a candidate is evaluated by multiple interviewers in order to prevent blind spots. They've tried anonymized resume reviews, where you look at a resume with some fields removed, such as the identity of the candidate and names of their previous employers. They share their thoughts on this in How We Designed Our Front-End Engineer Hiring Process.

Dig deeper with Blogboard search:

Engineering interviews
Technical interviews
Hiring engineers

blogboard.io - Engineering blogs from top tech companies. Search, discover, follow.