In this episode of the Alooba Objective Hiring podcast, Tim interviews Chris Byington, Head of Analytics at Superhuman.
In this episode of Alooba’s Objective Hiring Show, Tim interviews Chris and he shares deep insights into the key attributes he looks for in data hires, emphasizing the importance of the right motivation and coachability over technical skills. He explores how rapid advancements in AI are reshaping the roles within data teams, predicting an increase in strategic problem-solving and stakeholder engagement. Chris also discusses the evolution of technical roles, the impact of AI on hiring processes, and methods to make hiring more objective and fair. Additionally, he offers tactical advice on how data leaders can align their work more closely with business goals.
TIM: We are live with Chris. Chris, welcome to the show. Thank you so much for joining us.
CHRIS: Thanks for having me.
TIM: It is absolutely a pleasure to have you on the start of 2025. I'm pumped to have a conversation with you. And where I'd love to start is just to get your thoughts on. You know for your, for your data hires, for your data team, what are you looking for now in candidates? How are you thinking about the kind of development of technology? AI is changing things so quickly. Is that changing how you think about the scope of roles and more broadly, how do you think about hiring someone for now versus what they're going to be doing on. In one or two years time,
CHRIS: Yeah, definitely. Let's see. There are a couple, couple points in there. I think that are all really important. I think the, the first one that you said was, was what do I look for in candidates? Obviously there's like a, a long list of things that, that, that, that I look for, but I think there's two that are really float to the top of the list and, and kind of are in a different tier and that are more like meta things that I look for. The first one is the right kind of motivation. I think it's very hard to coach motivation and to want, and to like Convince someone to want something that they don't want. So I look for people who really want to have a positive impact on the business and who, who are more focused on kind of helping the business be successful as opposed to their own. personal gain as, as, as painful as it is to say, or anything else like that. And, and also who want to make the team successful. And I think it's actually, you know, if you ask kind of questions that, that have the goal of learning that about, about candidates, I think it's pretty straightforward to learn whether they have that motivation. I find that those are the people who kind of have batteries included, who, who will go out and proactively do the stuff with stakeholders, do the coaching of the less senior team members, et cetera, et cetera. And who just make your job a lot easier and, and create the most value. So right. Kind of motivation. That's the first thing that I look for. It's more of a meta skill. It's not, you know, SQL is not on the list. For example, Python's not on the list. And then the second thing that I look for is also a meta point, which is coachability. And I think that gets a little bit into the point that you were making about how do you hire someone for six months from now, 12 months from now? About, you know, in their own personal growth and the growth of technology, et cetera, et cetera. I think like. I'm a big fan of like growth mindset. It's not, you know, that's not an uncommon thing, of course, but I think the people who you know, who joined a team and who can grow with the team through the growth phases of the company and the data science team are the people who are hungry for feedback who seek it out and who have a goal of making themselves, you know, better than they were, you know, yesterday, better than they were a month ago, a year ago. Rather than people who, who have a more pointed and rigid point of view of, of who they are and what they want to accomplish. So if I think if I find someone who really wants to prioritize the business above everything else and making the team successful, it's motivation. And then someone who, who's, who's curious to learn and has, yeah, curiosity rather than kind of a scarcity or a fixed mindset. I think those people. Number one, surprisingly hard to find even though those things sound uncontroversial. And then number two, I think those people tend to succeed really well in any kind of role. And I think, I don't think it's specific to data. Those are the things I have in mind. And then, so that was your first question. Second topic I think was around like depth of AI usage on data teams. Did I get that right?
TIM: well, you sort of touched on part of it already, which was just because we're in this state of rapid change, where it feels like to me, if, if you were to sit down and think about at least the technical skill set of an analyst, you could argue, maybe in a year's time, maybe the idea of writing SQL from scratch, for example, might be no longer relevant, like you're gonna be prompting Claude or Chachapiti arguably to do it. So yeah, in your thought process around what you're looking for does the rapid change of AI impact that? I guess it certainly wouldn't with the first two things you mentioned motivation and, and coachability have nothing to do with the progress of AI, maybe. But yeah, is there anything that's kind of impacting your lens right now and how LLMs and AI are developing that then change what you look for now compared to, let's say, if you're hiring three years ago?
CHRIS: I, I do, I do think that That, that generative AI tools for data analysis are, are improving at a very fast rate to the point where, like you said, within, within one to two years, I think that they will likely be able to like, Write the queries or do the analysis of like a low to mid senior level of, of data scientist. So I, and the, the framing that I have of that is I would expect the types of like problems and, and questions and projects that people on data teams work on will in. In on average, go up in altitude and go up in kind of the level of strategy. So you're spending, and because the reason for that is like, you're spending less time on the writing, writing the SQL or the Python. You know, if you ask chatGPT if you give it a file and have it summarize some stuff, it'll actually give you the Python back, which is amazing. So it can do that today. If you kind of know the exact prompts to give it, and I've been playing with that a lot, it's been pretty helpful. But you do, it is nuanced how to do the prompting. And I don't think it's productized for data analysis. Cloud is closer along but it's, there's still not there in the way that like BI tools are there today. Not with AI, but just BI tools as like an interface for a user. So I do think, and data scientists will spend a larger share of their time on like framing the question, partnering more deeply with stakeholders, which my experience, in my opinion of the industry is like, that's. Arguably more important. I mean, I think I would argue it's more important than, than writing great SQL is like, are you asking the right questions of data? And then when you get, when you kind of get the answer from the, from the report, most importantly in my experience is like the biggest breaking point is like, does it get acted on? And does it have an impact on the company? And so I, I think I see those things that I just mentioned, the question framing as well as the kind of actionability as, as increasing as a share of, of data teams energy, which I think is a great thing. And it's really exciting. And I also think it's arguably more fulfilling work. I, at least I would argue as a result of, of LLMs. What I don't believe is that LLMs are. At the point of or close to being able to execute that whole value chain, like from asking the question to doing the analysis to, to acting on on the data, which would be like a true, like agentic data scientists, so to speak. I don't think we're there. I don't think we're close. But, but replacing that middle step of, of doing the reporting. Yeah, I do see it in the next year or two.
TIM: Is there something to be said for, well, let me frame it this way. So I'm kind of caught in between two minds when I think about this myself. Part of me thinks that, yeah, the average data analyst or data scientist, the blend of their skills that are kind of soft versus technical, you could argue is maybe getting slightly softer. Because if we get to a point in a year where let's say the coding is done by an AI, then you're left with more time to interface with the business, to ask the right questions, to interrogate things, to get the action that actually drives any value, et cetera. So part of me thinks that then there's a higher soft skill component, but then on the other side, I feel like almost the exact opposite, which is if you really know AI well, and you have strong programming skills that could be like an explosive skill set that might be worth investing even more time in. So I'm kind of caught between this yin and the yang between these two opposing forces. How do you think about this? Like if you were, if you were, let's say, a junior analyst now, would you be dedicating, I need to like focus on my soft skills as much as possible, or on the AI knowledge as much as possible?
CHRIS: Yeah, I think I would, I think it probably depends on the types of problems that you want to solve for the business. That's what I would say. I think if you're in the realm of business intelligence, if you're in the realm of product analytics of, of kind of, Product data science, like the more like descriptive and where you're making recommendations for, for what product to build or how to engage with customers to the, those ones I see following the, the, the form of like, allow AI to take over more and more of your technical work so that you can spend time on the, on the high impact business stuff. And then, so that's like one side of the coin and then on the other side of the coin, if you're more in the realm of like a machine learning engineer someone who's building you know, a deeper set of technology where, where there's a system of intelligence that's automated to do. churn prediction or to determine what users should receive what email at what time. You know, if you're building production machine learning models, then I could definitely see an argument for going deeper on the technology and the, and specifically the statistics and working alongside an AI I'm trying to say co pilot but co pilot is like a decent term for it that would allow you to do that work much more quickly and staying in the, in the kind of at the technical altitude. So I i can see it both ways, like you're saying.
TIM: Is there something to be said if we just think like a step further ahead in the pipeline of like who becomes a data scientist, a data analyst, will there almost be a change in who would be attracted to these roles if there's, let's say, a subset of them that suddenly become a lot less technical, where instead of, I don't know, 70 percent of your time chopping SQL and building reports, if that component is automated and you're doing a lot more Business interfacing things or what have you that suddenly you're the source of candidates might end up changing a little bit or we have to rethink the early stages a bit more.
CHRIS: Yeah, I can, I can see that transition happening, I would say like in a minor way or maybe a moderate way. I, I think the analog I would give is 20 years ago, and still today in some areas, there was a job function called a DBA, which is a database administrator. They were doing like the low kind of low level, like doing indexes on a Postgres database or at that time, not Postgres, but MySQL, you know, Oracle, whatever it is. And they were really like fine tuning the, the database to make sure it runs properly. And, and that, in my opinion, like Has evolved today to like data engineers who are more architecting like the platform in the system. And the reason for that is the low level hooks of like indexes and partitions and stuff like that. Those are mostly for the most part, they're abstracted away from the user. They've essentially been automated by the technology more or less. You could draw an analog here with, with some of the either SQL or Python coding that you're describing. What I think won't change. So I do, I do see that changing, like people who would, who would be attracted to like the kind of the, the, the more detailed like SQL and Python. That makes a lot of sense. That those folks may not, may no longer be interested in, in, you know, data science or, or maybe an analytics engineer is a better, is a better example type of role, what I think, what I see. As like the majority of the draw for, for kind of business facing data scientists, business intelligence people, et cetera, is the idea of like, I want to have a function and like be in a job where I can learn about the business and I can be, you know, I can be offering suggestions to the business of what's happening, what happened yesterday. I'm building, I'm building a visualization to help us understand how we're performing and then also, you know, be a trusted advisor with the executive team, with my stakeholder, et cetera, on what they should do based on kind of. How the business is performing and I think that for me is like the core draw and I don't see that going away so it's a little bit of both but mostly I think the the kind of curiosity driven Someone who loves looking at numbers and trends. I think that'll I think that's here to stay
TIM: Yeah, yeah, exactly. So, so for these folks, for someone who's maybe been, I don't know, software engineering or coding for 20 years and they would consider themselves, I am a coder. My job is to code. That's kind of a formal part of them. And suddenly this is being radically changed potentially kind of taken away from them in some sense.
Like, what is there almost like a mindset shift that they would need to go through? If you were almost interviewing someone like this, what would you be looking for? Cool. In them to see that they could progress and, and keep up with the changes of their role,
CHRIS: Yeah, I think yeah, I do think that's a risk especially, you know, especially for kind of for some people, not probably not for everyone. I think the. I think there's a lot of different, I think it depends on the person. I think there's a lot of different framings depending on what someone gets energy from. And like what they find compelling. And I think that's part of like being, you know, doing, being a manager is like understanding that really deeply. If you don't understand that about folks on your team, then, then it just becomes challenging to incentivize them because you don't know what they care about. And so I think having an explicit conversation Almost like an intake is what I typically do is like an hour or two hours of like, Hey, like, tell me about projects that you love to tell me about projects that you didn't love. Like, what's the difference between those two? Try to assess out some of those patterns. Who was the best manager you ever had? What did they do? That was so great. And like, often it kind of facilitates the person learning something about themselves. And so doing that with them I think is important. And through that. What I've typically found is like in the new paradigm of like, this is what the business needs. These are the new tools that are available to us. You can find something that, that, that like where the magic happens, where you put together what the business needs and what the person finds really exciting, even if it isn't exactly what they've done before. And so I think an example of that. In my mind that has already happened really successfully is the rise of like analytics engineering as a function. You know, we didn't really have analytics engineering before dbt. So, and it's only been how many years, six or eight years, eight years ish, since analytics engineering has been a thing. And before that, it was just data analysts building tables, right? And now they have a whole set of tools. And it's like something that a lot of people are really, including myself, are really, really passionate about, but it is a role evolution, but it's for the better. And I think that does come back to. The points I was making about coachability and like having the right motivation. If as a person, you're like, okay, I'm a software engineer and my job is to provide value for the company. And I know that that's going to look different in different environments. And that, and if I'm coachable, it means that I'm not rigid in like using the tools I've always used and I'm open to learning new skills and that may be AI, especially when it can help you be. 20, 30, 50, a hundred percent more productive in terms of what you're able to ship with a certain quality bar in a given amount of time. So I think I think you can have it both ways.
TIM: you'd mentioned really digging in and trying to learn these motivations in I guess the onboarding process of really delving in there and asking those questions. But of course, by that point, you've already made the hiring decision, I guess you have to do almost like a light version of that in the interview process. It's just like, how do you really evaluate someone's motivation, coachability when you have maybe. A few hours with them to interview, like, what, what are you, what are you focusing on? Is it an interview or is it some other evaluation method? Like, how do you get to the bottom of that?
CHRIS: There's not a perfect way to do it. Like you're not doing it with perfect confidence. You know, you're, you're stopping the, the AB tested, you know, 90 percent statistical significance or whatever, instead of a hundred, it's hard, you know, it's hard to get to a hundred, like you said, in, in a few hours. I, I think during the interview, It's at a much higher altitude. It's like, Hey, do they want to do the right thing for the company or are they out for themselves? And then for coachability it's understanding, you know, how do they take feedback, do they ideally, do they seek it out, how, you know, how do they have a demonstrated history of acting on feedback from other people in order to improve themselves? I think that's the altitude that's okay to be at. And then the intake discussion is more like the. The factual details of that, it's like, Hey, what types of projects do you prefer to work on? We're a few levels deeper. So that you can take that information. And like as the manager throughout the next, you know, year for over the coming months, you can, as you see projects, you're like, Oh, this project would be good for this person. Just for example, we had someone on at superhuman which, which is where I'm the head of analytics who was working as a, at a data scientist, but. But kind of wanted to transition into an analytics engineering role. And so having known that we were able to orchestrate that because it was something that the team needed. And it was also something that, that this woman was interested in. And, and had we not known that, you know, we wouldn't have been able to make that connection. But more specifically, going back to the interview process I just try to ask really pointed questions that don't really leave too much room for the person to Like say something that's different from the actual truth. And so for coachability, the question that I ask is tell me about a time that you made a mistake, tell me what caused it, how you figured, how you determined that it was a miss and what did you change to improve and stop it from happening the next time? And. The person and that's like the person's body language, their reaction to that question, whether they lean into it and they say, yes, I have so many examples like I love making mistakes because that means I can learn more in the future. You can, I think it's relatively straightforward to see if they're being genuine. Do they take accountability do they say you know that was a miss for me. It's not a reflection on me as a person but rather an opportunity to improve. I myself like I Chris as I sit here and not the interviewee. Have made so many mistakes over the course of my career, like a hundred percent. And I'm really happy to have made them because it helped me get to where I am today. And I don't see that as a personality flaw. I see that as just part of being a human being. And can they show like, Hey, I made a mistake. I fixed it. It didn't happen again because I took these steps. Like that's awesome ownership. That's really what I want to see. If they say like, Oh, I made the mistake, but it was this other person's fault. You know, I, it ha it recurred, it happened twice. Then I think you're in more of a, it's not a gray area. Then that would be, I think, a poor response. Cause it would show that the person isn't does not show kind of like under their own power, the ability to learn from their mistakes and to improve. I think it's hard, it's hard to fake the answer to that question. So that's, that's the coachability. Another way of framing it is like, tell me about a time when you got feedback from your manager that you agreed with, and then what did you do? And if they're like, hmm, I've never gotten feedback that I agree with, then like, and you'd be surprised people say that. So I think, I think having that and, and then an equally important question is like, tell me about a time that you got feedback that you didn't agree with. It's okay to not agree with it. The key is why. Like why, you know, is it because you're blaming someone else or is it because you, you factually, logically didn't agree with the feedback and, and you worked through with it and you were able to convince your manager that they were wrong. I'd love to hear an example of that. I've never heard of one, but you know, you know what I mean? And then for motivation, so that's coachability. I usually ask just one of those questions. It's usually enough. And then for motivation what I usually do is I talk through a project example and I don't ask the person to talk about a project that like had a huge company impact and kind of, it's not a leading question. It's like, Hey, what was the, what was, you know, one of your. Favorite projects are the projects you are most proud of and they're the choice that the person makes of which project to work on. Was it the one that got them promoted? Was it the one that they were really interested in as a side project, like a side hustle, but it had nothing to do with their job. Or was it the one that had a huge company impact or something that helped their, their team or helped their cross functional partners? Those last ones, the fact that they choose it and they, that they explained that that was kind of the thing that gets them most energized. It's them volunteering, whether their priority is the company or themselves.
TIM: Right. So basically you are getting them to talk about their past decisions, past behavior. Which then tells you really, reveals the answers you want through the, through the way they convey it and what they did in the past, basically. And you would take that as an indicator of what they'd be interested in, in the future as well. If we assume that the best indicator of the future is what's happened in the past. Is that a fair, comment?
CHRIS: It's one piece of the puzzle, yeah. But yeah, that is the approach.
TIM: The way you were describing the kind of intake process where you kind of dig into some of these. Kind of help the candidate or help the new employee think through their motivations of what might be interested, interesting to them, maybe think, Oh, that's an exercise that candidates should go through themselves before they've even started the job search, because that could help them refine what exactly they want and what they look for. If you were looking for a new role right now, how would you think about that process? Like, would you step back and almost ask yourself the same questions that you would as a hiring manager in terms of like what you actually want? Because this is interesting. I like if I think back to my own job search experience, I don't think I had that almost meditative approach before I was kind of taking a Like, I want this kind of raw bang, bang, bang, without really meditating on it.
CHRIS: I think I would. Yeah, I think that I hadn't fully made that connection. But I think that's a good point. Yeah, I think I would, I would think, think about it that way, probably think about it. A lot of the example questions I gave were more about, like, they're focused around projects. They're, they're a little bit more detailed, but I would abstract it to like the job. It's like, Hey, like, you know, if you like, haven't had that many jobs, I've had five, maybe including this one. And it's like, Hey, which one was the favorite? Like, Why was that which was the least favorite or which was the least favorite period? You know what you can do some our brains are like great at pattern matching and so thinking about that and then and then trying to distill it into like A very small number of things that you want to get out. Of of the of the new role I think I think is, is something that I've seen be, be helpful in the past. Like at a previous job, I, I was switching jobs and I knew I wanted to like build a team from scratch. I knew I wanted to sit close to the profit center and I knew I, this is a little bit more detailed, but I knew I wanted to have a team where data engineering and analytics sat on the same team. And now those were all like kind of meditations that I did on things that I was unhappy with in my current job before I switched roles. And and that was helpful because those were very clear. It's like you either have it or you don't, they're not fuzzy. And and that, yeah, it led me to get not the job I have today, but the one I had previous. So it can be helpful.
TIM: And I think it's probably, yeah, just part of overall interview prep when I think, if I think back to maybe myself as a more junior candidate and other junior candidates, they would think of interview prep almost as like, I'm going to write, memorize, you know, answers to these common types of questions, which is, I think off by a few degrees, it's more like, no, no, you need to actually have thought about. These themes, why they matter to you, because then you'll be able to answer coherently, like in your example questions before about motivation and motivation and coachability. I could see how a badly prepared candidate who hadn't really thought about the process and had rocked up in one of your interviews would absolutely drown in those two questions. But if they'd sat down for a few hours and thought about it, they probably could have. meditated on it and thought of examples that were fresh in their mind, as opposed to on the spot, trying to pluck out something from five years ago. So a little, a little bit of thoughtful preparation must go a long way, which now I would have thought with something like an LLM is a lot easier. You can have, you can have a conversation there with it. You could mock up an interview very easily, almost for free using Chachapiti. You could get it to. bounce back ideas from, and that could be, I think, a really helpful process to go through.
CHRIS: Yeah, I agree. Definitely.
TIM: Which I shall go through if I ever apply for another job. One day, I guess one thing that we like to talk about on the show, as you would expect, given it's called objective hiring is how to make hiring a little bit fairer. A little bit more objective. I'll tell you my view off the bat is I feel like often the best candidate might not be the one to ultimately get hired. I think there's a lot of layers of subjectivity in the way hiring is done. How do you think about objectivity? How do you think about trying to make the process a little bit fairer?
CHRIS: Yeah. In general. Yeah. So in general, I agree with you that, the higher that hiring in hiring, there's a, there's a high risk of, of not doing it fairly. And, and I think that also having a negative impact on, on, you know, you as, as the manager, it's like not getting the best candidate in the, in the seat, like you said in, in terms of how to make it less subjective and more objective the thing that I've seen be. Probably the most powerful is treat it like an AB test is like the thing that I, that I tell people, which means, and what I mean by that is define the, the metrics or the evaluation criteria upfront before you see any candidates and then define what like good looks like for each metric or each evaluation criteria upfront so that. You're not going off of like the vibe in the pit of your stomach, but rather like an agreed upon set of rules. And that's, that's often why I try to have relatively few questions in the interviews, frankly, and have them be exactly the same between candidates. I think that's kind of, That's pretty straightforward. Like most people do that these days. But still like, it's important to note that and then be very clear about like, what is a good response? The heuristic I would have is that. You should be able to give the questions, you should obviously have the, not only the questions, but like a good response, you know, good, great, and then poor response, like a description of that written out, someone else should be able to run the interview or at least listen to the recording and have the same evaluation as you do. Like there shouldn't be that much subjectivity in it. Like some, the person needs to know about data, like it should be able to be someone on your team or maybe your boss, if your boss knows about data. But you know, there shouldn't be anything special about your evaluation that, that's specific to you. Other than that you wrote the questions. I do think it's hard to write questions for interviews, but I don't think it should be hard to, to evaluate interviews. Something else that I do to try to make it like the evaluation a little bit more objective, is I found, especially when you have like a panel of interviewers, I found that it can be, it can be too abstract to just say yes, no, did the person pass the interview. So, of course, like you can give them like specific things to evaluate. I also like to give each Interview panel member, like one level lower of heuristics to answer, to get their point of view. It's like, cause sometimes people are like, well, they kind of passed it, but they didn't usually that's a no, right? It's like a soft. Yes. It kind of, that basically means no, I'm definitely in the camp of like, if it's not a heck yet, if it's not a heck yes, then it's a heck no. So I try to give some heuristics to make it a little bit more tangible. I typically have four questions. The first one is, would you be excited to work with this person? It's like. It's either yes or no. Like that's really straightforward. Often the people who are soft yeses will answer no to that question. So that's like, we don't want to hire those people. The second is, do you think they would be a good cross functional partner, assuming the role is cross functional? If it isn't, then you can remove that. The third is, would they, this person create meaningful impact to the company? And then the fourth is, would this person increase the average level of talent at the company? And that one I think is probably the hardest one. So yeah, those are some of the things I think about in general. It's like, call your shots before you make them, similar to, you know, to an A B test.
TIM: Yeah, that's so important. Otherwise you get the what I'd call the moving the goalpost problem. I don't know if that saying translates across countries, but Yeah, you you inevitably, I notice, get to a step in the hiring process where someone will provide some feedback, normally feedback that seems absolutely fine and reasonable. But doesn't really connect with the actual criteria of the job. So I'll give you an example. You might get a candidate. So I don't know, a third interview or something. And someone comes in and goes, ah, yeah, you know, I'm just, I'm not sure they're a good team fit. You know, I, I'm not sure they're going to really partner with us well enough or blend well enough with our team. And something like that kind of seems reasonable, but is a bit vague and doesn't necessarily connect to the criteria you'd established previously. Or you could have something else where you might say, oh, well, yeah, but you know, I, I didn't think their, I didn't think their Python skills were good enough. Like I gave them Python, quick Python quiz in, in the interview, but that wasn't part of the criteria to begin with. Then. Why are you selecting for it, even though it may be reasonable now, but if you've decided from the get go that it wasn't, then it's just so easy to deviate away at any point in the process. So yes, setting all that stuff up from, from the get go is essential. I wonder if sometimes it's not done because it's just a lot of effort to really think through exactly what you need and what you don't need. And maybe it's just the sort of the time effort involved. Maybe less experienced hiring managers as well. Don't know the details exactly of the role. Like as a little bit of guesswork for them, perhaps. I don't know what any, any particular reasons you might see this go astray. Sometimes
CHRIS: I think, yeah, I think it, I think it is more challenging than people give it credit. four to, to be really rigorous about decision making, whether that's in hiring or in making like business decisions, which I think is a hard part of working in data with stakeholders, as people who want to make gut decisions. You know, it's really the same thing. It's just about hiring or about business, you know, launching a marketing campaign or sales or whatever it is. And so I really see them. That's why I try to. And I think that's a really good question. equivocate between AB testing and hiring is because I think it's just part of the same decision making. So I think that can be challenging. It takes a lot of mental energy. I do think it comes back a little bit to philosophy of like, do you want to make decisions based on logic and facts? Or do you want to make decisions based on gut? And I like both can be valid. Like there's like very senior executives at Fortune 500 companies who are paid millions of dollars a year to make decisions based on gut. And if you're right. Like go for it. Like, I love that. It's just not where I want to work for me as a data person, because I don't think I can add that much value. And so I do think when you're joining a company, that's one of the things that I, you were talking about in my own interview prep before, that's one of the biggest things that I test for. With the executive team is like, do you actually make decisions based on facts and logic or, or what you're on some spectrum, right? But like what share of the decisions are made just based on gut and in what scenarios can gut overrule logic and the other way around? And and I think it's important to establish what type of culture you're working in. Along those lines, I think both can be valid. But my, my personal preference is to make decisions based on data.
TIM: I also feel like when it comes to hiring that again, you're thinking of this spectrum of data driven to gut feel, or maybe. The number of decisions in a hiring process, the proportion of them that are data driven versus gut feel often there's a kind of a combination. I often feel that if we just kind of unpacked the gut feel component and really drilled into it, we could probably come up with some reasonably objective criteria out of that, because a lot of the time it's sort of a vague sense of, I don't think they're going to be good team fit. Okay, why not? Why not? Why not? If you dig down deep enough, you probably get, Oh, I'm looking for someone who's detail oriented. I noticed they made this sloppy mistake. I remember this analyst 10 years ago who I hired who made all these sloppy mistakes and my life was held like there's some kind of meditative process that if you just dug enough, you'd probably come up and make it sort of a 95 percent data driven process and there's still a little bit left over maybe. For the ultimate gut but I, I feel like, yeah, there's, you could almost science y the gut feel bit with a, with a bit of thought.
CHRIS: Yeah, I think that's fair. That makes a lot of sense.
TIM: Are there any bits of the hiring process that you would put in the bin that you think this has got to go, like, this is just worthless. We shouldn't be doing this anymore. It's of no real use.
CHRIS: I guess I would say long take home assessments. I think, I think it, it, it's unreasonably onerous on candidates. And I think there's a quicker way to get the same signal. Like, I think I'm, I'm probably okay with brief take home assignments to give the person a chance to like think on it. Not in a live setting because I think there's a whole set of like accessibility concerns with asking someone to do a case study on the spot. I also think case studies can be valuable, but you know, when you hear about companies doing take home assignments that, that take, you know, like more than four or six hours to complete, that, that seems a bit unreasonable and, and it's just kind of asking the person to work for free, which I don't, I don't it's not something I agree with. Yeah, that's probably the one.
TIM: What about reference checks? I'm always interested in this one because I had always viewed them as a, personally, a bit of a tick and flick exercise at their worst that was adding no value. But then when I've spoken to a few people recently about how they do them and kind of gave me a slightly different perspective. Do you value checking references?
CHRIS: I value back channeling for candidates to understand other people's experience working with them over long periods of time. I wouldn't call that a reference check for reference. And so back channeling, I would call like back channeling is what happens before you make an offer. And then reference check is what happens after the person accepts. The back channeling, I think is a good way, especially for like senior people and executives, there's just so much nuance. And you know, the more senior people get, they tend to have. Like deeper spikes which are more challenging to understand in a short time period like a short interview. So I think back channeling is valuable from people who've worked with someone for a long time we do that quite quite often, but it's not a requirement. It's like if it's there if you know someone who Who you have a mutual connection with, then yeah, of course you would get that information. And then reference checks. I have found reference checks to be valuable after the person accepts not obviously not to evaluate their candidacy, but to get input from people they've worked with in the past about like, what has been helpful? What, what should we look out for given that we've already decided to work together? But not evaluative. It's more like, how can you accelerate the getting to know you process?
TIM: One thing we've touched on just slightly in this conversation is AI. Those two little letters that's transforming the world every way that we can possibly think. What's hiring going to look like in a year or two? Like, are we on the precipice of some kind of Complete transformation, or is it going to be more just a sort of iteration of how it's done at the moment? What are your thoughts?
CHRIS: You know, I don't have a good answer for that. To be honest, I think my My initial reaction is that it's more incremental. I do, it's probably a four letter word, but I believe in a human being reviewing resumes. And like, I do believe in face to face human synchronous interviews. I don't believe in one directional interviews or interviews evaluated by AI or anything like that. And I think and I think I also believe in, in getting like a diverse set of points of view about a candidate by having hiring panels, all of which is human. So yeah, I guess, I guess I do have an answer, which, which is which is I do still trust kind of human to human hiring for the foreseeable future.
TIM: And, and why is that? What, what do you think would be lacking in some kind of AI approach? What is the main value add that humans bring to the table at the moment, do you think?
CHRIS: I think it's, I think when you're When you're hiring someone, you need to have, you're trying to form an opinion about the impact the person would have in the future at the company. I think it's more about that than it is about like evaluating a set of check marks. Which I do think like a, like an AI system probably could do. It's like, Hey, they have this skill. They like, they pass this coding test. You know, kind of similar to what I mentioned before about like evaluating for motivation and for and for coachability. Those things I think are, are pretty nuanced to understand candidate's competency for them. And so I would not. Trust in LLM to, to fully evaluate and like make a decision on them today.
TIM: One thing that I personally imagine would happen would be that there would be like a final human decision. But increasingly, from the start to the middle to nearly the end, the AI would progressively be involved just because there'd be, I think, so much upside to doing that around speed, cost saving I think, in theory, a way to reduce a lot of existing biases that take place just because of all the, all the myriad reasons we could interview a candidate and go, you know what, I don't like them, you know, Down to what side of the bed we got out of that day, whether or not the coffee was spilt on us, if our partner yelled at us, like, that's going to influence our decision so much in a way that it won't affect the AI, that if we had, yeah, a well programmed AI, that it could eradicate some of these long standing issues. But yeah, it's going to miss that nuance. I kind of think of it as like a, what's the aggregate or net effect? This seems to me that some obvious benefits, but some obvious downsides, but I'm not sure where that nets out. But I certainly would have thought that that it'll start to chip away from the kind of start to the middle onwards.
CHRIS: I can, I can see that. Yeah, I could definitely see that.
TIM: if there was one bit of the process that you think, Oh, I'd love AI to take care of that. So I don't have to do that anymore. Or you think that, yeah, it would add a lot of, or add a value above the way the human currently does it. Is there any, any kind of step that springs to mind?
CHRIS: I suppose it's, it's probably the, that they're like, you're saying the earlier in the process, like the initial screen, things like that. Even just like taking care of the recruiter screen would, you know, it would have, it would make it so you don't have to schedule it other than for the candidate to schedule it. And, and things like that, things that are kind of lower, lower risk because because you're further away from the decision. I can imagine that making sense. Yeah,
TIM: I wonder if candidates will be pumping their own AI into these AI interviewers soon. It'll be some AI versus AI weird standoff in the early stages of hiring. I don't know how that plays out though. What's, what's the
CHRIS: right
TIM: there? I'm not sure.
CHRIS: Definitely.
TIM: We're in a weird spot actually, aren't we? So like I hear from a lot of people that the current state is, you know, you put up a job ad, you get a thousand applications or more very easily. Seems as though a lot of them are written using Chattopadhyay. It seems as though the CVs kind of look perfect compared to the job ad. But then candidates are looking at this market being inundated with all these competitors thinking like, Oh, my God, how am I gonna get a job? What? What's going to be the breakthrough to this problem? Like, is it just going to be? Companies implement AI, candidates have AI, and then it just sort of cancels each other out and we're kind of back to where we were a few years ago, or where are we going to get to in the next year, I wonder?
CHRIS: mean, I think I think you're I think you're right about like in your description of the problem I think they're the way I would frame it is. Like very high noise to signal ratio. And I think what I typically look for, like a thing that makes a an application or a resume stand out is not so much like the formatting or the language or kind of. Matching keywords in the job description, but I typically look for demonstrated history of like doing the things that will make the role successful, like project examples I would, I would much rather have like 50 applicants to a job where half of them have great project examples that are relevant to the role rather than a thousand applicants to the job where 800 of them are kind of boiler plate, but don't really contain any, any meat. So I would hope to see the burden reduced on applicants where they're, they're applying to roles that they're like more closely suited to, but like fewer roles, which hopefully is less work. And then. Hopefully that that would have the same effect on on employers and on hiring managers where the average kind of quality of, of of the applicant pool is increasing because not because the quality of the people is increasing, but because a higher share of them are like well suited for the role.
TIM: Yeah, I wonder for that. It's almost like a matching problem, isn't it? The candidate has a set of things and attributes, the job and company has a set of attributes. I wonder if part of the unlock is just more better data, maybe on both sides. Like I was thinking recently that for the candidate, we really have like their CV and their application, which is not much to go off. The company you have, let's say the job description, maybe there's just needs to be some kind of new data sets that would then make that matching more accurate. I'm not quite sure what they would look like, though.
CHRIS: I think LinkedIn is well positioned to do that. Cause they, they have information often about both the job and the candidate without any application. You know, they have your, if you have some bullets on your LinkedIn, I think that goes a long way.
TIM: Yep. Time will tell we could speculate all day and I do love to speculate. Chris, if you could ask our next guest one question, what question would that be?
CHRIS: Hmm. That's a good question. I didn't think about that one before this. I think something that I, that I think about a lot is, is how to, how to move myself and the team closer to the business. So I think and maybe this is like tactical, but what advice do you have for other deal leaders to to move your work closer and closer to the business? Because I think that that kind of really deep partnership is, so much. is one of the main kind of predecessors of, of creating really great impact at companies. And that is what I try to do. So that's something that I'm always curious to hear about what has worked and what hasn't worked for, for other data leaders in forging those really deep relationships and partnerships. Yeah.
TIM: Wonderful. Well, we will level that question at our next guest, whoever that may be. And in the meantime, yeah, Chris, it's been a great conversation today. I've really enjoyed it. Thank you so much for coming on and sharing all your insights and experience with our audience today.
CHRIS: Yeah. Thanks for having me, man. It's been great. Love it.