Alooba Objective Hiring

By Alooba

Episode 104
Kai Chen on Objective Hiring, The Future of Work & AI: Finding Top Talent in a Changing World

Published on 2/21/2025
Host
Tim Freestone
Guest
Kai Chen

In this episode of the Alooba Objective Hiring podcast, Tim interviews Kai Chen, Global Head of Analytics, Insights, & Transformation

In this episode of Alooba’s Objective Hiring Show, Tim interviews Kai Chen, a Canadian management consultant turned analytics leader in Frankfurt, Germany, shares insights on the evolving landscape of hiring in analytics. Kai discusses the balance between intuition and data-driven decision-making, the importance of structured recruitment processes, and the evolving roles of analysts in light of AI advancements. He emphasizes the necessity of critical thinking and adaptability in candidates, particularly in a rapidly changing technological environment. The conversation also delves into the future of analytics roles, the impact of AI, and the essential skills for upcoming generations.

Transcript

TIM: We are live on the Objective Hiring Show. Today we're joined by Kai. Kai, welcome to the show.

KAI: Thanks. Thanks for having me.

TIM: It's our pleasure. And where I'd love to start is just by learning a little bit about yourself. Who is Kai? Who are we listening to today?

KAI: Yeah, it's a great place to start because it's quite the complex story. I'm Canadian to begin with, but I find myself calling from Frankfurt, Germany today. It's a classic story, right? You meet a girl, and you have to move overseas. So that's why I find myself in Europe. But my career has been in management, consulting, and investing, starting in North America with Deloitte, the big D everyone understands. And doing M&A advisory and M&A strategy before moving to Europe, with Simon Kutcher and partners doing commercial transformations. And that's me from a professional sense. Most recently I moved to Cramp, which is the largest agri wholesaler in Europe e-commerce. And so yeah, that's me professionally. I manage the analytics department. Personally, I'm married, and I have a seven-month-old daughter, which has taken up all of my free time.

TIM: And a few sleepless nights, I should imagine.

KAI: Yes, more and more she's teething. But it is what it is. Yeah. And I'm Canadian to begin with. I don't know if I said that, but it's an important part of my identity.

TIM: Yeah. And I guess Connor says he could probably pick it up just from the accent. I have to say I'm not particularly familiar with North American accents and the differences, but I would have guessed you were Canadian.

KAI: I think the distinction is a little bit less. It's clear then, for instance, it's really clearly from an Aussie accent, right? I can understand it'll be a little bit more difficult for Canadians.

TIM: Yeah. As long as you never confuse me with a Kiwi, then I'm happy with

KAI: No promises. That's harder to tell.

TIM: And you mentioned, yeah, starting and spending the majority of your career so far in consulting and M&A. And I'd love to, to start there actually, because, you that was the start of my career also was getting into this field and I remember certainly at the graduate and intern level, it was a notoriously challenging candidate experience, often driven by the fact that it was ludicrously competitive. It was what everyone wanted to get into. So each company had their pick of the talent, and you're competing against thousands of other people for a small number of jobs. So then the balance of power is with the companies. And a lot of hoops, a lot of steps, a lot of interviews, sometimes quite difficult ones in consulting, you typically would have the kind of case challenge thing. I'm not sure if that still happens now, but I'd love to hear more about your experience in those early days. And if any of that has shaped the way you think about recruitment now that you're in analytics.

KAI: Yeah, quite a bit, actually. If I were to think back, and this is way back, right? Back in business school times. I would presume that Australia is similar to Canada in that you have one really large business. It's a market where a lot of the activity happens. I would presume Sydney would be that for Australia in Canada. That's Toronto.

TIM: Yep.

KAI: And then there's Montreal and Vancouver and Calgary, but they're distant second, third, and fourth places compared to Toronto, which is the bulk of the business activity. So all major firms, whether it be Deloitte, a PWC or another big consulting advisory firm headquartered in Toronto. And it's very tense and very competitive, right? Because you have all the business schools in the area. Maybe 2,000, 3,000 business students a year competing for maybe 50 consulting positions. In a year from all the really top branded firms, so it was quite incredibly competitive for my class of 70. I think I was one of two picked up by Deloitte in my year. It's not to say not to speak about all the other schools and their candidates. But what I did like about. The consulting recruiting process is that it's quite standardized, right? And for better or for worse, I've drunk the Kool-Aid, and I like the fact that the quality candidate that you're getting at the end of this process. But it does filter out to a type, right? You want your sharp thinkers, people who can really make decisions on their feet, who can structure the work and the problem-solving and do a case interview. And what you get is a, whether they're, they're quite some might call it bland, but you do get a certain quality of candidate at the end of this process. And if I were to think about what I look for now in analytics. I don't think that professionalism is just, is there yet because analytics is such a broad field. I would argue much broader than what it means to be a management consultant. Analytics can be just about anything, to be honest, and until the industry has figured out what that more precise definition is, I think it's up to us as business leaders to come up with what is the profile that we want and establish the really rigid process—not rigid, but more, let's call it structured—is probably the better word, process to get the candidate that we want. So, for instance, we do a case interview as part of any candidate's interview analytics at CRAM to make sure that they've got the business and problem-solving sense. And then, of course, there's a technical component as well. We want to understand what it is, from a tool perspective, maybe even a programming language perspective, that the candidate is equipped with. But actually, for a born junior candidate, I care for the critical thinking because that, at this point, early 20s, mid 20s, at this point in their life, I can't really teach them to be a better critical thinker. That's why we have who we have. I can't teach them a tool. I can't teach them a new program or programming language as possible. But we need both the structured problem-solving that a case interview will tell us and inform us about, as well as the technical knowledge tests that we will do thereafter. So that's perhaps one takeaway I still have from the consulting world.

TIM: Okay. So you basically set up a clear process that's structured. That's consistent, which then presumably gives you a reasonably consistent outcome and result. You'll always have outliers. You'll never have 100 percent accuracy in picking candidates, but it's reasonably predictable. And is it the sense you get almost a certain type? Is that what was happening in consulting? There's a certain character that would normally be hired.

KAI: Yeah, indeed. And actually another benefit now that you make me think about this is the fact that we have the professionalism of a case interview, the fact that actually a lot of my team are ex-consultants, and I myself am one. That's also a good selling point in the point of recruiting because in recruiting, it's actually, it's, you're both buying and selling, right? I would love a candidate, but they also have to love me. And the professionalism that you introduce in a consulting style interview, with the case, and then the technical component, also lends a degree of, in a sense, credibility, right? With, whom you want to work for, which what does your prospective employer actually value? It's night and day. By the way, the success of my candidates through our recruiting pipeline then versus an IT candidate applying for an IT job at Cramp. I have the statistics. The HR folks were telling me about this. They were wondering why our pipeline was so much more successful. And I think part of that is because this feels like a more, I don't know, business-oriented, more, in a sense, professional engagement about a job versus a traditional technology role.

TIM: And the professionalism comes from the fact that a candidate could see that you've thought through this in a lot of detail in the way the case is set up. I assume it's somehow related to cramp. It's customized in some way, and the level of detail of your thoughts and how you're thinking about the problem. Is that part of the professionalism?

KAI: That, and I'm glad you're making me push on the nuance because I'm trying to think through it myself. It's also the fact that I ask each analytics professional to think about their impact on the business, and a case makes that very clear. But if you think about it, how do you spend your time in relation to how the business grows and how you add value to the business? That's quite a bit more. In a way, it's more responsibility than a pure technical interview that you might get for a more technology-oriented role, right? And that's the difference, which is, I want you to understand that you are a participating member of this business through the example of a case, but consequently, then you feel that you're involved and you have agency over where we're going and the impact that you're having.

TIM: I imagine for, let's say, a traditional analytics candidate, that experience might be what would be atypical. And I could easily imagine almost stereotypical, super technical data scientists. floundering in that kind of case study, a business focus case study. Is there like a common pattern emerging of where candidates do well or don't do well in that case study step?

KAI: Yeah, that's a great question. And. I have a very long corporate title, but I'm the Global Head of Analytics, Insights, and Transformation. And it's set up in such a way that my department is also set up in those slices. And that, you've got the really technical folks. On the analytics side, they love working with raw data, converting raw data into something useful on the insight side. You've got the, the sort of the well, the basically processed insights. You need to enrich it with customer market external sources, whatnot. So these are the people that really turn it into something useful. And then ultimately you've got the transformation people and the business analysts, who are more like traditional management consulting and change management. You take the insight, and you go fix a part of the business with it. And so actually, depending on how the candidate presents him or herself in the interview process, I already have a sense on are they more true technical or are they more, people manager, change manager? With some dabbling in, in, in the analytics and the key is, all of the work that we do across the entire spectrum of the department has to be data driven. I don't want intuition. Intuition is just one data point. Let's try and prove what it actually is behind it. And the case interview is a great divider in that, okay, immediately you sense who has a natural acclimatization or acumen for business logic and in a business context and who is more resting on the laurels of a technical competency. And that's great, and the beauty of the department is I can filter them where I think that they might sit best, but it is a great divide, right? You do have folks coming from business school who are dabbling in analytics, who are clearly more change management and project management types. You also do have folks coming from more, computer science backgrounds that are indeed more analytical, but also are curious in business. And in the future, there's no reason they can't be cross-trained and cross-pollinated to explore more of the business.

TIM: So they sound like reasonably fresh junior grad-esque candidates that you're currently hiring into roles. And is that how the kind of centralized pipeline suits that well, because they're almost a blank sheet in a way?

KAI: Yeah, indeed. And you'll know better than most that the industry is moving so fast that, for instance, I learned programming in C. How relevant that is nowadays is very irrelevant, right? And if I waited five more years. And this is a bold statement. Perhaps programming itself as a skill set might no longer be relevant. Actually, what we need are sort of HR managers for AI agents in the future, right? And so am I going to get that from someone who's been in their career for 20 years, performing well, but actually they've never been incentivized to learn and pick up the latest and greatest, right? So actually. In our field of analytics, I think we are more incentivized, to use that word again, to, for the younger talent, because the younger talent really is fresh with the latest and greatest, and I don't believe for a second that I won't be redundant at some point because if I don't continue my own education, I really will be irrelevant at some point. So I think there is a good sweet spot in this particular profession of analytics to really nurture and pick up the young talent, but there's a great war for young talent, right? The best and brightest always shine, and everyone wants to get them in. Agriculture, to be blunt, is not a sexy industry, right? So coming from management consulting, now they're agriculture. You really have to do some stuff with the compensation and the benefits to make that attractive for a young person, but I fully get that it's a challenge for a lot of people out there in recruiting. But all this to say, I think young talent, I do have a preference for; I would much prefer young, energetic, full of ideas. Yes, able to be structured and malleable in some sense, but that's, I think, where the industry requires us to be reactive. And, but we do need a few very good, experienced folks to really be the people managers to lend the guiding hand along the way. And those are also very key hires that I've made much earlier on in the process to make sure that we have set up the skeleton of the team. Really well. Now we're trying to fill that in with some young energy.

TIM: You'd mentioned earlier on that you had had more success than some other teams in the company, and you mentioned the HR team and shared some data points with me. How are you measuring that? Is that based on the conversion rate of the interview or longevity in the role? Like, how do you think about the success?

KAI: Yeah, conversion rates very simply. So think of it like a funnel, right? How many people viewed the application, viewed the vacancy, how many people applied, how many people interviewed, second round, ultimately converted. Fun facts, right? So we had, for our business analyst roles that we opened up last summer for three roles, something like 400 applicants. So we had a less than 1 percent acceptance rate. And so it was very funny; the HR was comparing this. The acceptance rate of Harvard that year was something like 1.2%. So it was harder to be a successful business analyst candidate. Because the pipeline for whatever reason was very, just very popular with with the market in that period. And then just to get into Harvard. So it's about, I think, presenting the role correctly to the correct audience. And as soon as you get a lead, then also recognize there's a selling experience here, right? Making sure that first impression is a good one, right? That we talk not just about the role required, but also about what the culture of the company is, right? Because that's the, to use the North American cliché, right? That'll be the new work family for the next few years. It's as much as it is a buying experience; it's very much a selling experience. And I don't, I think it's very important that the hiring managers don't forget that.

TIM: Is it fair to say that in the way you've set the process up, well, I'm sure it is fair to say this, that it's driven by your own experience? This is where we started the conversation in consulting. Have you ever had feedback from candidates or maybe peers who feel like you've almost over-indexed in a sense on your consulting experience? Now it's like slightly too consulting II that you might end up with slightly less technical candidates or some kind of outcome like that.

KAI: That's a good question. And I think the friction point here, where it is really visible, is with the data science candidates. So these are the folks that, by and large, have computer science backgrounds and are really technical, much smarter than myself. I really admit right on technical topics and. And here, whereas in other rules I might require 50 50 business acumen and 50 percent technical skills, here it's going to be like 95 percent technical acumen, 5 percent business context. You always do need that small contextual percentage because otherwise you don't know what you're working on, what space you're working in, and then the validity of your solution becomes less. But here I do recognize that for a really specialized role like a data scientist who might tackle a problem entirely different than what is, my business logical mind wants to structure it in this way: this is the new reality, right? The model itself might ask, might be asking questions that I'll never imagine or conceptualize. So that's interesting. I find that really fascinating. It also means that I have to treat these candidates very differently. I have to approach this with a different mindset. I also have to present the role with a different viewpoint. So it's not that, and everyone has different motivations. The sheer technical challenge of the data science problem is more so than actually the business impact. It's funny to say we're hiring them for the business impact, but they themselves are motivated by how complex this technical challenge is, for instance, network optimization across a European distribution concept, right? Quite, quite the challenge, technically. And if they enjoy that, by all means, come to us. But then I understand that they're not a consultant, and I have to treat them as such.

TIM: Yeah, I think that's a really important shout because I fully advocate for having a structured hiring process, trying to think really carefully about what you're looking for, measuring as much as possible, and designing it for success in the first place to get this kind of predictable outcome. But then I think back to some of the, let's say, more interesting characters that I've worked with over the years who, in any kind of normal hiring process. Would have not gotten any jobs at, they were like far on the end of a spectrum of ease to deal with and communication style, if I can put it that way, and they were able to get hired because they dealt just directly with a technical co-founder who could speak their language. They didn't care that they were going to rub everyone up the wrong way because the value they were going to bring to their business was so astronomical. It's who cares? Is there something to be said for? Almost like carving out a place for the outliers in the hiring process. Or maybe also it's a thing of if you're a bigger, more established business, do you really need the outliers? Is that more of a, it's more of a benefit if you're a startup and you're trying to do like a moonshot kind of project.

KAI: Yeah, there's a lot of ways to take your question, but I'll interpret it this way. I do; there absolutely is space for outliers, right? And I think I am a full believer in this, in, this is neurodiversity is what we're talking about. But diversity generally, so long as the friction of being diverse stakeholders can be managed, I think diverse teams always deliver more realistic outcomes. And so if it means that I need to be more, I need to have a broader range, dynamic range of people, relationship management as a leader so that I can manage and have a successful diverse team, I think it's well worth it, even if my job is just a tiny bit more complex. And again, I keep thinking about the data scientists, but these folks are a different breed, right? They're in the office in T-shirts and sweats and sneakers. And then I've got X management consultants wearing white dress shirts all the time and sometimes a suit jacket, right? But the fact that both sit within the same department and actually sometimes both work towards the same project, same objectives. makes that ultimate solution of a firm believer much, much more valuable, right? The management consultant alone is going to have a lot of PowerPoint slides. The data scientist alone, we're going to get something that no one understands how it works or if we have to believe the outcome or not; just conceptually pair the PowerPoint wizard with the data science wizard. And then finally you have something that business leadership believes and understands and is impactful, right? As a very dumbed-down example. You do need space for those outliers, as you call it. Absolutely. And I think any team that tries to create a homogenous environment in analytics, I would challenge that the long-term sustainability of that particular decision. Because you might get a team that moves faster, but then without sufficient context. Or you might have a team that loves the context but has no hands-on keyboard power to actually come up with the sophisticated solution. So you do need a balance of both, I think, particularly in analytics.

TIM: You'd mentioned earlier on almost favoring, let's say, fresher candidates with maybe less experience, but with almost, dare I say it,. Digital native, AI native kind of skill set and mindset. Is it just easier to learn new things when you're younger? Does it get harder as you get older? Yeah. What's your thoughts around that?

KAI: So there's something to be said about, like, neuroplasticity, right? How, indeed, I think physiologically, it's easier to learn things when you're younger. However, the bigger part of that, I think, is if you're coming out of school here, your primary job was to learn, experiment, and try things and fail in a safe space. But if you're mid-career, 20-something years in your job, you're actually incentivized to fail. Follow the rules, actually. Operate within the guidance; meet the requirements. And your requirements are given to you. Neuroplasticity is the physiology of learning, and it is different when you're older. But I think the context, the environmental context, is more important, which is, for 20 years, I have been incentivized to follow the mold. No, the young folks coming into our, into the careers are incentivized to experiment, and experimenting in particular is going to be very important when it comes to analytics and also all the AI developments that we have nowadays. So I think it's a combination of both that causes. Creates the perfect ingredients for the young folks of this generation to really be the rising stars in the field of analytics.

TIM: Is there also something to be said for, and I'm trying to think back to when I was a fresh grad and I just started jobs, is that I was not leveraging any experience in trying to solve a problem because I had no experience. So almost like everything was first principles, intuition, I'll Google it, I'll ask someone. I had no preconceptions, no prejudices, and no biases in a way about how to solve, oh, okay, I'll just build this financial model; I don't know, I'll Google it. Is there, yeah, some liberation in not knowing a bunch of stuff?

KAI: Yeah, that's a perfect way to say liberation and perhaps not overthinking, right? Absolutely. That's true. I will say, however, and this is quite anecdotal, but I, my team knows I am terribly going offline. Even when I'm on holidays, once in a while, I'll just, I'm on holidays, Christmas holidays, and they'll see an email from me. And the one caveat I'll point to this, I'll put in front of this liberation from overthinking, is if I know that I've also been on this path of overthinking before, or if I know that there is a shortcut because I've seen it before in another context. I will suggest that perhaps you can already start at step C, as opposed to figuring out A and B yourself. Sometimes there is learning in letting, particular young professionals go through their own, trials and tribulations of their A and B, however difficult. But then as a people leader, you have to make that decision. Is it? Is their learning at that particular point worth the expense of their time? Or actually, should you just save their time and tell them what the shortcut is? So you have to balance it. And I both are a healthy combination of both is necessary.

TIM: Yeah, you remember the lessons you've learned through your own trial and error much more than you do from a well-meaning bit of advice from anyone. No matter how good a learner you are or how receptive you are to feedback, there's nothing like getting punched in the face yourself to realize, Oh, that's not going to work.

KAI: For sure. But I'm learning with my seven-month-old now, right? Like, how many times am I going to let her head smack the floor before I go, okay, maybe we should. Maybe we should move.

TIM: Yeah, it's an extra level of stress beyond helping your team navigate around the analytical challenges. What I'm also struck by is taking this, let's say, structured approach to hiring and setting up almost like a hiring engine to give us, like, a pretty good chance of selecting the best candidate. I always come back to and think about sports. And, like, the amount of data you would have in football or baseball or whatever is exponentially more than what we'd ever have in hiring an analyst. And the risk is, and the investment is like 10,000 times higher. Yet, I've heard so many anecdotes of managers in football or other sports. You go, Yeah, we spent a hundred million pounds. The first training session, we knew they were done. Like an hour in, and every player is like, Nah, they're crap. So even at that level, they can still get it wrong. So I'm always humbled by that, in that no matter how bad my hiring positions are, at least I haven't spent a hundred million pounds.

KAI: Yeah, for sure. Not yet.

TIM: Not yet. One day, maybe. Actually, I should ask you a related thing. So you do investment as well. I'm interested when you are looking at entrepreneurs and their pitches; the fact that they're outside of the industry and almost have no experience in a domain. Do you ever view that in a similar way you would to a fresh candidate who almost doesn't have the baggage of working in that domain or industry?

KAI: That's a pretty interesting question. But indeed, there are parallels between what you look for in a founding team of a startup versus a candidate for a corporate job. Then I'll say it depends on the stage of the startup, right? Through my investments, we do very early-stage stuff. So pre-seed-type startups, and there it did, years of experience in the particular industry that this company chooses to operate in across all founding members, not necessarily a prerequisite. I actually do agree with you on the diversity of the experiences at the early stage. Is perhaps more of a critical factor in this long-term success of a startup. Then 30 years of experience in one particular field. I will say, however, that for this founding team you want, you really want a builder. Typically, as a technical person, you want a marketer, a communicator, and a good people person, a leader, right? The one that will pitch in front of you, right? And very rarely do you get all three in one, and most of the time you might get two of those skills in one person, but actually I care for a well-rounded founding team. You do look for all those things, however, on the board of advisors, which is very in vogue nowadays, but all startups need to have a board of advisors or by some other name; this is where I will look for really deep. Industry knowledge, right? If it's going to be a startup working in the Medicare space, for instance, or the medical devices space, I do want to see that they've got doctors or professors, doctor professors, or people with 30 or 40 years of experience in this particular medical niche. So I know that this team knows what they're talking about. Converting it back to the corporate world. Yeah, it is. In one candidate, you are looking for, indeed, I think, for the young candidates, more the skills and the energy, and the most important skill is the ability to learn; then I am looking for in-depth knowledge. This takes me back to consulting days, but we had a brilliant consultant, but her background was astrophysics. But she could just really solve challenging commercial problems, actually, commercial strategy problems for companies. But her background is astrophysics. Actually, I think it landed a little outside the box versus the typical mole, but I think on the teams that she was on, it probably made the experience a whole lot richer for everybody involved, including the client. And plenty of examples like that. My undergrad was in human geography, right? It was only in my master's where I started dabbling in business. And I would like to think that the undergrad was not wasted, that I think about how people relate to the lands, the physical geographies that are around them, and you have certain tendencies. Why? What stereotypically, right? Why are Italians always flexible with the timing? You can grow a tomato any time of the year on the peninsula of Italy, right? So it doesn't really matter. We don't need to plan for winter. Anecdotally, why do the Nordic countries prefer more socialist governments? They need to prepare for winter. You need to stock up as a community on all the grains of the year so that you can survive the winter. Just anecdotal stuff like that, I hope, still has some relevance in the business context. I hope. Long story short. Yeah, I think there are good parallels to be drawn, particularly around are we looking for skills or are we looking for depth of experience in certain fields between startup investing and recruiting corporates? Tim mentioned speed and speed of learning, willingness to learn as being this important factor. And indeed when. We were lost thinking about who we were hiring into our company. Spend a year and a half hiring different people with varying levels of success or lack thereof. And so I sat down and thought about, okay, what are we really doing here? Who are we trying to attract? What are our values? We meditated on it, and we settled on figuring out, yeah, okay. The single most important thing for us was could they learn new things quickly because they were inevitably going to have to do all these things they hadn't done before again and again. So we just wanted to really over-index for that. And I feel like that's maybe now even more important if we're in a state of ludicrously fast technological change; you can't really, if you're still sitting there writing Python from scratch line by line in a year, be insane. You have to change yours. Ways of doing things and ways of thinking, which is hard. How do you evaluate that in the hiring process, though? That's really what's difficult. Like, how do you get a clue that they're willing and able because it's so easy to talk? I'd learn about this. I did this online course, but it doesn't really mean you've changed your mind. Yeah, let's answer this from two perspectives. One is from the candidate's perspective. One is from the team's perspective. From the candidate's perspective, again, how they present themselves in a case interview or answer your questions. Let's give you a hint on how sharp and agile their thinking is, right? Particularly if you force a problem set on them from a case, you do see their thinking. So that's very easy on the candidate's side. How do we tell if someone can do that? Reading a CV only gets your foot in the door, right? But there are so many great CVs out there now. There are a lot of online tools. ChatGPT will help you write a beautiful CV, right? You can tailor it per industry. Tailor it per geography. It'll be perfect. Don't try that now. A lot of chat GPT, German, Serbian, and CVs. I think the other factor we need to think about as hiring managers is going to be on the team side. Are you familiar with American football?

TIM: very vaguely.

KAI: Okay. The difference between, like, soccer, FIFA soccer, and European soccer is that everyone is super specialized in American football, right? So specialized in different body types. Are trained so that you have different roles. Your quarterback has to have a great throwing arm. Your linebackers need to have the back of the line. That'd be big dudes that hold the line, essentially. As such, specialism, and now translating back to what we're discussing, we also have to think about, okay, what role am I trying, what gap in my team am I trying to try and apply? What role am I trying to fill? And it's not that I need everybody to be quarterbacks; it's not that I need everybody to be linebackers or running backs or wide receivers. I need to think, okay, can this person fit this particular gap in the line that I have? Given how and, given the degree of neurodivergency, given their degree of education or experience, and then consequently, the best answer I can give you is that it depends, right? It depends on the candidate and the hole you're trying to plug. But a lot of things to consider, and both are dynamic, right? Maybe you can shift the whole thing; you want to plug a little bit left and right to meet the perfect candidate. But also maybe you can expect that candidate to, if you have, if learning is his or her greatest talent. That the candidate can actually fit more roles along that line than that with the gap, my analogy is failing here, but you get what I'm saying: learning really is a force multiplier in that it gives us, as the hiring manager, flexibility in the long run to deploy a person.

TIM: I wouldn't mind drilling down a bit further on the candidate evaluation bit. Because I feel like there's something to be discussed there. So you mentioned. You felt like that wasn't that difficult a challenge in figuring out how maybe adaptable, how quickly they could learn, because it almost came about as part of the case process. Is it where you're changing the parameters and seeing how they adjust? Is that like, how do you get to the bottom of that? Understand this: you understood this phenomenon, but when you were in consulting yourself, when a partner or someone very senior walks into a room or a CEO from the client side walks into a room, you know that they're the CEO or the managing partner, even if they haven't said anything, right?

KAI: Why is that? What is it about their executive presence that suggests that they know what's going on? And, but that's almost ethereal, right? That's almost, you feel it, but you can't touch it. And what is that? That magic sauce, that x-factor in an interview. I also couldn't tell you directly, but I know when I see. If I had to, if you forced me to, if I had to try and describe it, it would be a combination of how sharp they are sharp is going to be jumping accurately and quickly between between points they want to make and also how comfortable they are in an uncomfortable situation. The consulting board's got the airport test, right? If I sat at an extra bar at an airport for 15 minutes, would I still be entertained by this guy or girl, right? That's, there's a lot to be said about that, but I think there's something there in that. If I can't, if I can't see myself working with this person, or at least being entertained for 15 minutes and, extrapolating working with this person for many years, I also don't think then that they are of the correct profile to be in my department. So it's, yeah, it's not the perfect answer that you're looking for, but it's probably the best one I can give you. It's a combination of things, but ultimately, my first consulting partner told me this. We want the client to trust us with their baby, right? The quicker you can be trusted with their baby, the quicker you know that we can have a real impact. Because up until that point, all the tests, all the questions are just assessing if you can be trusted with the baby. Only after you get the trust do you actually get to have an impact. And, applying to our context, what does it take to get that trust that you know what you're doing? It's different in data science, different for the project managers. But there is always going to be that element of, okay, you know what you're doing. I like what I see, and I'm an individual for my own experiences. Perhaps I see different factors, and I value different factors than you might or any other hiring manager. But ultimately has to pass some sort of gut check, which is, you know, what? I'll trust this person with my baby with the department, with the responsibility.

TIM: I wouldn't mind sharing how we thought about this problem. Last time we were trying to hire software engineers specifically. So we, we're of the view that like actions speak louder than words and that certainly for not a particularly good interview, it's very easy for a blagger to just say a lot of things without really backing it up. Especially those who would come across as especially confident. Extroverted, handsome happens to like the same things as us and almost unlocks a little bias. Like, I can think of the candidates over the years who've, in the first five minutes, spoken about football and a certain football team or whatever. And suddenly I like them. They've got a big, nice smile, but those who are good at unlocking that I could see how they could game it quite easily. But what we came up with was, could we give our software engineers? a little algorithm challenge but in a language that we knew they definitely wouldn't already know. So we chose R, the statistical language, because no software engineer would ever know that. And we just gave them this reasonably simple challenge because we thought, what could happen? Like some of them might go, Why are you asking me to do a program in R? Like, R is stupid. That's for statisticians. I don't want to do this, which would be a negative signal because they're not even interested in learning. Others would try to do it, just fail, which is worrying because it was a fairly simple problem. And you can just Google R, and you can go on YouTube; it wasn't that hard. This is pre-Chachapiti. And yeah, we, we gave them this challenge under the idea that if in a day or two they could figure out enough about a new language and build a tiny bit of software that functioned. Amazing. They're just going to repeat that again and again, working for us. And that's all that we needed. So that was how we came up with, at least for software engineers. Getting them to demonstrate an ability to learn, but I don't know any way to scale that across n roles. Because you can't give a marketer an R test, certainly not in the days of JGPT.

KAI: R, by the way, is a very valuable tool to learn, I'd say. We do a lot of statistical modeling nowadays, so it's quite flexible as a language. I like your example. I think in that context, it's appropriate of software engineers. And I think it's not so different in the concept for business folks, a case. I see totally the value that you guys have there with that approach.

TIM: Where are we going in the next few years? Because I feel like the tech is changing so quickly and developing so quickly. What is an analyst? What is a data scientist even going to be doing in a couple of years? Is it the case that? The sort of mix of technical versus business skills is going to change, you could argue; maybe if programming is now something you do with an LLM, you don't have to write a line of code from scratch. Maybe that bit's gone. Maybe the softer skills are relatively more important. Do you have any view of what's going to happen? And if so, does it inform the profile of the candidate you're currently looking for?

KAI: It's a great question. It does not yet inform the profile of the candidate we're currently looking for, except for the fact that I'm biased towards more fresh grads. Funny enough, I asked someone, one of our data scientists, just asked me this question this morning and chat just You know, just curious, my opinion is we're going to get… Let me start on one particular topic. So AI, what's going to happen with their and in regards to technology rules? I think you're going to get a whole, the AI will, in its essence, replace quite a few process-oriented rules. However, it'll create some new ones in response, right? You're going to get, and I think the technology I'm really bullish on is going to be agents. Which are more or less, like a really scalable first year analyst today, but in the future there might be a really scalable college professor, I don't know, right? But agents will require trainers. Actually, trainers, in skill set, are more like HR people than they are programmers, right? And that'll be the really interesting paradigm shift, which is programming now going to be fully the role responsibilities of open AIs or deep seq or these really, these AI factories. And then what's the role of technology teams within businesses? I think a huge new role that will be created will be these agent trainers, these agent managers. Because the productivity of the agents still needs to be managed and contextualized in each context of each company. Although we buy it from OpenAI or, God forbid, the deep sea someday. But. That's quite new, right? Someone specialized in managing AI agents. I think absolutely that will be a role in the next, definitely the next 10 years, if not five or even shorter time frame than that. In regards to the impact on analytics, indeed, you have a very scalable first-year analyst. All right. If I look at the questions that most of my department of 30 gets on a daily basis, I would say 70% to 80%. I'm spitballing the math here; these are stuff that an AI agent should be able to answer. Show me the turnover of this country in this year. My goodness. If you just spent five minutes on the intro intranet and just searched, you could find that yourself, but it's easier to do it on Microsoft Teams. Ask one of my analysts to do that, right? By all means in the near future, okay. An AI chatbot will be the one that responds to you for these types of questions. And already that empowers the organization quite a bit. Do first-year analysts have to appear for their jobs? In some sense, yes, but in others, no. There will I'm bullish, and I'm optimistic on this technology, but I believe that if AI answers the more routine, mundane, boring tasks, then the analysts are freed up to go deal with more high-value activities and do a transformation project over several months. So if you're freed up from answering these daily questions, go pursue this path of exploration. Which could lead to a lot of value that you otherwise would not have the headspace to pursue because you're again freed up on the mundane. That's where we're always going to need business analysts because the context of business It will never be—never is a strong word—but it will be very difficult to fully, in full fidelity, duplicate and clone in any digital way, right? So anywhere there's a human relationship, that would be hard to replicate and model fully in any sort of AI. So we're always going to need analysts, but I think the analysts need to be more specialized, need to be more specialized on exploration on high-value-creating activities, and less on maintaining the mundane. One more anecdote: early in another life, I was a Canadian Army cadet. And so I have some purview still on military matters. If you look at what the U.S. Air Force's approach to drone warfare is, and drones in Ukraine, of course, are very popular nowadays. The U.S. Air Force says we're going to use drones instead of a pilot, and the buzzword now is optionally manned fighter. We're going to use a drone instead of a pilot when the task is mundane, when it's repetitive. And when it's dangerous, now we work in agriculture, and I'm an analyst. I don't know if dangerous will ever be a thing, but again, if the task is mundane and repetitive, like the Air Force says, I would also believe that in due course, it could be automated away towards AI agents, but then it frees up the pilot to go do more high-value things, just like it should free up our analysts in the future to go do more high-value things.

TIM: Yeah, I feel like one way to think about it is, what proportion of all the decisions that have ever been made in our business in the last year were perfect? And right, I would say much closer to 0 percent than 100. So there's a lot of upside for better analytics. And how many things are we not even measuring? Yes, like at all. That is going to take, I would have thought, years of additional analytics work that, yeah. Yeah. If we liberate our analysts away from the mundane, what is X, what is Y, when it literally already exists in a dashboard and we just need to point the person towards it, surely there's a pretty substantial unlock there. And these exact same people, with the same current skills, could just do a lot higher-value work.

KAI: For sure. And as a, it's just a small paradigm shift that, for the field of analytics in general, I think will happen because of the widespread adoption of AI in analytics. You, what are the stages of analytics? You've got descriptive. Then you've got predictive, then you've got prescriptive. So I'm not just telling you what's going to happen. I'm not just telling you what's, sorry, I'm not telling you what's happened. I'm not telling you what's going to happen. I'm telling you what you should do because of what's going to happen. And then now you have to take even further steps. You have just a definitive, right? I'm going to do it for you. I'm going to do it for you. You just have to hit approve. And ultimately a fully automated decision-making in areas that don't matter so much. Humans cannot participate in, for instance, high-frequency trading on stock markets or other commodities. But in the business context, it's scary to let the AI decide all these things, although in some, lots of examples, the automated analysis is much more accurate and quicker than what humans can do. That'll be a rule for humans for many more decades, which is AI whisperers, right? AI managers: Is what the thing is saying actually relevant and correct? And that I don't know how many centuries it will take to automate out of our human psyche, really.

TIM: One framing that I think is always helpful for people like you who have small children is to think, What are you going to recommend to your daughter? If you had to pick one of these two, Hey, Amelia, you really need to focus on your AI skills or your people skills. If you had to take a prediction of which is actually going to end up being more valuable for her, where would you put your money?

KAI: How do people? But I would do people skills with the context of some sort of industry. So let's, I, this sounds funny, right? The dominance of AI and the prevalence of AI in the near future will make programmers less and less relevant, except if you work for OpenAI or other AI factories, so to speak. Because programming will be offshore, but for Amelia in 20 years time, when she's looking at her career, AI will be just the fabric of how we work, right? It's just so embedded in our data that she won't even think about it. She won't know a world before AI in a very real sense. So just like we take the internet and Google for granted, I think she'll take AI for granted. So does she need to be an expert in AI? I'm not so sure in 20 years; I think she just needs to be a participant. In the digital world of 20 years from now, and I could be, I could be drastically wrong about that, but I would hope then that she actually gets more passion from a subject matter, and then AI is then the solution or an enabler of her realizing that solution ultimately. My perspective on that is she could also be a nerd and love AI in and of itself. I don't know, but I think for most folks, it's a bright future. If you're passionate about a topic. AI in the future should help you reach it.

TIM: Yeah, I'm certainly very optimistic and bullish myself, although there are some issues we'll have to resolve, but I'm a net optimist, I'd say, on AI currently. Kai, if you could ask our next guest one question, what question would that be?

KAI: about anything. Okay, presuming that they have some background context in AI I want to know what they think about the possibility of Skynet. So we just talked about the really obtuse positive elements of AI, but actually there is an underlying element of risk in how we, as a society, coexist with something that one day may be smarter, quote unquote, whatever that means, than us. It's already faster processing than us, right? But faster processing and agency in the, quote unquote, physical world might also be Amelia's adulthood, right? In 20 years from now. What might that look like? Science fiction has taken lots of attempts at answering this question for better or for worse, but I'd love to have an expert take a shot at what this means in their context.

TIM: I will level that question at a guest of ours next week and see what they say. It's been a great conversation today. We've covered a lot of different ground. And I think our audience is going to be a little bit richer for having heard your thoughts today. So thank you so much.

KAI: No worries. It's my pleasure. Thank you for the opportunity. Always happy to chat with folks passionate about analytics.