Alooba Objective Hiring

By Alooba

Episode 12
Yoav Reisner on the Talent Acquisition in Israel: Challenges and Best Practices

Published on 11/15/2024
Host
Tim Freestone
Guest
Yoav Reisner

In this episode of the Alooba Objective Hiring podcast, Tim interviews Yoav Reisner, Head of Data

This Podcast of Tim and Yoav explores the multitude of challenges and shifts in the Israeli tech job market influenced by global events such as COVID-19, the war in Ukraine, and the evolving landscape of remote work. The conversation touches on key areas like the importance of structured interviews, the impact of military experience on job candidacy, and the paradigm shift from prioritizing high salaries to seeking job stability. The speaker provides an in-depth analysis of how to attract the right talent by showcasing company stability, the necessity of structured interviews to mitigate bias, and the importance of being prepared to fire quickly if a hire isn't the right fit. The episode also delves into the implications of tools like ChatGPT in both enhancing and potentially complicating the hiring process.

Transcript

TIM: What do you think the biggest challenges are right now in attracting talent to your roles in Israel, and have you found any strategies that are especially important or relevant in the Israeli market compared to other countries, perhaps?

YOAV: So the Israeli market, I think, is a good reflection of the global market in many ways in the sense that we were having a lot of layoffs here, especially in high tech, around coronavirus times in 2022 and then the war in Ukraine, and generally speaking, a lot of investors pulled out in a sense, which caused a lot of layoffs, and then those layoffs caused a culture of layoffs within companies. Without talking about that a lot, the interesting thing about Israel is two things: I think what makes Israel unique is that we're currently, without expanding too much on the subject, in the middle of a war, so the war is just another wedge in the economy as a kind of a continuation of recent global events. which causes even more layoffs more instability, less investment Although I will say that investors that are used to investing in Israel are continuing, so that kind of saves some of it, so interestingly, like Investors moved from a mentality of investing in growth to a mentality of investing in profitability. Also, employees Moved from a mentality of finding the highest salary to a mentality of finding stability in a sense So lately I've been seeing more and more employees asking in the initial interviews, which is new to me, some questions along the lines of, Is the company stable? Did you have layoffs recently? So these are very interesting shifts, and I think You can attract talent by showcasing stability as a company, and you need to think about how you can do that. And additionally, something unique about Israel, if I'll just talk about the Israeli market in general, is that this might be a global trend I'm not sure, but we're getting more and more non-academically oriented candidates, which means that they either did boot camps to learn a certain profession and then maybe some kind of experience It really depends if you're hiring a senior employee or a junior one, and this is very new to me personally. I'm getting more and more people that just get released from the army. In Israel, it is mandatory without going into the war. Whatever, it's always been mandatory, right? Even in peace times, and I'm getting more and more people just getting released from their kind of mandatory term. And they are doing high-tech, relevant positions in the military, which is very new to me, and they know how to phrase it, and their positions are also within the military, which are spoken of as high-tech positions, right? So this is the army's way of trying to become more advanced, so I get more people now coming into interviews saying, Look, I'm going to be released from the military in a few months, and I'm a product analyst in the army, or something along those lines. and this poses a big challenge actually for me because it's already hard enough. I'm from an academic background, which is my bias in a sense. It's easier for me to get along with and understand people that come from the academy because I know what they've been through. I know the different—it's more of a holistic approach of knowledge. And then it's hard enough for me to understand what people that went to boot camps know or don't know; people coming from the military, it's a huge challenge they're dealing with; sometimes they're not even capable of speaking about what they did, so you start this conversation about different skill sets, and you're trying to understand. So I'm answering several questions here. I feel at the same time, but as a company, if I'm trying to attract someone, I think the biggest thing that the different biggest differentiator I think right now is stability and options. I think I feel like we can get away much more with salary than we used to be able to. And also it's important to say that Israel is very geographically sensitive to where the company is, so if you're in the center of Israel, the salaries are much higher, and also real estate is much higher, etc. If you're more in the periphery, which is the case with my company, our main offices are in the southern area of Israel; you can get a lot more with kind of lowering salaries or doing lower salaries and promising kind of growth, etc. So that's, I think, my best way of answering about how to attract and then how to assess is a totally different story.

TIM: That's a great analysis of the market. One thing that struck me as you were discussing that was I was just trying to think about the labor pool and trying to compare it to a country like Australia, so in Australia I haven't done the analysis, but I would have thought Probably the majority of people who work in tech and data would be from overseas, and they've come here as students and gotten a skilled visa, or they've come on a skilled visa migration program. And so we had this strange period throughout COVID where we had complete lockdown and no migration at all for two years, which created a very unusual situation where there was basically 0 percent unemployment in the market and a massive increase in salaries, and now I've seen a return to normalcy since you've had more migrants come in to address that balance. What about in Israel? Are most of the workers in Israel in tech Israeli, or are they coming from other countries?

YOAV: Most of the so I don't have statistics about this. It's important to say that I'm giving my personal experience here, but my experience is that the vast majority of tech employees in Israel are not outsourced, right? So internal programmers analytics data scientists, not salespeople, not customer support, etc. The vast majority are Israeli, I think by a large margin, and so COVID in that sense had less impact. The most impact that COVID had was being an accelerator of the concept of working from home for Israel, and I do think that it pushed some companies to go into outsourcing certain aspects of the company, but obviously not the main competency of the company, which is usually R&D in tech. So R&D is, I would say, 99 percent Israeli in Israel. We don't have a lot of migration. I think that makes sense because most people that decide to migrate to Israel, they're not doing it for work.

TIM: It's for cultural reasons.

YOAV: For cultural reasons, the main cases that I've seen people actually going into Israel are the exceptions. I do know that there are some programs in the academy that accept a lot of people from abroad, but I don't think there are a lot of them in the work market. I think most people that work in the Israeli market are people that, for cultural reasons, for religious reasons, for any kind of reasons, they decided to do what in Israel is called to make aliyah, which is essentially to move permanently to Israel and become an Israeli.

TIM: You mentioned earlier a really interesting problem I'd never thought about, which is Thank you. When you're interviewing candidates who've come from a work environment that's inherently secretive where they can't really divulge the details of what they've done, how do you go about evaluating their skills and their experience in an interview in that kind of environment? silence

YOAV: Build a structured interview. I think it's very important that you do the same, mostly like 80-90 percent the same interview, to all candidates. Otherwise, I feel it becomes very hard to compare the interview itself and the assessment of skills outside of talking about what they did, which is a good thing, but talking about what they did and their experience and what they know how to do is very hard, so I just become much more reliant on knowledge questions and assessment. Then I do understanding past projects; right now I am trying to figure out a better way to do it because I don't want to address them as people that don't have work experience. Which is what happens when you don't understand what they did. I think it's a little bit unfair, but the army itself, which is most of the cases where they cannot really divulge what they did, did not find a good solution for this. I think the military is still getting used to seeing itself as an employer of tech workers in this sense, so I think it's going to take some time, even culturally, for Israelis and Israel to get to a point where we're able to hear about a certain role because there are very specific You're in a certain unit in a certain part of the army, and anyone who really knows what that means You're if you're in intelligence, it's much it's a different story than if you're in the navy, right? It's not hard to understand, so that gives you a good grasp, but inside right now the big mess is that most of these roles are inside intelligence, and most of them don't have anything to do with intelligence. so you could be Helping other units in the military develop their B I or something, and it's very hard to understand what that means, especially because the situation in the Israeli army in terms of systems, so information systems, it's honestly, personally, I would say my personal experience is that it is a big mess. Each unit has something different. It's a huge organization; it's very important.

TIM: like any other bureaucracy in a

YOAV: Exactly, exactly. In any huge bureaucracy, it becomes very complicated and very inefficient in this sense, so also the people that work in these environments, they struggle a lot; a lot of their experience is struggling with inefficiency in a huge organization, which I assume might be relevant for certain companies but not for me personally. so they can talk a lot about that, but it's not so interesting to me.

TIM: You touched on something earlier. You mentioned that for your interviews, in general, you like to follow a more structured approach because you found it easier to compare candidates. I feel like this is worth unpacking a bit because I see a lot of companies still doing a very ad hoc kind of approach, and so I would be interested to hear more about why you do structured interviews, what that really means, and how you feel it's better than an unstructured approach in some cases.

YOAV: It is important to say that I have some history in recruitment. Specifically, just to give you a brief overview in one or two sentences, I was a recruiter in the military, so I was responsible for interviewing people that were going to be recruited in the future and determining whether or not they were fit for certain roles. And this was a non-technical interview. It was a first personality, you could say, interview in a sense, and I've learned a lot about interviewing from that period of time. I did a four-month course in the military about just about how to interview people, so I have some past in that regard, of course a natural transition for me. I worked in a big recruitment company after I finished the military, and at some point I realized that I'm much more attracted to the technical aspects, and I've ended up learning industrial engineering with data science as the main thing, but I still have that background right in some area, and I think you know structural interviews, and having a structured interview is extremely important. I feel that the ad hoc approach just allows you to sneak in so much more bias without noticing. I think bias is not something that you need to run away completely from. I think having a good feel if this is someone you'd like to work with or not is important. It's not like bias Bias is a very negative kind of word, right? But I also pull into that just feeling whether or not this is the kind of person that you would be having a good time working with, which is a good thing, but I feel that the less structured the interview is, the more you give certain biases more of a chance to sneak in unchecked. Let's call it like that, so you might have had a bad morning or a good morning, or the coffee machine didn't work okay, or the person going into the interview was a little bit tired or stressed or whatever, so that will just take a much bigger chunk of your judgment, and you would not know to say that's what caused you to decide these things. So I think the important part is to understand why you think what you think about the candidate, right? So it's okay if you say, I don't like that candidate because I think I would not have a good time working with him, but you need to be able to say in the same breath that he's pretty good technically, right? If you just have this vague idea that this guy is not good for the company, I think you might be missing a good reason why maybe he is the best candidate if you do need to recruit right now. A structured interview gives you a much better way to keep yourself in check and a much better way to compare candidates to each other. I do think you need to keep it semi-structured in the sense that you should not be robotic, right? If a candidate says something interesting that's unique, To that candidate, you should do a follow-up question. You should try to understand that person specifically; you should not avoid having uniqueness in an interview, but having a skeleton of an interview, I think, is crucial. I personally really don't like people that freestyle and then just give you the halo effect of how they felt in an interview, and I will say that while I like a structured interview, I do like questions that are built into the interview that are part of the structure, but they're not really structured within themselves. So I like logic questions. I like to combine let's talk about an analyst, for example, so I will ask some SQL questions, right? There is a right answer, right? Although I will say that even in SQL questions there are many aspects that I'm looking at apart from whether the output is correct or not. So I like to look at how the person organizes the code; I like to look at how much time it took to get to that answer; I like to look at how the person addresses the problem. So personally, when I'm writing an interview—and this is also important—I write every part of the interview before I start a round. I have a document with all the questions that I want to ask, and for each question I have a set of parameters that I want to measure, let's call it, and I also like to take it one step further, and I like to define a scoring mechanism for each parameter, and all of these things exist to safeguard from, essentially, in a big sense, bias. but really the bottom line is that I'm trying to make as many objective decisions as possible. It's impossible to make it completely objective, but if I have very focused, very well-defined parameters that I'm scoring, I feel that Boolean scoring is very good in this sense because if you're saying, Okay, did this guy in the SQL was organized or not? and you say yes or no, and then you have five parameters, and you go yes or no. The thing that solves it, and I found personally from experience, is that if I'm giving like a 1 to 10 score, I find myself going back after five interviews, really wanting to change the scores of the past candidates simply because I'm still calibrating, right? I might have my own concept of what's good or not good, and then I have three or four, and I realize the test that I've done is much harder probably than what I thought it was, and I was much harder on these first candidates than on these last candidates because I calibrated a little bit, and I feel that's okay if you're aware that you need some calibration. We've seen a few interviews, but if you try to keep the parameters and the scoring mechanism really simple and really on point, you need less of that. So what I was initially saying was that I do have SQL questions, et cetera, but I also really love logic questions, so I can give you two questions from my last round of interviews that I think are a good example. one type of the question was popularized in Israel by an app attribution company And before I do an interview, I ask around people that friends or colleagues or past colleagues try to understand what are the questions that are being asked in the market, and a question that I heard a lot from analysts in Israel goes like this: it's a nice question. They ask how many apps you think a person has on their phone on average. Now, this is a highly vague question, right? It's the sort of question; it's the category of questions. I would put it in the same category as Are there more toilets or humans right? this type of question, which I personally like and I think that one is particularly brilliant because it's very easy to define well statistical moments for that question because you cannot have zero apps on average because phones come with apps and you cannot have an infinite amount of apps because there is a certain amount of storage so you can talk about distribution about minimums maximums medians versus averages so I think that's really nice to do but the important bit for me is that when you give a question that doesn't have one right answer there's like right directions of thinking about it and knowing statistics a little bit to be able to talk about things statistically but you still need as the interviewer to define them Yeah, very well-defined goals of what you expect in order to score, so in that regard I won't have done the person got the right answer, so I would say things like, Did the person know by himself to speak in a statistical way, or did he just, you know, I had a candidate—this is I do anecdotes, but I had a candidate in the last round that I asked a variation of this question. I didn't want to ask this specific question because it was very popular at the time in Israel, so I asked how many furniture are on average in a room. You know it's the same story, and I had this one candidate that just started looking at the rooms around us and counting furniture, right? So this is a good example of someone not trying to look at the big picture, trying to think about which types of rooms exist in the world, right? In these different types of rooms, what type of furniture could you find? So, abstracting this idea, you could, for example, say, Did the person think abstractly about the question, or was he highly influenced by the surrounding environment? For example, these are really measurable scoring systems. For a very vague question, did the candidate know to say that the average must be higher than the median? Right, so these are things that I'm able to score objectively even though there's no right answer.

TIM: That is an outstanding description of structured interviews and such a clear process that you have. There are so many things I want to pick into one is you mentioned here, so you could have almost like this expectation over what a good answer would be, checking off did the candidate mention this or go in this direction. What I found quite difficult in the past is if it's a brand new question that I've never asked anyone, I almost have no idea what candidates are going to say, and so it's not until I've interviewed several of them that I start to notice the patterns, and it's almost like I could only get my criteria after five interviews. Have you experienced that, and if so, do you have any ideas of how to solve that?

YOAV: Yeah, I've experienced it. It's a little bit similar to what I've mentioned about the calibration, so it's not only the scoring calibrating who's good or bad; it's also calibrating the scoring mechanism itself, you could say, so I've definitely encountered that, and in the recent round of interviews, this is an interesting actually use case of this example of this. So I've gone ahead and tried to be a little bit creative when I can, and I've decided to ask about the subject of statistical bias, and there is a very famous case of statistical bias called survivorship bias, and just for anyone hearing that is not aware of this case, There's a very famous case in World War Two where the Allies sent planes to fight Axis planes, and a lot of people lost their lives in these kinds of airplane fights, and those that returned to base wanted to understand how they could strengthen those airplanes to survive those fights better. So they started marking where they found hits on the plane, so bullet holes, right? And then they've diagrammed all the hits, and they went to the analyst of the army. They showed him all the hits and said, Okay, where should we focus on increasing armor, right? And the survivorship bias here is that most people tend to say, Look, I see a big chunk of hits in this area, right? There's a highly dense area of hits, so we should focus on that because people are used to saying, Where I see more data, this is a more important area, right? It's a bias; it's a classic bias, and actually, in this case, you really need to think critically about how data is collected and understand that you can only collect data from airplanes that actually returned to base. and so the relationship between survivorship and density of hits is actually reversed because where you have blank areas, the chances of getting hit there and surviving are actually the worst, and the less dense the area, the less likely you are to survive a hit there, and it's very hard, and so what I did in the last round of interviews was count on people not having this general knowledge, and generally speaking, I was correct; only one or two candidates out of around 20 actually knew about this. So what I did in the interview was just show this image right There's this famous image on Wikipedia of the plane with the markings of the hits, and I showed them this image, and I told them the story that I just told here, and I said, Okay, so you're the USA; I'm a list, and you need to tell me where to strengthen the airplane, and I just wanted to see where it goes and whether or not they figure it out, and the interesting part was I had a very good follow-up question to this. So after they figured out that you need to strengthen the less dense areas, if you think about it like a scatterplot, you need to strengthen the areas where there's no dots, right? My follow-up question, which I really liked, was how do you feel survivorship bias could happen in app analytics because we're an app company, right? So I wanted to see whether they were able to extrapolate their knowledge and understanding of the problem to the domain, and what happened was that in the first initial interviews, when they didn't figure it out at all, I gave a few hints if they were completely wrong about it, but in the first few interviews, I did not force them to understand the issue at hand right. If they didn't know, they didn't understand it. I was fine with it; I moved on, but then a few of them did figure it out, and so I started in the later interviews. If they didn't figure it out completely, at some point I tried to gently push them in the right direction until they did figure it out. So I did mark it out in my mind that they needed my help to get there, but I did want to ask the follow-up question, and in order to ask the follow-up question, I needed them to understand what survivorship bias is, and so that may have made me a little bit unfair towards these initial first interviews where I didn't do that. So I don't know if that's a great example, but that's what I thought about when you asked, and this is a change of the interview itself in a sense. After calibrating and understanding what kind of answers I should expect from the candidates, do I have a way to not make this too burdensome on the interviews? I think the only way maybe is to rely on benchmarked questions, so either you have a track record of questions that you've asked before, or you're doing this for a long time. You've asked; there's a bunch of questions that are your favorite questions. You already know what people know or don't know. That's one way, or if you don't have that experience, you could rely on speaking to other recruiters. Maybe you have some friends. Maybe you have some friends that have, in their companies, people that recruit data positions, and they don't mind talking to you a little bit, so you could try to rely on others experience to try to build something with a benchmark already built in. Personally, I will say that this is, in a sense, good practice, but I will probably always try to test one or two new questions each round simply because I feel that also as someone who recruits people, I need to find ways for me not to burn out in a sense.

TIM: Yeah, okay, another approach to the lack of data at the start, the sort of empty restaurant problem, might be asking some of your team, like if you have a few current analysts in the team, you could almost maybe ask them and get almost be a starting point for some reasonable answers. Thinking about it now, maybe a large language model If you just asked Chachi PT and Claude five or 10 times, it might be like a few ideas, and you can start to get an imagination of what candidates might actually reply with.

YOAV: Yeah, so that's a very good chat. GPT is a very interesting subject, and interestingly, one of the ways that I found myself trying to combat the era of chat GPT and recruitment is because I feel that it's a very important tool, and personally I feel that people that know how to utilize different AI tools are better employees. They're better at doing their job, but there's a catch because I've encountered many people that are one-trick ponies, so any problem that they have, they throw at ChatGPT or Claude or whatever your favorite LLM is, and if they are not able to solve the problem that way, they just get lost. They don't know how to Google things; they don't know how to use Stack Overflow, and this is a huge problem.

TIM: I'm interested in that. Just to interject for a second, these tools are so new. You're basically saying to me that you already know people who I don't want to say are addicted to them but rely on them completely to do their jobs. Are these fresh graduates who've almost never used them, like they're almost AI natives, or are you also noticing people have almost lost some experience of using tools they used to use?

YOAV: Yeah, so both I feel that on the people that are just now finishing, you know, their degree or just entering the market, in a sense it's very clearly starting to be the case. I've met more than I'd like people that are like that, and then also I see this is less. I didn't interview for a senior position for a while, so I cannot say about that specifically about Jachipiti. I should be able to say soon, but my colleagues I see that on them, right? So some of the people in R&D at our company, you can see that the first thing they do is chat GPT, and then they might come to me or to the CTO or whatever when they can't find the solution, and we ask them, Did you check Stack Overflow? and they're like, Ah, no. Okay, so why did you come to me at this point? So definitely I see people becoming more lazy, and I definitely see people just entering the market feeling like ChatGPT is the savior and it will solve all their problems, and this is a huge problem, and one of the ways that I've changed how I do interviews in the ChatGPT era, let's call it, is that when I build questions, especially if these are questions that I'm going to give as a home assignment, I try to build it in a way that when I give it to Claude or ChatGPT, they give an incomplete answer. and in order for them to give a complete answer, you need to push them in the right direction, so I might start with a first draft, ask ChatGPT to solve it, see how he tries to answer it, and then ad hoc try to insert something that would trick him, and I use that as the home assignment, and that really allows me to try a little bit and catch those that exclusively use ChatGPT, especially if it's an analytics I give some data, and I ask them to analyze it or whatever. I usually like to insert some big mistake in the data, but not big enough for ChatGPT to understand it's a big missing chunk or an outlier—just something a little bit strange that's not a pattern for

TIM: sort of thing that a well-trained analyst should pick up but that for AI, for whatever reason, is impossibly difficult

YOAV: Yeah, because an AI does right now what you ask him to do, and I rely a little bit on psychology here when I assume that they're just going to say they're just going to copy the question that I ask and insert that into ChatGPT, so I specifically asked that question as to not point the AI in the direction of finding anomalies. in the data, for example, and then he won't do that, so if the person is a little bit smarter than that, he rephrases the question; he looks at the data by himself first or anything he could avoid that, but I do try to keep a minimum guard against people that just throw the issue at GPD.

TIM: It's really interesting what you've observed because it sounds like, for the people who aren't, let's say, AI natives, like the really young people who've just adopted this tool, part of the end of the university, now they're working for the people who are a little bit older and can remember two years ago when this didn't exist, and they've already had a profound change in their mindset and their behavior and their habits that now their first point of call is Chattopadhyay. That's quite interesting and probably, on average, better for them. I feel like for myself, if anything, I've had the opposite bias where I should use it more, and it's my lack of patience every time I use it where I'm like, Oh, I need to keep prompting, and eventually I'm like, Oh, it's too difficult. I'll just do it manually so I feel like the people have made the full switch. Maybe they've made it slightly too early, but it's still going to be ultimately to their benefit, I suspect, but as long as they still remember their other tool set, they can't just use the hammer for everything.

YOAV: Exactly, so that's the problem, so I completely agree with you, by the way, so I'm interestingly—I don't know if this is actually related, but I'm trained as a machine learning engineer, so I understand the bits and bolts of how ChatGPT works, but still I use it less than some of these new candidates. I like to think about myself just getting a little bit old in a sense and making it hard a little bit to learn new tricks, so I'm completely with you. I should be using it more than I do, but the main problem with these kinds of candidates and people that really got addicted in a sense to using it is if they either never learned any other approach to solving issues or they forgot that there are other approaches, and I see that happening, unfortunately, faster than I would assume. I don't think ChatGPT is a very impressive tool. Don't get me wrong; it's very impressive if you know how to use it. It can be a huge time saver, but it's not You know you should not take anything that you do with Chachapiti with a little bit of a grain of salt still, and a lot of people don't seem to understand that. They don't seem to understand that ChatGPT has people use it, and they feel like it's the same as a Google search, which is very much a mistake.

TIM: Yeah, I feel like the challenge with ChatGPT is that if you ask it something about a topic that you're not an expert in, you will never be in a position to really critically understand its answer, and as a first pass, it's, Oh wow, this sounds great, but if it's a topic that a lot of people are into, you start to really dig into it, and you're like, Hang on, you missed this. You missed this; this is bullshit; this is a lie. This is a misrepresentation. You really have to almost be an expert in the area to be able to validate its output.

YOAV: and honestly, I think the only way you should use ChatGPT is if you're an expert. So what ChatGPT is where it excels in my mind is where you're able to save time by giving it kind of the dirty work. So you're trying to write maybe some function or whatever, and you know how that function should look like, but I don't know; you need to do, like, a configuration file, you know, a huge Your JSON or YAML file or whatever with a bunch of parameters, and it's going to take you some time to write it. That's where ChatGPT excels, right? It can help you really write that file, and you can validate it. It's much faster, so that's exactly where you should use it, or maybe you're trying to explain something to someone that you know how to explain it, but you're trying, but you can validate what ChatGPT does. and unfortunately I see a huge trend of people doing the opposite, right, trying to use ChatGPT for exactly the things that they are not able to validate, and this just creates the better ChatGPT becomes Less of a problem this becomes, but it's still a huge problem. People think ChatGPT is trying to give the most plausible-sounding answer, right? And that's a very tricky thing. I don't personally think that even though it's a way of limiting how you use ChatGPT, it's definitely something you should use only for areas that you have enough basic knowledge in.

TIM: Oh, it's very good at convincingly lying and seeming shiny and correct. It's almost like the world's biggest psychopath or something like that.

YOAV: Exactly, because I like to think of ChatGPT and Claude and all these LLMs as the guys that give the most nice-sounding answer in a forum, and this doesn't necessarily mean it's correct; it's just the nicest-sounding answer, so it always sounds really good; it always sounds logically correct. So it's very coherent, but it could be totally wrong, so people are not validating Chachapiti people, just blindly trusting it, which is becoming more and more the case also for data-oriented people, who should be, I feel, more inclined towards critical thinking and trying to question the underlying How was this data collected? What's happening in the data? And still these people also, they Just push things at ChatGPT; they get some insights, and they feel ChatGPT has spoken.

TIM: our new God basically

YOAV: Exactly, exactly our new deity.

TIM: So let me ask you this: so you've worked without a formal talent acquisition team to do recruitment yourself, obviously leveraging the experience you had already in your career as a recruiter. So that's helped you a lot. What advice would you give to someone in, let's say, a smaller company, maybe a startup or a scale-up, who has to build out their own team without a talent team in place to help them? Is there any kind of approach you'd recommend them taking? I guess you've already talked about the structured interview approach and trying to measure things there, but are there any other suggestions you can make?

YOAV: Yeah, unfortunately the biggest suggestion that I'm going to make is to free up a lot of your time. It's going to be a hard, long process, and because you don't have anyone to really do some kind of an initial filtering for you, you're going to have to do that somewhat yourself, and what we do in the company is we try to delegate this process each time to a different manager, right? So in the company, we're three managers: there's me, there's the CTO, and then there's the CEO, and we do the recruiting mostly ourselves, so when it's a hardcore programmer role, I'll do the initial filtering, and then the CTO will do the technical interview. When it's an analyst, I might let the CEO or the CTO do the initial Filtering just a phone call 10 minutes of a phone call or something like that, and then I'll do the technical interview, and the goal here is not So much as to there are two benefits of this one I do think it's very good to make a candidate go through at least one more manager in the company apart from yourself during the process because you might have a very big bias in terms of what you like to work with, but another person might really balance you out, saying, Look, yeah, I realize he's very much your type, but you should pay attention to x, y, and z. So I feel that if you don't have a talent inquisition, do try to pull a manager that you trust, even if he's not your type of role, right? Even if you're very different in terms of role, make him the HR, because any of us can, better or worse, try to just talk to someone and understand if he is an okay person. Should I let him do the technical interview? So I do think there is art in that as well. I'm not saying HR doesn't have a very important skill set in that regard, but I do think if you don't have an HR, you should sit down with a different kind of person other than yourself that you trust, explain what you're looking for, and let them do it. You should not do the screening and the technical interview, so that's one thing that I will say, and you should talk to that person or make him write a rather verbose summary of what he thought about that interviewee, and then again, clear out a lot of time because I feel that it's hard to screen in the initial interview, which means that you're going to do a lot of technical interviews, and that's life. And my last recommendation is I see that a lot of interviewers have the tendency to really force the entire interview, and sometimes you ask the first question, maybe the second question, and you already realize it's not going anywhere; this is not what you're looking for, so I think it's a very important skill to know how to elegantly stop an interview early on without the interviewee feeling like he's not good, which personally I don't like to do. I like to finish every question with the interviewee, the candidate feeling that perhaps they had some good things in their answer, even if they didn't have anything. I don't like to finish a question and move to the next question if the answer is I don't know. I try to force them into some type of answer just because I personally don't like to feel like I'm leaving them feeling like they truly don't know anything. Although if they started their answer withI don't know,they're already in a bad position, but I want them to try and see what they can come up with, and I also don't like to move to the next question, but you need to learn to finish a question elegantly. Maybe ask some final questions like, What kind of expectations do you have from us? Do you have any questions about the company that make the interviewee feel like there is a natural closer closure of the interview? But learn to cut interviews short; if it's a no-go, it's a no-go.

TIM: Yeah, there's no point wasting each other's time, especially as you said you can exit it elegantly. That's a great list of suggestions. One I would add to that, based on our conversation and the other people I speak to over the years, is I feel like your approach to hiring is extremely well thought out. I wouldn't say necessarily it's structured just like you put a lot of thought into each detail, and I feel like it's that planning that sets you up for success once you're actually doing it, and I feel like a lot of companies might jump into hiring Oh, we need to hire this engineer quickly. Let's whip up a job ad and start doing something. It's just this chaotic process that inevitably fails more often than it should, but it's clear if you have this really structured plan of what you're looking for, what your questions are, and what the answer to the question is going to be, you've planned out the whole process in a way that maybe you might have to have enough patience to do. You might have to carve out enough time, and to your point, expect that it's going to take a lot of time. Yes, once you go, go as quickly as possible, but with a really solid plan, you can execute and be more successful.

YOAV: I will say, by the way, as a last touch, that everything that I said is important. I feel you need to interview thoughtfully; you need to think about it; you need to plan, but hiring is very complicated. It's very hard to really know even if you do several interviews, and I try not to do a too complicated, too many steps process. I feel that the candidate gets burnt out. You get burnt out. I'm not really in favor of doing more than two or a maximum of three steps in a recruitment, but I will say it's often impossible to know really for sure, and I think people need to be better not only at hiring but also at knowing when to fire someone. So sometimes you hire someone you think is a good candidate, and you figure out after very quickly, probably, that it's not really the case. Either put into the contract that the first month is a tryout period for both sides, or whatever, people need to be better, I feel, at learning how to fire employees that are not a good fit because hiring is just half of the deal.

TIM: awesome