Alooba Objective Hiring

By Alooba

Episode 122
Sankara on Improving Interview Practices by Balancing Thoroughness and Candidate Comfort

Published on 3/6/2025
Host
Tim Freestone
Guest
Sankara Prasad kondareddy

In this episode of the Alooba Objective Hiring podcast, Tim interviews Sankara Prasad kondareddy, Head of AI and Data Science at Amplify Health Singapore

In this episode of the Objective Hiring Show, Alooba's Founder, Tim Freestone interviews Sankara Prasad kondareddy, he dives deep into the nuances of effective interviewing. With over 15 years of experience in AI and ML, Sankara shares valuable perspectives on improving the interviewing skills of recruiters, balancing technical and cultural fit, and the potential role of AI in enhancing the hiring process. They discuss the importance of crafting a balanced and candidate-friendly hiring process, the pros and cons of remote interviews, and the ethical considerations of using AI for screening. The conversation also touches on inclusivity and practical solutions to eliminate biases from hiring, drawing upon Sankara’s extensive experience in conducting hundreds of interviews.

Transcript

TIM: We are live on the Objective Hiring Show today. We're joined by Sankara. Sankara, thank you so much for joining us.

SANKARA: Thank you. Thank you. I think, thank you for inviting me. It's a well-deserved topic to have a chat about. I'm hoping a lot of insights come out of this interview today.

TIM: Yes, I'm sure they will. And I think a nice place to start is always just to hear a little bit more about our guests. Who are you? Who are we speaking to today? Because I think that'll help us contextualize a conversation.

SANKARA: Yeah. I am Shara PR, an AI ML leader, so I have about 15 years of experience in applying data science and analytics tools to solve a real-world business problem. I've started my leadership journey about four years back and then have been. Involved in the hiring process, I've done hundreds of interviews so far, and I wanted to share my experience and thought leadership in this area.

TIM: Excellent. And yeah, you mentioned interviews, and that I think is a great place to start because I think for most companies, for most roles, really the interview is their kind of key evaluation tool they're using, even if they might have some skills tests, IQ tests, personality questionnaires. At the end of the day, most of the time, it's the candidate's performance in the interview or interviews that decides whether or not they get the job. Now, I feel like one big factor in this equation in how effective the interviews are is the interviewer, the person asking the questions. And it's a skill that's maybe overlooked, I think, in some ways, like I never received interview training or hiring training. What's your view on the kind of quality of interviewing? And do you have a sense of what kind of good versus great versus okay interviewing is?

SANKARA: Yeah, look, I think interview is a skill right that needs to be always be improved like any other skill. So you typically don't get a course or structured training on that, but effective interviewing requires preparation and understanding of the roles and the ability to not only evaluate candidates technical fit but also cultural fit, right? So it's a skill that can benefit from ongoing training and practice and is involved in the process. In my experience, companies definitely have effective frameworks in place to rely on a process and not rely on a single person's judgment, right? Eliminate any backdrops of one person, right? The ability to interview well is a learned skill that requires effective interviewers to know to ask behavioral questions. Also, on the top of a technical fit, behavioral skills, and behavioral tech techniques like the STAR method, try to understand if he or she is able to break a situation into an effective STAR-based framework, and so on and so forth. For example, a hiring manager might ask a generic question like, Tell me about a time when you faced a challenge, but you know, fail to deep dive deeper into specifics. A strong interviewer will ask Probing question: follow-up question to understand how the candidate thinks, handles the pressure, and how he or she will work in a team, right? So that is more important. Having a holistic question About the person, the experience, and the cultural fit, it helps, right? So that comes with practice.

TIM: I'm not sure if you can remember and cast your mind back to when you did your first interview as an interviewer. And the difference in the quality of that learned skill you now have compared to when you started.

SANKARA: Yes, so I used to focus a lot more on technical skills back then and then not really focus on whether he or she would be culturally fitted to the company. Also, try to understand the personality of the person. It was very not holistic at all, right? And then over time I realized, and then I've onboarded some of these candidates and then learned that, I think, there are a lot more things that I should have checked during the interview process, which I have failed to do, right? Then I think I started improving. improving my interview style to make sure that it's both horizontal and vertical, so I would pick up a specific technical area and then deep dive to understand if the person has the holistic understanding of the technical skills. But then I would also do it horizontally, right? I would try to hit them on the motivation. I would try to understand how they work with other teams. How do they work with the stakeholders? And try to ask about situations where they have been successful, where they have not been so successful, and then how did they overcome so on and so forth.

TIM: I've sometimes heard interviewing described as like a fact-finding mission, trying to learn more about the candidate, and maybe part of the challenge is that an interview is quite an artificial experience. Like, it doesn't really represent what you do on the job. It's a skill unto itself, interviewing as an interviewee. And so as part of the challenge, just getting past that initial mask and helping the candidate relax. And how do you think about that side of things?

SANKARA: Yeah, I think that's why it is important to start the conversations on a lighter note, right? Try to understand where they come from. So on and so forth, having a conversational question to start with would definitely help in the process. Otherwise, I think it will then become like an interlocution rather than an interview, right? So I think that it is very important to cheer up the candidate and then make sure that they are comfortable in the conversation rather than being, like you said, more mission-oriented in the conversation.

TIM: One of our guests that I chatted with last week said something interesting that struck me, sorry, which was that maybe there's something to be said for remote interviews where the candidate is at home in the comfort of their home, their bedroom, or their home office. They're relaxed. It's an environment. They can control it; you know that everything's fine. And maybe that might help to get a bit more out of them, especially in those early stages, because rather than bringing them into an office where it's two or three people staring at them, also in what my guest described as almost like a sterile environment, he said, sometimes interview rooms almost feel like the doctor's waiting room. They've got some generic artwork and these kind of gray walls, and that environment itself is just a harder environment to. You're best in, so maybe in that sense, there's almost like this upside of remote interviews. What are your thoughts on that?

SANKARA: Yeah, I think, look, I think things are evolving, right? I think with that, we also should evaluate both from the candidate side and also on the organization side, right? If in, in today's world, I think if the candidate is willing to take an interview remotely because of whatever reasons, right? So the company should accommodate that because at the end of the day, yes, there are some merits in looking at the person physically in an office and then trying to assess if he or she is the best fit for the role. So there are definitely merits in that, but I think the crux of the interview process should try to understand who the candidate is. What have they done in the past? How do they fit into your role, right? So those kinds of assessments can definitely be done online as well, so now I think the only merit for me or someone to come into the office would be to gauge some of the other things, like the body language. Are they nervous? How are they doing? So on and so forth, but I think I would believe that those should not be the primary focus, and you know, if the candidate is comfortable online, or if he's comfortable to come in, there are people who want to have a conversation directly. People have changed, right? So, and then their needs and then their personalities have changed; we should respect that, and then the interview should be done wherever they are comfortable as well. Unless until, say, if you want to meet the CEO of the company, in that case again, I think from a candidate's side, he should respect that he's a CEO and probably want to come and show his face and then have a conversation. But for the initial rounds, I think unless there is a greater need. I think we should let the candidate choose whether it's an online or an offline interview.

TIM: One of the other potential upsides I think of an online interview is it makes it a lot easier to record it, which then adds this really rich data set. If you record the interview, you have the transcript, and you have the summarization with. tools like Fireflies or Assembly, all those kinds of products. And then in theory, if you had hundreds of interviews over the past few months across your teams, the data set there to then search across candidates to go back to what exactly was discussed to match new candidates to other roles, I feel like there's going to be a huge unlock soon in hiring in a way that would be quite hard to get the same thing face to face because it would be Quite a jarring experience to record an in person interview would just it would seem weird. No. Yeah, and so I feel like that's going to be maybe another upside of online interviews. One thing I want to ask you about also is how structured or unstructured you make your interviews. Do you have a scorecard where you're trying to measure specific things? Is it purely conversational or somewhere in between? What do you think about that?

SANKARA: Yeah, just before in answering that one, I think, yes, what you have highlighted in the world of data-driven decision-making makes it a lot easier. The process becomes a lot easier by having tools to record the interview, store the data, and then use AI and automation tools to automate the hiring process, right? the only, again, it eliminates a lot of human bias, that inherent human bias that we have. But I think we also need to be very careful in the way that we will use those tools and technologies, right? Ethics and consideration of these tools are very essential. for your attention. On your other question, I prefer a conversational type of interview, right? Rather than having a scorecard. Again, having, it's conversational, but then the conversation would be around how he is doing on, say, a programming language if I'm hiring for a software engineer, right? And if it is a data scientist, I would want to understand how he is doing on a machine learning, right? But I think it won't be a scorecard at the end. It would be like how he or she has been able to do on all these four or five aspects of categories where I believe as a hiring manager, the candidate should possess, and that will set them up for success in their current role, right? And I will keep it conversational on those categories.

TIM: And just so I understand correctly. So at the end of the interview, do you come out with a yes or no? Do you come out with a good, bad, or okay? For those different categories. What is your end evaluation of an interview?

SANKARA: I think it's either yes or no, so I think the process, the interview process, will be done if I have a score, maybe, and then I have to look at other candidates and so forth. I rather want to see, I have a role, here is a candidate, he comes with, he or she comes with this kind of experience. Whether it is an SR or not, you know the process can be sped up rather than putting a score and then saying, Let me look at other candidates, and so forth. If you know that this is a right fit, then you just go ahead, right? In the interest of, ah, process efficiency as well as respect to the candidate.

TIM: Is there something to be said for, like, when I think about structured versus unstructured, what I would normally hear from people who have been through almost like Overly structured processes, they'd say, almost become robotic. You're not really learning anything. You're ticking off boxes, and then I'm completely unstructured. It's conversational, but then how do you compare five candidates when you've had five completely different conversations with them? Is there something to be said for having consistent lines of questions so that each candidate has a similar experience and you can then grade them and compare their performance?

SANKARA: Yeah. I think if you are an experienced interviewer, then you come with a lot of judgment. You have done this multiple times, and then you would have that thought process of trying to ask similar questions and then see how they do, and then I typically use those questions to be from the existing work so that I know, given the actual problem that the company or the organization is trying to solve, how the candidate does against it, so that will give me a baseline. How different candidates are doing, and I would also try to compare a candidate's approach with best-in-class approaches that the existing team, whom we have been comfortable with, has come up with, so if they are able to, reach that level of sophistication and innovation. Then I would say that, yes, I think he or she is the right fit for the organization, right? From a technical perspective and also from a behavior perspective, I think it can be it can be similar questions as well.

TIM: What about in terms of having a variety of different interviewers involved in the process versus having just the hiring manager? What's your view on maybe you're getting a more holistic picture, but maybe from people who aren't necessarily experts themselves in the role that they intervene for?

SANKARA: So I believe having a panel, like I said, I think there is still a human bias, right? So while I know there is a lot more judgment around here. But still, I think having one person in the interview process will not be efficient, right? Having a panel of experts would, and then giving them a clear direction in terms of what the role is about, having that checklist of at least broad questions that they would have to ask. And then, let the panel judge the fitness of the candidate for the role. It would help, and different people will build and come up with different perspectives within the panel. I would try to keep the panel to be people who are experts in the area rather than people who are not related to the area. so that I can, one, give a consistent picture about the candidate's fitness, but also eliminate any individual human bias, right? That is that that we all know is there, right? From one of the interviews. So that would help. So having a panel giving them clear guidelines on what is the role and what we are expecting from the candidate. Trying to give them some pre-read in terms of the questionnaire that they can ask to make sure that the benchmark is clear, and then having at least two or three people in the bench to help.

TIM: Yeah, I think you touched on something that's so important there where I've seen. Interview processes that have numerous different interviewers go downhill quickly is if they're not necessarily looking for the same thing, and they each have their own concept of what a great analyst is or a great data scientist, rather than what you've described, which is no, we're going to set up the groundwork like this is what we're looking for. This is what good is, and https://otter.ai. Those same molds. Otherwise, it can just go into disarray very quickly. At least I've seen that.

SANKARA: Yeah. Agree on that. So companies and organizations should have a very streamlined process, not only for their own benefit but also for the experience benefit. At the end of the day, the people who interact with you, whether they are selected or not, should have a very good impression about the organization as well.

TIM: Yes. And nothing would be more jarring for a candidate than, I don't know, having a consistent hiring process. The first three interviews are around similar kinds of things. And then suddenly the fourth interviewer comes in and asks them some random thing about some random skill that they haven't even thought about before. I can remember a couple of years ago when the agency work presented a product analyst candidate to a company that flew through the processes and then got to the final stage interview with a stakeholder who's leading the marketing function. And they just had their own view of what a good analyst was. And for whatever reason, they had it in their head that the candidate had to be amazing at Excel. So I started asking them Excel questions, but the candidate hadn't really used that for years; they use Python and SQL and some other BI tools. And so the candidate got rejected, and then the hiring manager and anyone else didn't really think that Excel was a required skill. But the candidate was rejected as a result of that. So I think it's so easy for hiring processes to derail if we're not just all on the same page from the get-go.

SANKARA: Yeah, I agree, totally agree on that. Look, I think this is where HR should come in and do retrospectives, try to understand the experience of the candidates who collect right, and then maybe have random chats with the people who were not, you know, selected. Having to understand their view and then improving the processes would help.

TIM: That's a great suggestion, and that is not something I have ever seen companies do. You're right. What about the candidates who don't get hired? The feedback from them is essential, and you could learn a lot by just giving them a five-minute buzz. to get their feedback. And as someone else yesterday was recommending an NPS score for companies, like they should just be at least surveying all candidates who are in the process. It's a really simple tool to use. But yeah, the second, third, and fourth place candidates are clearly a key data set that we should understand more about because it's not just about the person who ultimately gets hired. Especially if you, as you say, want to keep that employer brand. You want to hopefully have those candidates come back and apply again soon for another role. They were so close. So yeah, why burn bridges by not getting their feedback?

SANKARA: Yeah. Yeah. Yeah, I think if they think NPS is great. A verbatim there would help, but also trying to, depending on how much time the HR departments have, selecting a few people and having a 10-minute catch-up would also help.

TIM: Yeah, and I think you, yeah, you've nailed the practicality there, which is, I've never seen more overworked teams in my life than internal talent acquisition teams. I can remember one global bank that we were working with a couple of years ago where I spoke to the Italian team they had. I think maybe three or four hundred people in their Italian office, and 1T8 did 100 percent of the recruitment for that company, all of it herself, right? So she had, I don't know, 15, 20 roles live at any point in time. Can you imagine the amount of work that's involved in that, especially given how tediously manual a lot of recruitment is with scheduling interviews and manually reading resumes, all that kind of stuff? So yeah, I think what we're talking about is really valuable, but also I'm sure talented people listening to this think, Yeah, I'd love to do that. I agree, but I've got a million things on my plate at the moment. This is where I think actually there's such an upside for automation in hiring because there are so many time sucks. In the process, adding technology to it doesn't dehumanize the process. I think it humanizes it; it gets rid of time-wasting stuff so that we can do things like this. We could actually speak to candidates who just missed out. We could speak to candidates who got the job and really understand. Then, in more detail, speaking of automation, we touched on it earlier; before, AI was, of course, now everywhere or feels like it's everywhere. What are your thoughts on AI and hiring? Have you started to dabble in using it yourself? Have you seen candidates use it on their side?

SANKARA: Yes. I think I believe AI and other automation technologies can be valuable in hybrid processes, right? Especially for screening a potentially good candidate pool for interviewing or taking the transcript and then summarizing it to understand if there is any. Any areas to improve on, so on and so forth? Some of those things you have touched upon can help speed up the process and reduce human bias completely, and this will focus on key skills and qualifications; however, there is a risk of the introduction of new bias, especially if AI models are not properly trained on a diverse set of data sets and operating from multiple domains, geographies, and skill sets, right? So to improve decision-making, AI should be a 100 percent augmented tool, right? There's no doubt about it. But it shouldn't represent the human judgment. I believe we need to instruct using LLMs properly with appropriate fine-tuning and prompt tuning in order to minimize any new bias that is coming up and test it on multiple benchmarks. But once it is curated to be very good for the specific use case, it can help in removing inefficiencies in the process. But also trying to make sure that these models are trained and benchmarked on multiple data sets will help us win the trust of different stakeholders involved in the process that will also help in higher adoption. And I've seen companies start using this for multiple use cases within the HR space.

TIM: Yes, and I think that caution is warranted and fair enough. Bye. A slight devil's advocate to that would be that the current way recruitment is done, as you've mentioned, is so flawed with so many biases to begin with. And we don't in any way hold humans to that same level of account currently, because for the typical company, it's a, I've applied to a job ad, I've submitted a resume, a human recruiter has looked at that resume, maybe for a few seconds, maybe not at all. And then I get some kind of generic rejection. That's the typical experience for most candidates. And at the moment, as a candidate, you would have no idea why you got rejected. If you asked if I went and surveyed a company I applied for three years ago, five years ago, and said, Hey, like, why didn't I get that data analyst job? They would not be able to tell me there's no oversight of transparency at all. So I feel like some of the expectations of AI may be almost too high because why should we expect this to be perfect? Perfectly transparent benchmark process when we don't have it at all for humans.

SANKARA: Yeah, I agree. I think we need a larger debate on some of these concepts. Try to have a clear view on our guidelines around where and how we can use AI. But it all boils down to. You know an organization has a goal if they believe that getting this one will help in improving their brand image and all. Absolutely, they should go ahead and do it, and then all I'm saying is it should be an augmentation tool to improve the process. You know this efficiently. I use it every day, you know, for a lot of things. But I think they need to be properly tuned. For example, HR needs to train properly to say, How do you use this AI? and need to ask them properly. So some of these guidelines should be there, but I have seen immense value in using some of these tools, just in the hiring process, but also in other places. But I use it, you know, the way that I wanted, right? So that way I think it's at the end of the day final; if I agree with the output, then I will just go ahead, right? But I think I've curated it and then started using it. But yes, I think, like I said, people wouldn't have time to look at resumes. Word for word And AI can help there and then summarize what you are looking for when you tell the AI that this is what

TIM: Yes,

SANKARA: Yeah. Yeah. Yeah, I

TIM: As long as in the training of it and the prompting of it, we don't then introduce the same biases that we're trying to eradicate in the first place. If that were the case, I could almost imagine that happening, like, Oh, I'm a recruiter. I just want to replicate my current process, which, as we know, is flawed and biased. This is what I look for. Just give me more of this. I could easily imagine companies rolling that out.

SANKARA: I would imagine companies, or whoever is doing this, would create a lot more data sets, right? So a lot more better-curated data sets So ideally you can use historical data sets, but I would suggest these companies use data sets created specifically for this use case, like I said, that will eliminate the bias in the first place and then be used, and we are only trying to eliminate the human bias using AI at the end.

TIM: So let's just focus on resume screening for a second because I feel like that's an obvious use case that will be getting rolled out soon. To be fair, from all the people I've surveyed in the last four months, two companies that I've spoken to have even tried to use any kind of automation at the resume screening stage. The rest, the 99-point-whatever percent, are still doing it manually. So I feel like there's almost this disconnect where it seems like a lot of it is happening, but in reality, maybe not a huge amount yet. Partly, I think because of legislation, it's quite complicated. You're dealing with data privacy and dealing with automated employment rules in different countries. So there's some natural hesitancy to not be sued, which is fair enough. But let's say we focus on resume screening as a concept. I feel like a sticking point is going to be the fact that, let's say an LLM is going to try to match a resume to a job description. And it's going to come up with some kind of match score based on the text comparison. That itself I feel is not going to work that well because we've all interviewed candidates who look amazing on their resumes. And then again to that first interview, and they're like, Who am I even speaking to? There's no connection at all. So is a problem here going to be the same problem with any analytics, which is the value of it is dependent upon the quality of the data? And the data set on both sides actually is pretty crap, like a job ad might not represent the job and the resume might not represent the candidate. Do we need better quality data, perhaps?

SANKARA: Completely agree on that. It's that I have, in my experience, faced it, right? So whether I'm using an automation or even whether I am doing it myself, right? So when I screened a resume, I then found the candidate to be very good. You know when I have my first interview with the candidate, I think that it's like, "What?" right? Like you said, who am I talking to, right? So that's an inherent problem. Like I said, that's a data problem on the resume side. But even assuming that the data problem on the resume side is that the candidates are honest and then they are putting a true representation of what they have done in the past. Then we have on the model training side, I think if the data set used to say that, Hey, this is what ideally my ideal analyst would look like, or An analytics data scientist would look like, if that's not accurate, I think, again, it goes for a toss, right? I think there are multiple ways to, like, you can curate better data sets right on your side, but the candidate would always, you don't have any control over what a candidate would put on a resume. I think on your side, I think you can curate a lot more data sets. I think it has to be, I think they need to be governance processes around it or whatever data sets that are used to train these automation tools. That should be mandatory. But I think if you want to eliminate, you have done what you have done, but then the candidates, I think we should use other automation tools, right? I'm hoping that AI interviewers will come in, and then they do a 10- or 15-minute test for all the screened candidates. Then only the people who are above a certain threshold, then a human can talk, right? Because I've seen I've received hundreds of resumes, and it's become overwhelming for the recruiters, and then having a replica of myself to be able to go and do it right, but I will reduce the bar of this because I think I would probably put a, like, it's just a resume screening tool, right? So I would have one tool that will just select the resume, and then I would have an agent or something that will do a short interview with these candidates. And then the interview scores or whatever, if they come set, I will actually put in very slowly so that, if there are any issues with the model training and all. The candidates, I don't miss on good candidates, right? So these are some of the things that we can do, but again, AI is rapidly changing. I'm hoping a lot of disruption will come even in this space where it either enhances the effectiveness or also the accuracy of the whole process.

TIM: Yes, and I have to say I'm fully expecting that because I think this whole space is absolutely rife for some innovation. Really, has hiring changed that much in 30 years? You've had a job ad; you apply with a resume, which, by the way, was invented by Da Vinci 550 years ago, so that's pretty old. You apply with a resume, someone screens it for a few seconds, and you do a bunch of interviews, like it's so old-fashioned. Surely we can do better now with these superintelligent beings that we're creating.

SANKARA: Yeah, that definitely is. I think we can at least retrospect on what has happened and see what the new ways of doing it are. I think in the past we might not have had the tools that we have today, and then having these tools, I think it should definitely be. I would be able to carefully, I think I wouldn't disturb the current process, but I would carefully introduce new processes that can and again are data-driven, right? You introduce a new process and see how they are doing. You have a bunch of metrics that will help you to understand if we are going in the right direction. And then over time you incrementally see this progress and then go for a lot more holistic process.

TIM: I hope that's how it goes.

SANKARA: AI

TIM: out.

SANKARA: A way forward.

TIM: Yes.

SANKARA: In what proportions and what this one time will tell, but I think that's a way forward.

TIM: I was playing with an idea recently, speaking of humans and AI together, and I was thinking, if I started a new company today, I think I would think about it very differently in terms of, I would say I expect the vast majority of my colleagues are going to be AI, not humans, and so I'd be thinking like, what is our, how do we have to set up our organization to get the most out of our future AI colleagues, as opposed to everyone being human? Are we at that point yet? Or do you think there's still a long way to go?

SANKARA: It's not a long way to go. I think in the past when machine learning was this one, the data to improve our future process was always there. So I was working for one of these banks where we had a full change management on design for data, right? So it's not about current data. You have to start thinking about new data that you need to do. Like you said, if I need AI colleagues in the future, they need data, right? And if you don't capture it today, then you have to; the process gets delayed, right? In effect, utilizing this. So I don't believe it's a long, long due. I think it's not very long into the future. We will have this, but I think we should at least start thinking about it because I think in a year or two, some of these things will start to become very real, having a lot more collaborative approaches, and then having these curated data sets, like you said. Starting to think about them today would definitely help and speed up the process of being there. Human plus AI very soon.

TIM: We've talked a lot about AI potentially helping with, let's say, the accuracy of hiring, this kind of matching problem, scoring, resume stage, interview stage, and what have you. There are other big problems in hiring, though, that I'm not sure I would immediately help with. One of those is just the length of processes, like some companies that have really long processes and many rounds of interviews. And I remember seeing some research from Google years ago where actually they had done the analysis, and they realized, you know what, after a certain point. The value add of an extra interview is pretty low. Like we're not learning anything new about the candidates. I think they cut it off at a certain point, but other companies still think that having, yeah, the sixth or seventh interview is worth it. What's your view on this? Is there a certain number of interviews that's enough? Do you feel like you can just keep learning more? How do you view this?

SANKARA: I think the company should have a balance, right? A comprehensive interview process, with respect for the candidate's time, aiming to make decisions in a reasonable time frame, right? While thoroughness is important, a lengthy process can cause candidates to lose interest or feel undervalued. I believe the company should have a balance between how many rounds they have and the experience of the candidate also. Now the aim is to make a decision in a reasonable time frame; again, if it's for a very senior role, having a thorough process is important, but for junior roles, I think at max, it should be one or two rounds of interviews. And then you should set up the interview in such a way that you will get a holistic view of the candidate, right? So I think it's faster. It's sufficient for both the organization and the company and then the candidate itself.

TIM: Yes. Yes. And if I think back to some of the longer processes I've seen companies have, they would also typically pair that with what I call a one strike and you're out policy. So let's say the candidate has six rounds of interviews. The first five, tick tick, are doing a great job; amazing; they're great. They get to the sixth one; the interviewer wakes up on the wrong side of the bed, doesn't really like them, and is not next. And then the candidate gets dropped at that point. Which I find really strange. I feel like, wouldn't it be better to have, as you say, like a more holistic picture where the candidates performances are looked at across all of those six interviews combined, as opposed to at the last interview they failed, therefore they're out?

SANKARA: I think that could be one good strategy where rather than a funnel view, you can have a panel view, right? So basically, you can think of this as yes, each of the rounds is the candidate to the next round. But I think the final decision should be based on a weighted average of all views right across all interviews. I'm assuming the process is, either the process is, you start with certain technical and then you go to behavioral and then go to HR right now, or you can have, like, you have a hiring manager and his boss. So I think depending on what they are following, if different interviews represent different checkpoints, then having a weighted average of all these interviews works. Again, I think different companies follow different methodologies, and the idea is to make it a lot more. Efficient for both the people, right? Like interview is a marriage process between candidate and the Organization and it has to be treated in the similar in that seriousness

TIM: I've had the chance to ask you lots of questions today. If you could ask our next guest one question about hiring, what would you ask them?

SANKARA: I would ask about how they ensure diversity and inclusion, right? And how do they prioritize? that in the hiring process, and do you measure its success? This would be something that I would definitely want to hear a thought from other guests, right? Because I think this is a topic that has not been so discussed during the hiring process but has been discussed within an organization, right? How many do you have? diversity and inclusion in how to have diversity inclusion at different leadership roles, but I haven't seen much debate on diversity and inclusion at the hiring process. I just want to understand what would be the thought process of this.

TIM: That's a great shout. And that's a question I'll ask our next guest. I'm really interested to hear what they say, particularly just to give you my quick five cents on this. I feel having an over-fixation on interviews as an evaluation tool does tend to, I think, marginalize certain candidates; extreme introverts and people on the spectrum maybe aren't going to perform that well in an interview compared to a really happy, smiley extrovert. And so I think even thinking about those types of hidden biases, I think, is really important.

SANKARA: Understand, thank you for that.

TIM: It has been a great conversation today. I've really enjoyed it. Thank you so much for sharing all your thoughts, insights, and experience with our audience today.

SANKARA: Thank you, Tim. Thanks, I thoroughly enjoyed the chat, learned a few things about the process, and got a few ideas myself about the process. Hopefully I can start using some of those insights in my own interview process as well going forward.