-
View transcript
Alright. Welcome to Ellie Kay's webinar, artificial intelligence natural, transformation.
I'm so pleased already to see so many attendees with people still rolling in. So if you're joining, just as we're speaking, you're gonna miss the housekeeping rules, but, the housekeeping rules are are pretty simple.
During the webinar, you can drop in questions. So there's a q and a section at the bottom. If you drop a question in there, we'll endeavor to answer it at the end in, a q and a section.
Otherwise, we'll reach out and and answer it later.
Just as the webinar finishes, you'll receive about fifteen minutes later a copy of the webinar in your, inbox, and feel free to share that with people you've you you know, that might find it interesting. So please do do share, the webinar post, but you'll have a copy so you can go back and reflect on the, things discussed. And if there's any topics you want to hear in future webinars, please also do do let us know. This topic in particular is one that lots of people have asked us about. Hence, we've got a webinar today.
So please, if there's other topics you want to hear about, do do let us know.
The topic is artificial intelligence, especially around transformation, hence the title.
You know, the last few years have been very exciting in in AI transformation. And for those that have been around slightly longer and now, lost their hair, you know, twenty years of digital transformation, AI transformation, it does feel different. It feels like things have changed quite a lot, and it's a topic we want to, you know, hear from people on the ground, what they're experiencing, and and what they're, seeing.
I'm going to share screen and share some slides as we can't help ourselves.
First, just to introduce two is that Kung Fu?
Yep. Just to in introduce two panelists to start. So you've got myself, Rob Wilde. I lead digital and AI at, LEK Consulting.
But I've also got two esteemed guests who I have worked with at points in my career, for lessons. I've had to, work with me at some point. We've got Steve, Delisle, who is CDO at SHL.
This is a, a private equity backed business that Steve's been leading, for some time and previously, had, lead roles at places like MonsterJobs. I'm sure much more, Steve, that I'm missing out, but, an experienced AI transformation and digital transformation in in those size of businesses.
Nikki, Christofi as well, and thank you thank you for joining, is an engineer manager of a VC backed business here in in in London and, has also worked in some very large organizations as well, Nikki, in the past, including FTSE one hundred and SMP. So I think with with the panel here, we've got a a range of experiences across different size of organizations, different types of teams, etcetera. So it's really great to hear, views across those different types of of of businesses.
Yeah. So thanks for joining, guys. Really, really, really great to have you.
What oh, what I'll say is you've already got the questions. Obviously, I sent the, the slides around.
Feel free to, you know, share honest views, disagree with each other cordially or hopefully.
But, yeah, it's it's really, you know, what you're really experiencing. There's no kind of you know, it's kind of just share what you think and and what you feel.
In terms of the audience, they don't fully get off. You also get to, join in and answer questions, and we're gonna start with that. Actually, we're gonna start with a poll question to to the audience. There's only three in the sessions. Before kickoff, a question for the audience, that should come up on your screen just now.
We asked a hundred leaders that are leading AI projects, are leading, AI transformation. We asked them, you know, what what is the success fair of of the AI projects you're doing?
You know, is it successful so far? Have you, has it reached potential?
Do you think AI is leading to success in your business? And we've got a number. Let's see for the audience what they think the success rate is.
If you want a second. While that's coming in, actually, I just mentioned the the sessions for today. About half of you have answered. So we're gonna talk about success rates in AI transformation and talk about gonna help ourselves success rates at different stages of, businesses and transformation. We're then gonna talk about the challenges those businesses are facing, and we're gonna talk about solutions, etcetera, that help, with those challenges. And finally, we're gonna have a bit of q and a. So that's the plan for the day.
Right. We've got enough answers, I feel.
The number is thirty six percent. So one in three, which is, as you can see, quite similar to what people have said, I think. If can you see those results as well, Steve and Nikki?
I can't. Yeah. Yes. I'm saying. But but Paul is pretty close. Yeah.
So I think, you know, what what one in three is what we hear from a hundred people that are actually doing AI transformation and AI projects.
The audience obviously know that's the right number, so that's fantastic. You know, the audience know. You know, when I when I got this stat though and and when we we kind of worked it through with business, I actually thought it was lower than I would have expected, to be quite honest. So my first question, that I'm gonna ask to to to Steve, if it's okay, Steve, first at least.
I think that's low. And and I just wanted to you know, from your point of view going through transformations, why do you think it's at that number? Why do you think it is only one in three are currently showing success?
Yeah. And and I love I love the way that this question is phrased. Right? The perceived success rate.
I I I do think, you know, AI is everywhere. And and when the marketing around AI is, you know, it it's gonna do everything for you. And I think what that has a tendency to do, particularly for those a bit removed from the digital, organizations, you know, in in, you know, CEOs and key leaders within within, private equity firms and engineering teams is that the the assumption is that it's gonna solve everything, and and why doesn't it just code and do all the phases of your workflow? I think the reality is it's it's not quite that straightforward, and and what it requires then is a proper expectation setting for the organization about what we should really see, what's the expectations we should really have for this, and how is it really gonna impact the business.
It's not going to immediately you know, with very little investment. Right? That's the other problem that we do see is is the perception here is we're just gonna turn it on, and it'll just magically work. And you'll save fifty percent of your coding time, and and everything will get better and faster and higher quality.
And it is effort. And so, you know, so the the learning curve is high. The cost isn't zero. It does require dedication.
It does require time. And, therefore, again, it it really is about properly setting the expectations upfront to define what success really is. Right? And I think that's why the perception of failure is much different than, you know, the realities of of successes.
And I do think, you know, even ten, fifteen percent better quality, faster turnaround times, easier deployments, all of these things that don't necessarily get the headlines per se of Mhmm.
In the boardroom about a great success that AI returned to me. Those are those are meaningful successes for organizations that are, you know, on the ground, the front lines really fighting the battle.
Yeah. Yeah. And I yeah. You're right. I put perceived in here for for for for that reason partly. You know, peep we are seeing in our project people that are getting fifteen percent gains in in in areas. We're we're shooting for for thirty or or forty, and that that's why it's a difficult question to to someone.
So Yeah. With without even really any any foundation or reason to to to think forty is the right number. Right? And I think that, again, that's the reality of of what we you should expect to get versus what you do get out of it. I think that's the key. Yeah.
Yeah. And I think we'll touch on some different challenges. I do think people are hitting a lot of challenges, which we're gonna go through, as well. So I do think it it is challenging, as she said, the investment side, the the kind of, the amount of change that's needed.
But, Nick, if it's okay, we'll we'll keep on moving and come back to another another question.
Just before we do, couldn't help myself, you know, starting a consulting world a little bit even doing lots of air transformation.
You know, we took those hundred, hundred, it's actually a lot more like a hundred and twenty, but the business has been working with them, thinking through where they fit on, the kind of AI, transformation scale. And and the reason we've been doing that is what we're seeing is the challenges that businesses face in their AI transformation evolve over time. So the challenges they faced much earlier in AI transformation is quite different to, you know, when they've evolved in later in their kind of AI endeavors.
I'm not gonna talk to bystanders and self sustainers partly because bystanders, you know, there's there's very different challenges there to get, businesses, to motivate themselves or to start an AI transformation.
And self sustainers, I think we're quite early, unlike digital transformation where more businesses have reached that phase. I think in I AI transformation, I think it's, we've got much more many more businesses that will classify themselves in the, scale them in progressive stages. So I'll just describe those two phases. You know, the scalers, you know, they are starting to do AI projects or AI transformation within their business, but they're they're seeing maybe lead indicators, but they're not seeing any of the translation yet, maybe into commercial success just yet.
And the progressives, you know, they've they've really now had some time to push forward their, AI transformation. They're starting to see some of the benefits come through, but, generally, they're still not, you know, seeing the the the maybe they're not rolling out because of our business just yet, or maybe they're seeing pockets of success, but they, you know, they translate into some bigger teams, or maybe they're seeing success in one element of AI, but not within a wider set of projects. So that's how we've, classify those business, and it'll lead into the challenges.
But the question which you've seen, and I'll pass this one to Nikki first is, if you think through that progression within your your business today, the one the the one you're focused on today, where would where would you place yourself?
And at the same time, I'm gonna ask the audience, the same question, Nikki, if that's okay. So, yeah, if those in the audience could could could give an answer, but, Nikki, yeah, the question is is, is is to you as well.
Yeah. Sure. So I thought about this, quite a while because I wasn't sure where to place myself between ourselves between scalars and, progressives. And I think we end up with seeing ourselves as scalars, and that's mainly because of we've we have adopted several AI tools in the engineering team to help us with productivity and development. We've got Copilot business subscriptions. We've got business license for Cursor. We're using Chat, GPT, and Claude.
And then the rest of the business, we've rolled out several AI projects. So we've got kind of, customer support chatbot that we're rolling out currently. We've got an internal, AI platform that we've, rolled out to the business called Pen AI, which, uses an open source software called Onyx to kinda connect with our knowledge bases and help people search and find answers quickly. And we've also started to add it a little bit into our, product, in our pension transfer process using, a sim sentence similarity model to determine what action we need to take from a for a transfer.
But all of these things don't, really equate to the the kind of commercial success. We we we haven't seen those benefits. We're, you know, not really sure. There's no uptick in the efficiency metrics.
It's hard to understand whether it's truly helped us or not. And, also, I'd say these projects are not as part of the main road map as well. We haven't got enough kind of time to really delve into these things, and I think that that is you know, we we do have, well, we do have executive buy in in the sense of excitement and motivation to do these things, and our CTO is actively doing a lot of things, you know, to try and push it forward. But I think there is an maybe not understanding of how much time we can really spend on this, you know, moving the needle within our actual road map.
And then I think one of the reasons for that is maybe, the AI literacy of the of the team, which I know you're gonna get on to, you know, in the challenges, which is, if we don't have the internal capability to move fast and fell fast, then, it's gonna be really hard to get that buy into them, do these projects. And so I think that's where our real challenge is, and I think that's what kind of puts us in that scaler, category.
Yeah. I'd put you on the more progressed end with the amount that you're trying. And, you know, sometimes people are also I mean, this might not be the case. So they'll they'll take this as a good bit. Sometimes people are doing a lot, and it's not, got the focus. There's there's other challenges as well that actually we've not got on the challenge, list, one of the business been working with recently. But before we go too far on that and, yeah, Steve, just really good to hear where where you know, on reflection, where would you put yourself?
Yeah. It it's it's great to hear, Nikki, your your progression on this as well. Right? Because I think it's probably a very, very, standard path, so to speak, in in the adoption of AI where we did about we've been we've been at this now for probably a year and a half.
We started back in January of twenty four. And for the first six, eight, nine months, you know, we did a lot of a lot of what you sort of described as well. We bought a bought a lot of tools. We kinda rolled them out.
We had pockets of successes. We had pockets, frankly, of failures where I think trying to adopt the tools set us back as opposed to accelerating us. We've seen a little bit of of stubbing of our toes and starts and stops. And, and, you know, after about six, nine months of that, we we saw enough successes to kind of engage, you know, quite honestly, you know, with Rob and the team at at LEK, they can help to understand what the industry was doing and how to more formalize this and get give you the start to get those commercial returns, how to track the value that we're getting from some of these tools.
And I think that the the big shift for me from scalers into the progressives, section was really more about, you know, kind of starting to formalize the approach across the entirety of the organization. So now we have a very consistent approach to it. It is a change management problem as well as a toolset problem. And I think it wasn't until we kinda got into that second phase and we we became much more mature around the the consumption of the tool.
So I would put us you know, we're probably, you know, on the early end of the progressive, set. We're, you know, we're certainly not on on the right hand side quite yet, but we are seeing that return. We are seeing commercial benefit. We are able to track the value in the return, for the tools that we're using, and the consistency and usage of those tools, as well as, you you know, the value that we're getting from them.
So Yeah.
And I think I think we touch on that a little bit later, but I think that's exactly where we wanna get to in terms of the change management rollout, formalized standard process, and then and then the measurement.
And I think that's the next thing for us, which definitely when when we haven't got yet.
Yeah. Should we share the, share the results?
Can you see those?
Yeah. Okay. So we've got quite a few people in the voice on the bucket. Hopefully, this might convince you to move into the scaler bucket if you're on that. But, yeah, a lot of businesses, fifty percent have put themselves in the scaler, so you're starting to do something and and more maybe where Nicky described with, fourteen percent, more in the progressives, Steve, where where where you were more describing yourself to be.
So, it's interesting that nobody nobody put themselves in a self sustainer bucket.
Yeah. Well, I I do we were you know, we've been reflecting this. I don't think there's as many businesses, and those that believe that are still have a lot to do. So, yeah, but it's an interesting in in digital transformation, I do think there are a lot of businesses now, but that's a longer conversation to hear. But, Yeah. So, so that was, section one.
We're gonna just talk about some of those challenges. So I think Nick and Steve, you're both touching on on on talent challenge at different phases. Again, what what we did here is, across the the businesses we've worked with and across, people, again, we know are doing AI AI projects, AI transformations, we got them to say their challenges. Right?
What what is it that's really, points in your AI transformation that held you back? We couldn't help ourselves but kind of rank them. So we've got kind of the top three for the, scalers, and we've got the top three for regressives. Now, obviously, all businesses are different, and it it it's gonna be slightly different between between, size and and other things, but at least there's a lot of commonality.
You know, some things just repeat across all all projects and transformations.
And, number one for the scalers in particular has been a lack of exec buying. I have to, just bound this one up because it it sometimes verbal buying is there, but then it's the, you know, when you actually speak through with people, it's actually the kind of actions and follow on support that might be the challenge. So it's not always just it's not the execs don't believe in AI. Sometimes they do, but, you know, I really do believe it's gonna change things, but the the follow on support might not be there. That's the number one challenge for scalers that we, see.
Number two is, AI literate resource. I think you were touching on this, Nikki, in in what you were saying.
This, again, has to be kind of caveat ed. We've seen things fail here as well where people have hired new heads of not completely fail, but not not led to the success people are looking for. But people hiring heads of AI and, you know, new CIO roles have not really unlocked this. It needs to be a more team wide. It's not a hiring problem for a lot of businesses. It is a education within the current enterprise.
So that is another one to that's number two, but, again, needs a bit of explanation as I'll do. The last one, poor targeting and right sizing. Actually, Steve, I think you were touching on this. There's a expectation you can spend, you know, fifty thousand on licenses and in a few hundred million business really change the dial by just buying a few licenses.
Or there's It doesn't quite work that way. But It doesn't quite work.
Yeah. It doesn't quite work that way. Or or or people have done kind of not really business case it for bigger bigger investments, and that's really made it struggle to go for the organization, especially in bigger businesses.
All of them proof of concepts are more visual, but haven't actually proved out the kind of business points that have allowed for the unlocking of bigger investments and things like that. So, again, there's a there's a number of things in here, but it's the right size of investment, point, in the scalers.
So they're they're the scaler ones. I'm gonna ask you the question in a second. You know what's coming. But just for the progressives, it is different.
Change management just comes up nearly in all progressives. It's we we we've got quite far and now getting the organization to you know, people in the organization to change behaviors, rolling that out, change leadership just comes up you know, we might have a tool now that's working. We've got one squad or one team using it really well. We've got to scale that across the business. That comes up as as the highest by by far, actually.
Second and, sometimes linked, I think, but people mention them differently, the measurement of AI impact. And I think, my reflection on this one are that AI seems to me, from what we've seen live in the field, to be making teams that were all already more difficult to measure, more product right? More productive and and and better output. So if you think about areas of digital, if you think about areas of marketing, if you think about some of the complex servicing of customers, they're the ones that are probably seeing the biggest impacts from AI, but, fundamentally, with the hardest ones to to to for CEOs to measure in the first place.
You know? You you guys understand the stories and velocity and all these types of metrics, but a lot of senior leaders we talk to, they're not as comfortable in those metrics or awareness. So I think measurement is, or at least lead indicator measurement is is is is proven difficult, and I'll stop talking in a second. And the last one is just the pace of change.
We've we've seen it ourselves. Even I'm struggling to keep it with the pace of change in more models I'm using my day to day is is is changing so fast.
That's kind of the final element. So, yeah, they are the, they're the three. So you get to these six slightly different for the scalars and progressives.
Of course, I'm gonna ask your audience. Of course, I'm gonna ask you both.
Audience, there's a there's a question, coming your way for those that were in the, sixty percent at least.
And, yeah, Nikki, back back to you, actually.
Of those challenges, I can put them back up if it's helpful.
But of those challenges, what have you seen? And if it's not one of the six, by the way, it can be, you know, a different it doesn't have to be. Yeah.
Sure. So And yeah. So I think, definitely, you know, when I first thought about this, my first thought was it's it's time. Like, you know, we don't have enough time to do these things.
I was like, what is what do what do I mean by that? And then it's like, okay. We have a laser focus on what we wanna do this year. We're a small series a company, with about sixty people.
We, so we have so much to do to get to where we need to be, and, it's hard to take that time away to research and implement and potentially fail as well, which I think is important. Like, we we need to be able to test things and things do work and don't work. And so part of that is the kind of executive buy in, and I would say even investor buy in if you go up a level of, you know, having that in our, in our road map, like, so we can carve out that time. I do think that the the executives, like, at Penford are very, pro AI.
I don't think it's a case of, you know, the the the motivation. It's more of how do we actually build it in. And, and I I really do think it's because without the right kind of talent and knowledge and education amongst the the the company, the we won't be able to deliver quickly, and we won't be able to fail quickly. So for me, it's the AI literate resources that kind of lead into not getting that buy in for some projects because we don't actually know how long it will take.
We don't know, you know, what we're measuring to say we've we've got a success. And I think it's hard for people to sign up on something like that because, you know, why would you? And so I'd say that was the biggest change on this, on the top of the three.
On this list. Yeah. Just a general there's a lot going on in the bit as every business. Right? There's so much going on. It Yeah. It's having the change versus the due time and that balance, is that that kind of a fair playback of of that.
Yeah. Absolutely.
Yeah. Yeah. Steve, you came off a you're coming on Nicky or you're showing your own thoughts.
Yeah. I I was biting my tongue on some of things you're saying there. You know? I I feel them, if if I feel almost all six of these.
And even when we're in the progressive side, I still think there's some scaler challenges that we're growing through. Right? So we you know, if you just a few comments on this. Definitely the you know, even my my opening comments on lack of executive buy in.
I think this isn't necessarily just, you know, the CEO or or the board or your funding investment group buying into this. It really is almost every part of the business needs to understand what investment is it gonna take. Right? So the the CFOs in the financial world need to need to be willing to to fund some of this and find monies and budgets for it, and product teams and engineering teams need to be willing to carve off time for it because it it isn't something that comes for free.
It is it is work.
And I think that's really, really important that, you know, stress that one again. It's a it's a continued struggle even in the progressive side because it's always something that requires care and feeding. And as, you know, human nature, the second you stop focusing on it, you sort of lose lose focus on it and and you start to regress. So I think that's really, really important.
For me, the the other really big thing and I noticed that there's a a a q and a question about speaking to successful change management. I do really see our transition from the scalers into the progressives being about managing the change in behavior more broadly across the organization. And it's easy to get an engineer or a squad very excited about using an AI tool. And we've we've probably played with, bought, purchased.
I probably have licenses for just about any AI tool out there, probably very similar to Unikki and and lots lots of the attendees as well. What what we found was what works for us is to to stop sort of chasing the next great tool and really kind of locking in on a tool for some period of time. And not to say that we stopped looking at the market and looking to try to understand innovation because clearly that that's the death of you if you do that. But really to say, instead of always saying it's gonna get better with the next tool, really embracing how are we gonna roll this out systemically.
What are the things that we need to do as an organization to be able to, as we had said, you know, track the value that we're getting from this? So we've really invested a lot of time in understanding what is the usage models.
The fact that AI generated you know, we're in the thousands of lines of codes a day for, you know, AI generating and being accepted by our developers. Is that good? Is that bad? What is the output of that?
You know, so really tracking not only output of AI tools, but then how does that pull through all the way to the end? Did that result in higher quality for my customers? Did that result in faster deployments, you know, faster resolution of issues? So really kind of embracing all of the facets of the engineering workflow, not just the front end one, which again kinda ties back to why I think a lot of AI, projects are perceived to fail because it isn't just about coding faster.
It's really about being better at our jobs, delivering more innovation and and more products that are compelling to our customers.
And that's how I think we we made we turned the corner and really didn't get that value until we started to to embrace that change management and driving a consistent consumption model for it. Now once we've done that, now we're in a place where we're doing exploration. Right? So we've got by the way, full full disclosure, we we decided to land on Copilot for our our tool, of choice. Had a lot of pushback definitely over the last three, four, six months about you wanna use Cursor as well, but we chose Copilot and locked in on that one. Now what we've started to do building out those processes is now a squad or two. We've got about twenty squads just to to give everyone sort of an understanding of the the magnitude of my teams.
They're all teams. Yeah.
Yeah. Exactly. All all engineering squads. We've got about twenty, twenty two squads.
So now that we've kind of rolled this out consistently, we have measurement and tools, and part of our sprint playbacks all do talk about how we used AI in the in the development process in the sprints.
We now have the ability to kinda inject a new tool and see how those squads did relative to the other one. So we've established baselines. We've established expectations, and we can kind of see that.
As is the case with all engineering work, yet it is not an exact science. Sometimes the tool will will will accelerate for a sprint or two.
Other times, it will it will, again, you know, limit our successes and pull us back. But I do think the fact that we spent that time building out that consistent change management program has now given us the ability to accelerate and understand better as well.
Right? So you kind of get the right that rhythms and mechanisms, and then whatever you bring in just rolls out much faster.
Absolutely. It's it's become kind of a, you know, an and we've engineered our use of AI in many ways. Yeah.
Yeah. Yeah.
That's true. To go Right.
Sorry, Nikki.
Sorry. I was gonna say we'd say well, I think we're deciding to go down the cursor route, so I'll let you know how Yeah. Versus.
Exactly.
Exactly.
But I'd like to talk about sorry.
In a queue, please. Sorry.
I'm sorry. I was just gonna say I think, I think I think that's the thing. We we've rolled out the stuff, and we're not really tracking kind of how it's helping us and how it's helping the engineers every day. And I think that we've got the traditional metrics we use, cycle time and things like that, to to look at our efficiency. But, I think we need to, like you said, play back a little bit on on how we've used AI to help us and try and measure that maybe even a little bit separately to to kind of our normal metrics. Yeah.
Well, I share, these these these results, with the audience. Can you guy you guys can see those. Right?
So keeping pace and AI literature at the at the top, but, you know, there's a fair fair spread there across, all the different charges. People could answer more than one on this one as well, by the way.
So, yeah, that that's really interesting. I'd I'd like to get to one more question before I go to the q and a. So and it is the last, one. Just more of a future looking question.
Sorry, Rob. Before you jump into that, it was interesting. One of one of the the items that I didn't talk too much about, but it seems to be the highest one Yeah. Here is is the lack of AI literate resources. Right? And, you know, I I do think coming from a a company that, you know, we so we do assessments, and we do kinda try to understand skills of of, candidates in looking for jobs.
Hiring people that are AI literate certainly is very, very difficult and very, very expensive. Right? Mhmm. But what I what we've also found is so we we do have an AI team that that is responsible for our central AI tool builds.
Yeah.
But the other thing is is kind of investing in your organization.
Right? Yeah. Yeah. Investing in training in that because, you know, again, every engineer that we have we have a lot of, you know, a lot of our, teams are are based in India and the UK.
Everybody is hungry to learn this stuff as well. And what we found is we didn't even need to to to push people too hard to motivate them to take some of these internal courses and the internal knowledge and skill set training.
So I do wanna I just wanted to, you know, jump in there for a second to stress the fact that I don't think that transformation comes by turning over your entire engineering staff and hiring Yeah.
Yeah. Or hire in in the market, you're getting the AI market in person, getting the I don't I I completely agree on hiring.
It's not been the magic bullet. I've also seen an, you know, people that have outsourced their AI to some you know, we brought in someone else big out that that doesn't work either.
So And just to sorry.
Just to add to that, when I mentioned AI literate resources, I was not thinking in terms of hiring new people. It was more the education of the existing team. I mean, I don't have we don't have, you know, the the funds to do that. So I think we really only have one choice.
Yeah. So even if you're not thinking that way, some people are trying to solve it with with one or two hires. I think that has not yet, from what I've seen, led to significant change at all, actually, because it's a more, and I think that's what people read from that one.
Yeah. And that and that's that's why I figured it would be interesting to, you know, to to make that point for the for the audience here because it is it certainly is important to have people who understand it, but it's not gonna come from you're not gonna go to the store and buy that. It's gonna take investment and time and and cultivation within the organization. And then even we as leaders. Right? There was a big learning curve for myself, and I'm sure Nikki for you as well.
And you're rocking and I spent lots of time talking about how to even educate your leaders on what it means to Yeah.
Yeah. Yeah. Yeah. Yeah. Yeah. Admit it.
Perfect. Alright. Let's quick minute on the last, question.
Nikki, back to you. But, you know, think think forward five, ten years. How how do you see AI fitting into your whole organization? You know, the whole the whole business, how how do you feel it will be different?
Yeah. Sure. Sure. I mean, so we've just started to kind of roll out the AI task force. It's kind of going across the whole business to help with education and implementation and and risk management because we're obviously in a very highly regulated industry. So what I see, you know, in the the immediate term is is, the process improvement. So kind of supercharging our employees.
How can we, scale our business without having to scale proportionally in our operational costs? And that's really where I I think that's gonna help us reach, you know, our goals as a small company, and that's where we should, I think, start, you know, start at the process improvement side. But I think if we look forward ahead, you know, ten years, what I'd really like is it to be built into the product more and to actually use AI to help, you know, get get our get our pension into the stage where we are achieving our mission, which is helping everyone save enough to, live comfortably in retirement. And I think I don't actually have any idea how we're gonna do that right now, and I think it's definitely a more longer term, thing. So I'll, you know, put my hands up there. But that's where I'd love to see it. So kind of going outside of the process, improvements and really actually building it in into, our our product.
So in five five, ten years, do you feel like things will be very different, or do you feel it'll be a progressive I I yeah.
I I I'd I definitely think it'll look but I think it'll look different in one year versus five years. I think that things, are moving so quickly, and the maturity of the AI tools that are coming out are just you know, they're they're blowing my mind, some of them. So I think that there's definitely yeah. As long as we can harness it and we've got that good baseline of how we're using AI, like like Steve said earlier, we can add a new tool and see how that that helps. So I think, definitely, that's, you know, that's a next five, ten years is a huge, timeline in in my eyes as a you know, from a company that's seven years old.
So Yeah.
That makes a lot sense.
And, Steve, your your answer, five five, ten years from now, what you know, how does it fit into the whole business? You know, what what's the across all the areas of your business. You know? Yeah.
I I mean, look. I I I really like, Nick, the way you said that, you know, five, ten years is a is a lifetime from now. You know, even one or two years, I think, is gonna be significantly different for us. I I I think, you know, not to be a bit tongue in cheek.
Right? But the I think the question really here is how won't it fit in, to the organization? I I think, realistically, what we're looking at now and a lot of what we've discussed here is how AI is fit into kind of the engineering process and how to build to you know, building tools and fitting it into products. We are also at the point, and I think this is where we start to get further to the right in that progressive scale, is rolling out AI tooling in non engineering parts of our product, not yeah.
Yeah. Yeah. Of our organization. Right? So not just in the product that we offer to our candidates and in efficiencies for engineering teams, but we're looking at how do we roll this out for our deployment teams, how do we roll this out for customer support, customer service teams?
Our Salesforce, implementations are starting to be you're baking in some of the tooling there, Einstein. And, of course, we're building our own tooling to kinda help to drive and make our sales organizations more efficient, targeting higher probability of closed customers, and and I don't think it will take again, to your point, Nikki, I don't it won't be five years. This will be, you know, eighteen months. I look back eighteen months ago to where we were to where we are now, and it's it's a drastic, drastic change.
So I'm really excited for it, but also a bit cautious. Right? Because I do think AI, while it is fantastic and it does amazing things, it also doesn't replace the need for, you know, good old thinking, right, and and hard engineering work and hard processes and and applying a bit of, you know, human intelligence and logic on top of it as well. It's not going to architect your next platform.
Right? It it you still need architects, and we still need a lot of those pieces as well.
And, again I just got to the, you know, some of the things that I'm What what is it good at and what is it not?
Right? I think that's important. Sorry. I talked over you. Go ahead, Rob.
No. No. We're me speaking speaking here. Should we just spend, you know, just wait a time, and we've got, roughly, five five minutes.
Should we just spend some time on some of the questions that have come in if that's, okay, guys? And, I like the future proof one. Maybe people could think about that. Can you see the some of these questions as well?
Just I can. Yeah.
Yeah. Yeah. The future proof one I like. May maybe I'll take the top top the one that came in quite early in the conversation because it's when we're talking about the adoption curve. Maybe I'll take that one way or maybe you two think if when he wants to answer the future proof.
Well, because I I do like that question.
There were question right at the beginning, sorry, which were for me, it was said, Rob, where does LEK fit on the AI adoption curve, which I didn't answer. So a good question.
And I think, in the world of, consulting, whether it's transformation or whether it's in a wider strategic consulting, it is it is impossible not to be using, AI tools today. It's already sped up so much how we do secondary, how we understand, trends, how we understand what information exists, also in the way we do, kind of primary research, the AI tools there. We we are supply the the speed at which we've adopted AI tools is, has has been has been fantastic. But I will still put us pretty late scale as early progressives. I do I do think we've got more we can can do as well, within within, Elekit.
But back to the last question, it's it's transforming our industry as much as any of us. I mean, deep research is amazing, and, yeah, some of the sentiment analysis, some of the the tools we're using.
Does that has that given you a time to have a think about the future proof one? There's there's a few others that are really interesting here, but, anyone got a good answer for future proof investment?
You know, I I I think, you know, anytime we use terms like future proofing, it it's it's a bit difficult, right, because it's impossible to tell where these tools will be going.
So what I what I do like and, you know, it's kind of back to the, you know, the way that I answer on the previous questions about breaking the change management component down, the process, the, you know, the creating a habit of using these tools and frequently examining the benefits and the value of the tools that you're getting, I think that is the future proofing.
Right? That allows you then to to pivot and adapt quickly and kind of building that muscle of using tools and then holding yourself accountable to the value of those tools and being willing to to pivot very quickly if you don't see that return. And I think that's the that's, to me, what proofing means in this space. Right?
Yeah. See, responding and and adapting. Absolutely. Yeah. Yeah. Okay.
So And I I think, also in the hiring process, when you're hiring engineers, like, you know, we've we allow them to use whatever they wanna use in the interviews.
So if they wanna use AI and they wanna and that's what they use every day, like, then that's what they can use. And I think that it's important not to make it seem that it's, like, cheating in some some way to use AI to help you because, you know, that's what you're gonna be doing day to day in your role. So I think just making sure that you hire AI forward thinking people as well and have that as part of your process is is gonna be a way to build your team and, you know, it'll be natural to adopt tools when they come. Yeah.
Yeah. Yeah.
There's one that came in just a minute ago, which, is at the top or bottom, depending on which way you look at, or second from bottom. Sorry.
Exploring AI opportunities is one thing, but successfully execution is another. And often the harder part, we all agree with that.
Execution is is harder than agree with that.
Yeah. That's right.
Of of of this one. In your experience, how freak frequently the AI initiative struggle or fail due to poor underlying data and database readiness?
Does anyone want to take this? I've got quite a few examples of, you know, data not yet being in being in existence or, you know, people were getting quite excited, but actually found out they've not got data there. And I've got a few exact examples of AI allowing businesses to start collecting data that they didn't collect in the past that's helping them.
But does anyone have a particular answer on I I just wanna I guess I just wanna say, I think that we struggle with this in every initiative, not just AI initiatives a little bit.
We have it's hard to, you know, find the the right data to drive your prioritization, and I think it's the same for AI initiatives, but it's also same for our scaling initiative. You know, how do what do we focus on first to improve our operational processes that's gonna, you know, make make make sure that we we're not hiring, you know, fifty more people to do something we could be, just scaling with our, with our product. And so, that's something we're focusing quite a lot on at the moment in in Penfold is trying to get our database readiness there so so we can measure that change and and the success of it.
I don't in terms of practical steps, we have hired a data engineer recently to help with this. And then also using trying to use tools like, the kind of inbuilt analytics from third party tools that we have, like Intercom.
We're building our data warehouse and then try using QuickSight on top of that.
Sorry. I don't wanna yeah. I realized we're nearly out of time, so I don't wanna go on too much. But I think there's lots of stuff we can do to, like, get to that point where we're like, yeah. Now we can measure this.
Yeah. Yeah. No. I was just realizing we could actually see each other and not see the work q and a. I think that was why, Yeah. We've got another minute, but, should we try one more, or should we, call us?
There's one that I saw that's in there. If I could, dealing with questions from people about AI impacting their jobs. And I think that's probably one that all of us, you know, even even outside of, you know, anybody working right now, I think, is kind of looking over their shoulder, worrying about, is AI gonna come and steal my job? Right?
And I do think that, you know, Nick, you touched on this one a little bit. You you know, trying to demystify AI a bit and make it less scary to organizations, and that's why I do stress. It isn't a replacement for thinking. You know, what what we've found with this and the way that we've presented it to organizations is look at all the the boring, repetitive, laborious tasks that we've been able to remove.
We've been able to accelerate, you know, time you know, heavy thinking and, you know, one one of the one of the things that we've looked at, and, Rob, you know this well is, you know, at SHL is we we look to kinda understand the impact of SHL is what is the what is the amount of time that our engineers get to spend actually doing deep thinking, deep work as opposed to, you know, that that repetitive stuff. And that that's a really interesting way to kinda pivot this AI piece, but people are always gonna be worried that the next, you know, thing is gonna come and and put them, you know, on on the unemployment market.
But I think it's really important to stress to people that the goal here isn't to remove those things, but it's to create more time to do more efficient, high value work and remove the, you know, that that repetitive stuff that, frankly, nobody wants to do. Right? And I think what we found is people have really embraced that. Demystifying it, giving people training and knowledge and experience and access to these tools also, I think, helps people to realize that there is value here, but it's not so so, scary.
Right?
Yeah. I'm just realizing the time, and, I'm sure the people, got the data booked up to this point as well.
No. I think that's a great answer, Steven. I I think people have always been worried about change and, you know, new tools coming along. And the the world of consulting has changed a lot in the last forty, fifty, sixty years.
You know? No longer is it assets and going down to the library. You know? It's a very different very different world.
But let's, yeah, let's let's close that. Really appreciate you spending the time and talking through, the underground, challenges, and I hope the audience enjoyed, and it's great to have the different views. But, just thanks thanks thanks to both of you.
Thank you to the audience for taking part and doing the polls. And, yeah, as I said, the webinar will be shared, and, please please do share it with people that might be, interested. And let us know some of the topics you want us to explore. But thank you, everyone.