Video: Why traditional analytics are failing modern teams | Duration: 3752s | Summary: Why traditional analytics are failing modern teams | Chapters: Welcome and Introduction (12.32s), Speaker Introductions (83.815s), Session Overview (164.895s), Evolving Dashboard Expectations (250.845s), Data Visibility Challenges (415.28s), Decision Latency Costs (552.195s), Data Quality Impact (761.35s), Proactive Analytics Shift (1026.52s), Analytics Speed Challenges (1314.285s), Build vs. Buy (1582.29s), Build vs. Buy Challenges (1829.52s), Embedded Analytics Evolution (2135.8s), Embedded Intelligence Experience (2419.33s), Shifting Analytics Paradigms (2713.46s), Change Management Strategies (3112.47s), Structured Data Querying (3181.415s), Scaling AI Solutions (3272.285s), SaaS Analytics Challenges (3367.2s), AI Hallucination Mitigation (3542.065s), Concluding Thoughts (3666.64s)
Transcript for "Why traditional analytics are failing modern teams": Okay. Sure. Should we go for take two? Good afternoon, good morning, good evening, wherever you're joining us from in the world today. Welcome to the webinar between the Product Led Alliance and ThoughtSpot, where we'll be covering why traditional BI is failing modern product teams. And just to give a a little bit of explanation of what we mean by that, it seems like today, product teams everywhere are rushing to add AI to their analytics stack. But largely, that means most of those efforts are focused on making dashboards better. So today, we're gonna talk about why that's not gonna work in a modern AI enabled world and what it actually takes to build intelligence into a product. I, for one, certainly have a viewpoint on that. My name is Mark Hillsmith, and I lead the embedded analytics business for ThoughtSpot in EMEA. But I'm joined by two esteemed colleagues who are much better qualified than I to comment on these things. So please, would you welcome Jane Smith and Kim Fora. Kim, Jane, would you like to just introduce yourselves and get a little bit of background on on your experience? Sure. Of course. Well, I've been leading data teams for as far as I can remember. That's quite a while right now. Up until very recently, I've been leading Gumtree as managing director. Before that, I was at Checkatrade for four years, leading product and marketing. And before that, I started and led Gift Gap for over a decade. I'm originally from Barcelona, Spain. I'm aware that doesn't explain my accent. I don't know where that comes from. I've lived in many different countries, but it does explain why I'm in such a good mood as this weather brings me very close to home. Nice. Thanks, Kim. Jane? Hello, everybody. I'm Jane Smith. No relation to Mark. I'm not related to him, but I do work closely with him. And, occasionally, we coordinate outfits. I'm chief data and AI officer for EMEA at ThoughtSpot. So I work with our customers. I advise their chief data officers and chief AI officers on the the future of data analytics, AI, and AgenTik. I'm a former chief data officer myself of an insurance company, and I worked across pharma, retail, and SaaS. Great. Amazing. I told you you're much better qualified than I. So thank you both for that. Really looking forward for today's session. So just a quick run through of the running order and a couple of housekeeping rules. What we're gonna cover off today is we're gonna talk a little bit about the status quo and, I guess, what you'd class as a dashboard era. We're gonna dig into what's actually broken and and the shift away from that and how we're going to do things a little bit differently in the future. Then we're going to close-up with maybe ten minutes of Q and A and then just finish up with a link back to the ThoughtSpot website if you've got any particular follow-up questions. If you have questions as we run through the session, if you can just type them into the chat window, we'll pick them up. And I think you can probably vote on there as well. So the more votes we get, other questions we'll answer first. And we'll do that in the last ten minutes. So unless there's anything else from you, Jane, or Kim before we get started, we we can just kick off by going into a little bit of debate around dashboards. So for me, I worked at Tableau for ten years. We broke a lot of boundaries in terms of data access at the time. And dashboards provided us with pretty good visibility into data. So I guess my opener is what's the point of this conversation? Like, doesn't that work okay? But why is this so problematic? Kim, can we start with. you? Well, happy to go first, straight at it with dashboard. So I guess from my perspective, they still play a role. I think they will always still play a role. Right? If we think about La Liga tables, for example, you'll wanna see those. Right? If you think about your OKRs, how you're performing, I think you would expect to see a dashboard and see how you've evolved over time. I think every organization will require that visibility, and dashboards will play a role. They deliver on that. But maybe where the problem is is what do we expect a dashboard to deliver? Because they are not analytical tools. They will they will tell you what happened. They will never they were not designed in the first place to to explain why a number has shifted. And I think that's the gap that we're here to talk about because with new tool tools that are emerging, we're far more likely to get closer to that why than what a dashboard could ever deliver. Yeah. Great. I think that's a really interesting perspective. And Jane, from your perspective or, I guess, from the market perspective, are you seeing the same shift in expectations from customers? Definitely. I mean, what I'm seeing from customers is that, basically, everybody wants to get from their b to b tools what we now get from our b to c tools. So if I can track a pizza order in real time, you know, why can't I see where my my shipments are, my inventory, my stock levels? Why can't I see all this in in real time? I have also seen customers are thinking differently in terms of dashboards. If you are a company that provides data as part of what you do, you know, maybe if you're if you provide pharma consumption data or something like that, market research data, your customers don't just want live boards. They want to be able to query that data and not just with their data scientists query it, but their business users. And, yeah, I think, finally, if you're a company that have a a SaaS product offering, what we're seeing is that, you know, your your customers, they they will want a best in class analytics experience from your product even if you're not in the analytics business. That's that's what I'm seeing, Mark. Yeah. Good. It makes a lot of sense. And I guess it tags on nicely to to Kim's point. I guess the other thing that's imperative in here is companies are generating more data than ever. And most companies that I talk to anyway believe that they're data driven. So why are people still unable to make decisions until after the impact has already hit the p and l? Why isn't it faster? I think that there there are gaps here for sure. Right? So for me, one thing is visibility, and something very different is understanding. And, these things are often mixed up at boardrooms. Right? So you can have a dashboard that shows you that revenue dropped last week. You can see the data, but you still don't know what to do about it when when you will have to do something about it. Right? The data is there. The signals and actions are not necessarily following on an automated fashion. And often by the time that someone's noticed a drop on, let's say, a metric like revenue or profit, they've investigated, they've escalated, the impact has already hit, and the focus has to be shifting more and more into the leading indicators, which, will give you those early signals on what's about to happen. Yeah. Nice. And the same question, I guess, to you, Jane. Fully agree with Kim. You know, it's that retrospective backward looking piece that you're getting. And just added to that, I think when you are reliant on a central analytics team or specialist analytics skills to get your answers, you start to have this, you know, this time delay to to visibility almost built in. If we take I'll take a pharma example because I I used to work with GlaxoSmithKline. But if you, you know, if you have a leader who wants something like, what show me the prescription volumes per region and you, you know, you see that, you can see that view, but then you automatically think, well, which prescribers are driving that growth? So you have to go back to analytics team. They have to either build you something else or do more discovery. And then you think, okay. This is great. But can I see this now by, I don't know, prescribing segments, oncologists, hospitals, cardiologists? You've got another week's delay. And then if you think, well, okay. This is great, but I need to which marketing interventions against those prescribing segments are working. You know, you just got this back and forth, and there's this constant time delay. And, I mean, that's even just your structured data, isn't it? If you want to pull in things like your unstructured data, your world data, things like maybe external prescribing trends or regulatory trends, you've got more delay again. Yeah. Go got it. And so I think just we round out the the bit about dashboards, then it's probably fair to say that they kinda give you a not not really a unified view. Right? It tends to be a single pane of glass, like a a fairly narrow view. And it feels like I guess, timing is probably an issue there as well. So, Kim, you've been running a marketplace at scale. What is that problem? Let's give it a name. The decision latency. What does it actually cost in terms of not just theory, but but real impact? Oh, it's huge. Yeah. It costs in in every possible direction that you can think of. Right? So give you some examples. Right? Conversion. So conversion shifting, and you're often still running an experiment that could have been delayed and you're not really monitoring that close enough, that is gonna affect your business down the line almost even instantly if you don't monitor it all at a real time level. You start to see feature adoption dropping, and, sometimes you're not gonna catch it for weeks, because the dashboard has not been set up or does not update regularly. Similar story I could tell you about pricing. Right? So, with, all the marketplaces I've been involved in, pricing is quite central, and there's a lot of shifts and changes. And at the same time, competitors may be, shifting and changing and impacting you. If you're not monitoring all that almost with alerts, I think that's what's most important for these key metrics, then, you are going to start seeing a lag in the data, and that that's gonna hurt, right, inevitably. And I'm sure this is across industries, across categories. It doesn't only impact tech. But the root cause for me has always been the same. In my experience, it's that the data underneath isn't structured, to surface the problems quickly. And sometimes the data underneath has not been, let's say, built. It's not been in a set up in such a way that you can query it and work with it. And, that is a bit the danger, of the situation we are in because AI tools, will very happily do that. They will close the gap for you. But without those solid layers underneath, the quality of the data, and with the right resources, with the right taxonomy, you won't you won't be in a position where you trust what you're hearing, right, and let alone act on it fast. And what's really important right now, I think more than ever, that those foundational layers of data are built, and they are built on a very solid fashion. Because that's what the LLMs, the AI tools will be querying, and that's what they will be guiding users from. And and and I would kind of have to emphasize that because it sometimes will feel like there's a lot of work and sometimes a lot of manual work first before you get there, but I would say that's that's the key. That's the area that I would expect all businesses to focus their attention. Yeah. Definitely the the criticality of the underlying data is is well, has to be whether it's analytics or it's AI or however you're interrogating that. Right? I mean, the the rubbish in, rubbish out premise, like, is still incredibly accurate in terms of getting different quality content out of whatever your systems are that sits on the top. Right? Another. perspective is that the analytics industry spent the last decade solving for data freshness, like faster pipelines, real time dashboards, faster updates. But the real bottleneck was bottleneck was never the data itself. I think the real issue is the gap between when a data point appears and when a team actually acts on it. So so, Jane, when you see this across many organizations, where does that latency usually come from? Yeah. So if I pick up you know, Kim talked about underlying foundations, underlying data has to be good. And, you know, I think there's a point between between data and between users where, like you say, there there is there is latency. And then, you know, there's several points, but two big ones that I think, you know, I see probably the most. One is organizational. So one is if you have a big centralized team to go through, you get that situation that I articulated earlier with, you know, the back, the fourth, the back, the fourth. Whereas when you can have business users directly query, you know, in their in their words, in their language, without specialist skills, you remove a chunk of that latency. You know, I think the other the other kind of point between data and user is, you know, that that data layer. We have metrics that aren't consistently find, well defined, and I'm talking here obviously about semantic discipline. You know, if we have a situation where customer churn means, I don't know, as one pharma examples, inactive prescribers for three months in, you know, in commercial, but in medical affairs, it means health care professionals disengaging from, you know, the the material or something. You you have, you know, multiple your data isn't going to to make sense. You you're not going to have, you know, consistent consistent answers to the questions you asked. And that is as true for dashboards, and it's sort of 10 x when we move into into AgenTic and we move into GenAI for analytics. Yeah. I think you referenced garbage in, garbage out. I think it's sort of garbage in catastrophic disaster out once you have a situation like this. Yeah. And I think, like, I don't think there's anyone that we speak to on any footing really that isn't really aware of the value of getting the underlying data right. To your to your point, like, that is the risk associated now is exponentially grown. Like, you get the underlying data wrong in the first place, then the actual outcome from that could could be enormous impact and and incredibly problematic. So so. another view maybe is, like, the product teams are still fundamentally reactive just with better tooling now, like, the AI and everything's improved. But are we trying to tackle the problem the wrong way? Like, is there a better way to to approach this? There's gotta be. Yes. I I think there definitely is a a better way. The tools got better, but the workflow didn't. So the dashboard still they need someone to go looking. Most teams discover problems when a stakeholder notices something in a board meeting or a crisis has already hit, not when the system tells them, proactively. So that's that's quite key. I think this is really important because, just to your to your point here, Mark, that that the stakeholders won't really know what's going on. They could be blaming the solution. It could be the dashboard. It could be the AI solution. They they but they don't know that very often underlying it's the it's the data and how the infrastructure has been built in first place. So the shift that needs to happen is to go from go look at the dashboard to the system tells you what has changed. That's what I'm expecting analytic teams, data teams to be focusing on. Right? So most of your KPIs should be in alerting mode. There should be an expectation of productivity. You shouldn't have to ask the question pretty much. You would, wake up in the morning knowing this has changed, and this is why. Right? And that's the gap between, if you're in reactive reporting and proactive signal detection. Most teams have not made that jump yet. Yeah. It yeah. Really interesting. And and, Jane, you must see this a lot. Right? You must have a point of view on on this. Yeah. Absolutely. I mean, don't you heard from Kim, though. Don't take it from me. You take it from the the CEO of Gumtree. But, you know, exactly what what what Kim described, you want to move into this proactive era. And, you know, the the technology is is really I mean, it's it's it's here now. And it's this this shift from it was the shift to agentic analytics really, isn't it, from reporting to proactive detection and also action. So, you know, if we if we take let's not take a pharma example. Let's take a retail example. Maybe if you maybe a SaaS company, let's say, like like ours, and, know, you want something like show me all, you know, q three deals where product usage in the trial has dropped this week. I mean, that's great. You can you know, that's your your question. You can get a really good answer to, but it's reactive, isn't it? It's it's looking back. And what what I want is I work in a in a SaaS company. Is I want agents that are running in the background, and when they're spotting, you know, deals that where usage is dropping, something's pushing me a Slack message or to the account executive saying, you know, warning company x's trial usage has dropped by 35%. I reviewed their Zendesk tickets. They've hit a bug. I've dropped an email to, you know, to your founder. You can start to do these. That that is that's the shift that's that we need to see. That's what analytics needs to do for us. Yeah. So it's not just about like reflecting on a particular data point like that something's changed but it's understanding or starting to understand what has happened, why it's happened. But also. pointing us in a direction of what to do next. Arguably, actually acting on that as well. Right? Yep. Interesting. I I do this a lot. I'm not sure we could put it in the frame of of your users or your customers to interact with AI tools that are available on the market, whether that's ChatGPT or Copilot or or whatever it is you use. When they come back to analytics dashboards, what suddenly feels different to them in that experience exchange? It it's it feels like going back to the dark ages, if I'm honest with you, Mary, in so many ways. Right? It's just slow. I mean, the thing is that we we've all set up an expectation, or AI has set an expectation to all of us where you just ask a question in plain English, and you get a clear answer in seconds, and then you expect the rest of your life to function in the same way. And inevitably, no. Analytics is is still yet to catch up to deliver that quality response that you can you really trust this? Can you really believe it? Because the ability for AI tools to give you a plausible but fabricated answer is still quite high. And, you know, as much as you are gonna ask for sources, it it is a it is a space that needs a lot of maturity, and AI needs to get a lot better at being, honest. But, yeah, so the contrast, would say, is still brutal, but the expectation inevitably has shifted. People, now expect to ask, why did this number drop and get an answer right away, not not a dashboard? So and, this expectation will only widen. So every month, consumer AI tools are getting better, and the bar for what feels acceptable inside your own product just goes up. And, if your analytics experience is still, well, here are 15 filters and here's a pivot table, well, your users will have already moved past you, to get an answer to their question. Yeah. Yeah. I think that's entirely right. Jane, I'll throw throw it up to you. Yeah. I mean, where I've seen expectations change is because, you know, again, we've got used to these these tools in our own lives. People want to see they want to able to ask a question and get an answer that pulls from their organization's structured data, their unstructured data, and, of course, world data without going into three different, you know, user interfaces or or probably more. So this, of course, is the the MCP protocol I'm talking about, which has huge implications for analytics. But, you know, from, like, financial services example, if if you want to if an analyst wants to say something like, which of my high value clients are at risk of churn this quarter? You want to be able to ask that question to the single analytics interface and have it pull on, you know, your CRM interaction interaction data, customer support, transcripts, all the stuff that's saved there. So that's your your unstructured data, of course. External market data, trends, things that are going on, shifts, impacts, and, of course, your structured data, your transactions, your customer data from your your lake house. So you you want to be able to ask that question and get your answer in the one interface like like we do in our ordinary lives. That's that's the change that I've seen. Yeah. People don't wanna jump around. Right? I think the days of having to leave the application flow that you're in to go to an analytics tab or something to get an insight to then go back into the main application should should be gone by now, I think. Feels that way. Like, we should be able to do something better, more sophisticated than that. I think the other point on top of that is speed. Right? So if we go back five years, product decisions were kind of quarterly. Now they seem to be continuous. What does that mean, like, in terms of demand and expectation on the analytics layer underneath? Wouldn't say it's getting it any easier. It's it's changing the model. And I think, like, traditional analytics was very closely tied to financial cycles in the past. Right? So you would have your annual planning, and, things started breaking down in the world of agile because you realize that, well, a year one year is a very long time, and goals should not be You should move to quarterly OKRs, and then you start building your frameworks around that. But now you're right. I mean, even quarterly OKRs are being put to question. They're not moving fast enough, and many businesses are moving to more of a regular cadence. Could be monthly. Could be weekly. But, yeah, that puts a lot of pressure on on analytics because, crucially, you wanna ensure that you've got the right infrastructure in place. You've got the right setup and so allowing teams to then run on an autonomous fashion. And and that takes that takes time. That takes resources. That takes a significant team focus to ensure that everything's designed around that rhythm. But today, product teams are making feature level decisions daily. And we always ask these questions to ourselves. Should we roll this out wider? Is this segment responding? Should we kill this experiment? The analytic layer can't run on a quarterly cycle when the product team is running on a daily one. So you end up with teams making fast decisions on slow data or worse, making decisions on gut feel, because the data is not ready. And, yeah, that that cadence mismatch, I think, is one of the most underappreciated problems in modern product teams. It is a very common issue. I don't think it has an easy answer, by the way, but inevitably, we need to continuously working on ways to close that gap. Yeah. Right. And think, like, this is a question that it would throw us at at ThoughtSpot. I know you spend a lot of time, Jane, talking about these type of things. What was your view? I'm really sorry. I I got some sucked into answering a really good question in the chat even though you didn't say, would you mind to put your your questions in the q and a? So I I won't do that again. So I skip that question mark? Yeah. Of course. Yeah. Yeah. Yeah. No no problem at all. I mean, I I think it's just a it's an interesting viewpoint. Right? Because there is always gonna be a slight mismatch in terms of of cadence. But I think what increasingly we're seeing is demand on product teams to be able to react as quickly as the requirement of the business. And that's just always gonna be difficult. Right? So we're gonna have to rely heavily on technology in order to fulfill that and accelerate the human efforts to work in within the cadence that matches to the business to make all of those things work in sync. But let's just move on a bit of a step change, I think, in terms of I think we probably covered off kind of what we were doing technology wise previously. And I guess, what the reasons are, the underlying reasons as to why we need to make a change. I think if we start to look at the shift now, I guess product teams, when they realize that their analytics isn't keeping up, that they can't match that cadence, Most people, I think, certainly in the conversations that I have, look at trying to build their own. If you try and build your own, what does that look like kinda eighteen months into that sales cycle? Is this something you've got some experience of, Kim? Oh, yeah. For sure. I mean, inevitably, you've got to build layers your own. Right? If you are a tech organization, data will be very central to you, and you will expect data engineering team to have the right pipelines built so that any layer that surfaces, which you can then question if you if you build or if you buy, we'll we'll be able to surface. But I think there's some a lot of foundational work that needs to be done. Often, this starts as a three sprint project, and then it turns into a platform team. And then a few days later, and and you're you've you've got dashboards, a data pipeline, then you're maintaining a a not only if you want that active pipeline, a cache layer, an access model, but you've got a team as well that you've hired maybe two engineers to keep the lights on. And that engineering depth is real, and it and it does compound over time. Right? So every every spend that you're spending on analytics infrastructure, is one that is incredibly valuable, but, of course, it is one that comes at a trade off on other areas as well, like building your core product. And, that's a challenge that I that I expect will evolve and need needs to continue evolving. Because, if when when you talk to stakeholders, I've seen these cycles happen so many times. First of all, it it would be like, what's, you know, what what's the number? Like, what what was how many sales are we getting? We need to know the conversion rate for our sales funnel as an example. Right? And once you've managed to get that right, then you shift very quickly to, but what's changed? So what's the change versus last week or last month or last year? Right? And then very quickly and that's what dashboards do very well. Right? But then you start moving to especially when you see that distinction between KPIs and OKRs, you just want to know what's changed. You know? You just just tell me when something important happens. And the those requirements, though, that cycle is just moving quicker and quicker. Right? The expectation of moving from I build something, alert me if something breaks, which is more of an operational type of report, then, you would expect to see that from day one. Right? And still many businesses don't have that, by the way, because it does take significant effort to build all all that infrastructure and tooling. But going back to your question, so the build versus buy, those types of decisions are ones that I think every business that I've worked with, they they do need to define. They need to decide. Often, we end up, buying tools that would help us do that, but clearly not at the very basic layer. There's a lot of piping work that will be happening internally. But at the visualization layer, absolutely right. The often the what what I tend to think when making these decisions is to build what makes you different, but by what makes you fast. And in this case, data visualization and AI layers that will come on top of that. In in all cases, I would expect to outsource them. Yeah. Good. Got it. Yeah. I think that's really good insight from, Kim, your side, like, from the person that's been in position and and gone through that cycle. I mean, I I talked to people about this a lot and, like, an off use line is analytics is iterative. Right? Trying to answer the problem in a in a build is really difficult because for every data point, for every answer that you provide, there's always more questions. Like, that's an ongoing cycle of iteration of asking and answering, which is a really tough thing to to build. But but, Jane, I think building analytics in house often feels like the company that's taken on that decision, like, they're taking control of the problem for themselves. What teams quickly realize is that they're also taking on the full infrastructure cost, the the burden, the security, the governance, the scalability, reliability, all of those points as well. Right? So when companies decide to build internally, what responsibilities do they actually end up owning? Yeah. Quite quite a lot. I really like what Kim said about build what makes you different and buy what makes you fast. And I'm referring back to that because I think that's hard for us if we work in product, and I assume pretty much everybody in the call works in or around or very close to product. We like to build things. Right? We like to we like to we like to try things. We like to own things. And we have this kind of mentality, I think, of how hard can it be, but we'll build it. So I think, know, when it comes to analytics, what can start as a sort of a a quick analytics feature, you know, can sort of it can mutate into a big tech debt piece very, you know, very quickly. Because when you have I I think, you know, Kim, you did reference something like this. Your product team, they're not focusing all their energy on on your your their core product. They're spending probably about 60% of their sprints on maintaining what, honestly, I'm going to say is probably a very mediocre hard coded BI tool. You know, they're either going to hard code something that every, you know, every now and again, every time there's a new request or a new feature, it's going to need a two week sprint from the engineering team or, you know, they're gonna start caching data or moving it back and forth. Now your data is stale, and it's really going to hit your your costs. So once you that that is gonna put you into the realms of super expensive. Now you might say to me, we're not gonna hard code. We're going to build something dynamic. But then, you know, what you're doing is you're sort of building really a boutique analytics company inside your inside your business, which you, you know, you may want to do. You may feel fine about that, or you may think, well, this isn't really our our business. It's also, you know, either way, whichever is the way you might go, it's very hard to have enterprise, you know, grade multi tenant, row level security in a homegrown tool like like that. It really only takes one kind of missed where clause for, you know, a custom script for customer a to accidentally see customers b's revenue data, and then you're in in a whole world of pain. And, you know, one thing with with Bill Buy that somebody said to me a few years ago, I always keep it in mind. We don't you know, in the AI, the GenAI, the agentic era, you don't sort of buy you're responsible for reasoning. So you kind of don't buy your tools anymore the way that you maybe previously did. You almost have to select them the way that you would select a lawyer. You're looking for that component of reliability and ethics. I think the same can be said when when you wanna build something, you're prepared to take on that level of of assurance. That's where I stand on a mark. Yeah. Fascinating. And, again, like, you've spoken to a lot of people about this problem statement, and and I think I think that's a really good point. Like, the the sense of ownership that comes with building it yourselves is the reassurance most product builders look for. Like, that that seems like we can own this problem ourselves. But the reality of actually trying to create an analytics solution and own it and maintain it. So it's not just upfront cost. It's the ongoing maintenance of the deliverable. and and keeping pace with the market. And certainly in an AI world with that, you know, it is accelerated pace. It is kind of almost impossible. That said, like, teams that also went the vendor route didn't necessarily escape the problem. What's the failure point for embedded analytics do you think can? Well, I I guess on one side, it it does move that visualization layer into your product. And sometimes that will be the right call. You may have, like, your own dashboards. In the cases of, Gumtree, you will want to see, let's say, the exposure of your ad, similar story to Checkatrade, if I'm honest. So there's some internal analytics that you will be wanting to work on. They tend to be more on the dashboard side of the spectrum if you want. They they they're they're usually very simple metrics that you wanna showcase. But you so you will have users that are starting a charge. They interpret the metrics and deciding what to do on their own. But that analytical the analytical layer is improving, but the decision layer isn't necessarily improving as much. Right? So so you you've got the data closer to the user, but you didn't get the user closer to the action. And often, that is where I'd expect teams to focus their attention. So analysts need to be, thinking about outsourcing the visualization layer as much as they possibly can. It's not about making pretty graphs and pretty charts, Morning. and it's a lot more about thinking what are the underlying challenges or insights that this organization needs in order to move their goals and their OKRs forward. Right? That's that's usually where the problems lie. And, often, analyst teams, they confuse that. They confuse the, well, I'm just working for a stakeholder. The stakeholder wants this dashboard, so I will provide them with the dashboard. And I would expect them to shift that approach to, well, why? What what's the need of this dashboard? What's the underlying question you're trying to answer here? And that I think is is quite key in order to build a business that's far more mature and data oriented. Yeah. So yeah. So it's a really good point. So a lot lot to unpack in there. I mean, Jane, this is the world we live in kinda daily. Right? What what do you think? I'm gonna say a cheeky thing. I don't know if I'm allowed to, but I'm actually going to ask you what you think, Mark, because you are ThoughtSpot's embedded expert. What do you think? I'm just here to ask the questions, James. I yeah. That yeah. Sure. I mean, it's it's a really valid point. Right? I think that's it's a the obvious thing to do is is to outsource it. Right? The the on one hand, the motivation is to own it yourselves. I think most people these days can look at the problem and go, like, that's not core competency. We ought to give we ought to outsource this to a third party vendor. I think the criticality is choosing the vendor that enables the strategic direction of your business to maintain its right trajectory. And so I think it's really easy to look at where the market is today and go dashboarding solutions are still right. That's a valid way of presenting data without maybe really looking into, actually, what do I wanna be able to deliver? And the north star for embedded analytics, certainly since I've been involved in in the market space, it is like, can I help my customers make better decisions? Like, if I can help my customers be more successful using my data, then that's what wins. Because, ultimately, if they can get more value out of my platform, that makes my platform incredibly sticky. It means that they'll use it more. They'll arguably spend more money with me. Transactional volumes will go up. Churn will decrease. And so if you keep that, like, core central North Star at heart and then start to look at who is the right embedded analytics vendor, like, that moves beyond like, a dashboarding solution. That moves it's gotta move away from, like, point one that we covered today, why tablet why why dashboards are substandard in today's market through the shift that we've just spoken about through international language and AI capabilities so that now I can get my business users, my domain experts really close to the data. And so they can ask specific business questions and not just get an answer to my question, but also decisioning, thinking, predictions, recommendations, and into next best action, which is ultimately where everyone wants to be. If you can have that on top of your data, then that's the solution that really works with you and will keep pace with the market rather than falling into the trap of what are other people doing that looks like a dashboarding solution. I think that's where the the procurement of the third party solution kind of falls down. You you can mark good answer, Mark. Thank you very much. can mark my homework. You can tell me how I did. I have to keep you on your toes. Yeah. Of course. Yeah. Thank you. don't want you get too comfortable. Yeah. Of course. I'm. gonna go back to asking questions. We're going to get some kind of revenge question now. Yeah. That's right. I'm gonna ask a difficult one now. So so what does embedded intelligence actually look like inside a product? Like, not conceptually, but what does a user experience differently, I think is my question. I'll give you some time to think, Jane. I'm feeling generous. I'll I'll come to you, Kim, first. Well, I was I was gonna pass it over to Jane. This this is definitely feel free. tricky it it's Jane that deserves sort of three events, not me. Let's see. But let let me think about this. So I would say, what does embedded, intelligence look like inside a product? Right? So, maybe the best way to phrase this is that the the insight is coming to you. So it's it's inside inside your workflow. So it's less like, here's what's happened, and it's more here's what changed. Here's why it matters. This is what you should consider doing. Right? So so it's almost like the product's becoming the analyst in a way. Right? And, it it goes far beyond just telling you the number. And it's almost like the dashboard is is is coming to you, but it's already done the the thinking. So let me give you an example of this. Because I advise a business, called EarthMark, a sustainability, data business that takes raw public data, it normalizes it into a source, and embeds it directly into ecommerce checkout flow, so across tens of thousands of retailers. The retailer doesn't want a sustainability dashboard. They want the scores right there in the customer journey at the point of decision. And that's, to me, is what I see as embedded intelligence, and that's how consumers perceive the product. There's a lot of data that's happening in the background to just surface that EarthMark score. And the architecture behind it, collection, normalization, scoring, deliveries, is the same pattern that I see across many verticals and a lot and across many product and tech organizations. Yeah. Really interesting. I love that EarthBot story as well. I it's fascinating business. Jane? Yeah. Sorry. I was just googling EarthMark. Actually, that is that's really cool. Yeah. She's cool. I'm going to I'm gonna pick up on Kim's point. I I like that. You know, for me, I think what is the what's the embedded intelligence look like? I think I think it's when you're the product's learning system underneath sort of becomes the product rather than the interface. That's that's how I'm thinking about it. So maybe similar to to Kim, if we take something like an online retail checkout, you know, traditionally, everybody really sees the same grid, don't they? The add ons, the bundles. But what drives the revenue growth isn't isn't the UI. It's not what they're seeing. To an extent, obviously, it's terrible. It's it's gonna have an impact. But, generally, it's it's not going to make that much different. But what what is gonna drive revenue, what is gonna make a difference is the learning system underneath that's able to predict, you know, for that customer, which two items are going to maximize their basket value for them specifically. And it's going to be able to adapt to each click, each dwell, each decline as the customer goes through their their journey. Now, sure, we have some personalization today. We do. I think it's mostly rules based. It's segments based. It's just maybe you're going back to Irina's question that she put into the chat, is really good. It's nothing like the kind of autonomous agentic background learning system that we are going to to see. Nice. Okay. Really interesting points there. So I like to say, love that Earthmark story. And I think some of the points that you pick out there, Jane, they're really relevant in today's market and they tag on to a lot of the conversations that I have day to day. I'm conscious that we're running against time a little bit. I'm gonna move into the last phase, which is get a little bit more of a free hit, I think. Like so so what do we do differently? And let let's phrase this in a way that kinda makes sense to the audience, that if a product leader listening right now has a road map full of more reports and dashboards, kind of the status quo that we've spoken about. Yeah. More granular features based on existing architecture technology platforms. What's the first thing that you think that they should challenge? So let's let's have this lighting round as well. I wanna make sure that we allow some time for questions from people attending this this webinar as well. So I'll go quickly. I'd say reframe the metric as an action. So what would we do different if we saw this number move? And if you cannot answer that, you probably don't need a dashboard. Right? You may need a completely different product in the market, but that's quite key. Reframe that metric as an action. I'm going to ride ride the coattails, Kimith. That's alright. I'm gonna also sort of reject this idea of starting from the the dashboards and the road maps. I'm gonna say, what we should start is what's one question that if you could get good answers to it immediately would really move the needle on your business. It would really materially change how you you make money, how you do business. And then I would think what is the workflow so that you have an analytics an agent analytics agent that can be monitoring that data for you and proactively bringing that insight so you don't even have to go to a dashboard. It's it's coming to you. That's that's where I'd start. I'd leave for a whole lot of stuff and go there. The the technology is here. Yeah. The technology is here today. Right? I think it is a mindset change. That that's like across the board. I think for, you know, more than a decade, probably twenty years, that we've had the answer to data questions largely wrapped up in dashboards. Right? And I think technology has now moved on and like it is time for a different approach. I think the market is gonna demand that as well. I think expectations have now dramatically changed in terms of what people can achieve. And therefore, that's probably what's gonna drive the product delivery forwards anyway. Like, because the the customers, the consumers will demand it, I think. Any final comments just before we move into Q and A? Anything that you want to, I guess, pick up on from the discussions we've had so far? Or anything that you want to just comment on further? Otherwise, we'll just move into some questions, I think. Happy to move to q and a. That's good. So I'm just gonna I'm gonna take these from the top down, I think. If I just click on a button, there's a question. So from Katie, what does good look like twelve months after making the shift? How do you know it's actually working? Can I take that one, Jay? Yeah. All yours. I was gonna offer it to you, actually. Oh, so sorry. I thought you said yeah. What is what is good look like after after twelve months? I think what it looks like is when you have an analytics team that are not answering whatever think of as first line, second line questions. They're not answering the sort of the basics that business users can ask themselves. What's my churn? What are my sales? Why did this drop? Why does this look different? Business users can just do that for themselves. Instead, your your analytics team, they're almost they're they're getting into the much more substantial questions about your business. They're almost acting like consultants into the business. They're spending their time on the harder things like I don't know. Has our investment in Amazon cannalized our website sales? Something like that. But that's what I that's what I think it looks like. Yeah. I agree. And I think, like, specifically, I'm talking about, like, customer facing analytics when I talk about this. So it is like, the the I would always advocate for get something else into the market and iterate on it. Like, your customers will feedback pretty through pretty effectively on what works and what doesn't work. And so I would always advocate for the approach of get something out there, do the appropriate testing with the market audience. You can always do that prerelease in betas. But but get something out there and and check it live with some friendly customers initially, like, do do small releases, iterate on it, and then launch fully from there. But but my feedback is always the the customers will let you know what is good and what is bad pretty quickly. And so yeah. Always advocate for that approach. It is but perfection's, like, not quite achievable. Right? Okay. Next one down. Who wants to take this one? If most teams are still reacting despite better tools, what structural change actually moves an org to proactive decisioning? Is it process, ownership, or something else? What tricky one? I think yeah. It's a tricky one, but I think it's a very cultural one. Right? So and it probably comes from the top. It comes from leadership teams. Right? So if if you ask a team to build me a dashboard, that gets them far removed from the actual thinking, the decision process. Right? And often the way that I try to challenge myself and challenge leaders to think about is is, well, what are we trying to do? What's what's the goal? What's the objective? Right? And try to get, data analyst teams very much involved in that decision process. Because once you do that, then they're gonna be the first ones, craving for better tools and, better understanding of your product, better understanding of the design experience, so getting very close to the build and influencing the business strategy as a whole. So I'd I'd probably say it's it's a cultural shift, what what's the nature of this question and one by which, I guess, moving away from your traditional autonomous side of thinking is gonna be quite key towards a more collaborative performance led culture. Yeah. Nice. Anything you wanna add add to that, Jane? Yeah. I think it's, you know, good old fashioned change management here is incredibly helpful. It is, you know, creating champions. It's leading by example. It's, you know, some things we do in ThoughtSpot. We basically run our entire business of Spotter, which is our analytics agent. We rarely use slides anymore, but it's the expectation is when everybody goes into a meeting, you don't use slides. You just connect to your laptop into Spotter three. You ask your questions there, and then you get your data. And, you know, things like that make change management a a a shorter job than than it would be otherwise. Nice. Okay. Yeah. We got a a lot of questions, I'm gonna keep moving on these. Always wary of a question that starts with devil's advocate. So first of devil's advocate to building your own. Why shouldn't companies rely privately run-in guardrails, cloud code with MCPs to analytics where they can safely grant access to code repos for interpreting products, intra, and tooling? Can I take this one? Please. Yeah. Be. be my guest. And thank you, Katja. This is a great question. So LLMs find it very easy to quest to query unstructured data. They find it a lot harder to query structured data. So, you know, when a an LLM is looking through your rows, your columns, your tables, if it sees something like any, it doesn't know if it's Northeast, is it Nevada. So querying structured data has historically been difficult. It's been tricky. It's also something that you want to query generally your structured data very, very safely rather than have an LLM running over it. You where you can where you get essentially natural language text to SQL translation, then you can't necessarily unless you have special skills, you can't necessarily understand. It goes into a black box, where you don't know what's being returned to you from your structured data. What thoughts would do is we only ever we have a tokenized search. So every your natural language questions are translated into a token, and the tokens generate the SQL. So it's a highly deterministic SQL. So you do not get any hallucinations back from your your structured data. That also means that anybody who's asked the question and looks to an answer can always look back through the tokens to see how the how the LLM arrived at at its answer. So that is what I consider to be the best enterprise grade security, and that is that's my answer to why I wouldn't I wouldn't run something like a private, like, the the suggestion you've made Claude code on top of your your structured data. Can can I add a bit of nuance to that, Jane? True. Because I've I've I've started a business. I've been running for a couple of years, but very focused right now for the last three months on AI powered. So it's called Early Phoenix. And Early Phoenix is a AI powered business intelligence platform. So I'm supporting retailers and classified business understanding how they are competing in the marketplace and where they're positioned versus them. And when you when you're running running on your own, it's it's I you probably will wanna consider approaches like this one. If you're starting from scratch, for me, ClotCode is running pretty much everything that I do from building a website to sending an email. Right? And that may be what gets you started. But the more you scale, the more you grow, the more you're gonna want to start thinking about specialist tools that take you a level further and and which are, I guess, easier to work across multiple teams or shareable across stakeholders that maybe don't have that technical background. Right? So I think it it is still a valid solution, especially as you're starting. But as you scale, I probably would say that may have reached its limits. Yeah. Agreed. Really interesting. I think there's a lot wrapped up on that. I'm just just conscious that we've got a lot of questions here, so I'm gonna move on quickly. Okay. Next one. We're building these tools for SMEs at Luca. How do you see these challenges being different? More SaaS use, but a large number of providers, for example? K. I have experience with SAS. Jane, you have a bit more experience. Sorry. than me. sorry to understand the question. So we're building these tools. Does that mean analytics tools? So yeah, I. assume I assume that's my assumption. Yeah. That Okay. that's fine. So how do these challenges how do you see these challenges being different, more SaaS use, but a large number of providers, for example? Yeah. I I think it's maybe talking about, like, long tail delivery is what I'm taking from it. Like, SaaS applications are typically, like, low cost delivered. Large scale is my reading of it, but I might have misinterpreted that. I mean, I I have a viewpoint on that because simply embedded analytics is like the the perfect foil for SaaS delivery for me. I we probably do more business with SaaS vendors with, like, really scaled out larger states of customers than with anybody else. And so, like, a really perfect fit because often companies focus a lot of their attention on their larger enterprise clients who warrant human touch points, which is much more expensive, but they can afford to have, like, human time time and materials spent on serving the needs of those type of clients, whereas the typical longer tail is it needs to be serviced by technology simply because of the cost point. Right? And so I would argue that actually the best use cases are the most value derived is arguably in those scaled out lower cost point, more SaaS type deliveries for delivery scale at lower cost but maximum value because they're also typically the types of organizations that can't afford their own or don't have their own analytics teams. Right? So they haven't got people in the business to try and take and interpret data points for them. And so they're reliant on things like third party application softwares to have really good data type capabilities to service their needs. I'm hoping that I've interpreted the question right. But if anyone's got any commentary on that, then please feel free to pitch it in. Cool. Okay. Let's have this one. It looks like quite an interesting one. How do you deal with issues of AI hallucinations and when huge data volumes? I e, what is the AI cost going over billions of records? Oh, please. Please. I have this one. Yes. Yeah. Yeah. You've shouted further. So all your. Thanks. Thank you, Tavik, for the questions. So how do you deal with the a with the issue of hallucinations? Well, on structured data, you have a solution that goes through a tokenized search like like we mentioned. So you get a deterministic SQL back rather than a black box which which could hallucinate. As your unstructured data, well, you know, we have a lot of discipline around how we govern structured but not unstructured. So I'd never recommend that you point your LLM at your entire network die drive point to judiciously. Oh my gosh. You've got six seconds left, Mark. So I I don't if we're gonna get cut off at this. I don't know whether. we get, sorry. I'll I'll finish it really quickly. Sorry. yeah. So we'd always be judicious with your unstructured data. Be mindful about what you point your LLM at. And as for costs running, yeah, you have to be very thoughtful, you know, in how you allow your your LLMs, how you allow your business users to to go through LLMs. I would always advise having, you know, a central gateway that people are allowed to go through and, you know, keeping your your FinOps very close to you. You know, I have been in situations before. I had some early experimentations with LMs, and they were much less powerful than they are now. And it ended up with them misunderstanding the schema running an optimized query across millions and millions of policyholder records in an insurance company, and it it burst through credits really, really, really quickly. So whatever you can do to put yourself in a position where that's not happening is going to be gonna make your CFO happy. That's great. I'm gonna wrap it up because I don't know whether, like, things blow up if we overrun. So thank you, Kim. I really appreciate your time and input today. Like, really fascinating points. Thank you for that. And thank you, Jane. Like, I I think we've heard some really fascinating points of view today. I think, like, we've established, like, dashboards probably don't do the job that we want them to do in the market today. I think we've talked through, like, how's that shifting and arguably what some answers are for the future. I think embedded analytics was really successful in helping your users see what has happened. So, like, the dashboard approach, embedding intelligence is helping them to decide what to do next. And so if your roadmap doesn't reflect that shift, you're probably not building for, I guess, modern day expectations, I guess, is how I'd wrap it up. There's a QR code on the screen there. Hopefully, if you wanna follow-up with this conversation, please scan that, reach out to us. I hope you found it useful. We didn't get a chance to quite finish all of the questions, so apologies for that. But hopefully, it was an energetic and interesting session. So thank you for your attendance. I wish you all well. See you again soon. Thanks. Thank you.