Video: Future-proof marketing measurement for CMOs | Duration: 2792s | Summary: Future-proof marketing measurement for CMOs | Chapters: Welcome and Introduction (6s), Marketing Intelligence Evolution (85.21001s), Digital Attribution Era (211.67499s), Tracking Signal Loss (360.565s), Performance Marketing Imbalance (505.655s), Last-Click Attribution Challenges (637.48s), Marketing Intelligence Solutions (813.185s), Data Integration Foundations (969.78503s), Marketing Mix Modeling (1093.775s), Incrementality Testing Explained (1384.15s), Digital Attribution Methods (1644s), Triangulation and Optimization (1859.92s)
Transcript for "Future-proof marketing measurement for CMOs": Hi, and welcome everybody to our live session. It's great to have you all joining us. It would be really awesome to see where you're joining from and who you are, so please feel free to pop a message in the chat and let us know. We love meeting all of you from all over the world. My name is Emma. If you don't know me, I'm the community manager here at PMM Alliance, and I am super, super excited to be introducing our session today on future proof marketing measurements for CMOs. It's brought to you, with our wonderful partner, Funnel, and we have a really, really exciting guest speaker as well. But before I introduce him, I just want to remind you all as well that continue using the chat throughout the session. And if you do have any questions, please feel free to pop them in the q and a tab throughout. So we've got János here from Funnel. He's the VP of measurement. He's got over twenty years of experience in marketing intelligence and AI based attribution modeling. So I think it's safe to say you're in pretty good hands for the next, hour or so. So on that note, I'm gonna hand over to you. Enjoy the session, everyone. Thanks a lot, Emma, for the introduction. And thanks, everyone, for joining. I'm very happy to talk about why the future of marketing is intelligence. Yes. As Emma said, I have quite a bit of experience in measurement and AI and building data science teams. And, what I would like to talk about today is, of course, with all the hype that we have, in AI and specifically in generative AI, which is, of course, a huge lever to optimize our marketing holistically, we mustn't forget that we also need real actual human intelligence to really make, the most use of all the awesome artificial intelligence tools that are out there. So without further ado, let's dive directly into it. And what I'd like to do, for the start is go through a couple of paradigm shifts that we as marketers went through in the past couple of decades, which allowed us to arrive at this sort of renaissance of a brand, as we like to call it. So, this recognition that it's actually really a worthwhile exercise to invest into building our brand equity and invest into brand advertising. And I also like to discuss how marketing intelligence and specifically marketing measurement facilitates this development. But let's first maybe jump back a couple of, decades into the era of the madman, the era of brand, where iconic brands such as Microsoft, Nike, Nintendo, and of course, also Apple have been built, all completely without any sort of cookie tracking, without any sort of Google Analytics, without last click attribution. And, also, what we have to keep in mind that marketing mix modeling already was been has been used back in that era by big FMCGs such as Procter Gamble, for example. And so, marketing mix modeling, even if some maybe see it as the holy grail and the new shiny thing that will help us with all our measurement problems, is not really that a new thing. And then in the kind of 2010s, we started really into digital attribution and everything seemed tractable. If it wasn't tractable in terms of marketing activities, it was probably not worth doing, at least that's what we thought back then. And we kind of neglected building our brand. We believe that last click is a good way to measure the effectiveness of our marketing activities and there were also times where we thought that retargeting is one of the most worthwhile exercises to do. So from my personal experience, I knew of an agency, that, I was working at an agency, we had a client that actually suggested that they should spend, they thought they should spend the whole marketing budget into retargeting. And that of course was flawed because they thought and believed the retargeting numbers were correct. So there were retargeting webinars out there that proclaimed that they could increase conversion rates by 600 or even thousand percent. And one of the outcomes of that era was that we have these, thousands of different vendors in that space that all kind of profited from this tracking data that was available. If you, in 2015, went on any sort of website of, for example, newspaper or publisher such as New York Times regarding, you would probably be tracked by roughly speaking 30 different companies. And that kind of set a third party cookie on your device. And then in around 2018, tracking restrictions came into place more dominantly. For example, GDPR was implemented And of course, with that realization, or within that time we realized as marketers that we need other methods to measure our marketing holistically. And we couldn't just rely on tracking data and multi touch attribution and things like that. And with that we also realized that of course we should include offline marketing activities, non marketing effects into our marketing measurement and marketing intelligence. And that helped us to really, really realize that we can also measure the impact of our brand activities and upper funnel marketing activities. So with that, we realized that there is a future of a more balanced day, where we balance out brand activities and performance marketing activities holistically. I'd like to dive into a bit more detail on why the shift is happening and of course, it's all driven by this dramatic loss of tracking signal, starting as I mentioned in 2018 with the GDPR as one of the main events, so to say, that sparked that whole development. But also, companies and organizations such as Mozilla that announced tracking protection for Firefox and then of course, Apple, which was probably leading that from a technological perspective. In 2020, Apple announced dramatic changes to the accessibility of the IDFA. The IDFA was basically a user ID that advertisers can use to track users on iOS devices, so on Apple devices. And that led to, that loss of tracking signal on iOS devices led to some significant, share price reductions for companies such as Facebook and King. And then in 2023, Apple announced further reductions in the availability of tracking data on iOS devices and also announced tracking protection, link tracking protection. So it would be more difficult to, understand where a user was exactly coming from, which kind of campaigns a user has, clicked upon. And then, Google, of course, kind of went back and forth with, starting to restrict third party cookies beginning of twenty four. And, so we have these two major forces, kind of, state regulation, such as from the EU, the GDPR, but then also, and I think this is even more important and played a bigger role, technological sort of restrictions by companies such as Apple and Tim Cook here. And then, as I just said, Google and this guy here, the CEO of Alphabet, Sonam Picay, they just decided in April that they're actually gonna let the third party cookie live in Chrome at least forever. So, there's some maybe some backtracking that tracking data will be more available or that there is some sort of future for, more tracking signals. So let's see what's gonna happen there. And, of course, everybody in this space should keep an eye on that. Because in the end, tracking data is quite valuable. But, with these sort of this loss of tracking signal, as I said, we realized also that this whole, view on performance marketing or the emphasis on, short term initiatives and performance marketing led to a decrease in effectiveness as we can see here in these numbers provided by System one, with the increase in the share of short term marketing activities, the effectiveness measured by the average number of very large business effects that marketing activities drive actually decreased. And one very prominent example of this imbalance between investing into performance where the spread activities was Adidas, who did an econometric modeling with which they found out that the 23% of the marketing budget that went into brand activities actually caused 65% of the revenue or was responsible for 65% of the revenue, whereas the 77% of marketing budget that went into performance advertising activities only generated 35% of the revenue or was responsible for 35% of the revenue. So a huge imbalance in terms of the budget allocation. Specifically, if you look at Les Bennett's recommendation of investing 60% of brand and 40% of performance and of course, we reference this in a bit more detail later when we talk about how to measure brand equity. And I found it really interesting that Adidas was brave enough to talk about this openly, because I think a lot of marketing organizations are actually in this position where they are heavily or too heavily relying on performance marketing because it's so seductive to believe the last click in Google Analytics numbers, which of course are not the whole picture because they do not account for the impact that views on social media or display advertising have. And of course, also completely disregard all non trackable activities in offline marketing campaigns. And they also disregard and do not account for all the non marketing effects that we should incorporate. Nike was another prominent example, or is also mentioned in this same vein because they have been criticized for focusing too much on D2C and not investing enough in building the brand. One sort of exemption from that is that Nike actually invested now into being the sponsor for the German national male football team, four time world champion, just to mention that, and outbid Adidas by paying more than double what the German national team got before from Adidas. So it's weird as a German to imagine that the German national team won't have the three stripes, but the Nike swoosh, but I think it looked beautiful as well. So So Nike was quite good in investing heavily into brand in that regard at least. Here's some further numbers in terms of how prominent last click still is. This is some studies and research done by Meta. 78% use last click for campaign decisions, 77% don't think last click is a good way to measure campaigns. So if you do the math, roughly speaking, 55% kind of do last click, but don't think it's actually a good idea. And that might be related to the 53% of business leaders that are the biggest believers in last click. And and of course, it's hard to argue with your CMO or maybe even CFO that they should not, you know, believe in last click. Maybe it's because we as marketers managed to convince all the the, sort of evangelists in that era, the Multi Touch attribution manuals, etcetera, managed to convince the marketing managers to not believe last week. But now we need to convince the business leaders in the organizations to also not believe in last week. But again, last week seems very deterministic. It's hard to argue that it's actually not that hard to argue, but it's a very simple and straightforward way to sort of track marketing effectiveness, but of course not measure it holistically. And some further numbers, 75% of global COOs are unable to quantify and optimize the marketing, thirty five percent do not measure marketing impacts at a tactical level, 37% do not have timely measurement in place. That's maybe the reason why 61% want better and faster media mix modeling. And, one astonishing numbers is that more than two thirds, 68% of users don't act on the insights of the consistently. So they invest all that money in getting into marketing mix modeling system place, and, of course, also into potentially buying the solution of an vendor. But then they don't act upon the insights, which basically, yeah, makes all their investment useless. So why is it so seductive to believe the last click numbers or the performance marketing measurement. And it's because if you look at the EDA funnel going from awareness, interest to desired action, that the lower part of the funnel is so much easier to track. It's much easier to track retargeting or direct type in or branded search click, compared to measuring the effectiveness of a, out of form advertisement or print mailing. And, that doesn't mean that it's easier to measure because just being able to trick it doesn't mean that we are measuring it correctly. And also the lower part of the funnel usually don't drive that much incrementality if we compare it to the upper part of the funnel, where we usually have high incrementality and reach maybe also people, the 95% that are not in the market currently. And that part of the funnel is much harder to measure as we just discussed when looking at, for example, measuring out of home or print advertisement. So what is the solution? And that is, of course, marketing intelligence, which will help us to measure and understand the effectiveness of our marketing activities holistically. According to Gartner, marketing intelligence is a category of marketing dashboard tools that help an organization to gather, analyze data to determine its market opportunities. And one thing that we at Funnel would really like to add to that definition is, it's not only about determining the marketing opportunities, but it's of course also to then act upon these insights. So we don't just want to look at these opportunities, we actually really want to, make the best out of them and realize them. And our idea is that we really need to drive forward after we generated these insights collaboration, adoption and activation. So the third point here that we'd like to add to this definition by Gartner. And it all starts with gathering high quality data, making sure that we are really working with the correct data. Then we are able to apply advanced measurement, on that data and generate these awesome insights, which we then have to act upon to really leverage the value that sits within these insights. So let's look at these steps here in more detail. First of all, getting all the data into place. As we know as data people, garbage in, garbage out. If we have wrong and flawed data, data that is not mapped correctly, no matter what AI, Gen AI process you'll apply, you'll get garbage out. Garbage in, garbage out. And also, as any data scientist knows, the integration, the extraction, the transformation, and loading of the data, the ETLing of the data usually, cost us 80% of the time in a data science project and the marketing measurement and marketing intelligence is a data science project. So it starts with unifying online, offline data, APIs and spreadsheets. We want to gather data from online marketing platforms, Google, Facebook, TikTok, etcetera. We want to include first party data from the brands, data warehouses, backend systems. We want to include all offline marketing data. How much has been spent for different out of home campaigns, TV, and print mailings. Then we need to understand baseline effects and non media effects. So we wanna know about specific events that are relevant to a brand's activities, such as the sports events, but maybe also the back and forth with the tariffs, of the US administration currently. We need to incorporate these into the models as well. And then we'd like to include further data such as the cost of goods sold, fulfillment cost, and for example, also lifetime value and maybe even predictive lifetime value data. Because if you think about it, what we really want to optimize is for the lifetime value of a client or of an organization and a brand. Then Then we want to store and source that data securely, normalize the data, manage the data quality and combine it, and all of that in the concept that we at FADA like to call data. Once we have all the data in place, we can go to the fancy data science part and modeling part, that might be more interesting to some. Where we, for example, might start with marketing mix modeling. So what is marketing mix modeling? Marketing mix modeling essentially, tries to correlate and understand the interdependence between our marketing investments and how they drive sales, revenue, profits, lifetime values, ideally. And for that, we include data about our marketing investments for the different marketing activities, for example, on a daily level or on a weekly level. So we might, for example, have the daily spends for Instagram and Google search ads. We might know what kind of discount campaigns and vouchers are in place on each day. And, this could also, of course, be on a weekly granularity level, but ideally, we work with daily with a daily granularity level. And then with that, we can understand the incremental impact that our marketing activities have on our sales. We'll understand what kind of sales would have happened anyway, because, for example, of our brand equity, because of non marketing effects such as specific events, PR activities or overall demand driven by seasonality. And then, marketing mix modeling allows us the incremental impact of all these marketing activities that we drove in the past. What's important is to account for the hierarchy of the different effects. So something such as seasonality is not only driving season is not only driving conversions directly by increasing the sales in our stores, for example, but it also increases the number of organic searches, branded searches, and direct, for example. And upper funnel marketing activities such as TV campaigns also usually don't drive digital conversions directly, but they increase the direct visits to our website or the branded searches. And similarly, external factors such as, for example, specific events and seasonality or interest rate changes do might also for example for financial institution, increase the branded searches and organic searches. And of course also maybe direct hits. So it's quite important to count for this sort of hierarchy between the marketing and non marketing effects. So if we've done all of this in our marketing mix modeling, it might seem that we have a pretty good system in place for measuring the holistic impact of our or measuring the impact of our marketing activities holistically and accounting for the informality. But and this is a big but, even though it's a small data problem, has this problem with being, inherently based on a very small amount of data. And I'd like to explain a bit about this in more detail. So even if we look, as I explained, at daily level data, so spend data, for example, how much we spend on Instagram campaigns on the daily level, then we have for one year three sixty five data points and for two years, hence 730 rows of data. And that is a very low amount of data for industrial and for any sort of practical data science application. If you do credit scoring or churn modeling or lifetime value prediction, you usually have a couple of thousands, if not a couple of 100 thousands or maybe even million data points, customers examples, to do data science modeling on. And with this small amount of data, the issue is that we might have contradicting results. So we could have two equally good marketing mix models in terms of statistical fit, so they could have great R square values and very low mean average average percentage errors, things like that. But, in essence, have contradicting results. So, one telling you that the that you should invest more into TikTok and another tell another marketing mix model telling you to invest less into TikTok. And that's something that every data science practitioner, every marketing mix modeling practitioner, will confirm. Of course, there are good ways, and required ways to make the marketing mix modeling more robust, but something that you need to account for if you think about just using one of the open source marketing mix modeling solutions out there, for example, for software packages. And then also with marketing mix modeling, because of this low amount of data, you have lack of granularity and with that a lack of actionability. So to really act upon the insights of an it's not enough to just see that, oh, I should maybe invest more into TikTok and less into Google, but you need to know exactly which campaigns you should pull money out of, which Google campaigns you should pull money out of and which TikTok campaigns you should invest that money into. And campaign level insights is usually something that doesn't provide. There are a couple of tricks that you can do to get an to provide campaign level insights, but they usually don't increase the robustness and reliability of the insights. So it's a tricky thing and that's why we need integration, that's why we need integration with testing and MTA. So we cannot do in isolation, we need to integrate it with testing and multi touch edge machine. So let's look into what incrementality testing is. Incrementality testing, roughly speaking, is based on having a marketing group and a control group. So a test group and a control group. The test group, is being exposed, for example, to your marketing activities and the control group isn't. And then you can compare these two groups and, for example, you have $50 worth of sales per exposed customer in the marketing group, and then the control group is is only 42, sales per control customer. So the incremental impact of your marketing activity is hence $8. So that's a very simplified, yeah, view on it and what can be done in addition to that and what's nowadays kind of state of the art is that we do GeoLift tests with synthetic controls. So for example, we can divide up Germany into different areas in states, where in the test group, we expose, the regions to the marketing activity. And then the control group, the green area here, is not really our control group, but it's being used to build a synthetic control group. We use the unaffected regions to forecast what would have happened in the affected regions. So we try to understand what, ultimately would have happened in the test regions, if we would have not exposed them to the marketing activities. The benefits of this is we don't need user level data, because the access to user level data for more holistic and complex tests is usually not easily available. So we can, of course, do a test within Google or within Facebook, which has its own limitations and things that we need to worry about, and which will allow us insights into, you know, for example, does a specific Google Ads campaign work better than another one or, not doing that Google Ads campaign at all. But, to, for example, test a combination of Google and the Facebook campaign together or campaign across different marketing sources and channels, we need to resort to something such as geo lift testing. And, what you see here then in detail is we use the this could be the result potentially of such a GeoLift test where we, for example, see the original time series. So this is the sales per, for example, per date. And then you see this dotted area here, this dotted area here in the original time series, which shows us what we would have expected based on the control group or the synthetic control group. What would have happened if we would have not exposed the test group to the marketing activity. And then we see with that, of course, we see the effect of the intervention and we can also look at the cumulative impact that the marketing activity had. And then the third component, so we spoke now about marketing mix modeling and incremental testing and yield of testing in particular. The third pillar of our triangulation framework is digital attribution. And digital attribution should be based on data driven attribution. We don't wanna have to rely on last touch or first click attribution. We also don't wanna look into static multi touch attribution. So we don't wanna use linear position based time decay or any similar methodology because they are static, so they don't adapt to specific users, preferences and attributes. And secondly, they also do not account for all of the non converting marketing touch points, which is quite essential because we don't want to have an inherent bias in the measurement and attribution framework. So, think of it as, if you if you consider last click being wrong because it's it's like only paying the, goal scorer in a football game, and I'm talking about, actual football, and not American football, then static attribution or linear attribution and time decay attribution is wrong because we might just only look at all the successful passes that a midfielder has played and not at the all the unsuccessful ones, Which we need to do to really understand the sort of conversion rate or the effectiveness, the efficiency of a specific player. And that's what we do with data driven attribution. With data driven attribution, we also incorporate all the marketing journeys, so all the user journeys that have not ended up in a conversion. So all the non successful user journey touch points. Of course, we then can use this data to calculate conversion probabilities. So here we have an example. We have, two types of user journeys. One where we have a Google ad click and a Facebook ad ad click and then a retargeting click. And we have 6% conversion probability. So if we imagine we have thousands or 1,000 of these journeys, we might have 60 conversions within the thousand journeys. And then we have another thousand journeys where we only have a Google ad click and a retargeting click. And here, we only have 40 conversions within these thousand journeys. So the conversion probability of this journey is only 4%. And that means that we have an uplift of 50% from 4% to 6%. So that is then mostly due to the Facebook touch point that we have been tracking here. We do all of this data driven multi touch attribution by accounting for the sequence and time differences between the touch points. So it's a, you know, there might be different conversion probability or different incremental impact if the Facebook touch point has happened before the Google Ads click, which is quite important. And of course, also, if the time difference between the Google Ads and the Facebook touch point would be maybe one week instead of one day. This is fully automated. We use some specific type of deep learning algorithms, for this long short term memory networks, and, the algorithm and the models are being updated on a daily basis. So the reports are being updated on a daily level. Now we have everything in place. We have marketing mix modeling, which is great at capturing non marketing effects and offline channels. But as we discussed, it's inherently limited to a very small number of data, which comes with significant problems regarding granularity and actionability, and also, might provide contradicting results. Digital attribution is great at getting actual data on click and order level data, but of course, it's not holistic. It doesn't include, dual level data, it doesn't include, non trackable data, and we have this huge loss of signal and it's only mostly for online, activities. And incremental testing is great to really prove for causality, between marketing activities and increase in sales or drops in sales, hopefully not, but increases in our business KPIs. But it comes at high opportunity cost. So if our marketing activity works, then the group that has not been exposed to our marketing activities potentially loses out on revenue increases and profits or both. And it also provides results in a very slow manner. So incremental test and geo lift test usually have to run for a couple of weeks, if not even more than a month. So we are going to combine these methodologies in something that we call triangulation. So we want to mitigate the weaknesses of each of these methodologies and really leverage the strength that each of these approaches has. And the way we do that is, for example, we use marketing mix modeling as providing a good overview of the incremental conversion of a channel such as Meta overall. And then we can use digital attribution or in platform attribution also to provide campaign level performance data for Meta. So we might, for example, understand that the share between a Meta brand and a Meta prospecting campaign is 3070%. We can use that information to get more granular insights from the marketing mix modeling. And then marketing mix modeling provider might provide us really good insights into what should be next test that we should drive in terms of what is our certainty about the ROAS or CPA about a specific channel. And that allows us to inform our GeoLift testing strategy or overall testing strategy. And then, of course, implementation testing provides us, kind of ground truth for specific channels, such as channel or such as YouTube or Google Ads, that we can then incorporate into the model. What we do, for being able to realize this, system, we work with something that we like to call an AI called multi objective optimization system. And I'm not going to go into all of the details here. There's a blog article, that I mentioned in a bit. But in essence, what we do is we run through thousands of iterations, in different models, but adjusting the hyperparameters. And we then not just optimizing for the statistical KPIs, such as MADE, root mean squared error or R square, but also for our benchmark and business KPIs. So with each of these thousands of marketing mix models, we are able to calculate the attribution, results. So how much conversion, how much revenue should be allocated to each marketing channel. And with that, we then are obviously also able to calculate the roles and CPA values, because we have the cost data. And we can compare that to, for example, results from accumulators, from data to multi touch attribution, from platform attribution, or maybe even from surveys. And then, select the models that or filter for the models to optimize, for the trade off between benchmark and machine learning metrics, while going through these thousands of iterations before we arrive at the final marketing mix modeling and attribution insights. For that, we use quite sophisticated AI infrastructure that's also being used by AI. It's an open source framework called Ray, which is maintained by the people at AnyScale. And we also launched a blog article on the website. And, yeah, I'll invite you to read more about that. It's quite interesting and AnyScale is definitely a framework to be recommended if you have more advanced machine learning modeling training requirements. So now we talked about all these, beautiful things and how we can build the best triangulation framework and come up with good insights and holistic insights into how we can really drive revenue and profit and maybe lifetime value improvements. But one thing that we really need to account for is that we now talk more about the sort of short to mid term optimization, but we also need to understand the impact for our marketing activities on brand building. Because in essence, short term focus on sales activation only increases sales, revenue, profits in a sort of short peak type of manner, as you can see here. So this blue line, where there is not really a huge level shift in terms of the overall sales uplift compared to our baseline. To really drive the best from our short term and midterm marketing activities, we also need to invest into brand. And that's something that I already talked about at the beginning. And of course, with the system that I just introduced with the triangulation framework, we'll have a much better understanding of our upper funnel marketing activities. But we also want to understand how much of our TV investment this week, for example, increases sales within maybe six months or twelve months. And to really understand that, we need to understand how our marketing activities increase brand activity and brand equity. Because brand building is quite important as we can see from this chart here and this is from the famous book, The Long and the Short of It, by Les Binet, where he essentially argues for balance of 60% investment brand activities and 40% into more short term performance marketing activities. By the way, I don't really like this distinction between brand and performance, because in the end, all marketing, of course, needs to perform. And there are these sort of, yeah, merged terms such as brandformants or performance branding, whatever. But in the end, all marketing needs to perform, all marketing needs to increase sales and revenue. Marketing is not an end in itself. So with these, brand building activities, we actually will be able to manage to get out of this sort of saturation in the lower parts of the marketing funnel, where we each try to outbid our competitors on the same generic keywords and pay very high click prices, for example, because we have a higher intent of our users, because we built our brand, because we managed to leverage the awareness and recognition of our brand with each user that kind of dives into the marketing funnel. So how do we do this? How are we able to improve our brand equity and brand recognition? Well, first of all, we need to be able to measure, of course, what the impact of our media investments is on brand equity. And brand equity is quite an elusive concept and we talk about how to measure that. One way is, for example, to use the share of search and Google trend data. And then we can understand the build market and mix models, which help us to understand, what the impact of our media investment is on increased brand equity. With that, we also need to include all these contextual variables that we already talked about a bit earlier, such as, for example, macroeconomic factors, volatility indexes, tariffs changes, interest rate changes, specific events, sports events, world championships, etcetera. And once we are able to create robust models that help to explain the impact of our media investments on brand equity, we can also use the same model and the same sort of approach to understand the impact that media not has only directly on sales, but through increasing brand equity on sales. So to really help explain the impact that media spends have in the long run, and on sales within maybe six months' time. And that will answer the question, how much of our TV spending this week will increase sales in six months or in twelve months. So as I already addressed, brand equity is an elusive concept. How do we actually measure that? And this is just a sort of motivation to explain why share of search can actually be used. We have brands that work for example with also social media mentions and direct to be included into their sort of proxy for brand equity measurement. But here's a good explanation, which shows that increases, relative increases in the share of searches on the x axis correlate actually quite nicely with the increases in market share. So this is, of course, not like, you know, 100% scientific proof that share of search is the best indication for brand equity and market share. But it actually shows that share of search should be included into building maybe a brand score or brand equity measure. And we have had good experiences with that. So this was all the more or less technical part of the webinar and what needs to be put in place to solve for the sort of technological challenge when it comes to marketing intelligence. But one thing that I really like to talk about is, with all this hype around AI, we really also need to understand that different types of business businesses need different types of implementing systems that have such a high complexity as a marketing intelligence solution. So you might have the D2C business model, which is more straightforward, has a high data maturity, where we can actually rely more on a soft service approach. And then we have maybe more complex business models with omnichannel activities. So they need to understand how online marketing activities drive the footfall in their stores. They might have a not so high data maturity or more complex data structures in place because they have to create data such as the footfall into the different stores. And that, of course, requires a more consultative approach, which any, yeah, sort of vendor or solution provider in that space should provide. As I said, artificial intelligence alone isn't enough to really foster collaboration, adoption, and activation. We need real human intelligence, to maximize the value that these insights or these marketing intelligence solutions provide. The technical solution is only one piece of the puzzle, really, and teams must adopt and collaborate to activate the AI insights. So the solutions, a, must make it really easy to understand the insights and to really, understand the interpretation of these insights and how they have been generated. So there's a huge educational, sort of ask there. But then also, the solution should really help to give the stakeholders enough transparency to understand how the algorithms work. So there should be no black box approach. There should be no argumentation around, we have the most PhDs in our organization. Only smart people will understand this. This needs to be broken down in really simple terms, so that everybody can understand the basics of it. So please feel free to ask your vendor enough why questions, so that you feel comfortable of having really understand the basic principles of the marketing intelligence solution put in place. And we have been proven to really increase media driven sales for our clients. We have lower CPO, we have managed to help businesses to increase ROAS in specific channels such as higher or recognizing higher social media ROAS for FlixBus. We've shown a brand that the TV ROAS was actually 256% higher than the previously thought of that allowed them to really significantly grow through TV in the future. We showed a global fashion brand that they will they had been able to increase or they increase the revenue by 77% while maintaining constant blended CPA across all the marketing activities. And because usually if you invest that much more into marketing to drive revenue, that's significantly higher, you will end up with a much higher blended CPA or lower loss. And, these are the cases that, kind of prove that this actually really drives value in a significant way. But of course, it's not about generating these one time insights, because marketing measurement, marketing intelligence should be seen as an evolutionary process on a road to steadily increase marketing activities holistically. We work with different, very different types of brands with major, omnichannel businesses, major brands such as Mont Blanc, Douglass, some of the biggest publishers in Europe such as Springer and Bild and are very proud of our clients. With data integration campaign reporting, we have some of the major agencies working with our solution and some of the biggest brands worldwide. We have more than 2,500 customers globally. We process more than 10 terabytes of data every day. We have more than 100,000 users working with our solution and we track $70,000,000,000 in digital ad spend that's roughly speaking more than 10% of the overall digital ad spend worldwide. And we have offices in Boston, Hamburg, Stockholm, City, and Dublin. And with that, I'd like to thank you so much for your attention. I invite you to connect with me on LinkedIn. Feel free to reach out if you have further questions that might have not been answered in the q and a. Thanks so much for joining.