The AI Revolution in Brand Advertising
Show notes
In this episode, I'm joined by George London, CTO at Upwave. We talk about how AI is reshaping the brand advertising landscape and making measurement more effective. George shares insights into Upwave's AI analyst, Bayes, and how it's helping advertisers understand campaign performance. He also discusses the challenges of scaling an engineering team and the importance of a strong cultural fit. Plus, George gives us a peek into his approach to software architecture and design decisions.
Full transcript
thing record. So welcome back to the podcast guys today we're joined by George London. George, welcome to the podcast. Happy to be here. Thanks for having me. Why don't we start, George, by you telling a little bit about yourself and what are you up to? Yeah, so I'm the Chief Technology Officer at Upwave. We are the brand outcomes measurement company, which means that we help large advertisers understand the impact and effectiveness of their brand advertising campaigns and help them get the best value for their money. Nice. And if you had to say what kind of trend are you currently observing in the brand measurement right now? Uh, probably like many other industries, AI is completely reshuffling the deck in everything with marketing and advertising. Upwave is investing very heavily in using AI to make measurement more effective and more understandable. And we actually just announced the release of our AI analyst, Bayes named after the Reverend Thomas Bayes, the founder of Bayesian statistics that helps advertisers understand the performance of their campaigns and how to make their campaigns more effective. Brand advertising can be a pretty complicated field for those not familiar brand advertising is any kind of advertising that's not intended to drive a click or a conversion in the short term. So it's things like car commercials that are not trying to get you to jump off your couch and buy a car today, but instead of trying to make sure that next time you're in the market for a car, you remember that say Toyota exists and you consider a Toyota for your next car. You know, Toyota's advertising, for example, is very complex across many different channels, many different ad formats, many different creative messages and so on. And so for customers to sort of understand the effectiveness of these complex multimillion dollar campaigns can take a fair amount of statistical and quantitative fluency. so AI really helps us dial in on. sort of what's most important and tell a clear and coherent story to our customers that makes the data really understandable and actionable.
Great, great. Thank you for the explanation. Also, is there any other uh specific use case of AI that you could share with viewers that is very beneficial for you guys? Yeah, I'd say we're pretty all in on using AI for development of our software internally. know, our whole team has access to cloud code and makes heavy use of it. And I think it's pretty materially accelerated our development efforts and meant that, you know, we can be a lot more ambitious with the types of products that we offer and features and integrations and so on. It gets a lot easier to say yes to people's wild and crazy dreams and ambitions because when we can. build almost anything we want at a prompt, you we still have to apply rigorous engineering discipline to make sure that, you know, the quality and reliability is there, but getting at least to that first draft of a new feature product has gotten dramatically faster for us. Is there any interesting technical challenge you've tackled recently at Upwave?
I'd say building this AI analyst has been a pretty interesting technical challenge. You know, it's both using the technology effectively, using AI itself to produce a fluid conversational interface. And, you know, it's not just giving answers, it's giving correct answers quickly. You people are making multimillion dollar decisions about how to spend their advertising money on the basis of the measurement results that we provide. so. It's super important that when the AI answers a question about what worked best, that the answer is correct and believable and not hallucinated. Uh, but it's also important that it be quick and high latency because people don't want to stare at a thinking spinner for minutes at a time. So finding the way to deliver this fluid experience in a way that is both rigorous and capable and useful. has required us to really think carefully about how to put together all the components of the modern agentic AI stack. And then think about things like how to evaluate the quality and build an internal evaluation framework that we've been doing to really understand how it performs in different scenarios when it's interacting with different personas and make sure that we can. You know, make it so that every time we make a change or an improvement, we know that it actually is an improvement, not just on the specific thing that we're trying to improve, but also on the overall performance of the agent.
Is there any approach that you use when you make architecture or design decisions in your role as CTO? Um, I believe that quality matters with software. I remember my fourth grade teacher had a sign on his desk that said, if you don't have time to do it right, how are you going to have time to do it again? And yeah, that's why I think that, you know, taking the time upfront to really plan out what you're doing, to break it down into sub components, to think about the corner cases and what might go wrong, you know, where it might break down with scale and so on. doing the upfront thinking and planning to answer those questions means that it's more likely that the first version of something you build is going to be the correct one. And that's resulted in us being pretty rare that we have to rebuild or rework something because we really put the emphasis on getting it right the first time. What's been the biggest challenge in scaling your engineering team so far?
Um, you know, making sure that the team both has a high talent bar and also a high cultural bar is always a challenge. You know, I really believe that software engineering is a team sport and, know, we make sure to have a culture on our team that sort of is both rigorous and also supportive. You know, we want to make sure that we are doing very high quality engineering all the time. Make sure that we have sort of a. free thinking and open environment where the best ideas are put forward and the best ideas win regardless of who promotes them, but also interact with each other respectfully and supportively and make sure that Upwave is a place that people want to show up for work every day and want to interact with their colleagues. so finding just the right people who have both that high talent and oh high... uh fluency in the human side of engineering requires us to be very thoughtful about how we grow the team. Is there anything specific that you look into when uh hiring a new software engineer to your team?
I mean, it's always the traditional things, you know, people have to demonstrate that they know how to program and know how to think through an architect systems. Although this is something that it's an interesting open question in the era of AI exactly how you should best go about letting people demonstrate their core capabilities. But on top of that, you know, I really want to see a track record of substantial success at some point in the past. My first job was at a hedge fund and my first role was putting disclaimers on slides that said past performance is not necessarily indicative of future results. But, you know, at least with humans on teams, I've had past performance typically is indicative of future results. So, you know, people who have naturally found their way to leadership on teams because they've been the type of person who people look to, to answer the hardest questions or deal with the hardest problems. You know, if they've done that in the past, they'll typically do it again. But, you know, also we have a rigorous cultural section to our interview where we ask about, you know, times when people made mistakes in the past or times when they received feedback that led to them changing their behavior. Because we're also looking for humility and self-awareness because, you know, again, we want people who can both propose great ideas, but can also hear great ideas from other people and. accept those ideas when they're better than the ones that they have themselves.
That's great. What's one thing that you would change about the brand measurement industry as of now? Um, I would encourage people to wake up a bit more to the impact that AI is going to have on the space. You know, I've talked a little bit about the impacts of AI is already having a measurement, but the entire process of delivering advertising, you know, planning campaigns, creating what we call the creative in the industry, which is the actual video or JPEG asset, the planning of where. You know, that asset is going to be shown, which we call the media planning process. You know, it's deciding 30 % of the money will be spent on Facebook and 40 % on TV and so on. And then, you know, the actual day to day we call trafficking, which is actually making sure that the right ads show up in the right place. Each of those processes traditionally has required a lot of human attention and labor, but those are all things that AI is. very capable already at accelerating and it's just going to get better at. And so, you know, I think that people who really care about performance and getting the most impact for their money really should be leading into thinking about end to end AI powered advertising where you give the AI goal, which is either drive a specific amount of sales or drive awareness or consideration in the case of brand advertising and then let the AI. work backwards to how to accomplish that goal most effectively. That kind of back propagation of results to strategy is something that deep learning has been shown to be extremely good at. anybody who thinks that that's not going to be how most advertising is done in the relatively near future is probably a little bit head in the sand at this point.
Okay, great. I think a good conclusion. Wrapping this up, George, I would like to ask you if there are listeners that want to know more about what you guys are building or just chat with you, where can they go? can go to upwave.com and learn all about the services and products that we offer and submit a contact form and we'll be in touch soon to tell you about how Upwave can help. Great. George, thanks for joining the podcast and we'll see you in the next one. Awesome, thanks for having me.