Most DTC brands are making million-dollar channel decisions based on attribution data that's fundamentally wrong. Olivia Kory — CSO of Haus and the incrementality expert Brett references on stage more than almost anyone — breaks down what it actually takes to know if your ads are working. Spoiler: if you've been writing off YouTube based on MTA, you owe yourself a retest.
Inside the episode:
- Why YouTube's true ROAS is 3.4X what the platform reports — and how Haus's 190-test study across 74 brands proved it (plus why your D2C numbers alone are only half the picture)
- The right time to start incrementality testing — it's not when you're huge, it's when your business gets complicated enough that turning off ads won't give you a clean answer
- How StockX went from barely spending on YouTube to making it their #2 acquisition channel — by running geo holdout tests and acting on the results
- Why Meta's optimization might be too good — and how brands like Jones Road are improving their iROAS by making changes that look worse in-platform
- The surprise winner: AppLovin — Olivia came in skeptical of mobile game ad inventory and got data she didn't expect
- Haus's new DTC Basics tier — a lower-cost entry point so more brands can stop guessing which channels are actually driving growth
—
Sponsored by OMG Commerce - go to (https://www.omgcommerce.com/contact) and request your FREE strategy session today!
—
Chapters:
[00:00] Intro clip — Olivia on treating incrementality as a report card vs. a growth tool
[00:22] Introductions & background — Olivia's path from Starcom → TubeMogul → Netflix → Quibi → Sonos → Haus
[06:55] What is incrementality? — The randomized controlled trial analogy; geo holdouts vs. click-based attribution
[10:45] When should a brand start using incrementality? — The low-to-mid 8-figure inflection point; multi-channel complexity as the signal
[15:34] Native platform lift studies (Meta & Google) — Are they worth it? Signal loss, CAPI, iOS 14.5 limitations
[17:25] Geo holdout vs. user-level testing — Why Haus was "born out of the ashes of iOS 14.5" and went all-in on geo
[19:37] How a Haus geo holdout test actually works — Data ingestion, experiment design, market matching, results
[23:20] Actioning on incrementality data — Coaching leadership, making reallocation decisions, improving channel performance over time
[26:02] How long should you run a test? — Why 2-week YouTube tests fail; 4–6 week minimums and the role of consideration cycles
[27:19] Incrementality as an optimization loop, not a report card — Connor from Ridge, Cody from Jones Road Beauty, and the StockX story
[30:42] Key metrics defined — iROAS, iCPA, incrementality factor, and why in-platform ROAS can mislead you
[32:47] Branded search — Is it incremental? Simple Modern's 5% read, when Amazon bidding on your terms changes the math
[35:57] Treatment window & post-treatment window (PTW) — How Haus structures tests for YouTube, Meta, and CTV; lagged effects explained
[39:36] Consideration cycles & post-purchase surveys — Why your path-to-purchase report is probably shorter than reality
[41:00] Halo effects: Amazon & retail — Why omnichannel brands that only measure D2C are understating YouTube's impact
[41:58] The Haus YouTube study findings — The 3.42x incrementality factor; halo effects that doubled lift when Amazon/retail pulled in; Demand Gen vindicated
[44:10] YouTube vs. Meta: how the channels differ incrementally — Meta's short payback window, the "too good at intent" problem, and why YouTube wins on halo effects
[46:53] Surprises from the data — YouTube (not surprising to Olivia), AppLovin (very surprising), and why TV results swing wildly based on inventory type
[50:16] The biggest levers to improve incrementality — Creative first (30–50% wins), then account structure, traffic composition, and spend level
[51:46] A DemandGen campaign running on Gmail — A real audit story and why traffic composition can make a channel look broken when it isn't
[53:13] Haus's new DTC Basics tier — A lower-cost entry point to measure D2C and Amazon across core ad channels
[54:54] Wrap-up & where to find Olivia — Part two teased around the next Haus YouTube report
—
Connect With Brett:
- LinkedIn: https://www.linkedin.com/in/thebrettcurry/
- YouTube: https://www.youtube.com/@omgcommerce
- Website: https://www.omgcommerce.com/
- Request a Free Strategy Session: https://www.omgcommerce.com/contact
Relevant Links:
- Olivia's LinkedIn: https://www.linkedin.com/in/olivia-kory-50230812/
Past guests on eCommerce Evolution include Ezra Firestone, Steve Chou, Drew Sanocki, Jacques Spitzer, Jeremy Horowitz, Ryan Moran, Sean Frank, Andrew Youderian, Ryan McKenzie, Joseph Wilkins, Cody Wittick, Miki Agrawal, Justin Brooke, Nish Samantray, Kurt Elster, John Parkes, Chris Mercer, Rabah Rahil, Bear Handlon, JC Hite, Frederick Vallaeys, Preston Rutherford, Anthony Mink, Bill D’Allessandro, Stephane Colleu, Jeff Oxford, Bryan Porter and more
Transcript:
Olivia Kory (00:00.35)
The brands who treat incrementality like purely as the report card are having a lot less fun. They're driving like a lot less impact honestly on the business.
Brett Curry (00:22.552)
Well, hello and welcome to another edition of the E-commerce Evolution Podcast. I'm your host, Brett Curry, CEO of OMG Commerce. And today I'm super excited to welcome to the podcast, the one, the only Olivia Kory. She is the CSO of Haus and she's really everyone's favorite incrementality expert. And you may be like, wait, people have a favorite incrementality expert. And I'm here to say, yes, they do and it's almost always Olivia Kory. So can't wait to dive into this discussion on all things incrementality. And so with that, Olivia, welcome to the show and how's it going?
Olivia Kory (01:02.136)
Thank you, Brett. It's going well, other than have some sort of virus that my kids gave to me, but a little bit under the weather. Otherwise great. We were talking before we started recording that you do these events that I hear amazing things about. And I hear that you pop up on the screen, my podcast with Andrew Ferris. I owe you a huge thank you for all of the help that you have given to the Haus study in the Haus YouTube data over the years because I think we really owe you one for getting it out there.
Brett Curry (01:37.39)
Totally, yeah, I'm a huge YouTube fan. I'm really just a huge fan of marketing that works. That's kind of always been my thing. But love YouTube. This study that you guys did, the 190 incrementality test across 74 brands, I reference it weekly at least. Anytime I'm on stage, I talk about it. So yeah, I put your mug on the screen a lot and my buddy Andrew Ferris. And so I've probably sent some customers your way, helped grow his podcast, but that's fun. So I enjoy doing that for sure.
Olivia Kory (02:07.062)
Funny is that I feel like vendors tend to hate on Meta and Google a lot. And I always wonder why, because I'm like, I just like advertising networks and we know that Google works. Why are you going to target? It's because they're big tech, of course, and it's like really easy to dislike them. But like, hey, there are not a lot of great places to spend your ad dollars. And so I like to lean into the platforms that I know are going to drive returns.
Brett Curry (02:35.894)
Lean into what's working for sure. So lots of things I want to talk about related to this topic. First though, your background Olivia, I think is really interesting. So you're former Netflix, former Sonos and that kind of led you to Haus. So give us kind of the brief version of that. What'd you do prior to this and how did it lead you to the world of incrementality?
Olivia Kory (02:55.864)
Yep. I actually started off agency side. So I've done a little bit of everything over the years, but started off agency side. That's where I met my now husband. So I think Starcom, the very large ad agency in Chicago, for bringing us together. And then I got into ad tech. I was at a DSP, a programmatic video DSP called TubeMogul.
And that was like just as kind of programmatic media was blowing up in 2013, 14. And that actually, that experience is what got me the job at Netflix. Reed Hastings, the CEO of Netflix at the time was like very passionate about biddable programmatic advertising because he wanted the marketing to work more like the product. Where like if you think about the Netflix product, highly algorithmic, personalized, no two people see the same kind of experience.
He wanted to take and apply a lot of those principles to the marketing. And so I just kind of like lucked into my dream job because I had that experience that he was looking for. And at Netflix, I got just like an MBA in incrementality. I was surrounded by the smartest people in the world truly when it comes to incrementality testing. And we were running these experiments all over the world. We had a very clear understanding of what marketing was contributing to the business in this world of Netflix where it was like, tons of organic demand, tons of noise, really hard to parse out what paid was delivering on top of all that. And the way we were able to do it, there's like no secret sauce here. It was just that we had a very large team dedicated to helping us figure that out. And so that's like really where the Haus story begins because Zach, our founder, his insight was like, what if we could enable access to like that level of scientific rigor, quality incrementality testing, but instead of having to hire a huge team, let's do it as SaaS for less than the cost of a head count. And fortunately, over — he founded the company in 2021 — but fortunately over the past few years, like incrementality has kind of become widely accepted as the gold standard. So we've had some tailwinds that we've benefited from. But after Netflix, just to kind of complete the story, I moved on to a couple of roles. I went over to Quibi for a little, and then I settled in as head of growth at Sonos.
Olivia Kory (05:14.71)
And I was actually one of Haus's first customers, helping Zach and team build the early MVP. And it was just going so well at Sonos in such a short amount of time that I had to make the leap over. So I joke that I'm a vendor now. I don't know how I got here, but I'm really a growth marketer at heart. Yeah, that's my story.
Brett Curry (05:33.325)
Yeah, yeah.
It's so valuable though, as a, you know, being a growth marketer at heart, because you can help lead strategy — cause that's what the tool is built for, it's who it's built for. And so it's super valuable. I am curious. So your husband, you met him at an ad agency. You guys are both nerds. I'm sure. Do you talk about this stuff at the dinner table? Is it like off limits to talk about marketing stuff at the table? Like what do you guys geek out about together?
Olivia Kory (06:02.382)
Do you want to hear something crazy? It's going to be a bomb. I don't know if I can't remember if I've ever said this publicly, but my husband actually works at Google. So that's hilarious. He's been there for like 10 years. I actually texted him this morning because I wanted to get an updated stat on how much YouTube delivery happens on TV screens, but he hasn't responded yet.
Brett Curry (06:25.71)
It's like north of — it's north of 50%, right?
Olivia Kory (06:27.97)
I like the exact number before we — yeah. We met at the agency and then he moved over to Google. It's funny because depending on the day, I feel like I'm either like Google's best friend or worst enemy. So that's been fun. Recently, I think we've gotten into really good rhythm with Google and it's going really well. And so our paths cross sometimes in a very nice way.
Brett Curry (06:55.386)
That's awesome. And what's interesting to me, incrementality has kind of become the buzzword, the thing that's talked about the most. You talked about some serious tailwinds that are benefiting Haus. A lot of brands we work with — high eight, nine figure brands — they're all either using it, considering it, talking about it right now. But I am reminded that not everybody understands incrementality. I was speaking at an event in Miami a little over a month ago and it was like seven and eight figure brands, really, really smart, active group. And I asked — like 120 people in the audience — I asked for a show of hands, how many of you have heard of incrementality? And it was like a handful. It was like five people, legitimately five people. Then I did the event in LA at the Google YouTube offices — that's the co-branded event we do with Google and with Raindrop. And of course everybody there has heard of Haus and incrementality and are probably using it.
But you've been around this a long time. In your own words, what is incrementality and maybe what isn't it, so that we can kind of frame this and set that frame for the rest of the show.
Olivia Kory (08:03.116)
Yep. It's really funny. I was in Toronto at an event and I gave my incrementality speech and I realized I got like way too deep, way too fast. And I had to rewind like so far to really meet the audience where they were. So yeah, it's easy for me who's like talking about it all day, every day to just like quickly go down that rabbit hole. But to try to like zoom out and talk about it, maybe the way I would describe it to someone who's not in the industry is the analogy I like to give is like randomized controlled trials in healthcare. I think a lot of marketing data is based on correlation. You saw this ad and then you signed up for this service or you bought this thing. And we at Haus like kind of fundamentally believe that in order to establish causation — like true cause and effect, this advertising caused this outcome — you need an experiment with a holdout group. And the holdout group is where you actually turn off advertising in part of the country. We do this based on geo and you see the difference between the control and the treatment. And that enables you to say like, hey, this effect would not have happened without the marketing. And so in that way, that's why I say it's analogous to like randomized controlled trials in healthcare where you're giving some group of people the placebo. And some group of people the drug. And so that is at its core — when you think about incrementality, it's that counterfactual to understand what would have happened anyway in the absence of marketing.
Brett Curry (09:45.4)
Right, right. Yeah, it's so good. And it's one of those things where, you know, I think as we progress and get smarter as marketers and as there's more tech powering what we do, there's just been multiple points when I know I have thought, well, this is like, this is it. This is the solution, right? MTAs come onto the scene and we're like, now we can see all the data together and we'll finally know what's driving the impact. But then you realize, well, really every tool that's really click-based or leans on click is biased and it's going to have flaws and really every piece of measurement is kind of some level of wrong. Maybe it's still useful, but it's not complete. And really the only way to tell what would have happened without this advertising or did this advertising cause this result is to do a holdout. And so, yeah, I love it. And I think more and more people are seeing, yes, I need this as part of my measurement stack.
Cool, so let's talk about — and I'm going to kind of bounce around to a variety of topics — but when should someone consider using an incrementality tool like Haus, or there are other tools, but like when should someone say, okay, I've got my MTA, I've got some other tools potentially, when do I need incrementality?
Olivia Kory (11:03.534)
I was just talking to a brand about this on a call right before, and there's kind of an interesting dynamic happening with some of these new AI companies, which I'll talk about, but let's talk about e-commerce businesses just as the example here. When you're just starting out, when you're just starting your business, you don't have a lot of organic demand. Nobody knows who you are. You don't have any awareness and you're building this business on Meta and that's most likely how people know about you and how they're purchasing.
And thus in that way, it's pretty clear like what's driving your business. And the click-based attribution there is like probably a nice proxy for what's happening. And then you start growing and I don't know exactly, I tend to see it around like low to mid eight figures in revenue where that's where you start expanding channels into Google and YouTube, into TikTok.
So you're expanding into channels, you might be expanding your sales distribution into Amazon and into retail, and you are growing organically. Like the word of mouth is starting to take off. Maybe you're partnering with other companies and they're doing co-marketing with you. That's where it's like, at that point, your ad channels are gonna start taking credit for like a whole bunch of growth that's happening elsewhere. And the classic example there is like, you might be spending in all these channels and all of that impact is showing up as search in the click-based attribution model because that is like the last place someone goes before they end up buying. And so again, like when is the right time? I tend to just ask brands I'm talking to like, how much of your business is driven by paid? Do you feel like you have a good signal there?
And like, can you just turn off marketing and easily get that answer? Or is your business so complex and it's so fuzzy that you don't actually think you could get that answer by like a simple on-off test. And that is when you need incrementality. And like, you can think about a Sonos. Like if we shut down marketing at Sonos, like the business is so big and there's so much noise that like it would be really hard to get a clean signal on the top line. And so that's...
Olivia Kory (13:28.942)
That's typically when I'd say folks feel ready. It's like multiple channels, they've expanded into multiple sales channels in terms of Amazon, retail, and they're unclear on whether ads are kind of taking credit for what would have been an organic conversion. I just had an interesting conversation with an AI company. They're doing $100 million or more in ARR.
They grew entirely organically — viral AI product. It's like a B2B AI video generator. It's the inverse. Like they haven't had any paid. And so I was like, wow, you guys are going to need incrementality off the bat here because you have grown so quickly and you've become so viral organically that they're going to need incrementality right away. They just, they're like a year old company.
And they're signing up with us because they just launched with so much momentum. So I don't see that a lot, but really exciting. And like the Baseball Lifestyle 101 guys, I think are probably an example of that where they grew first organically and thus anything they do in paid is going to need incrementality.
Brett Curry (14:33.538)
Yep.
Brett Curry (14:40.536)
Totally, yeah, helping them with that right now on the YouTube side. And it's such a cool brand. They were community first, they kind of slow built for a couple of years and then just took off like a rocket. So yeah, incrementality is, I think, critical once you get to those stages. Just absolutely critical. And so let's think about this. Sometimes when we talk to brands, they're thinking like, okay, when do we need to look at incrementality, like we just talked about? And okay, if I do, then...
What should I start with? What's your point of view on — Google has what they call conversion lift studies. So they'll do actually a user holdout, which we can talk about the benefits of geo holdout versus user holdout, and I think that'd be a fun topic — but they'll do some incrementality studies. Meta, I know, will run incrementality studies on your behalf. What's your point of view on those? And are those like a good way for brands to dip their toe in the water and get some incremental reads?
Olivia Kory (15:34.926)
Yep. I'm a big fan of the native conversion products. I know, again, that's not a popular opinion when you're an independent third party, but I think they're incredibly useful. I think one, it gets you comfortable with the terminology, the nomenclature of like, what is an iROAS? How is it calculated? How is that different from your in-platform ROAS? And it also prepares you for what you might see when you start running more of these tests in terms of perhaps the incremental return is not what you thought. It's lower, and it gets the organization comfortable with this idea that this is a new metric and we're setting new baselines. I like it in terms of, okay, let's get used to the terminology. Let's try to establish some set of baselines. Meta's conversion lift product is, I think,
kind of widely accepted as like, all right, it has the rubber stamp of approval. I don't hear much about Google's and I do wonder if there's like a signal loss thing happening. Like I know Meta's CAPI product is solid. There might be an issue with like being able to track iOS inside of Google and maybe their enhanced conversions product isn't being as robust as CAPI. I don't know. I just don't hear about it as much. So these are all hypotheses for like why you don't hear about Google's product. I also think maybe it's not like universally available to all advertisers the way Meta is.
Brett Curry (17:14.902)
It's not. It's not. Yep. You've got to get allowlisted and stuff like that. It's going to roll out more, but yeah, it's not just available in the platform.
Olivia Kory (17:17.336)
Probably why.
Olivia Kory (17:25.248)
Yep, yeah, that's — I mean, your guess is as good as mine there. The issue with user level is signal loss, right? Like you're still relying on being able to match a user who was exposed to an ad to a sale that happened down the line. And you know this with just like iOS 14.5. It's just so many people who have opted out of tracking that they're modeling — like Google and Meta are both doing modeling and I'm sure it's working well, but it is certainly not perfect. And so signal loss over the past few years has like really, really challenged those user level products. And then the other issue is like every platform is kind of doing it a different way. And so you then no longer have like the standardized cross-channel comparable methodology where you can —
Brett Curry (18:16.578)
Yeah, you can't compare your iROAS on Google to your iROAS on Meta because they're maybe measuring it differently.
Olivia Kory (18:22.508)
Yeah, so, but like all things equal — if the signal loss wasn't such a big problem and they were all doing it the same way — user level testing is better just like objectively in terms of precision. But with privacy and whatnot, like the signal is too unreliable to really have faith in. So that's why we focus on geo at Haus. We were kind of like born out of the ashes of iOS 14.5 in 2021.
When everybody was trying to like stitch together these user graphs and like use these alternate IDs, we just skated in the total opposite direction. We're like, hey, we don't want any pixels. Don't give us any PII. We're just going to look at the aggregate sales in these regions versus some other set of regions. And that's what's enabled us to like move really quickly through like security reviews with our customers — we don't look at any of that PII in our analysis.
And so privacy durable, and then the other kind of like really good property with geo testing is you do it in the same way across every channel. And that's really helped our customers feel confident making reallocation decisions because they're evaluating them all on a level playing field.
Brett Curry (19:37.678)
It's the same math, it's the same methodology. So comparing Google to Meta to TikTok to AppLovin, it's the same measurement applied to everybody. I really like that. So, walk through a couple of things. Walk through like, how do we set up a geo holdout test? What is this usually looking like? So I come to Haus and I say, hey, I want to do an incrementality study on YouTube for this brand. How are you structuring that geo holdout?
Olivia Kory (20:02.19)
Yep. End to end, I'll start from the very beginning. This might get boring. I hope your listeners are excited about some of the technical details, but I'll just...
Brett Curry (20:13.292)
I think they are or they're heavily caffeinated, but people nerd out here. So I think we're good.
Olivia Kory (20:17.388)
Okay, so we start with data ingestion. So we take the customer — if they're on Shopify, really easy, simple plugin. But otherwise, if your sales data is in Snowflake or GCP or Redshift, we ingest sales by day by geo, or like whatever the customer cares about in terms of KPI — like new customers, subscription starts, for mobile apps sometimes they'll pass us like player value. And so whatever kind of the customer cares about — we're flexible, they set up that data ingestion and then we can basically configure their instance of the app. And then they can log into their app and based on their own historical data, we can configure an experiment across any channel, online, offline, YouTube in this example. We would set it up and then basically our tool would tell you how much you need to spend. Well, first we want to understand your business question.
"What is the incrementality of YouTube?" is like a different question than "How much should I spend on YouTube?" And so we'll recommend a cell structure and a test design to best kind of answer the question you're...
Brett Curry (21:22.91)
Answer the question you want to get answered — that's going to inform how you design the test.
Olivia Kory (21:27.02)
Yep. And then we do basically all the data science work to help you understand how much should I spend? How long do I need to run for? How large does my holdout group need to be to kind of like sufficiently power the test from a statistical standpoint? And then we launch the test — and that just involves turning off YouTube in like some percent of the country — and we give you those regions, we do the market matching. And then because we're plugged into the data that I mentioned in terms of that data ingestion, we can start populating results pretty quickly once that test launches. And customers can kind of view how those results are changing as they come in day by day. And so that's like the tool part. Now we have like a pretty heavy services model that we attach to that to make sure — like, all right, how do I interpret these results? What do I do next? How does this compare to other customers? How do I really like operationalize this in a way where I'm...
decisioning on these results in a way that's going to improve business outcomes. And that's where like, I think the tool alone is fantastic, but there's still a pretty high touch services model here to help you like really extract and maximize the most value you possibly can with like, what do I do with this result? Like, how do I operationalize this? And how do I explain it to my leadership? Like that's one of the most underrated parts of incrementality — picture this, these companies have been handing leadership a Google Analytics report for decades. And now we're saying like, that Google Analytics report that we've been showing you, like maybe that number has been wrong. And so we coach our clients on like how to have that conversation in the most productive way and how to really like ease your leadership into this idea. And so that's like where the hard work happens, I would say.
Brett Curry (23:20.288)
Yeah, it totally makes sense. Yeah, data is only as good as what you can action upon. The last thing any DTC brand or any brand needs is more data. Like we've got data everywhere around us. It's — can you give me some clear insight? Like some insight into what's happening inside of my business, into my customer, and then help me do something. Like help me shift something. Either help me lean into a channel or lean out of a channel or switch creative or like something, but help me action towards it. And it's a really good call out where you say like, yeah, for leadership, a lot of times this is the first time they've seen an incrementality read, right? And so imagine the first time you saw Google Analytics, you probably had a million questions. You didn't really know what to do with it. And so I think a few things to maybe touch on is how do you advise people? And I've got to talk to now several of your strategists. They're awesome. They're very smart marketers. They know the tool. They've been in the game a while. So super helpful. Who's your favorite?
But the only name I can think of right now is Noah, and Noah's great. Shout out to Noah.
Olivia Kory (24:27.968)
Noah is definitely a crowd favorite.
Brett Curry (24:31.626)
Yeah, yeah. Great, articulate, smart dude. Doesn't mind if I pick on him. Yeah, good guy. So I like him. Hey, thanks again for tuning into the E-commerce Evolution Podcast. I want to take just a minute and talk about my agency, OMG Commerce. We've been helping e-commerce brands for 15 years, and that's like 100 e-commerce years. And our specialty is finding opportunities for growth that other people miss and unlocking channels that you're not currently maximizing. For example, YouTube.
Most brands are sleeping on YouTube and my belief is it's the biggest untapped opportunity for your brand. We're also good at adding up to eight figures in growth for Amazon brands. And so if you're looking for scale and growth profitably, that's what we do. We'd love to chat with you. We'd love to review your current marketing efforts, show you where there's missed opportunities and craft a specific plan for you. So visit us at omgcommerce.com. Click the Let's Talk button and we'd love to schedule a complimentary strategic review with you. With that, back to the show.
What do you advise people? Like probably the result they're going to get is going to be wildly different than they were thinking, right? So like how do you help them? Because it's not like we should get one incrementality test and completely change our business, right? We want to make sure it's actionable, but just one test is one test.
How do you kind of coach people — when you're doing your first test, here's what to expect or what to do once you get that read.
Olivia Kory (26:02.862)
Yep. So we, and we've learned a lot here as well of like, how long should you test for? Like a lot of these heuristics were kind of like updating as we have more information available to us. So like what we do is we really try to make sure the test is set up for success. YouTube is an example. I think I have it in a slide somewhere. Like in the beginning folks were running like two week YouTube tests and we've just had like —
Brett Curry (26:29.87)
Updates.
Olivia Kory (26:30.286)
Update, yeah, like — now we've have enough data and we've seen enough tests to know like, okay, well, that's not going to be successful. So if you don't have time to run this for four to six weeks, then let's do it another time. So just like making sure we're using these best practices to set the test up for success out of the gates. Results come back, they're not where you wanted them to be. Let's go test some other channels. Getting baselines on all of your channels is important. And then the question is,
Brett Curry (26:55.982)
Getting a baseline, yes.
Olivia Kory (26:59.982)
All right, how do I improve upon this? And that's where the fun begins. Like I think the brands who treat incrementality like purely as the report card are having a lot less fun and they're driving a lot less —
Brett Curry (27:15.694)
It's just like pass or fail, right? Like you get this report and you're like, I failed. I totally failed.
Olivia Kory (27:19.958)
Yeah. And like they're driving like a lot less impact honestly on the business than these brands. Like you hear Connor from Ridge talking about how many tests that he ran. They are constantly running a test and like all he's trying to do is like make that Meta number better. And he just posted about this as well where — got a Meta read, wasn't happy with it, made a bunch of changes, retested, saw a huge step change win after the improvements. And so that's where I'd say like the —
Brett Curry (27:28.942)
He's a legend.
Olivia Kory (27:49.838)
The first kind of moment a customer will have is they'll get all their reads back and like we might make some reallocation decisions of, all right, let's move budget from channel A to channel B based on what we're seeing. But then the next thing — and that's where we spend most of our time — is like, how do we improve each of these channels and how do we go in and kind of really make some big swings? And so that's always my guidance is like, all right, that's why would you expect incrementality to be great out of the gates? You've never optimized for it. So how do we make that number better? And those are the wins. Like we have this awesome success story of a brand out there — they 4x'd their YouTube iROAS over the course of like a year. And it's not easy. Again, this is not easy. There's no like simple switch you flip in YouTube to like make it incremental. But it's an always-on channel right now for them and it's like their number two channel. And it was definitely like worth investing in to unlock that. And StockX just spoke about this publicly so I can talk about it. But the StockX team was barely spending on YouTube based on MTA. They ran some Haus tests and now it's their number two channel. It's like one of their top acquisition channels. So that's where we try to take it — hey, like this is the baseline upon which we will improve.
Brett Curry (29:18.798)
Yeah, let's get some baselines and now let's continue to try to step up there. And one other interesting thing — you referenced Cody from Jones Road Beauty. I read his post today, it's an excellent post. But he also said that the Meta improvement they made — so they optimized to try to get more incremental with Meta. The changes they made, actually several of them looked worse in platform. So in platform and MTA, some of the numbers got worse, but the incrementality read improved, right? Because they were able to reach net new people. That's kind of what they were prioritizing just to like really condense it. But they got more incremental. And so yeah, it becomes a game you can play too, right? Which is really fun for me. Like we've been a part of these conversations on the brand side with the Haus team, with my team, about okay, why did this read turn out the way it did either better or worse than we expected? And what do we do next? And it's kind of fun.
Olivia Kory (30:10.286)
I think some of us are in denial that the in-platform number is not a good proxy for incrementality. Because once you kind of accept that, then the concept of an incrementality factor becomes a little bit confusing — if there's no correlation between the in-platform ROAS and the incremental ROAS, then the factor...
Brett Curry (30:42.424)
Yeah, it's a little dicey, right? How do you apply that with any confidence? Which actually let's do this. So let's define a couple of things that we haven't defined yet, which I think will be really helpful. So iROAS and incrementality factor, iCPA and other ones. So what are those? And then how are we using those in our business?
Olivia Kory (31:05.046)
Yep. The audience probably knows CPA and ROAS. This is what you see in the ad platforms in terms of KPI success criteria — cost per acquisition or return on ad spend, without the I. When you add the I into these acronyms, you are basically throwing away all the conversions that would have happened anyway.
So you're saying that the translation from the CPA to the iCPA is, okay, now I'm throwing away everything that wasn't a direct result of this advertising. And in some cases, like YouTube, it actually gets bigger because you're saying, no, now I'm pulling in a bunch of conversions that I didn't actually attribute to YouTube that were showing up elsewhere in terms of channel attribution.
Brett Curry (31:57.73)
Yeah, all my click conversion measurement tools in platform and MTA hated YouTube, but incrementality generally likes YouTube.
Olivia Kory (32:07.63)
Yep, so you're basically just looking — you're moving from attributed conversions to incremental conversions and you are recalculating those metrics based on that. And now the factor is: of the conversions that the platform is reporting, what percent of those are incremental? And so your factor can be below one, it can be over one of course, but that's just the equation of — okay, of those...
Or like a Northbeam — some people will do factors on an MTA or like a third party tool. Of the conversions that Northbeam is reporting, what percent of those are truly incremental. That's the factor.
Brett Curry (32:47.458)
Yeah. And so as a comparison, something that's not very incremental — which our friends at Google might not like us mentioning — is branded search, right? And there's a place for brand protection and things like that, brand defense. But branded search, not very incremental, right? By and large. I know I talked to my friends at Simple Modern, had Brian on the podcast. I think he said branded search for them was like 5% incremental, right? So if a reporting tool — I think this was on Amazon and on Google — showed that it drove 100 conversions, it probably actually only drove five, right? Like 95 of those would have happened anyway, it probably only actually drove five conversions. So then that incrementality factor would be 0.05, correct? For branded search. Yeah.
Olivia Kory (33:39.35)
And branded search is a fun one. We love to pick on it. A couple of things with branded search — it might just be so cheap that even if only 5% of it's incremental, it's so cheap that like the math works. And then the other thing with branded search, we actually published some data on this. It's one of my favorite data pieces, so we can link to it. We've seen it be incremental more than I thought in terms of success rate.
Brett Curry (33:48.877)
Exactly.
Olivia Kory (34:09.038)
There are certainly cases where branded search can be incremental and the number one case there is like when Amazon is bidding on your terms. And so there's this share shift phenomenon that's really annoying. It's like quite annoying that like you've got to pay the Google tax and the Amazon tax here. But when Amazon and some of these retailers are bidding on your terms, you like might want to incur that tax to take the margin back into DTC. So we've seen some interesting data here where it is not universal that branded search is always a failure.
Brett Curry (34:42.414)
Yeah, totally. And we, I mean, we almost always have branded search on for our clients. Usually I think the error is it's not as efficient as it should be, right? So again, Simple Modern's approach was, hey, if it's 5% incremental, then I'm willing to spend here as long as I'm getting a 20X return. I'm going to bid for a 20X ROAS, right? And if I get that I'll spend all day long, it's great. I think for other brands, maybe it's like a 10X ROAS or whatever, but if you're
bidding too aggressively on branded search, you're probably just giving too much of your budget to YouTube or to Google at that point. So yeah. But I think there's still a lot of people that will just click on whatever ad they see first, or if someone's new to your brand and there's not much loyalty there, you probably want to show up. Also a competitor might swoop in and grab it. So yeah, a case to be made there. But in comparison, the incrementality factor is way lower on branded search than other campaign types.
Olivia Kory (35:42.35)
Many Haus customers would say their biggest win was having the data and the evidence to feel confident turning off branded search and seeing like no negative impact to the business and just taking a bunch of money in terms of savings.
Brett Curry (35:57.294)
Right? Right. Yep. I think that's smart in a lot of cases for sure. So let's talk about something else — you mentioned this a little bit, but I want to unpack it a little bit more. So if I'm running a YouTube test, it's got to be longer than two weeks, probably four weeks or longer. If I'm running a Meta test, maybe it can be shorter. Non-brand search may be different. Describe treatment window, post-treatment window and how you're structuring these for success.
Olivia Kory (36:24.622)
Yep, and this is another area where as we learn, as we get more data, as we run more experiments, we're able to kind of like update our priors and just get smarter here. So in terms of test length — we have our test window. This is where you kind of like turn off ads in part of the country, and that's when your test is running. And then we'll say, okay, the test is done, go back, revert back to national, or let's end the test. But then let's just observe the behavior of these markets for some period of time to understand any lagging effects.
Brett Curry (37:01.627)
Observe the control in the test area. So still observe those after the fact.
Olivia Kory (37:05.858)
Yep. We'll still observe those as distinct test cells. But we're just going to watch what's happening to understand any lagged effects. We call that a post-treatment window. The industry has started just using like PTW as the acronym, which we're proud of, but like don't expect everybody to know that. So when we say PTW or post-treatment window, we're just talking about like — test is over, but we're continuing to look at the treatment and the control and see how those sales are changing over time. And with YouTube, as an example, that's just one channel where most of the impact is coming on the back half of that test window, in the treatment window. We see this with CTV as well, and we have some data on the delayed effects of these more view-based channels. So that is just defining the terms. We have the treatment window, post-treatment window. I'd say with something like a YouTube, you also want to factor in the brand's consideration cycle. Like we were talking before we started recording about a furniture company that is selling high-end rugs and sofas. Like you have to factor in the consideration cycle. It's about the channel, but it's also about the brand and their purchase windows. So we try to match those two things up as closely as we can in terms of like — big infrequent purchases get longer tests, these more impulse purchases can be shorter. We try to ask the brand if they have any data on like time from exposure to purchase based on post-purchase surveys. So any data we can get here to help us design the right test is helpful. But for a view-based channel like YouTube, I mean, we're typically running like six to eight week studies and we're seeing a lot of that lift come on the back half of that test.
It's tough because, you know, so like one of my strategists and I talk about this all the time — like the way a finance team is forecasting and the way that this business is running, you're not really modeling like spend today on March 5th as returning on like June 5th. Right.
Brett Curry (39:26.148)
It's not easy to do. It feels very risky to do that.
Olivia Kory (39:29.479)
So we're working with a lot of these teams to like kind of rebuild the marketing model because these channels have very different payback windows.
Brett Curry (39:36.984)
Yeah, it's really a really good point. Yeah. Every product life cycle is a little bit different. I'm really glad you mentioned post-purchase surveys as well. I was talking to like a disaster preparedness company recently — they sell like generators and stuff like that. And they were saying, you know, if you look at like their path to purchase reports, like how long after clicking or seeing an ad does it take someone to purchase? Almost all of those are going to still say within a few days or a few weeks or whatever.
But they did their post-purchase survey and said, hey, when did you first hear about us? And it was like 50% of the respondents were longer than three months. And a lot of them were a year. And I've even seen reports like for True Classic Tees where that's kind of the thing. And so probably whatever path to purchase report you're looking at is shorter than what's actually happening.
And so you do have to keep that in mind when you're running an incrementality test, especially for those channels like Meta for prospecting and YouTube and Connected TV where you're reaching someone really early in the process. It's going to take time, right? Like for home furnishing, you're probably going to have to discuss that with your spouse, there's probably going to be a little bit of shopping there to see, am I going to actually buy this sofa or this rug? And so you got to keep all those in mind.
Olivia Kory (41:00.91)
But the other thing — just the big thing with a channel that's more view-based — is if you're selling on Amazon, if you're selling in retail and you're not pulling those sales in, it's going to look like YouTube can't work. It's because people are going to buy where they want to buy. They're not clicking. So clearly you're not able to guide them to your store. So they're going to go buy where they want to buy and that's likely going to be Amazon or retail.
If you're not pulling those sales in, you're probably understating the effects of that media.
Brett Curry (41:35.734)
Yes, talk about that a little bit — and we'll actually reference, I've referenced on the podcast a few times but I need to fully unpack it — but like the YouTube study you did, like what were some of the findings there in terms of how incremental was it and then the difference between D2C and Amazon and retail? Kind of tie those together. And then we'd love to kind of compare that to Meta or CTV for any numbers you have kind of top of mind.
Olivia Kory (41:58.21)
Yeah, man, we've done so many meta reports since — like I'm using meta in the other sense right now, not the ad channel Meta — but we've done so many reports, I have to dust off the YouTube data. High level, high level summary, do you want me to?
Brett Curry (42:18.68)
High level's great. Yep.
Olivia Kory (42:21.166)
Okay, so the big headline coming out of the YouTube report — this is now almost a year old, so I think we're due for a refresh. You can refresh it for your content in the usual sense. But the big bombshell was that YouTube, the incrementality factor that we discussed — based on what the platform is reporting, what percentage is incremental — was actually 3.42x. So that means if you
Brett Curry (42:30.018)
Yeah, you guys gotta reboot this.
Brett Curry (42:48)
Yeah, yeah.
Olivia Kory (42:51.342)
thought you were getting a one in terms of ROAS on YouTube, it's more like 3.4. And so that completely changes the calculus on like whether you decide to build this channel. So that was huge. And then the halo effects across physical retail and Amazon basically doubled the lift. So you were seeing a one in terms of ROAS on D2C when we pulled in Amazon and retail, that one turned into a two.
Olivia Kory (43:20.654)
So basically assume that you're under-reporting by around 100%. And then most of the effects were actually happening in the second half of the experiment and into the post-treatment window. So if you're running a two-week test and you think it's a loser —
Brett Curry (43:43.362)
You're going to hate them. Yeah.
Olivia Kory (43:45.134)
For sure. And there are also some insights around Demand Gen. I think at the time we published that data, Demand Gen was very unpopular. People didn't like that Google was moving people over from Video Action campaigns. And Demand Gen ended up looking really good in our data. And I think it continues to look good. So I think maybe we owe Google an apology for being upset.
Brett Curry (43:58.936)
Totally.
Brett Curry (44:10.721)
Yeah, yeah. What's interesting is, you know, we noticed this actually before I even knew about true incrementality studies. We had a hair regrowth client and we got them from zero to a million dollars a month in spend on YouTube. And over time they had a data scientist team, a couple of wicked smart guys. I remember one day they came to me and they said, hey, so these numbers look pretty good, but
we believe that for every one sale we're getting D2C, we're actually getting two on Amazon. And we were actually helping them on Amazon. So we're kind of able to help cross-reference that. But yeah, just a reminder that like people buy where they want to buy. Especially for view-based campaigns, you've got to look everywhere. And that's what you guys do. If you're not, then you're missing a good chunk of the results for sure. So how does that compare to Meta? Because I know Meta maybe is a little more mid-funnel so you can structure the test a little bit differently. Post-treatment window can look a little bit different. How are you approaching Meta?
Olivia Kory (45:13.218)
Yep, and this is what — when I mentioned like having to rebuild the marketing model to kind of model out a channel paying back over a longer time horizon — Meta is the opposite. And that's why people love Meta so much, it's really easy in that model because it hits quick. It is a very immediate impact. But we actually see like that can hurt incrementality often because when you're looking for kind of like a short-term outcome, you tend to over-deliver to people who are really down funnel and already in market for your product. You have to be careful. Like Cody and I — referencing Cody from Jones Road — we nerd out about this all the time. We talk about how like maybe Meta's conversion optimization is too good. What's great about YouTube is like, you know, it's prospecting. You're introducing the brand to new audiences. You just know based on the nature of that channel.
Brett Curry (45:59.062)
Yeah.
Olivia Kory (46:09.038)
With Meta, you could get stuck in this like pocket that you can't seem to break out of. It's like, well, Meta's optimization model is like seeing a lot of opportunity here because these people are going to buy. But the problem is your ads aren't influencing them to buy. They were going to buy anyway, and you're just spending on them. And so that's the challenge. It's like the blessing and the curse of Meta — that system is so good at identifying intent, but sometimes it might be too good at identifying intent and you have to be careful there. But like yeah, we see pretty short-term impacts there and we don't see as much of a halo on Amazon retail and we don't see as much of a lagged impact either.
Brett Curry (46:53.26)
Interesting. So then that shifts the way you do the treatment window, the post-treatment window, shifts your expectation, all of those things. Let's talk about some surprises. So now you've gotten into — you work with a ton of brands, you guys have done these big studies. What have been some of the big surprises for you? Either things that are very incremental or things that are not incremental. What have you found?
Olivia Kory (47:17.794)
YouTube wasn't surprising for me, but I know it was surprising for a lot of folks. I came in with the — we were spending as much on YouTube as we did on Meta at Netflix on the acquisition team. We had proven that out time and time again. So none of what we've been seeing with YouTube lately has surprised me, but I know it has surprised many who've been evaluating it on MTA.
Brett Curry (47:39.301)
In the D2C space, yeah, just was kind of a bombshell.
Olivia Kory (47:41.612)
Yeah. So that one's been great. AppLovin — I came in super, super suspicious of AppLovin. You just think about the inventory itself and just the nature of like a programmatic ad network.
Brett Curry (47:56.695)
Videos in mobile games — seems kind of like, sometimes when you're running those ads on Google you like avoid that placement.
Olivia Kory (48:03.18)
I, at least coming from ad tech, just have a lot of baggage and kind of trauma of like what I've seen on the open web when it comes to ad networks. And so I was skeptical. It's looking really good. And I know that's an unpopular opinion with the short sellers, but data doesn't lie. And it's another channel where —
Brett Curry (48:20.686)
That's what I've heard across lots of clients.
Olivia Kory (48:30.254)
The impacts are short term, which these performance marketers love, and it makes their lives easier when you can see that impact pretty quickly. On the other hand, TV has been interesting. I think a lot of brands will come to us in this moment where they're on Meta and Google and they're about to go diversify into YouTube and television. And almost all of them start with TV with one of these partners who does like — it's like a TV ad network for all intents and purposes. It's like linear remnant cable inventory or CTV. And what I've seen there is like results can be really mixed. And the reason there is because you get what you pay for. So many people say like, we're buying linear remnant cable because it's like a $1 CPM or $2 CPM.
And again, I'm here to say that that will show up in the results. You get what you pay for in terms of quality. And so I think it's really interesting when folks are testing not just these more remnant-based TV buys, but they're also going direct to the source. And you see that with YouTube — there's no intermediary here, you're buying YouTube on YouTube. If you go to Hulu, to NBCUniversal, to these networks directly, I've seen a lot of interesting results when you go direct to the source. And you just see what that looks like as compared to a more programmatic remnant buy. So I'm seeing just inventory type on television really swing the results quite a bit — like I can't make generalizations the way I can with YouTube.
Brett Curry (50:16.566)
Yeah, super interesting. So really the game here is we're creating baselines, we're understanding how incremental our different channels are, right? And we're comparing them. And then we're saying, okay, how do we improve these? How do we improve these scores? Because if we action on this data, we're going to get more and more incremental, we'll get more from our ad dollars month in and month out. And so what are some of the biggest levers to pull? So I've got my baseline numbers. Sure, I'm never going to be satisfied with the baseline, but maybe I get disappointing baseline numbers. What are the biggest levers we're pulling to get more incremental on our key channels?
Olivia Kory (50:54.126)
Yep, so far and away, the biggest lever you have is creative. That is where we see the most kind of like step change performance improvements — like 30, 40, 50% wins. Creative is your biggest lever. But I'd say there's a lot you can do in terms of how you're buying the channel. We work with a wide range of brands in terms of size. We work with a lot of Fortune 500 companies.
And then we work with like these kind of smaller, more nimble DTC teams. And so with the smaller teams who really pride themselves on media buying, this isn't as big of an issue. But sometimes when we're auditing accounts, we'll see that like 60 or 70% of the account is retargeting. Like that's very low hanging fruit.
Brett Curry (51:44.302)
And maybe somebody didn't know that. Maybe they were just doing it.
Olivia Kory (51:46.85)
Yeah, and like we audited an account the other day and their Demand Gen campaign was delivering like primarily to Gmail placements.
Brett Curry (51:56.206)
Traffic composition — that's why we almost always turn off Gmail in Demand Gen. It can work for remarketing, not as much for prospecting.
Olivia Kory (52:04.718)
They thought YouTube wasn't working and then we went in and audited it and it's like, you weren't buying YouTube, you were buying Gmail. I just didn't even know there was like that much scale on Gmail. So that was a learning for me. So like the account structure, depending on the team and how much time they spend in the ad account, can really make a big difference. I'd say that one thing on Meta is like the conversion optimization we talked about can kind of get you stuck in a bottom of the funnel hole where you're not prospecting for new users. So the teams that we work with are doing a lot to kind of figure out what is the right event to point Meta toward to get these like net new customers and introduce our brand to new people. On YouTube, I think it's you know, it's probably playing with these different campaign types and settings and the types of placements that you're buying inside of YouTube. And of course, spend level is huge as well in terms of figuring out where you sit on the point of diminishing returns.
Brett Curry (53:13.09)
Totally. Yeah. All those things are huge. And we've seen this play out now with test results that were good and then maybe follow-up tests that weren't so good. It almost always ties back to a big creative shift — either just new creatives or creatives focusing on a different product. Maybe a brand was focusing on their hero product initially, then they moved to some secondary products and it did not work as well. So yeah, totally agree. And then that traffic composition piece is so important.
If you're running YouTube, run YouTube. Don't look at Gmail and some of those things with it. So we are officially out of time, but I do want to hear from you. A lot of people ask me — hey, should I use Haus? Is it time for me to use Haus? Am I too small to use Haus? You guys are releasing a new level of service, correct? That's more widely applicable to DTC and other brands. Can you talk about that?
Olivia Kory (54:10.298)
I can. Yeah, it's on our site now, live. So if you go to our pricing page, it's called DTC Basics. And this is because we want more companies to be able to access Haus. We've heard that it can be expensive relative to other SaaS tools. And so this is a lower cost, lighter service starter tier where you can measure D2C and Amazon across your core ad channels, the ones we're plugged into.
And you can really get your feet wet here. You can see like a really nice side-by-side comparison in terms of what it includes and what it doesn't. But we just want — if our mission is to make incrementality testing more accessible and to really democratize it, then I think this is a first step toward that.
Brett Curry (54:54.446)
It's a huge step. I've been waiting for it. You talked about it on the Andrew Ferris podcast. It's finally here. I'm so excited. So I'm recommending this a lot, talking about this a lot. So Olivia, this was super fun. This time went by so fast. I really wanted to nerd out on more. So I'll have to do a follow up.
Olivia Kory (55:13.166)
Pencil me in for a part two. Maybe when we have our next YouTube report, I can come share it with you.
Brett Curry (55:19.19)
Yes, next YouTube report. All right, we're penciling it in, we're planning on it. Next YouTube report, we're gonna talk about that here. So super excited. Olivia, you're also an amazing follow online. So X and I believe LinkedIn as well. Where can people find you?
Olivia Kory (55:32.622)
On X and LinkedIn — Olivia Kory.
Brett Curry (55:36.317)
Awesome. All right, Olivia, thank you so much. This was super, super fun. Awesome. As always, thank you for tuning in. We'd love to hear from you. If you found this show to be helpful, please share it with someone else. With that, until next time, thank you for listening. Hey, as we wrap up this week's episode, I want to mention — if you're a great brand, if you're scaling high seven, eight, nine figures in D2C or omnichannel, we should potentially talk.
Olivia Kory (55:40.27)
Thanks, Brett.
Brett Curry (56:04.194)
We've worked with some of your favorite brands and we'd love to consider working with you as well. We are masters at unlocking new channels like YouTube, unlocking new scale on platforms like Amazon where we can add up to eight figures in new growth. We've got multiple ways we can work with you. We can do the full service thing and work like a partner with your team and really run everything. Or we can offer consulting. So maybe you've got an internal team that really knows their stuff, but there's an area they don't know really well and they'd like to get some consulting — we can do that. We also have tons of free guides, free resources, free materials you can check out. All of that gets started at omgcommerce.com and we can't wait to help you scale profitably.





.png)
