The Great Testing Tempo Myth
Ryan Deiss
Lessons
Class Introduction
18:01 2Class Introduction Why You Need a Growth Team
05:13 3What Is a Growth Team?
07:09 4The Mission of the Growth Team
15:31 5Types of Growth Teams
08:45 68 Critical Skills Every Growth Team Must Have
08:25 7How To Establish Communication and Accountability
08:44 8Growth Acceleration Process
08:00Focus on the Goal
05:29 10Analyze the Opportunity
08:40 11Brainstorm Possible Solutions
18:46 12Prioritize the Team's Ideas
05:25 13Demo: The Growth Idea Sheet
09:34 14Run the Test
16:51 15Report the Results
02:53 16The Tools We Use
06:13 17How To Establishing a Culture of Optimization
17:50 18The Great Testing Tempo Myth
05:47 19How To Run a Growth Team Meeting
19:40 20Who Should Attend a Growth Team Meeting
12:50 21The Importance of Reporting and Documentation
07:51 22How To Pilot a Growth Team in Your Company
06:46 23Auditing Your Existing Capabilities
09:03 24Make Any Essential Hires
16:38 25Get a Quick Win
03:55 26Get an Integrated Win
02:17 27Decide Where the Team Should Live
03:04 28Formalize and Announce
02:57Lesson Info
The Great Testing Tempo Myth
Now let's talk about testing tempo. Again, we're still thinking about this culture of optimization, so we got like, okay, we really need to establish culture of optimization, it needs to be baked into the values, right. Now I want to be careful that we don't go the other extreme of all right, test, test, test. It's gotta be this constant thing, because on one hand, people don't have a culture of optimization at all, and the other hand, it's like this faux-bizarro-always-be-testing kind of thing that doesn't sustain itself very well. So I read recently, I think it was a blog post, and I've seen this so many times. A five percent improvement in conversation rate every month nets an 80% improvement over a year due to the nature of compounding returns. Now, this sounds really, really good. It sounds really, really, really good. In my experience, this idea of do a lot of little tests and over time they will compound. It sounds good, but it's a logical lie. It sounds incredibly logical, oh, ...
if we just do a whole lot of little-bitty tests, whole lot of them, they're eventually going to compound, it's a logical lie. You can pull it off if you're a great big company with a whole lot of traffic and a whole lot of testing budget, but in general, the fact is, most tests lose or deliver no significant results over time. That is the reality of optimization. That is the reality of testing. Most tests lose over time. Our director of optimization, the person who really is in charge of this, who now, once we had this realization that we don't want this siloed optimization, he's now beginning to lead our entire growth team, Justin Rondeau, says "A 30% win rate is damn good." That's what he says all the time. You be right 30% of the time, it's like baseball. Baseball, you get on base 30% of the time, you're an All Start, right? That means you struck out a lot. And the key here is over time. When you're talking about a five percent win rate. When you're talking about a five percent winner, something that got you a little five percent improvement, a five percent win in January is rarely still a five percent win in December. These little wins, these little slivers of a win that you think are compounding, they have almost always reverted fully to the main within a matter of weeks or months. The win that you thought was there wasn't there, and instead what you have is an insignificant increase and a colossal, colossal waste of time. So most tests lose. It's not about picking, doing a whole lot of little tests. And that's why the volume of tests should not be your goal. I don't recommend that with your growth team you have as a key metric how many tests did you run. We did that, we tried that. It failed, miserably. We incentivized our people to do a lot of tests, which meant they picked a lot of little tests. A lot of little ideas, a lot of things that could be implemented quickly. Which meant no break-through whatsoever. The volume of tests is not the goal. Testing tempo is not the goal. If you go out and read a lot of stuff about this, what I'm saying right now is a very contrarian idea, and I have no doubt that we have conversion rate optimization people and stuff like that who are probably in the chat being like, that's crazy. And you know what their arguing with, math. They're using math to argue, and if you just go with math, you're right. I have this pesky thing called experience, and we've done this, and I can tell you, it just doesn't work. It assumes that you have an endless amount of resources to run as many tests as you want, which you never do. It assumes you have an endless amount of data, to reach statistical significance in a moment's notice, which you only never do. And it assumes that these little-bitty wins hold over time, which they only never do. So it's not about volume. The goal is to run tests that truly move the needle. We are looking for needle-movers. We are looking for the things that actually have the chance of delivering a break-through. If you're going to invest the time, the energy, the effort of rolling through that process, that giant process we talked about. I mean, it's a pain in the butt. The growth accelerator process, and doing all the reporting? God dang, fast forward to the end, when you're putting together a report on what just happened. There better be something kind of cool in there, people are like, "Oh, that's neat." It's like, "Oh, we just did that? "We changed the background color? "We gotta report on that now?" Come on, right? So what I want you to think about this is in terms of ask yourself is this a variable that whispers or is this a variable that screams. That's the terminology. Again, when we think about establishing a culture of optimization and growth, having this set of shared terminology and ideas. If we begin to ask that question, "Okay, guys, sounds good. "Is this a variable that whispers "or is this a variable that screams?" I want to make sure it's screaming. And these different phrases that I'm giving you are there because so often people want to again just throw math, they want to throw numbers at it. I'm not opposed to math, right I'm really am not. Being opposed to math would be kind of strange, but I'll tell you, you can use statistics to prove dang-near anything you want to prove. At some point, you have to let logic and wisdom intervene. So let's pull out a bit. Think about this, guys. Is this a variable that whispers or is this a variable that screams? If we're screaming, then we can move forward. If it's a variable that whispers, then we shouldn't. So don't worry about testing tempo. Don't worry about the rate at which you're testing, don't worry about the volume of tests that you are doing. You can get there. You can grow over time. As you have the capacity to do more tests, do more tests, but don't have tempo and volume be the goal in and of itself. Make sure that when you look back you say those were worthwhile tests that actually moved the needle. We're constantly looking at variables that scream.
Class Materials
Ratings and Reviews
Scott Nelson
Course was amazing! I'm a startup founder & the content was perfect for stage of our company. Ryan is a fantastic presenter & I found both his delivery as well as his ability to answer pointed questions to be extremely helpful. I'd recommend this to any company looking to build a growth team!
Marvin Liao
This was an incredibly helpful class & i found many of the frameworks & suggestions immediately useful. Well worth it.
Ato Kasymov
Amazing class !! Really complex issues are simplified and put together in a systemic and practical way !! Just take into work and reap the benefits !!