Show Notes
- Dan Norris’ Blog Article on Startup Validation
- My Analytics
- Dominic Patmore of Pilifter.com
- The Lean Startup
- Inform.ly
Transcript
[00:00] Rob: In this episode of Startups for the Rest of Us, Mike and I are going to be talking about whether or not startup validation works. This is Startups for the Rest of Us: Episode 142.
[00:09] Music
[00:17] Welcome to Startups for the Rest of Us, the podcast that helps developers, designers and entrepreneurs be awesome at launching software products, whether you’ve built your first product or you’re just thinking about it. I’m Rob.
[00:26] Mike: And I’m Mike.
[00:27] Rob: And we’re here to share our experiences to help you avoid the same mistakes we’ve made. What’s the word this week, Mike?
[00:31] Mike: I got an email from KISSmetrics regarding the app they develop called My Analytics. It allows you to take a look at all your Google Analytics data on your iPhone. Have you seen this yet?
[00:41] Rob: I saw the app. I had an early beta version of that, and it was pretty stripped down at the time. It looked cool but it didn’t do much. Is it pretty cool?
[00:50] Mike: I mean it’s probably pretty similar to what you saw. I mean it does allow you to see at least the basics and stuff. What I liked was it allows you to go back and compare to some of your data from yesterday or the current week against previous data points that they kind of hard code in. So you don’t have a lot options in terms of saying well I want to compare three days ago versus 14 days ago or something like that.
[01:12] Some of those data points are hardcoded in. So you’ve got options for yesterday, this week and then you can go back either one week or two weeks etc. But you can’t just do like a date range or something like that and compare those. But it does allow you to do at least some minimal comparisons and pull the data right up on your iPhone, which I didn’t have a way to do that before.
[01:33] Rob: Yeah. It looks really nice. The screenshots looked nice and it’s free.
[01:37] Mike: Yup.
[01:37] Rob: No fee to use it. Pretty cool. We got an email from Dominic Patmore and he’s at Pilifter.com. He says I’m been listening to Startups for the Rest of Us since December. It’s comforting to know I’m not the only one seeking to create and maintain small businesses. I wanted to say that your advice helped me help another entrepreneur for making a common mistake. I met with him yesterday and he was explaining his mobile app, which he felt was heading to version 2.0. And his aim was to get venture funding and get bought out.
[02:04] However, when I asked him how many users he had, he said zero. But said he had a number of interviewees who would be willing to pay him. And then he gets to how we become involved. He says hearing your voice in my head, I told him that the best thing would be to go back to those users and get them to pony up on a bet release. My reasoning was that if they so loved the idea, what would they pay to see it go to version 2, 3 or 4. If there’s no one, then maybe it would be better to back to see if the solution is actually addressing a real need.
[02:33] This was in a TechHub space. I think like a co-working space. And there was another developer next to us that jumped in to confirm what I’d said. Thanks to you, hopefully, I was able to steer someone away from putting more money into building something for no audience. Thanks again and keep up the great work. So thanks for writing in Dominic. I loved hearing stories. It’s kind of like second generation. We didn’t directly help someone but it’s like an indirect relationship.
[02:52] Mike: Very cool.
[02:53] Rob: The only other thing is Drip continues to move along. We have 11 early access customers. They’re 8 or 9 of them that are actually active. It’s getting to the point where we have enough people in there that it’s hard to tell what everyone is doing now. But we’re moving forward and keep getting emails about people wanting to use this. So that’s good validation for me.
[03:13] And we did push a big feature live today and that’s going to help with on boarding. And it basically allows people to easily create – it does something no other email app does. It allows you to do one click and have this whole template for an email sequences spit out and then you just kind of fill in the blanks. It’s a neat features and it’s finally out. And then we’re working on split testing next, which is really the last major feature we’re going to implement before launching to the world. So we’re about a month out. Every week the early access users grow and so I’m excited to move forward.
[03:41] Mike: That’s cool. It’s great to hear that you’re moving forward. Personally, for AuditShark, I feel like I’m moving backwards cause things are just going sideways.
[03:48] Rob: Is that right. What’s the problem?
[03:49] Mike: It’s not just even one thing. Like if it was just one thing I can deal with it, maybe two or three it would be all right. But it just seems like all these things that are moving in all different directions all at once right now. It just feels like nothing is going right. For example like the automated upgrade system for all the different things that I had in place that used to working now things are just not working for whatever reason. So whenever I go to do a deployment, basically the entire system stops functioning. And then I had a system of sending emails so that when audits would run, you’d get emails from them. That’s not working anymore.
[04:24] It’s hard to get to the bottom of all of it because there’s so many different, you know, I called them dependencies for a lack of a better way to put it. But all this different things that I didn’t necessarily write that I don’t fully understand how they work. So debugging them and troubleshooting them just takes forever.
[04:39] Rob: Yeah. That’s a real bummer. So its stuff that developers who are still working for you wrote or is it past developers?
[04:45] Mike: Well some of it is libraries. So for example, there’s a scheduler library that is supposed to run things on a scheduled basis. And when it fails, I don’t really get any feedback about why it’s failing or whether or not its continuing to run and it’s just like throwing exceptions in the middle of it.
[05:00] So, I still have to dig into it and try and figure out what’s going on. Like I said it’s all this different moving parts. They’re all in the air and trying to figure them out is taxing to say the least.
[05:11] Rob: And the bummer is you’re like you’re in early access and so you’re trying to manage that process as well as this technical side.
[05:17] Mike: Right.
[05:18] Rob: I mean it underscores how many things you have to juggle as a software entrepreneur that wouldn’t have to juggle if you were launching say an information product or just a new blog or something that relies more on content. You’re not going to tend to hit all these snags that we do, cause software is very complex.
[05:35] Mike: Right.
[05:36] Music
[05:38] Mike: This week’s episode is kind of inspired by a blog post that Dan Norris wrote about whether startup validation is legitimate or not. You know he’s got some issues with Eric Ries’ book called The Lean Startup. You know you look at the startup community and generally, I’m kind of paraphrasing and over generalizing here. But the startup community generally says if you read this book and do what it says you’re generally going to be successful. But there are a lot of startups out there who’ve tried this approach and just failed miserably. And the question is does that mean it doesn’t work. And Dan has got a plenty lengthy blog article where he’s basically calling out and saying this stuff I tried this and I tried this and I tried this and this stuff just doesn’t work.
[06:21] Rob: Right. He gives a bunch of numbers. How he send out surveys and he gives you exact percentages. Shows the press coverage that he got and talks through how many people he get to sign up for a launch list and all that. So it’s a nice in-depth piece. We’ll obviously link it up on the show notes. But the title is “Is Startup Validation BS?” So, we want to kind of talk through with those.
[06:40] I have to be honest, my views on the whole Lean thing are constantly evolving. I’m still critical of people in a movement who are doing a lot of teaching but they’re not launching startups. In any field, there are academics who are studying and researching and then there are people on the ground who are actually doing, and there needs to be collaboration between those two sides.
[07:01] But the startup space has never really been that way. Academics have always tended to be behind. Like the MBA programs tend to be behind where the actual founders are. Maybe the Lean Startup movement is a diverging of that. And we are going to fit the model that psychology and medicine, computer science, you know those fields have fit for years. But its still a challenge to feel like someone with their feet on the ground who knows the tactics that work and the ones that don’t work to kind of hear this movement that I definitely think has positive points.
[07:33] But to hear some people take it on as like the gospel or the way to do it and that if you do this, you’d be successful; if you don’t do it, you’ll fail type thing. That’s where I say like my views on Lean are evolving. I’m not pro Lean or against Lean. I love some of the tenets of it. And I think other parts of it are questionable and I don’t do them in my own startups.
[07:52] Mike: I think one of the things that comes up in my mind is that there are a lot of these classic mistakes that you can make when launching a product that have absolutely nothing to do with startup validation. And although, you’ve gone through the process of validating your ideas, it doesn’t necessarily mean that you’re still doing the right things. Back in episode 121, we’d talked about seven catastrophically common launch mistakes.
[08:15] And you go through that list and you look at those and even if you validated your startup ideas, if you start going through and you commit all of these launch mistakes, it doesn’t mean you’re going to be successful with your products. I mean it’s very difficult to be successful when you’re shooting your foot every time you try to take a step.
[08:30] Rob: If you think about building a successful product as kind of this triangle or this three-legged stool and you need product as one of the legs. You need market as another and you need execution as the third. And so you can build an amazing product, but if it doesn’t actually solve a problem that anyone needs you’re going to fail period. If use Lean Startup or you use another method to validate it upfront then hopefully you have found a problem that actually needs solving.
[08:58] And then you actually need to find a market that you can market to. So that’s the second part of it. Then you need to actually execute well. In other words, don’t make the common startup launch mistakes, execution mistakes, marketing mistakes. So there’s a lot of elements that go into it that validation just says hopefully it says you have a good chance of people actually wanting this problem solved, and hopefully the solution that you’re looking at is going to solve it.
[09:24] But that’s how I view it. I don’t think you can get to a point of 99% certainty just by doing pre-validation. You know it get you above 5% certainty we used to get to. Maybe it gets you to 60% to 70% that if you execute really well on both the product and the marketing side, don’t make a bunch of common mistakes then you have a pretty good chance of success.
[09:46] Mike: Yeah. I mean I think it just points out the fact that there are no guarantees and that’s really what it boils down to. Even if you do all the right things, you’re not necessarily going to be successful. There’s always this asteroid that can come out of left field and blow your startup away. But even accepting that you can go through this process, try to validate your idea and then still fail at it because just not enough people are interested in buying whatever it is you have to offer.
[10:08] So one of the things that I wanted to point out was that even assuming that you have a very large mailing list, that’s kind of a classic assumption is that if you get a large mailing list you’re going to launch, you’re going to get lots of customers very very quickly. And I don’t believe that’s the case. I mean if you have 4,000 people on a mailing list, if you have 1% conversion rate that will get you 40 customers or 20 customers at a half percent conversion rate.
[10:34] I think that’s not true. I mean there’s some fallacies associated with trying to assume that having a large mailing list is going to equate to a large number of customers. And part of that is because of your conversion process. You have to be putting in good calls to action. You have to be telling them exactly what it is that you’re offering, why it solves their problem, how it solves their problem, how much money its going to save them, how much its going to be worth to them.
[10:58] You can’t just blast this out there to a giant mailing list and expect all these people are going to buy your product on day one even if they said they would originally. You really have to educate them about what the product is going to do for them and how it’s going to benefit them. You know blasting out this email is not enough.
[11:15] Rob: I don’t think Dan made, I don’t think he just sent out kind of a random email and said hey buy the product. I don’t recall if I was on the launch list for his original product which is called Informly. But he’s a good content marketer. He’s a good copywriter and he executes pretty well on this stuff. So I wish I knew a little more in depth. That’s actually one thing. His post, I like the way he formed it. I like the way he gives a lot of data. But he doesn’t really talk about what he did to turn people into paying customers.
[11:44] So some questions that come into my head are “Did he launch with a single launch email?” You know did he build up this list of 1,000 or 2,000 people and then just send them a list and say here’s the product, here are the benefits, come and check it out. Or, did he do a long sequence. You know I typically do a four email sequence during launch. I also typically email between 6 and 12 weeks. You know if you’re working on a product for 8 months, I would probably touch base with them every couple of months during that time with progress update to build anticipation and to actually see the response from that.
[12:16] If no one emails you back then it’s kind of not a dead list but it’s people who are not really interested in what you have. But if you get 20 response with people saying this looks fantastic. I can’t wait to get in. Then that’s a whole other story, right. I think the other thing that comes to mind is that Dan has a decent personal brand. So he has a podcast and email newsletter of 5500 people. He has a lot going on. So I found that sign ups from my personal brand audience don’t typically results in trials or sales of my software.
[12:47] I mean, Mike, we have the podcast. I have the blog. We have MicroConf. We have Micropreneur Academy. All that huge footprint and it is a pretty sizeable footprint, I would say I have 10s of HitTail customers out of – there are four figures of HitTail paying customer, and I have literally in the 10s. I don’t even think they’re a 100 people from my personal brand audience who signed up.
[13:10] So if Dan’s list was seeded with a lot of people from his own audience and not from the exact target audience who would actually need this, then that could have been another issue with having – cause he had a reasonable size list. I think it was a thousand. Actually, that’s another question. He mentioned that he has a 1000 person beta list and a 1200 launch list.
[13:30] Do you know what that means? Cause I’ve never separated those two. I think that you should all have them as one 2200 person list and you should pick out a handful of beta testers from that who can run through it. And everyone else is just given basically the early bird the kind of pre-launch discount to come use the app.
[13:47] Mike: I think that what he meant by that is that he had a 1000 people sign up for the beta over the course of three months and go through it. And then there were another 1200 people that entered their email address to be notified of the launch. I mean I’m kind of in your boat. I’m not real clear what he meant by that. But I got the impression that he kept two separate lists. He drew the line in the sand at some point and said okay. Everyone before this point is going to be a member of the beta. Everyone after this point is going to be notified of when the launch actually happens.
[14:17] Rob: Got it. That makes a lot more sense. So, that would be something I would have done differently then. I’m presuming that during the beta that perhaps it was free or he had a free plan or something like that. And that’s one of the catastrophically common mistakes we talked about. He did mention a free plan in the post. He had 4,000 free users and only 15 converted to pay. And that’s the problem with free plans is you give people an alternative. So even if they need their problem solved, if you give them a kind of way to half solve it then they would tend not to want to pay for it.
[14:48] Mike: One of the things that Dan points out is that he went through and did some customer development and started asking people questions about surveys. And one of the things that strike me is that when you’re listening to people who aren’t paying you money, its not going to get you anywhere. Cause nobody wants to hurt your feelings about what your product is. And everyone is going to tell this is a great idea. I would pay for that.
[15:09] But the fact is until you ask them for money, there’s this barrier to entry. If they’re willing to give you the money then you’re probably going in the right direction. But if they’re not willing to cut you a check or give you a credit card then you’re probably on the wrong track or you’re talking to the wrong person. And my inclination tends to be that you’re probably talking to the wrong person as opposed to you’re on the wrong track with it.
[15:33] Because you can usually talk to those people and say, “If you’re not willing to give me a credit card, why is it that you’re not willing to?” It could very well be that it’s not that big of pain point for me. And that’s the hurdle you have to overcome. If you can’t overcome that hurdle then make sure that you are, or you’re talking to the wrong person and you need to figure out how you get to talking to those right people.
[15:54] Because until you do, you’re not going to really find out what their pain point is. You’re talking to people who may have ancillary pain point but it’s not the one you’re trying to solve. You know you’re marketing messages are getting confused, you’re talking to the wrong people, and you’ll never going to get anywhere when you do that.
[16:10] Rob: I don’t think I feel as strongly as you do about having people pay you money upfront. I mean if you look at what we both done with Drip and AuditShark. I have a list put together of emails and I’ve talked to people about the potential price of what that might be. And I’ve shown them the potential features and the potential value proposition and there’s a lot of enthusiasm. And there are people wanting to get into the early access.
[16:34] When I mention the price it will be after the trial, some people have said that’s too expensive and then you let them go. Basically saying that’s fine. I’m not a good fit. But others, you know, they haven’t written me a check yet and they haven’t put their credit card number into the app yet, but I don’t think you need to go that far. I think in an ideal world, you would. But I just don’t think that’s feasible when you’re doing kind of a larger launch like this and you’re trying to get thousands of people on the list.
[17:02] I think the best practice that I would lean towards maybe – you know, the way I look at it is those first 10 customers kind of the people that I interviewed and got on the list in order to validate the idea enough to start coding. Those people I would consider saying all right would you send me a check or would you give me a credit card number.
[17:20] Mike: I totally agree with all of that. But what I mean by listening to people is that if you’re asking them questions like what would make you pay for this? And they say well if you build feature X and then you go off and build it for them. Then they say okay, now I’ll give you a credit card. You’ve solved that particular problem for them but you haven’t necessarily solved a problem that is the same for a lot of different people.
[17:46] Really what I meant by trying to figure out who’s paying you money versus who isn’t is when you start listening to people who aren’t pay you money and you listen to too many of them, then you’re not necessarily solving any one person’s problem or a group of people who all have the same problem. You’re solving these individual problems for different people and you’re not necessarily solving the same problem for lots of different people.
[18:06] Rob: Yeah. That totally make sense. And that’s where that problem solution fit and then product market fit need to be in sequence. Because if you try to go sell to a bunch of people before you’ve solved their major pain point that people are willing to pay for, you’re doing things out of order and you get scattered. As far as I know Dan is not a developer, so I think he was outsourcing development. You can’t just move fast enough unless you have a whole team of people. You can’t move fast enough to hit multiple markets and solve multiple problems at once. And so that could potentially be something that impacted him here.
[18:37] Mike: I think we’ve talked a little bit before about asking for credit card numbers upfront. I just kind of want to emphasize the point a little bit. It’s really more of – I’d call it more of a psychological barrier than anything else. Because when you’re trying to convert people from a free plan into a paid plan, or you’re trying to convince them that after signing up for the product that’s when they should give you the credit card after like a 14 or 30 day trial or something like that.
[19:02] Basically, what you’re doing is you’re putting another artificial barrier in there for them to continue using the product. And really this is a big mistake and the reason why like I said is rooted more in psychology. And I’ll use an analogy here. If you look at enrollment plans for a 401(k) plans, you’ll find that the participation rate is commonly more than 90% were automatic enrollment is isued. And that’s very synonymous with asking for a credit card upfront.
[19:27] If you look at companies where they only have – where has the opposite where you have to actively try to enroll in it, you have to go to the HR department say I’d like to sign up for this. The enrollment after six months I think the statistic is 26% to 43% after six months. And that’s a huge difference. I mean that’s almost four times the difference between them. So what you’re really doing here is you’re trying to give the user value and show them the benefits and not have them do anything extra to maintain those benefits. Your software shouldn’t be any different.
[20:00] And stop asking them if it’s okay for them to continue receiving the benefits of your software. They’ve already made the decision to sign up, so put that in as part of the barrier to making that decision and couple them together so they don’t have to make another one down the road.
[20:13] Rob: Yeah. I think that’s a part of execution and I think its not the only way you can do it, but it’s definitely the rule of thumb that when I’m in doubt ask for a credit card upfront. I think another thing as I was looking through Dan’s blog post I saw I think it might have pivoted too soon. It looks like Informly was launched maybe it was six weeks before he pivoted into something.
[20:32] And as I thought back I realized when I acquired HitTail after I re-launched it, it already solved the problem. And I took me five months even with the marketing knowledge that I have, it took me five months to really learn how to market and how to find the people that really needed this product. And that doesn’t even include the problem solution portion. While I don’t think its going to take five months for everyone, I was willing to be at five months full time and that’s what I was focused on aside from you know there was a couple other smaller task I was doing.
[21:06] But in general I was doing it was pretty much five months full time of my time to really get HitTail where it started to scale. It was growing slowly during that time. But if you expect something to take off fast in six weeks, and you have a free plan and you’re not asking for credit card upfront, and you’re not forcing people to make a decision, and you’re not trying convert them into a paying customer then its likely that even if you have built something good or almost good enough that you’re not going to convert people.
[21:30] I actually have a friend of mine who’s in the mastermind group. He said, he wrote me an email and said I’d tried the content marketing version of Informly and the product itself was pretty rough. So meaning that the product execution wasn’t where it needed to be for this person to pay. And then he said there was zero effort made to convert me into a paying customer. I didn’t get any emails. I didn’t calls to action. There was no credit card upfront. Basically it was some classic mistakes that we see in the startup space.
[21:59] So again, the point of this podcast is not to sit here and point out all the things that Dan did wrong cause that’s not that helpful. I think what we’re trying to say is in light of whether startup validation works or not, it’s a really hard question to answer. But it’s keeping in mind that it’s not just about validating an idea, it’s about validation really talks about the product, right. Does it solve someone’s problem? Everything else the marketing and then the actual execution is another thing entirely.
[22:26] And if you don’t execute really well on that stuff then you can have all the validation in the world but the product itself still might not work. It may work but it may not work as fast as you want it.
[22:35] Mike: We’d mentioned this in the past but it was the long, slow SAAS ramp of death from the CEO of Constant Contact. She spoke at the Business of Software Conference this past year. And that’s just accurate as all heck. I mean it takes a long time to get to where you want. And there are certain things that will give you a step function increase and there’s other things that will give you this very very minimal increase that they work and they work well. But because of the fact that it takes so long to get customers until you achieved this massive scale, those things are not going to measurably move the needle in very very small time slices.
[23:11] Rob: And that’s why SAAS is so brutal, right. And that’s why we talked about having small wins where you do the one time software download. You do the WordPress plug-in or you do the Mobile app or maybe you do the info product or the training course, the e-book. Because things are things that you can kind of get out of the door. They don’t require a ton of support and you can get early wins to build your confidence. So that you can then later take the long slug of actually building the SAAS app, which as we’re finding more and more it takes months if not years to really grow it into a sizeable business.
[23:44] Mike: You know one of the things that Dan mentioned was that coverage in Tech Press doesn’t work. If you’re looking for press coverage to get your customers, that’s probably not going to help very well. But it will give your product and your websites some validation. It will give us some SEO juice. But I don’t think that the traffic that you get in some those types of sources is going to convert into paying customers very quickly.
[24:05] I mean it’s great for those back links. It’s great for the SEO. But in terms of getting targeted traffic, I mean basically what you’re ending up with is that all these people who are interested in new startups and new technologies. They’re going to come check it out and they may sign up. But they’re not necessarily really interested. It’s almost like saying hey check out this tech demo. And you send it out to Reddit and people are going to come and they’re going to get this massive spike of traffic. But they’re not going to stick around because that’s not part of their job. It wasn’t a problem they were trying to solve to begin with. They were just interested in something to use as a diversion for their daily life.
[24:42] Rob: Remember when I got quoted in the – was it The Wall Street Journal?
[24:44] Mike: Yeah.
[24:45] Rob: That resulted as best as I can tell because they didn’t actually send a direct link. It was just a mention. But that resulted in right around seven paying customers for HitTail.
[24:53] Mike: That’s awesome.
[24:55] Rob: Which is a rounding error. It’s completely inconsequential. I’m not saying press doesn’t work. I have heard of businesses. It was kind of like iOS keyword tool and analytic tool for the iOS app store. And they got mentioned on Tech Crunch. And they got $25,000 in recurring revenue based on that mention. That basically built their business. So yes, it can happen if your audience is heavily heavily aligned with that press outfit.
[25:22] But I’ve heard several entrepreneurs talked about, and I’m the same way talked about – I would rather be on Life Hacker or for me it’s like SEO Moz Blog]than I would like to be in the New York Times because those first two are so much closer to my audience. It’s like getting the small tech audience. And by small, you know, tens of thousands of people is infinitely more valuable in getting hundreds of thousands or millions from the general public basically or the general press.
[25:50] Now, Dan didn’t get, you know, he wasn’t on Good Morning America. He was on Mashable, and The Next Web, This Week In Startups, Shoe String and some other startup related stuff. So they were reasonably targeted. They were at least in the tech space. And I don’t know how many sign ups he got. I guess he said he has zero paid users. See, that’s a problem though. In the post, he said I had zero paid users from those traffic sources.
[26:13] Now if he was already launched then you can tell that but if you aren’t launched then you really want to know just how many email addresses did you get. Because getting from that email to the paid customer then depends on your execution and your ability to close those sales. It doesn’t depend on just the validation piece of it.
[26:30] Music
[26:33] Mike: One of the things that I really liked about Dan’s blog post is that he showed some results from his targeted surveys. And he showed the number of people who had said yes, I would pay for this or no or possibly pay for this. And then he showed percentages of how those things stack up. And the one thing that struck me about the survey was that numbers were I’ll say fairly low. I mean it looks like it was less than 100 people who were surveyed as part of this.
[27:00] I don’t think that going and asking a hundred people or less than a hundred people is a bad way to go. But I think you also have to keep in mind that when you’re asking less than a 100 people and saying would you pay for this, yes or no. People are going to tend to lean towards “Yes, I would pay for this” or “maybe” so if you start combining the yeses with the maybes then you’d end with this number that is I’ll say artificially inflated.
[27:25] Because possibly or the maybes you can’t count them one way or the other. I mean maybe you can split them down the middle, but still you’re just kind of guessing. I feel like survey should kind of guide your views and test of your assumptions. But you also have to be very very cautious about the conclusions you draw from some of these responses. If people are paying for it, then that helps out a little bit.
[27:46] Because if they’re paying for the product then any new features that they’re requesting, those are things that are going to keep them. Versus people who are not paying for the product, you really need to find what their problem solution fit is to get them and convert them into a paying customer. And testing some of those assumptions would be helpful, but you have to figure out exactly what the pain point is that you’re solving. And I feel like that that’s what the survey should be for. And it looks to me like – I’ll say that the surveys were a little bit squishy in terms of what the surveys were asking.
[28:16] Rob: Yeah. And for the record, this is when Dan pivoted from Informly which was an analytic aggregator and he pivoted into a content marketing analytics app. And I actually loved this idea to be honest cause we’re doing content marketing with HitTail and I’m working on it for Drip. And frankly, it is hard to track. Google analytic does not do a very good job of that.
[28:35] So I would absolutely pay for something that was easy to set up and that worked. You know when I looked at Dan’s kind of record of people who paid for it. He said that 22% which is like 17 people said that they would pay for it. And then only three people signed up. One cancel and then two weren’t using it. That’s where you basically hit the pavement and you email every one of these people and say, why didn’t you sign up. What doesn’t’ it do. What kept you from doing it?
[29:00] Because I think that that one was a winner. I really do. It’s my personal opinion so it’s not worth very much. But I would pay for something that helped me do a better job of tracking content marketing than Google analytics does. And so, at that point, I wouldn’t have stopped. I would have continued with the pursuit of emailing everyone. Finding out if I had not communicated the value proposition properly.
[29:21] Meaning that if I only got three sign ups out of this whole list then perhaps my positioning or the description, my headlines, you know the value I was going to offer wasn’t good enough or the product itself wasn’t good enough. It wasn’t solving their problem yet. And then figure out what to do. What to iterate on in order to get to the point where I was solving one of those things. Personally, like I said I think this is a problem worth solving the content marketing analytics thing.
[29:48] And I don’t know of a smaller inexpensive SAAS app that does it. I’m sure there’s $500 a month packages that do it but there’s no one in the startup space that’s doing it.
[29:58] Mike: So what are your thoughts on Dan’s comments on about building the MVP? He says that it’s a lot harder than it sounds.
[30:03] Rob: I think he has a valid point. He basically talked about how Eric Ries talks about building a minimum viable product and kind of talks about that its easy or implies that it’s easy or maybe that just has come to be known as getting a MVP you can do on the weekend and then iterate on it. And that’s the thing. If you take that very literally then maybe you do release before it’s a solid product. And maybe your failure on the product side is because you released too early.
[30:28] So, yes, I actually think he has a very valid point here that it’s very hard to know whether an MVP is good enough. To be honest, I think this is why you have to let a few people in at a time. And you have to find people who when you talk about the single problem that you are going to solve, and there should be only one when you start. When you talk about their problems, it’s the people whose eyes light up or who replied to your email when you send them and they’re really excited.
[30:55] And then you say okay, Mr. Person#1 I’m going to let you in and you’re my first early access customer. Try the app. Does it suck? Is it good? Tell me what it needs to do? Then let Person#2 in. Once you’ve built all Person#1 features. And to do this, it requires a lot of time and it requires a lot of iteration. And it requires quite a bit of patience as well as the ability to tell someone you know I’m not going to go in that direction. That I’m not going to build features that you want. That you’re not my ideal customer.
[31:25] So it’s not as simple like Dan says it’s not as simple as a lot of people expect this whole building the MVP and then iterating on it. It’s not as simple as building up a big list, throwing everybody at it and then iterating, because by that time, you’ve burned to your list now. You had a 2000 person launch list and you send them all there. But if the product didn’t solve the problem then poof you’re done. You have to start over.
[31:46] And that’s where I think really trickling it out very slowly which is the way that I’m doing with Drip right now. After a couple of months of basically having a fully functional MVP, we still only have 10 active people on the system. Because we want to build up those features that these people really need before letting in that next group.
[32:04] Mike: Yeah. I totally agreed. I mean this is probably the most frustrating part about it. And I feel like Lean Startup, the book itself really glosses over this fact that it’s such an iterative process and its time consuming. That’s the part that I think is so frustrating. It’s just so time consuming to work with individual customers in a way that in no way, shape, or form, scales. And developers want to get, they’re like the code is done. It works. Let me just throw it out there. And I don’t have to worry about bugs in the product.
[32:35] Maybe you don’t. Maybe everything is functionally complete. But if it doesn’t do what people expect or what they needed to do then it doesn’t really matter. And you need to somehow get through that process and take that step back and work through things slowly with people. You make sure that you’re solving the broader problems. And once you do it then you can throw it out there. But until you get to that point, it’s very difficult to be able to do that because as you said you’ll burn through your list very quickly and then you’ve got to start over.
[33:05] To me I think that’s probably the frustration that a lot of people find is when they try to build an MVP and they say it’s working. Let me throw it out to people and try to get as many customers as I can as quickly as possible. It doesn’t really work that way. And I don’t think I’ve ever really seen it and explained to say hey you really need to work through this slowly as opposed to just throwing it out there.
[33:26] Rob: And I think in Dan’s defense that’s the point he’s raising. Is that you read the Lean Startup book. You follow what the kind of Lean Startup movement and the folks who are teaching and it’s very high levels and it’s academic. So they’ve never said the things we are seeing right now in this podcast. They never say titrate it out. They never say build the MVP; have a few people using it. You know exactly the steps we’ve just laid out because a lot of them are not actively on the grounds developing products.
[33:52] And that’s okay. It’s not as tactical as it might need to be for someone like Dan who isn’t an experienced startup founder. He probably hasn’t launched a startup like a software product before. When he reads something that is as high level as Lean Startup, he still needs much more detailed guidance. And frankly, I don’t now where you go to get that. I mean I think that’s why we started the podcast. That’s why we have the academy. That’s why we have the MicroConf is to provide that kind of education.
[34:19] Not just in the Lean Startup space but just in general, for software founders who want more detailed actionable step by step stuff. And there are certainly courses out there on Udemy and that kind of stuff. But there’s no real single place that I know of where you can say all right, I’ve read the Lean Startup. Now how do I actually implement the Lean Startup? And I think Dan kind of took some hard knocks himself basically learning that, right.
[34:42] He learned step by step that parts of Lean Startup are harder than they might appear or harder than they were presented in the book. I mean I got to really thank Dan for putting this blog post together, for doing the experiments, for putting all the detail in here, and basically for raising the issues so that we can discuss it intelligently on the podcast. Cause I think this raises the level of dialogue about the whole topic. You know both of us; we wanted to thank Dan for putting this post together for sure.
[35:09] Mike: Yeah. Definitely hats off to Dan. I mean if you look at all the different things that have been written about the Lean Startup, I mean there’s all this stuff that says it’s great. You do this. You do that. Everything is hunky-dory. The reality of the situation is once you start putting feet on the ground it’s not that simple. And it never is. I mean when you’re trying to go from a book to reality. There’s a difference between theory and reality. So going through it and having a – I don’t want to say it’s an alternate point of view.
[35:36] But a lot of the things that Dan expressed I mean some of them are points that I share and some are frustrations that I share in looking at the Lean Startup because it’s not that simple. And it tries to portray it as being a, I don’t want to say a step by step guide. But they’re like if you just these things and pivot a little bit and take into account what your customers are saying or what your potential customers are saying and pivot as quickly as possible and iterate everything is fine.
[36:03] And it is just not that simple. And it just glosses over all of the details and all of the tactics that you actually need to execute on. And I feel like its just too high level in many cases to be applicable to not all startups, because I think that there’s definitely pieces of it that apply to very very many of them. But it’s just not tactical enough or provides you enough information to be able to decide when what you’re doing is completely off the rails from what they recommend.
[36:32] But at the same time, that doesn’t mean you’re doing the wrong things. There are no hard guidelines in there for when you should listen to it and when you should ignore it.
[36:38] Rob: Yeah. As I’ve said from the start, my views on Lean are certainly evolving. And I think there’s a lot of good that can come out of it and there’s some good high level information. There’s also that lack of detail you’d mentioned. And then there are some things that I do think that are questionable or have not worked in my experience at all. So if you’re interested in checking out Dan’s app, it’s at inform.ly and he’s basically pivoting to something that I think he’s now having success with and it’s beautiful client report for freelancer and web agencies.
[37:07] So inform.ly connects to Google analytics and other apps and then it provides a monthly report that you can use as a freelancer or web industry to keep your clients happy. So we definitely wish Dan best of luck with this product moving forward. You know we hope he continues to update us on his blog at thedannorris.com.
[37:26] Music
[37:29] Mike: And if you have question for us, you can call it in to our voice mail number at 1-888-801-9690 or you can email it to us at question questions@startupsfortherestofus.com. Our theme music is an excerpt from “We’re Outta Control” by MoOt used under Creative Commons. You can subscribe to us in iTunes by searching for startups or by RSS at startupsfortherestofus.com where you’ll also find a full transcript of each episode. Thanks for listening. We’ll see you next time.
JC
I have begun using a service called Celery that integrates with Stripe which allows “pre-orders” for product/service validation seamlessly.
They started as a bridge from Kickstarter campaigns to official launch.
Simple landing/sales pitch page with easy order form.
Do you think something like this helps or hurts startups?
JC
Erik Starck
Of course it works, but you’re never guaranteed anything in business. What would be the opposite of validation working? That you should not validate at all that you’re on the right path?
Let’s compare a set of startups that actively work with validation with another set that hides in a basement for 12 months and then release a mega-product to the world and see what group has the largest percentage of success.
I know where my bet is.
But maybe I missed the point of the argument.
I do agree though that MVPs are d*mn hard. I think it’s easy to miss that the main purpose of an MVP is to validate your business assumptions – not generate sales. And, even if that sounds good and fine, getting a good grasp of your business assumptions is not as easy as it sounds.
Mike Taber
The point was that it’s easy to think you’re doing startup validation when you are involving customers in the process, only to find out later that you didn’t ask the right questions.
The problems I see with startup validation is that it’s a lot more difficult than it sounds. “Just go talk to customers and ask what they want.”
It’s not that simple. It almost never is. There’s a big difference between what people will tell you they do and what people will actually do.
Desire is not the same thing as behavior. Many of the tenants of startup validation sound like common sense on the surface, but there are an infinite number of subtle nuances that are difficult or impossible to convey until after you’ve been in the trenches.
Hindsight is (unfortunately) the best teacher for startup validation. Every time you test something and make decisions based on that test, if you find out later on it didn’t work then you need to go back to your startup validation process and figure out what the right questions would have been because you apparently didn’t ask them.
This is where the rubber meets the road, and this is where I have a problem with the Lean Startup. It’s not that it doesn’t work. It simply doesn’t convey the difficulty and complexity of this step and people take it as gospel without digging in.
Of course, the other side of the coin is that every single product is difficult and for someone to try to come out with a defining guide for that kind of stuff is not only enormously difficult, but it borders on impossible. The Lean Startup is great for what it is, but there’s room for improvement, and I think that boils down to looking at specific situations and types of products.
Jian
Second Mike’s opinion towards startup validation, well said!
So the best way to validate an idea I guess is just to do it. Have people vote using their wallet. If the idea is making money, then, it is valid.
Terry Lin
Hey Rob/Mike –
I have used Celery (and know the founders) but in an e-commerce fashion. I had a sample product made for $40 US or so, took some pictures, linked it with Stripe and collected about $1,000 worth of orders.
Celery does allows you to post some HTML into their default landing page and allows you to collect the credit card info at the end.
You can actually charge the credit cards later, charge just a deposit, and even offer discount codes. It’s not great for recurring apps as its mostly made for e-commerce folks, but as far as simplicity to collect pre-orders nothing really beats it.
You can see the sample landing page I used here: https://www.trycelery.com/shop/balla
– Terry
na
Startup validation never claimed to be ‘the answer’, it is a method for minimising risk and refining the product to move it closer to a valuable solution.
If people are using it with the assumption that it is a recipe for success the problem is with them misunderstanding the purpose of the recipe, not with the recipe itself.
The premise for the podcast is flawed, its like asking if a shelf works. It does what it does. You shouldn’t complain if you attempt to use it as a time machine and discover it doesn’t work.