When I changed from being a startup executive to an EIR at a venture capital firm, I didn’t realize how ill equipped I was for my most important task: market validation. I had a general idea of where where I wanted to start a business, and some good direct experience that suggested it was a viable market opportunity. But what I lacked was a clear process for validating assumptions with market data, and then adapting the idea based on the data.
I won’t pretend to be an expert, but wanted to capture some tips that have worked for me so far in hope it can help another Boston entrepreneur.
#1 - Develop a Plan
Those of you who have worked with me know that I like to plan. Planning reduces risk, and reducing risk ships products. I’ll save you time reading and rereading The Four Steps to the Epiphany and Lean Startup in search of a process by telling you this: you won’t find one. So your first step is to clearly define your own process.
To start, define the following (below example from SilverBack, circa 2000):
- Customer hypothesis - e.g. MSPs servicing SMB customers in financial and public sector markets.
- Problem hypothesis - e.g. deliver appliance-based management platform to enable the remote delivery of IT services.
- Leap of faith assumptions - e.g. reduced margins driving VARs/resellers to seek recurring revenue, resellers seeking service & technology to enable delivery of recurring revenue services, SMB customers looking to outsource IT to reduce costs.
- Criteria for validating assumptions - e.g. 5 MSP interviews, 10 SMB customer interviews, 2 analyst interviews, 1 concierge-delivered service/experiment.
Once you have the what defined, you need to define the how. Define a pipeline of potential people and companies that can validate/invalidate your assumptions, and assign time frames to validating these assumptions. I also suggest you define metrics that allow you to judge progress over time. My primary metrics have been: # of assumptions validated/invalidated per week and % complete toward validating/invalidating hypotheses.
#2 - Seek Support
If validation efforts were movies, there would be a long list of credits at the end. There is no way to run a validation process without talking to hundreds of people, most of whom you have never met before. Unless you are an industry icon, you are going to need help. There are two types of people you will need to support your work:
- Connectors - You know who they are in your network. You’ll need introductions to lots of companies and people, and so connectors will be essential to your success.
- Advisors - You’ll need trusted advisors to help make sense of the data flowing in. Find advisors that compensate for gaps in your skill set, and are willing to provide honest/direct feedback.
Be transparent with your supporters. In general, I've found that in addition to wanting to help your success, supporters often find the validation efforts intellectually engaging. But it’s hard for them to be intellectually interested if you are secretive with your idea.
#3 - Use Both 1x1 and 1xN Validation
Jeff McCarthy at North Bridge recently drove this home for me. Some assumptions are best validated with 1x1 interviews, and others by people who can distill the views of 100s/1000s of customers. The latter group can include industry analysts, experts, leaders, and even some high end sales reps. You should identify which assumptions require one or both type of validation, and seek out the right people to support the effort.
I’ll confess to having started with an almost missionary commitment to 1x1 validation, which I soon realized to be a derivative approaching zero. Be smarter than me and use both 1x1 and 1xN validation from the start.
#4 - Identify Free Validated Learning
Free validated learning is to a validation effort what steroids was to Barry Bonds. Failed and recently exited startups are often mulch piles of free validated learning. Identify startups that have recently failed or been acquired that may have validated learning for you to leverage. They do not need to be directly related to your problem hypothesis, but should at least have been selling to the target market of your customer hypothesis.
#5 - Delay the Solution Hypothesis
I still struggle with this one. It’s easy to want to jump in and define the solution. But the simple fact is, until you fully validate your customer and problem hypothesis, it's likely the customer or problem hypothesis is going to change. In addition to detracting from your validation effort, investing in the solution hypothesis can also skew your objectivity by making you emotionally invested in a point of view.
If you have figured out how to consistently apply this tip, please call me and tell me your secret. ;)
#6 - Define Clear Success Criteria
For years I have used two simple techniques in managing project risk: 1) define a plan B for my plan A, and 2) select a tripwire date and criteria by which I will decide whether to continue with plan A or switch to plan B. The upfront definition of a date and criteria ensures good risk management by allowing you to be dispassionate when it comes time to change directions. I've applied a similar concept to my validation effort, by defining clear success criteria upfront and a date by which I will pivot or persevere.
#7 - Be Open
When I first entered the Boston startup market, I was always intrigued by the “stealth mode” startups. There was something cool about stealthy startups, like they had been recruited for a secret mission by the CIA. But in 2012, while I understand the need to be secretive about technology and business practices, I do not understand the need to be stealthy about a startup idea. If your startup idea relies on it being a unique, you’re doomed (loose paraphrase of Eric Reis).
I’ve adopted a policy of being open with my ideas. There are probably dozens of people with the identical idea for my current or previous pivots. A person or company who may have been a potential threat to my idea now, may be a potential business partner in a future pivot.
#8 - Don’t Rely on Reports & Surveys
It’s tempting to want to send out a survey or gather analyst reports to validate assumptions (confession: I did it). While these may support a validation effort, they can in no way be used to validate or invalidate assumptions. I too wish there was a cheat code for this process. I think it was Guy Kawasaki who said that every analyst projects a new market to be $50B in five years. So my advice is: assume you have a $50B market and now go gather the real data. If analysts could accurately predict the market five years out, they would be managing hedge funds instead of being analysts.
#9 - Don’t Rush It
Antonio Rodriguez used a firefly analogy recently for my validation effort. Fireflies congregating in one area of the woods will eventually scatter in many directions. The key is to congregate with the other fireflies long enough to identify the best direction in which to head. I am impatient by nature, and want to get on with building cool stuff. But there is nothing less motivating than building the wrong cool stuff. So take a deep breath, relax, and keep validating.
I knew I hit Publish too soon...
#10 - Run Field Experiments
You can talk and talk, but at some point you are going to have to get in the field and run experiments. Since the area I am investigating is not teenagers chatting with 3D avatars, I unfortunately can't get by with A-B testing on consumers of a website. Instead I've resorted to running concierge services in order to get validated learning.
I've been striving toward running experiments for the riskiest assumptions, but then devising the experiments in a way that pulls in as many other core assumptions as possible. Many thanks to the local companies that have been supporting me in this effort.