It was a million-dollar mistake.
Several years ago, I ran a startup that offered citation and writing tools to help students with research papers. At one point, my co-founder and I had an idea: What if we built a research search engine to help students not only at the end of the writing process, but also at the beginning?
We invested over $1 million indexing our citation data, researching search technologies, and building out the service. Launch day came, and we proudly introduced the new feature by placing it prominently on the website. We waited for the crowds to roll in and…
Less than 1% of our users clicked on it.
As it turns out, most students who used our site weren’t looking to do research. And when they were, they’d go somewhere else—a little ol’ website we all know called Google.
What I’ve learned since then, and what I do now every time I have an idea for a new business or product, is to find the easiest possible way to understand if people want it.
I don’t build the feature. I don’t invest in the service. I simply place a link or button on the site that leads to a theoretical feature and see how many people click on it to validate whether it’s even worth pursuing.
In product management, this is called a “painted door test.” The concept is this: You don’t build a door, hoping people will walk through it; you take the much easier route of painting one on a wall and watching to see whether people even attempt to open it. You can introduce a painted door test to a small portion of your audience to get the key insights you need.
In our case, we could have simply put a link to the new search feature on our homepage. Anyone who clicked on it would see a simple landing page that said “Coming soon!” The vast majority of users who didn’t click on it would have revealed to us that this product wasn’t worth building.
This approach has saved me an immeasurable amount of time and money, but you don’t have to be a product manager to try it out. Anyone who wants to pursue a concept, or who’s trying to get a lot done with little time or resources, can use this approach to test and validate ideas quickly.
Think about the last time you sat in a meeting debating one approach versus another, or the last time you had an idea for an initiative, but you weren’t sure it would work out.
Oftentimes, these ideas are based on hypotheses. That’s great—those, “I wonder if…” moments are where the best innovations come from. But how much time could you save and key learning could you gain if, instead of debating the concepts, you collected data that revealed whether or not they would actually work?
Say you’re working on marketing copy for an email campaign. You could go back and forth with your co-workers on which approach or version to go with, or you could put two iterations of the text on your website and see which one users engage with more. Or, you could run a few Facebook ads that simultaneously test different versions of the copy. Those insights could help you quickly determine what is the best copy for your email.
This can work for bigger decisions, too, such as making a call on which technology to use. Of course, you should lean on experience and rely on what a chief technology officer recommends. But in cases when there’s not one clear right answer, create a small prototype that uses the tech you’re evaluating and try it out for a week or two. Does it work as well as you thought? Does it scale? What are the unexpected glitches? With that sample set of data, you can make a much more informed decision.
Implicit in all of this is that some—in fact, many—of the concepts that seemed brilliant at first won’t have the result you wanted.
And while some may call those ideas failures, I look at them as huge successes. You’re learning what works and what doesn’t, without wasting time or money, and you can quickly move on to the next idea. The real failure would be sinking resources into a product or idea that no one wants.
Thankfully, I learned from my mistake many years ago. Here’s a recent example of how I put what I learned into practice: My current company, Solitaired, ties simple card games to brain training, and we had a hunch that users would want multiplayer functionality. That’s a pretty complicated build, so instead of embarking upon it, we simply added a button to the site. Turns out, less than 2% of our users clicked on it.
Was it a bad idea? The data showed us it wasn’t big enough to matter. But it wasn’t a failure at all. It was a sign that we can move on to the next idea without looking back—or losing $1 million in the process.
Neal Taparia is CEO and co-founder of brain-training company Solitaired.