Idea Validation and Analyzing Results

(Photo Credit – and how you can create product mockups just like this!)

Wow – it’s been a long, protracted process; but honestly, it’s been a blast to go through this with you, step-by-step.

I’ve stated before that this example of Testing the Muse in Practice could’ve literally been done in a fraction of the time, as evidenced by my work on the Dogs of the Dow App, where an idea was hatched, tested and validated in less than one month in 2014.

However, to make this a true real-time experiment, allowing for time to implement and report on the process, we really needed that extra time to tease out the fine details of the validation process.

But here we are, at the end of the experiment, with the one question remaining: Was the This Car of That Car app idea validated?

That is the question we’ll finally answer in this final installment, Part 12 in our real-time experiment series of Testing the Muse in Practice.

But first, let’s look back on what we had set out to achieve.

 

Validation Starts with One Idea

We began this series with the story of how, in 2012, I was in the market for a used vehicle. When doing the necessary research and going out to visit used cars, I kept running into the same obstacles, namely that comparing multiple used cars that were essentially apples-to-apples comparisons was extremely difficult to do.

Some of the best ideas are born from the need to solve personal problems; humans can be really quite inventive when we need to scratch our own itch.

After identifying my own personal problem, I set out to find how other people compared used vehicles – particularly when all other variables remained equal. Through market research, I quickly realized that this was an issue faced by many second-hand car shoppers.

Upon further research, I found a suitable solution that I could customize and systematize to use over and over again to quickly and reliably find the “best deal” out several seemingly similar vehicles. Recognizing that this solution could be productized to help others, I began to brainstorm how such a product would function, detailing features and benefits, as well as how it would be designed.

Once we had a mock-up of the product, we setup a sales page to begin dry-selling the product as if it were already being sold on the market, to determine if others would buy it too. The final validation of the idea was completed through advertising campaigns on both Google AdWords and Facebook Ads. The results of these campaigns will be analyzed based on units “sold”.

Phew! That’s the process in a nutshell, folks! Let’s see how our “product” did on the market….

 

Advertising Campaign Results

Whether a product is real or a “mock-up” – the true test of what defines a business is whether or not people are willing to buy what you’re selling.

In order for people to buy from you, they need to know your product or service exists. The quickest way to bridge the gap between your offering and your target audience is via advertising.

So let’s start by reviewing the results from our two phases of advertising, to examine:

  1. If we generated enough traffic to our website to make an educated decision on validation (at a minimum we wanted to exceed 100 visitors); and
  2. If our product is marketable and/or our ads resonated with our audience.

Based on the traffic generated through these campaigns, we can then translate these advertising results into whether or not our idea was validated using Google Analytics data.

For a detailed account of how these ads were created, or how these campaigns were setup and executed on Google AdWords and Facebook Ads, please start by visiting Part 10 and Part 11 of this series.

 

Advertising Results from Google AdWords (Phase 1)

Utilizing the Google AdWords advertising platform, we targeted 13 keywords through a total of 21 advertisements.

Interestingly, of the 13 keywords, only 4 keywords yielded almost the entirety of the clicks (i.e. used car comparison, comparison of used cars, second hand cars, used car search).

After 6 days, these ads received 99,801 impressions, but only 386 ad clicks, resulting in a 0.39% click through rate (CTR). Not stellar. At an average cost per click (CPC) of $0.63, the total cost of this phase of the experiment was $241.38.

 

Advertising Results from Facebook Ads (Phase 2)

Based on the 21 ads ran on AdWords, we took the 5 ads with the highest CTR (considered as our top performers) and created combinations of these ads with three images, resulting in a total of 15 ads on Facebook.

Using these 15 ads, we ran a 3-day campaign on Facebook Ads. This campaign reached 9,384 people, generated 35 visitors (for a 0.37% CTR, again, not so hot) at a total cost in Facebook Ads of $20.99, or $0.60 per click.  

 

Translating these Advertising Results into Validation Answers

As detailed in Part 9 on using Analytics, the stats from your ad campaign only tell you how effective your ads are – such as how many times it was displayed compared to how many people clicked on it. These interactions don’t explain what visitors did on your site after they’ve clicked on your ad – and more specifically, whether or not you have a prospective customer.

The real indicator of idea validation is your product’s conversion rate. In other words, from these advertisement clicks that brought unique visitors to your website, how many entered your sales funnel by clicking on your “Buy Now” buttons (constituting a “sale”).

If you need a refresher on how we setup our Analytics to define and record these goals, refer back to our post in Part 9.

 

Validation Analysis using Google Analytics

Over the entire period of this experiment, unique visitors to the project website totalled 3,868 users. However, this was over a prolonged period between setting up the website, initiating the advertising campaigns, and now reporting on the results. So it’s a healthy number, but it distorts the results a tad.

For a more refined view of the validation process, we want to isolate the traffic experienced only during the advertising campaigns. This will provide a much more focussed reflection of the market’s actual response to our product offering.

During the ad campaign, there were 559 visitors to website, with a total of 621 page views, or 1.11 pages per session. The duration of their visit was short and the bounce rate was generally high.

Nonetheless, of these 559 visitors to the website, 22 clicked on one of our “Buy Now” buttons to reach the sales page, resulting in a conversion rate of 3.94%. Not too shabby after all…

Additional details collected during this process indicated that, of the 22 visitors who were “sold” on the product:

  • 32% selected the Android app (including one email requesting the app once it goes live);
  • 14% selected the Apple app; and
  • 54% were undefined (e.g. they entered the sales funnel through a general “Buy Now” button and didn’t select their preference).

This is great information to have if you’re testing an idea for an app, because now you know which platform the majority of your target audience uses and/or which platform to develop for first. However, the 54% undefined is tricky. Next time I would close this loop by eliminating this option, requiring visitors to choose between Apple or Android from the very beginning of the funnel.

 

Validation Lessons Learned

There is much to be gleaned from the results of our two-pronged advertising campaign.

Firstly, the CTRs were not impressive. There are ways to increase these rates with more effective ads. This is particularly possible, now that we’ve identified the top keywords and ads that did the majority of the heavy lifting to attract visitors to the site.

Generally, this audience appears to be relatively hard to capture, which translates into higher (more expensive) customer acquisition costs. This was a red flag we raised early in the process (Part 2), when the keyword research results were weak, and later in Part 3 when the Facebook audience was difficult to target.

Nonetheless, we pressed forward because this is not a desk-top modelling exercise. We test, in the marketplace, to get evidence-based data to make better decisions.

A more obvious issue, that could potentially explain the weak results, may simply be the fact that the product doesn’t jump off the shelf at people. It may be that the product solves a problem that people don’t generally have; requires too much education; or the product presentation may be all wrong for this target audience. That leaves a lot of variables that could be tweaked or split tested to improve the results.

 

Validation Results: To Build or Not to Build?

The numbers are in and we’re looking at a conversion rate of 3.94%.

Perhaps that sounds low to you, but is it? Is this enough to validate our idea, and justify further development of this app?

Scouring the internet, you’ll find a wide range of opinions on what’s a “good” or acceptable conversion rate.

The 1% figure comes up most often. If you have a 1% conversion of visitors to customers, you have a viable product that 1 out of every 100 people want. That sounds fair, but only if the economics work.

Using our example, it cost approximately $265 to attract 560 visitors to acquire 22 customers. That might be acceptable if the product retails for over $20; but to spend $12 to acquire one customer on a $1 app purchase…the dollars don’t make sense. You would lose money every time. With that being said, there are many ways to scale, for example, increasing the perceived value and price of your product, or reducing the costs of acquiring your customers.

These are definitely considerations to take into account upfront when thinking about pricing and product platform.

But back to the quest for the Golden Conversion Rate…there is none!

If 1% to 3% is acceptable to you, and you can conceive of cheaper methods to advertise and scale a business, then 1% is certainly doable.

Use the numbers to help make informed decisions, but don’t ignore your gut, previous experience and level of interest/passion for a project. All of these elements together will help give you indications of potential success or failure.

By no means is this set in stone, but based on working through this process numerous times with various product ideas, I tend to use the following conversion rate ranges to guide my decision-making on whether to build or not to build:

  • 0-0.9%: Very little interest; Drop it and move on!
  • 1-4%: Some attention but not overwhelming; Consider, but don’t prioritize.
  • 5-9%: The market is responding; Strongly consider.
  • 10%+: We have a winner; Give the people what they want!

 

Wrap-Up of our Real-Time Experiment, Testing the Muse in Practice

We’ve tried to pack A LOT into this one post.

But to summarize, the validation process for the This Car or That Car app idea yielded a 3.94% conversion rate. It cost $265 in advertising, plus approximately $10 for the domain name and $10 on graphics – for a total price of just under $300 to test and validate this concept.

To me, spending that $300 to speculate on whether or not you have a potential business idea is priceless – it helps expose the downside risks, while uncovering the upside potential. Whether or not the costs are “worth it” can only be measured in future time and money saved, or in the comfort of moving forward and taking action, creating something that you know [has already] generated market interest.

For the This Car of That Car app, relative to other projects we’re working on, the results from the testing were not overwhelming. Although there was some market interest, on a priority basis, this idea is being “shelved” as low priority (but not dumped entirely). However, if this app sounds like something you’re interested in pursuing or developing – contact me and let’s chat.

You may have caught on by now that this series was less about the specific app idea we chose to experiment with, and more about using it as a “vehicle” (pardon the pun) to release a series of pillar posts on the various stages of idea validation, using a real-life example.

As stated in Part 1 of this series, the value of this experiment was the over-the-shoulder view as we embarked on the process of Testing the Muse – from idea, to research, to product concept, to testing, to analyzing results and beyond.

Stay tuned for more experiments, as we use this process (and various improvements/refinements as we go) as the basis for validating ideas, working towards building businesses and products, and ultimately developing real-world sources of passive income.

Best Always,

Jonah

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes:

<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>