In my last post I shared a process for companies to consider using when evaluating new products for new markets. The core idea is to test “customer readiness” before making significant investments in Marketing and Sales. And the vehicle for testing is to use a cross-functional Customer Development team to reach out to prospects to create and expedite learning moments. Uncover the facts, test your assumptions, and iterate on the product, if absolutely required, before executing the big launch. In this post, I’ll share an experience I’ve had when using this approach.
Ideally, first meetings with would-be targeted prospects are done face-to-face. It wasn’t always feasible for my team and I to do this. Instead, we scheduled a number of conference calls. I hired a college intern to help us, reaching out to targeted interviewees matching our profile discovered through LinkedIn. We were deliberate in requesting just 25 minutes with each person. Which means you need to be super prepared and focused. Asking for more time causes people to second guess the intent of the interview. Plus, people are just plain busy. Our stated purpose of the meeting was to conduct product research. We shared that we were “considering” bringing a product to market and valued their expert opinion as to whether such product would be useful.
By this stage, we had built a minimally viable product and could both describe and demonstrate it. But, we never led with a demonstration in the early calls. If the interviewee expressed interest in seeing it, we’d schedule a longer second call. In our outreach we found a number of reputable professionals that wanted to help us. Many were curious and most, given the nature of their jobs, wanted to understand and even have influence on new products entering the market.
While one hopes that the new product will be a hit out of the gates, typically the first encounters with prospects for highly innovative products are learning experiences. We came into the meetings indicating to our guests that we had formulated a few hypotheses that we’d like to run by them.
Before each meeting, I’d have a series of questions in front of me. We were consistent from meeting to meeting in questioning and took scrupulous notes. The first questions centered on what “we” believed their business problem/bottleneck to be that prevented them from getting the most of their workflow (in this particular case, it dealt with data analytics), followed by our thoughts on how to best tackle those problems. Stating the hypotheses and asking for validation generated some very lively conversations and typically, over 80% of the meeting was spent on my team listening, asking clarifying questions and taking scrupulous notes. While being respectful of the time commitment, a number of people we interviewed wanted to stay on the calls longer and contribute. We were careful not to come across in a sales mode straight away, but if an interviewee was genuinely interested, we pursued next steps.
We learned a great deal from the interviews. In our case, while what we were contemplating was interesting, for the markets/applications we were considering, it didn’t strike users, in its present form, as being of paramount importance in solving core problems. In fact, some expressed that putting the product out to market would produce questionable outcomes because our targeted user generally didn’t have the skills or background necessary to prepare data to be ingested and processed. In other words, there were upstream activities in their workflows, before our application would be utilized, that had to be performed correctly in order for our outputs to have real value or even avoid negative consequences.
While looking at our product in isolation made us think that we were in good stead, gaining an appreciation of the complete workflow and the skills required (and available) to administer that workflow, caused us to rethink. As we interviewed more people, we noted very consistent themes. We now understood the facts. Before burning too much time and cash, we caucused with key stakeholders, and made the necessary corrections to the product and how it was positioned. This led to a successful outcome. Had we launched without this process, the odds for success would have been far less.
The process does work. The ROI is so much higher when companies execute proper validation when innovating. It may take a little more time, but it’s time well spent. The time, effort, and money to correct, post launch, a poorly positioned product is so much higher! And nothing is more satisfying than knowing the product is right and equipping the marketing and sales teams with messaging that you know will resonate with prospective customers.
Topics: Product Strategy, Marketing Strategy, Strategic Insights, Market Research, Product Testing
Fri, Jul 28, 2017