Sometimes our clients are used to making decisions based on large quantitative studies and wonder how we can possibly gain the confidence we need to make decisions in the early stages of a new venture without statistical significance. Instead of focusing on surveys that give us quantitative data (which we agree has a time and place), we focus on evolving our research design from measuring what people say to measuring what they actually do. And it gets really interesting – and valuable – when those two things contradict one another.
The Differences Between ‘Say Data’ and ‘Do Data’
So what do we actually mean by “say data” and “do data”?
“Say Data” simply put, is when users vocalize what they feel about a concept or guess how they would act in a certain situation. It’s hypothetical, at least historically-informal, and therefore not a perfect predictor of future behavior. Collecting say data is a fast and affordable way to get early feedback on a new product or service and can point you in a general direction.
“Do Data” is behavioral evidence and a more reliable predictor of future behavior than say data. To collect it, you need to get into the market, put your prototype into your user’s hands, and observe and measure their behavior. The best thing is you don’t need to run a large pilot and spend tons of money to get do data.
As you continue to iterate your concept, and increase your investment accordingly, it is important to start measuring not just what people say, but what they actually do. Below are a few ways we quickly and affordably helped clients to just that.
Examples of how to move from ‘Say’ to ‘Do’
We employ different methods to test our assumptions with customers, depending on the context. As our understanding advances, we start to use higher-fidelity methods for gathering more reliable behavioral data.
An energy company wanted to understand what users valued in an app and how to sustain their engagement with it over time.
SAY: Users said they valued being able to monitor and understand their energy use, but couldn’t answer whether it was compelling enough for continued engagement.
TEST: We provided a small group of users a pilot app that focused on this one core feature.
DO: They lost interest after only a few visits but showed us some new possibilities in the process.
WHAT WE LEARNED: Monitoring their energy usage wasn’t enough to sustain use of the app, but we had tested alternative elements via debrief conversations with the pilot users. Ultimately, they needed a viral feedback loop, so we added the ability to control their smart home devices, which increased adoption and retention on the next iteration of the app.
An apparel company wanted to understand what people do with old shoes, because returning the shoes was a make-or-break component of a potential new offering.
SAY: People said they hand old shoes down or give them to goodwill, but in-home observations revealed piles of old shoes in closets and garages.
TEST: We created pre-paid mailers and handed them out in shoe stores to test whether people would package and ship their old shoes.
DO: People followed through, and even requested additional mailers for more old shoes they had laying around.
WHAT WE LEARNED: While they weren’t ready to part with their old shoes at the moment of purchasing a new pair, they would mail in their shoes at a later date, which allowed us to move forward to pilot the concept.
A large non-profit association wanted to improve the initial enrollment process for new members.
SAY: Users said they were ready to let go of the traditional plastic card that signifies their membership and instead use a digital card on their smartphone, but we weren’t confident they would choose a digital experience, download the app, and complete the registration.
TEST: We created a simple website that gave people a choice between a digital card or a physical card, then set up “lemonade stands” so we could observe people going through the online join process and interview them afterwards.
DO: Users did pick a digital card when signing up, but they didn’t download the app or log-in.
WHAT WE LEARNED: Members liked the idea of a digital card, but needed more education and support to bridge the link between a digital card and the app, so the entire “join flow” was adjusted accordingly.