Video

MRMW Q&A

Britt Calvert, Senior Account Manager, Ipsos Marketing took the time to explain how mobile and video analysis can provide added depth and impact to your research.

Q: There has been a lot of buzz about mobile and video. Why are these important and why should researchers care?

Britt Calvert: Mobile respondents are more engaged! They’re less likely to speed through a survey, straight line through questions, or give garbage responses to open ended questions. Mobile samples are also more representative. For example, certain demographics like younger males have always been tougher to recruit. Mobile surveys are more relevant to their lives.  This lets us better connect with them – in an environment in which they’re comfortable. Researchers should also look into mobile since it opens doors to new research applications, particularly within Product Testing! With mobile diaries, product experiences are captured in-the-moment. We can track how consumers react to the product over time, and at a variety of different usage occasions.

The Video capability built into mobile phones is an added bonus. It lets us get even closer to consumers as they experience our clients’ products. Video provides authenticity. Video data gives us a window into actual – not claimed – behaviors.  We get to witness the nuances of how consumers interact with the product.  And, as we’re learning, those ways are often surprising! Together with quantitative learnings, Video and Images help us build powerful narratives for our clients, and brings the results to life for our clients’ stakeholders. Video content can also have a life beyond the initial project.  Many clients create a searchable bank of visual insight.  This is a great resource for future deep dives and inspiration – so the research continues to add value long after the report is finalized.

Q: What kind of results can mobile and video deliver to clients? What’s a real-world example where this achieved client objectives?

Britt Calvert: To demonstrate how Mobile Diaries and Video can come together with powerful results, we’ll share a recent product test conducted for the Spreads team at Unilever. Our client’s category was facing a trend of decline. New competition was coming in, and users were exploring and switching to other options like spreadable butter. Unilever had made extensive use of traditional market understanding and product testing techniques. However, they felt there were other dynamics happening that weren’t apparent through the lens of traditional research results. Unilever challenged themselves to look for new approaches that were more “real” and “in-context” in order to uncover the truth.

The goal was to understand what Unilever’s own brand users think of key competitors: In what ways is competition better? Worse? How long does it take for opinions of competition to become solidified?  Are opinions locked after the initial trial?  Or does it take a week or two with the product. And how would we characterize the degree of ‘switching risk’ when Unilever’s buyers try competition?

We listened to Unilever’s business questions. We knew that Mobile Diaries and Video would be a great match, and could deliver insights above and beyond the learnings from a traditional one-week Home Use Test.

The study we designed was a quantitative, extended Home Use Test. Our respondents were heavy users of Unilever spreads brands. We placed each heavy user with a competitive spread product and asked them to use it how they normally would for a period of several weeks. During the usage period, we asked that they complete Mobile Diary entries with Video using the Ipsos App.  The entries were for anytime they used ANY spread in-home, whether that be the test product, another brand of margarine, or butter. In addition, each week respondents completed a weekly check-in survey on the App. This let us track their overall, non-occasion-based impressions of the product. In all, we gathered several weeks’ worth of in-the-moment, true-to-life product reactions.  We got a window into how impressions develop and solidify over repeated exposure to competition.

Q: What specifically made the mobile and video diaries especially useful for Unilever? What implications did they have on quantitative research?

Britt: Our occasion diary was a mix of standard quantitative questions and video uploads. The quantitative questions – things like which specific spread they used, how they used it, and overall satisfaction, became the backbone of our eventual analysis. The video would capture insights in a more open-ended way, and help us bring the results to life.

Setting up our study, we learned some tips and tricks for designing a diary:

  1. Keep the respondent burden minimal to encourage frequent participation. Diary entries should be very brief – if it’s more than a few minutes long, it’s too long. This isn’t as tough as it sounds. To some extent, the Video you’ll gather can replace traditional questions.
  2. Make sure to word each question as efficiently as possible to respect the mobile format
  3. And write your diary script in a light, fun tone – sound conversational!

And when it comes to respondent-generated video, there’s an art to getting the best results.

  1. Give detailed prompts to help set your respondents at ease. If your prompt is too brief or general, respondents freeze and aren’t sure what to say. Instead, making it clear what you want to hear about. You could ask: “Please record a short video telling us what you think of the product. Were there any surprises for you – either good or bad? What are the pros and cons compared to your usual brand?”
  2. And for your own sake on the back end, offer clear instructions for HOW you want respondents to film. Should they use portrait or landscape mode?       Consistency will be better for your eventual report. And do you want respondents to record their faces as they speak, or turn the camera on the product? We learned the hard way how critical it is to provide explicit instructions.       We hadn’t realized how important this would be, and got back some unintentionally funny videos. In one case, a pretty insightful voiceover about margarine while the camera is trained on a Honda steering wheel!       We would have loved to use it in our report, but had to set it aside.

After the data collection, we needed to bring all of this together. The more-standard questions from the diary allowed us to quickly grasp the overall learnings, and it also pointed us toward interesting areas worth deeper investigation. The videos added richness and filled gaps – they helped us to understand “the whys”.  The software that was used to transcribe and tag the videos allowed us to more readily identify not only clips that supported or challenged the data, but also “conversations” that we could tap into to really listen to the messages that consumers were sharing. In addition, we learned things just by watching consumers interact with the product – things that we would have never known to ask about in a quant survey. (Play the video) Given our mobile diary entry needed to be lean, we were able to fill in the gaps around usage, context, and even the presence of others in the household with the video data.

Q: In the end, what impact did mobile and video provide Unilever?

Britt: All together, our study generated over twelve-hundred respondent videos, totaling almost 20 hours of footage! We relied on new technology and the support of our partner Big Sofa. They helped us manage and analyze the many hours of video we generated – a task that has previously been completely manual, using hours upon hours of our teams’ time. Their online platform lets us seamlessly store, organize, and share the videos we collected. Every video was transcribed and tagged with searchable keywords based on the content, which meant we could quickly pull up videos by theme or usage occasion.

Quantitative data from our study was appended to the videos as well. This allowed us to search the videos based on results from the diary. For example, we could instantly pull up all the videos from respondents who had bottom three box Overall Satisfaction with the product, and dive deeper into what might be driving their dissatisfaction.

Video really brought out the ‘why’ behind our traditional data and beautifully captured the occasion context, the consumer’s emotional experience, and their mood. All of which is critical when you want to take action on the findings.  Video draws out much more from consumers than we’d get from traditional text-based open ends.  Spoken responses are much more nuanced and evocative.  And consumers provide about six times as many words spoken, vs. typed.

There’s a real beauty when data can be experienced and felt by the client, versus merely seen during a presentation or in a report. Landing insights was easier because brand users literally did the talking. The consumer could tell her story her way with her conviction and her honesty.  The data wasn’t just data any longer, but was Mary Jane from Nebraska making breakfast with her son.  Mary Jane and her son spoke their truth – real people with real stories that will spring to our client’s minds as they contemplate changes. Most importantly, the deeper insights made possible by this new approach helped Unilever take action on the findings & numbers coming out of the traditional data.

Q: New research methodologies aren’t without their quirks. Were there any lessons learned? Is there anything that you would have done differently?

Britt: For our team and our client, it was the first time ever conducting a project using these new approaches. There were indeed a few bumps and surprises along the way.

First, we took away some lessons about what kind of long-term cooperation we can realistically expect. The study ran for a month, and only half of our starting sample was still with us at the end of week two.

Next time, we’ll do the following:

  • Start with a larger sample size – if the budget can support it.
  • We’d consider making video suggested (not mandatory) at each diary entry.
  • We’d look at structing incentives more strategically.
  • We had paid out only upon completion of the full month. Paying a smaller portion partway through, with the remainder at the end, may have helped us sustain respondent engagement.
  • We’d push ourselves to be more disciplined with the length of our diary survey. Seeing our cooperation results, we came to think the burden to respondents was a bit too much.The first draft we shared with our client was similar to a traditional product test report, with some video clips. A great effort, but it felt too familiar and we decided to go back to square one. We worked together closely to land our final deck. Our final deck was very brief, only about 20 slides end to end. While our results were quantitative, the report included just 4 numbers! Several well-produced video streams were embedded to bring our key insights to life, and everyone walked away taking away key insights without feeling bogged down by too much data. It’s important to remember that the first time you conduct research like this with a given client, be sure to clarify the vision for the final report. Don’t make assumptions – and don’t just rely on what you’ve done for past deliverables!Our final ‘lesson learned’ is to give yourself time. We live in a fast-paced world, and increasingly it seems the expectation is to “get results overnight” for any and everything! While it is true that timing to get to data has been greatly reduced, the old adage, “garbage in, garbage out” still holds. As with any study objectives and output must be clear and aligned.Our original timeline was aggressive, and required quick work for both us and for our client to align. We thought we could move very quickly and – for both setup and analysis – that turned out to be unrealistic. You’ll want to build in a little bit of extra time during setup to make sure you get your surveys just right. And extra time is also needed during reporting – you need to expect a few more iterations than a typical study before the final deck is crafted and ready for prime time. But in the end, the extra time investment will be well worth it.
  • When you try something new, it takes a bit of time to get it right – so do yourself a favor and set realistic timing to manage expectations. These kinds of studies are so different from what we – both Ipsos as a supplier and most of our clients – traditionally work on. They’re pretty involved to execute, analyze and report, particularly the first time.
  • Another thing we learned was the importance of advanced video editing capability. A polished video stream with impact words will leave a MUCH bigger impression on your audience compared to raw individual video clips. The raw clips in most cases shouldn’t be considered a client-ready deliverable – be sure to plan for a video editor to help produce the final product in order to succinctly present the data in an impactful way. In our case, Big Sofa helped with this.
  • Another ‘aha’ moment we had was around the deliverable. With a project like this, the approach is cutting-edge. That means the deliverable should be too. Your client will be expecting something above and beyond the typical report, even if it’s unspoken.

Utilizing video analysis to it’s fullest potential, Ipsos looks to be on the forefront of paradigm shift that the world is in today when it comes to the power of video. With implicit reaction time tests and augmented reality and virtual reality becoming available to nearly every consumer with a smart phone, the possibilities are endless.