featured-image

Spotlight: Clare McDermott of Mantis Research on creating great survey-based content

Content marketers love a good survey. What better way to demonstrate authority and insight within an industry than with a fact-filled report full of stats and figures. And then there’s the earned media value when other articles and content creators quote and reference your findings, extending your brand’s authority to new audiences.

Clare McDermott is a co-founder of Mantis Research, specialising in survey-based research for marketing projects. She cautions marketers against rushing to SurveyMonkey without a lot more planning up front.

JC: These days there’s an app for everything. Do tools like SurveyMonkey mean just about anyone can produce an effective survey?

The answer, unfortunately, is no. I say this with a tremendous amount of humility because I was there once. I know the mistakes people make; in part, because I’ve made those mistakes as well. Survey design really does take experience.

One of the things that can go wrong is what I call the mistake of taking inventory. This is when you’re in a particular industry and you want to find out everything your audience knows about this particular topic. And you really haven’t thought about what’s going to make an interesting story. How do you design questions so the results will be something your audience cares about?

Inventory is good in some cases. But you have to find a balance between storytelling and just taking account.

And then there are tonnes of little mistakes that people make: asking questions with two or three questions embedded inside it; not providing the right answer choices; using faulty answer scales; not testing your survey. I could go on, but the mistakes that can be made are broad and diverse.

Often, when we’re going through survey design, I’ll say to the client, “What do you really want to know here, because I think your question isn’t getting at the real nut you’re trying to crack.”

What about sample size? How big does a survey group need to be to become statistically relevant?

Let me put out there first that I’m not a statistician, but this is based on my many years of experience working closely with the statistician on our team.

So, there are a few issues. Number one is credibility. If you have a sample size of 50, a journalist isn’t going to read or publish your survey.

The minimum sample size depends on who the target market is. If it’s a CEO survey, 200 is okay. That’s a lot of CEOs. But if you’re talking about entry-level marketers, for example, I’d say 500 to a thousand is a credible number.

From a statistical standpoint, one thing to keep in mind is how you’re going to stratify the data.

Say you want to segment your sample by company size (e.g. small, mid-size and enterprise). To do that comparison, you’ll need to create three cohorts. With a small sample size, it’s difficult to achieve statistical significance when making these types of crosstab comparisons.

We use a tool that tells us whether differences between one segment and another are statistically significant. You’re going to have a much higher chance of having a significant difference if your sample size is larger.

Like you, most marketers aren’t statisticians either. Do marketers risk drawing the wrong insights from their survey results because they’re not equipped to interpret the data correctly?

It’s a big issue. Sadly, I sometimes read surveys where the person just doesn’t understand statistics very well. But they’re not getting called out for it because I’m not sure the audience understands it very well, either.

It’s important to understand that, just because you see a difference, you don’t necessarily know the why. There’s a project we’re working on right now where people on the team wanted to make certain statements. And I had to point out that they can’t actually say that because we don’t know why this is true.

You can hypothesise and say that you’re speculating this might be the reason, but we don’t know.

Marketers also like to curate insights and stats from external research to put in articles or give weight to a white paper, etc. Is there a risk that the research might be misrepresented?

The thing that really ticks me off is when people cite research and link to something that’s not the original source. The page they link to is also just citing the same research. And then you go back in time, following links to find the original source, and the research is 10 years old.

I don’t cite anything that’s more than two years old, unless it truly is the definitive study of that industry. And I never link to something that isn’t the original company or organisation that did the research. It undermines your credibility.

What are some of the tricks when it comes to presenting the findings so they can be clearly understood?

What drives me nuts is people who use colour for no reason, like a bar chart where every bar has a different colour. As the reader, I’m wondering what does yellow mean versus red? It distracts me from what’s important.

Sometimes, if you want people to focus on three of the bars in a bar chart, what works nicely is you can do a light grey on the others and a bright colour on those three. Hey, this is what you should pay attention to. These are the ones that are actually really interesting.

In general, avoid pie charts. They make it hard to compare segments; how big one slice is versus another. A bar chart makes it easy to see differences that are extremely hard to see in a pie chart.

Is it always bars and pies and so on? Are there more effective ways to visualise the data and get the point across?

We use a lot of different types of charts and graphs. Obviously, some are appropriate for some kinds of data versus others. It’s really important to not get ahead of yourself by wanting to use a fancy chart just for the sake of using it.

Before choosing a chart, talk about what’s interesting about the data and what you want to highlight. What’s the best matching chart for that?

We did a really cool project for Sinch [gated content] where we used spider charts, showing by country what channels people use for messaging. I’m not really concerned what the number is. I’m not trying to show that 80% of people in India use SMS. What’s interesting are the different shapes. For a marketer, it’s challenging to scale a global strategy when the channel usage in each country is so unbelievably different. So, this is a really nice way to visualise those different trends.

To wrap up, what are your top dos and don’ts?

My number one “Do” is that, if you are inexperienced, get someone to help you with survey design. If you make a mistake at this stage, it’s going to ripple through the entire project. Survey design is the one place to spend your money.

My biggest “Don’t” is long surveys. Be respectful of survey-takers’ time. Keep your survey under seven minutes; even shorter if it’s highly technical. People just get fatigued and having a really good survey experience is super important. With each extra minute, you are lowering your completion rate and assuring you’ll have a smaller sample size. Survey length will kill you. It will kill you.

Source

Post Your Comment

Your email address will not be published. Required fields are marked *

Copyright © 2020 Maxxhost.net. All rights reserved.
<!--Start of Tawk.to Script--> <script type="text/javascript"> var Tawk_API=Tawk_API||{}, Tawk_LoadStart=new Date(); (function(){ var s1=document.createElement("script"),s0=document.getElementsByTagName("script")[0]; s1.async=true; s1.src='https://embed.tawk.to/5e4c036e298c395d1ce88e95/default'; s1.charset='UTF-8'; s1.setAttribute('crossorigin','*'); s0.parentNode.insertBefore(s1,s0); })(); </script> <!--End of Tawk.to Script-->
Register your Domain