In-product NPS survey Vs Email survey

For a long time me and my team have sent surveys through emails to our customers. The survey contains a bunch of questions and the famous NPS question.

Recently we decided to do it differently and run the NPS question inside our product directly to know more about our detractors.

As a result we got much more replies to our NPS question but also a much higher score, we went from 30 to 66.

Did you build yourself and your teams tools to run those kind of survey in your product or do you use existing products such as Appcues? And which way do you prioritise (in-email/in-product)? What could explain that huge gap for the NPS between these two methods other than the amount of answers.

miko1993mp's avatar
2 years ago

Hi @GulliGulli_,

That's an interesting question. As for the email survey, would you care to clarify how many questions your survey consisted of?

I'm asking because it's usually not a good practice to nest the NPS question into a larger survey. In fact, you should keep the NPS survey question short, pithy and to the point.

What we usually do at Survicate is just ask two questions:
1. On a scale of 0 to 10, how likely are you to recommend us to a friend or colleague?
2. Depending on whether they a detractor, passive or promoter, we'd ask them "We're sorry to hear that. What can we improve?", "That's not bad. What's the main reason for your score?", or "We're happy to hear that! Would you care to leave us a review on G2?" or simialr.

So, could it be the case that you had a lower NPS score when you sent email surveys, because the survey had too many questions? By the time the user reached the NPS question, they had been fed up with the number of questions (survey fatigue).

Did the in-product survey also have that many questions, or you limited it to the NPS question only?

Also, when you run in-product surveys, people are using your product so it seems natural to them you ask for their feedback. This could encourage higher NPS scores. Whereas when you send NPS surveys by email, you're sort of encroaching upon people's inboxes. But I don't think this would account for the huge gap you mention, and you could just as well make a case for using email surveys over in-product surveys.

I can only speak from my experience at Survicate, and that is:
- We send the NPS survey in-product first, and then follow up by email after 7 days
- We didn't notice any meaningful differences in the scores we get across the two channels
- We target our NPS surveys precisely, i.e. don't send NPS to the entire customer/user base at once, but do it in batches (paying customers / long-standing customers etc.)

Could it be that with your in-product NPS survey you targeted a different user/customer base (i.e. fresh signups) than with the email survey? (i.e. customers who recently experienced product issues). I think this could partially explain the disparity between the scores.

And I said earlier that some people are great advocates of running NPS surveys only by email. Here are some of the arguments they use:
- In-product NPS surveys can be seen as interrupting the customer's workflow and disrupting them during a task
- They might therefore have a higher abandonment rate
- Even if you customise in-product surveys, they don't feel as personal as email.
- Email is said to encourage more honest responses when done right
- When you send in-product NPS surveys, the customer sentiment might be influenced by the action they're currently taking in the product
- Rather than measuring the sentiment of the overall brand and brand loyalty, you would be measuring their satisfaction at a specific interaction point or workflow, which is not what you want.

4 points
GulliGulli_'s avatar
@GulliGulli_ (replying to @miko1993mp )
2 years ago

Hi @miko1993mp, thanks for your reply, that's quite correct.

In our email survey, the NPS question was mixed with other questions ( I would say there were more than 10... some were here to ask basic information we could get by asking the other questions inside the product ).

Both the in-product and email survey were sent to our entire customer base.

It was my initiative to push the NPS question inside the product without including extra questions to it to have a better idea of our score.

One thing worth mentioning is that the email survey got 500 replies. While the in-product one got almost 4000 answers 17.000:

My theory as well was that with a survey, only customers with a strong opinion (good or bad) are tempted to go through it. And like many things, it is easier to complain than praise.

2 points
What are the best platforms for community management?

We have 15k newsletter subscribers, and have around ~2k of them in a Slack group. We're starting to encounter issues in terms of community management - specifically, it's hard to pin content like c...

Any suggestions for a workaround to an Outlook calendar not syncing with Google Calendars?

Google lets you subscribe to a calendar using a URL - although when using an Outlook 365 Calendar link, events are copied over once, and then the syncing stops. This seems to be a relatively new is...

What's the best video conferencing app for internal discussions?

Three major considerations I have been using to evaluate the plethora of options available: 1. Effortless/non-intrusive: It shouldn't feel like a video call 2. Price: As this app would be complime...

The community for power users.