Supp/Blog/How to Set Up CSAT Surveys People Actually Complete
How-To6 min read· Updated

How to Set Up CSAT Surveys People Actually Complete

Most CSAT surveys get a 5 to 15% response rate. With the right timing, channel, and format, you can hit 25 to 35%. Here's how.


You set up a CSAT survey. After every resolved ticket, the customer gets an email: "How would you rate your experience? 1-5 stars." A month later, you check the data. 47 responses out of 600 resolved tickets. That's an 8% response rate. Your sample is so small that the data is almost meaningless.

Low response rates are the norm. They're also fixable. The problem usually isn't that customers don't care. It's that you're asking at the wrong time, in the wrong channel, with too many questions.

Timing Is Everything

The single biggest factor in CSAT response rates is when you ask.

Ask immediately after resolution: 25 to 35% response rate. The customer just got their answer. The experience is fresh. They're still in the conversation context.

Ask 1 hour later: 15 to 20% response rate. Still okay. They remember the interaction.

Ask 24 hours later: 8 to 12% response rate. They've moved on. The email sits in their inbox with 40 other emails.

Ask 48+ hours later: 5% or below. You've lost them.

The ideal moment to ask is within the last message of the resolution. Not a follow-up email. Not a separate survey link. Right there in the conversation. "Glad I could help! Quick question: how would you rate this interaction?" with clickable emoji or star options.

One Question. Maximum Two.

Every additional question cuts your response rate by 30 to 50%. A 5-question survey that takes 2 minutes to complete will get a fraction of the responses that a single-question survey gets.

Your primary CSAT question should be: "How would you rate your support experience?" with a simple scale. Stars (1-5), emojis (sad to happy), or thumbs up/down.

If you want a second question, make it optional and open-ended: "Anything we could do better?" This captures qualitative feedback without requiring it. About 30 to 40% of people who rate will also leave a comment. That's enough to spot patterns.

Don't ask for their name. Don't ask for demographic info. Don't ask them to categorize the issue. You already have all of that from the ticket.

Channel Matters

In-chat surveys (shown at the end of a chat or widget conversation) get the highest response rates. 30 to 40% is common. The customer is already in the conversation interface. Clicking a star or emoji takes half a second.

In-email surveys (embedded in the resolution email with clickable ratings) get moderate response rates. 15 to 25%. The customer has to open the email and click, but the friction is low if you embed the rating options directly in the email body (not a "click here to take a survey" link).

Separate survey links (a URL in a follow-up email that opens a survey page) get the worst rates. 5 to 10%. Two clicks of friction. Most people won't bother.

If you're using a chat widget (like Supp's), ask for CSAT at the end of the conversation. If you're doing email support, embed the rating directly in the resolution email. Never send a separate survey email if you can avoid it.

What to Do with the Data

A single CSAT score is a data point. Trends are what matter.

Track CSAT weekly, not daily. Daily fluctuations are noise. A bad rating on Monday and a great one on Tuesday mean nothing. Weekly averages smooth out the randomness.

Segment by agent. If one agent consistently scores 4.8 and another scores 3.2, that's actionable. Review the low-scoring agent's tickets, identify what's going wrong, and coach them.

Segment by ticket category. If billing disputes score 2.5 but feature questions score 4.5, the problem isn't your agents. It's your billing experience. Low CSAT on a specific category is a product or policy signal, not a support signal.

Respond to low ratings. When a customer gives 1 or 2 stars, follow up. "I saw you rated your recent experience poorly. I'm sorry about that. Can you tell me what went wrong so I can make it right?" This does two things: sometimes turns a detractor into a promoter, and always gives you specific feedback on what to fix.

Benchmarks

CSAT benchmarks vary by industry, but here are general ranges for support interactions:

4.5+ out of 5: Excellent. You're doing great. Don't get complacent, but celebrate.

4.0 to 4.4: Good. Room for improvement, but you're above average.

3.5 to 3.9: Below average. Something is consistently going wrong. Investigate.

Below 3.5: Red flag. Dig into the data immediately. This usually means systemic issues (long wait times, incorrect answers, rude agents) rather than one-off problems.

Response rate benchmarks:

Above 25%: Good. Your data is statistically meaningful.

15 to 25%: Okay. Usable but improve timing and channel.

Below 15%: Your survey methodology needs work. You're making decisions on too little data.

The Non-Response Bias Problem

Here's something most CSAT guides won't tell you: the people who respond to surveys are not a random sample of your customers.

Angry customers are more likely to respond (they want to vent). Very happy customers also respond (they want to praise). The middle (satisfied but not thrilled) tends to skip surveys.

This means your CSAT score is naturally bimodal. You'll see more 1s and 5s than 3s. A "true" average satisfaction is probably slightly higher than your CSAT score suggests, because the silent middle leans positive (they didn't have a problem worth complaining about).

Don't obsess over the absolute number. Obsess over the trend. If your CSAT drops from 4.3 to 3.8 over two months, something changed, regardless of where the "true" average sits.

And don't compare your CSAT to other companies unless you know they're measuring the same way. A 4.5 with a 5-point scale is different from a 90% on a binary thumbs up/down scale. Even the question wording affects scores. Standardize your own measurement and track it over time.

See Supp Analytics

$5 in free credits. No credit card required. Set up in under 15 minutes.

See Supp Analytics
CSAT survey setupcustomer satisfaction surveyimprove CSAT response ratecustomer feedback surveypost-support survey
How to Set Up CSAT Surveys People Actually Complete | Supp Blog