Supp/Blog/We Automated 70% of Our Support Volume. Here's What Actually Happened.
Automation8 min read· Updated

We Automated 70% of Our Support Volume. Here's What Actually Happened.

The good: massive cost savings and faster response. The bad: edge cases that made customers angry. The honest numbers from 12 months of automation.


The Promise and the Reality

Every automation vendor will tell you their tool handles 70-80% of support volume. That's become the standard claim. We hit 70% automation after about four months of tuning. But "70% automated" doesn't mean "70% of your problems are solved." Here's what actually happened over 12 months.

Month 1-2: The Optimistic Phase

We started by automating the obvious stuff. Password resets, order status lookups, "how do I cancel" questions, billing inquiries. These categories made up about 45% of our volume, and the intent classification was straightforward.

Results were good. Response time for these categories dropped from an average of 47 minutes to under 10 seconds. Customers didn't complain. The agents were relieved to stop handling the same five questions all day.

Cost savings were immediate. We were processing about 3,000 tickets per month. At 45% automation with Supp's $0.20/classification and $0.30/resolution pricing, the automated portion cost us around $675/month. Our previous fully-human cost for those same 1,350 tickets was roughly $2,700 in agent time. Net savings: about $2,000/month.

Month 3-4: Expanding and Breaking Things

Encouraged by the early results, we pushed automation to cover more categories. Refund requests, feature questions, integration troubleshooting. This got us from 45% to 70%.

And this is where things got messy.

The edge case problem

A customer wrote in saying "I need to cancel my subscription." The automation classified it correctly as a cancellation intent and triggered the cancellation flow. Problem: the customer actually wanted to cancel a specific add-on, not their entire subscription. The message was ambiguous, and the system picked the most common interpretation.

This happened 8 times in the first month at the expanded automation level. Eight customers had their full subscriptions cancelled when they only wanted to remove an add-on. Eight angry emails. Two churned anyway because the experience felt so careless.

The tone-deaf bot

A customer whose payment had been double-charged wrote in frustrated. The automation correctly classified it as a billing issue and sent the standard response: "I can see your payment details. Here's how to update your billing information." The customer wasn't asking to update anything. They wanted their money back. The response was technically related to billing but completely missed the emotional context.

The false confidence trap

When automation handles most tickets successfully, the failures become invisible. They get buried in the 30% that goes to humans anyway. It took us six weeks to realize we had a systematic problem with a specific intent category because nobody was reviewing the automated responses that preceded human escalation.

The Fixes That Actually Worked

Confidence thresholds

We stopped treating every classification as equally reliable. Messages with high confidence scores (above 95%) got fully automated. Messages between 85-95% got automated with a softer touch: the response included a "Did this help?" button, and a "no" click routed to a human immediately. Messages below 85% went straight to a human.

This single change cut our false-positive automation rate by about 60%.

Sentiment detection before action

For any action with real consequences (cancellation, refund, account changes), we added a confirmation step. "I understand you'd like to cancel your subscription. Just to confirm, you want to cancel your full account, not a specific add-on?" One extra message. Dramatic reduction in errors.

Weekly automation reviews

Every Friday, one person spends an hour reviewing a random sample of 50 automated conversations. They flag anything that felt wrong: correct classification but wrong tone, technically accurate but unhelpful, missed nuance. These reviews feed back into the system configuration.

The Honest Numbers After 12 Months

Here's where we landed:

Automation rate: 68% (down from the initial 70%, because we tightened confidence thresholds and pulled some categories back to human handling).

Average automated response time: 8 seconds.

CSAT on automated conversations: 79%.

CSAT on human conversations: 86%.

Overall CSAT: 81% (up from 74% pre-automation, because the speed improvement on simple questions more than offset the slightly lower satisfaction on automated interactions).

Monthly cost for automated portion (roughly 2,100 tickets): about $1,050 at Supp pricing.

Monthly cost if those same tickets were human-handled: about $4,200.

Net monthly savings: approximately $3,150.

False positive rate (automation gave wrong answer): 4.2%.

Escalation rate from automated to human: 11%.

What We'd Do Differently

We'd start with confidence thresholds from day one instead of adding them after failures. The early "automate everything" approach was tempting because the numbers looked great, but the customer experience damage from edge cases took months to repair.

We'd exclude high-emotion categories from automation entirely. Billing disputes, complaints about service quality, anything where the customer is already upset. Speed doesn't help when someone's angry. Empathy does, and we haven't found a way to automate empathy convincingly.

We'd invest in the review process earlier. The weekly audit should have started in month one. Instead, we flew blind for three months and only added reviews after the cancellation incident.

The Real Takeaway

70% automation is achievable. The question is what that 70% looks like. If you automate carefully, with confidence thresholds, confirmation steps for consequential actions, and regular human review, it's a genuine win. Faster for customers, cheaper for you, more interesting work for your agents.

If you automate aggressively and treat the number itself as the goal, you'll hit 70% faster and spend the next six months apologizing for it.

See Supp's Automation in Action

$5 in free credits. No credit card required. Set up in under 15 minutes.

See Supp's Automation in Action
automate customer supportsupport automation resultsAI support automation case studycustomer support automation ROIautomated support percentagesupport automation honest review
We Automated 70% of Our Support Volume. Here's What Actually Happened. | Supp Blog