Every CS tool will tell you they want to help you implement best practices.
They'll send you implementation guides. They'll assign you a customer success manager — yes, your CS vendor has a CS team — who will walk you through the recommended workflows. They'll host webinars about how leading CS organizations use their platform. They'll have a certification program. A community. A conference.
All of it designed to get you operating the way they need you to operate in order to use their product.
And they call it best practices.
Here's the thing that should make every CS leader pause.
Every CS platform has its own set of best practices. And they're all different.
Gainsight's best practices are not Totango's best practices. ChurnZero's recommended workflows don't look like Catalyst's. The way one platform thinks about health scores is fundamentally different from how another one does it. The data model that drives one tool's playbooks is built on different assumptions than the one driving its competitor.
If best practices were actually best practices — if they were derived from some universal truth about how Customer Success should work — they would converge. They don't. They diverge. Because they're not derived from universal truths about Customer Success. They're derived from the specific architecture of each product.
Best practices is what vendors call the changes they need you to make to your operation so their product works.
The Compromise You've Been Making
When a vendor tells you to implement their best practices, what they're really telling you is: modify your operation to conform to our product.
And because the framing is "best practices" rather than "product constraints," you've been doing it. Not because you were naive. Because it seemed reasonable. Because everyone else was doing it. Because the alternative — building your own tooling from scratch — wasn't viable.
So you compromised. You adapted your onboarding motion to fit their onboarding module. You restructured your health scoring to match their available data points. You ran QBRs the way their QBR template was designed, not the way your customers actually needed them to run.
And when it didn't quite work, someone from the vendor's CS team explained, very professionally, that you just needed to lean into the best practices a little more.
The Best Practices Test
Here's a simple test. Take any CS best practice your vendor has recommended to you. Then ask: does this make my customers more successful, or does this make me a better user of this product?
Most of the time, the honest answer is the second one.
That's not a criticism of the vendors. They're building products for a broad market. Their tools have to work for thousands of different CS operations. To do that, they need you to operate within their schema. Of course they're going to call that schema best practices. What else would they call it?
But you don't have to pretend the compromise isn't happening.
What Actual Best Practices Look Like
Real best practices in Customer Success start with the customer's desired outcome and the appropriate experience required to achieve it. They're derived from what works for your specific customers, in your specific market, with your specific product.
They look different for every company. That's not a bug. That's the point.
Your onboarding motion should be built around what your customers need to achieve value, not around what your CS platform's onboarding module can accommodate. Your health signals should reflect what actually predicts churn and expansion in your book, not what your platform can measure natively. Your playbooks should capture how your best CSMs actually work, not how a product team at a CS vendor imagined CSMs should work.
Agentic workflows make this possible for the first time. Because the workflow is built around your operation, not around someone else's schema. The tool fits you. Not the other way around.
That's not a best practice. That's just a practice. Yours.
