“Does your current post-call IVR survey prevent you from collecting multiple customer comments?” is one of the 26 items outlined in the 25 Mistakes to Avoid with Post-call IVR Surveys e-book and self-assessment. There’s a bonus item to make the total 26. Answer the diagnostic assessment questions to uncover issues with your own post-call survey program. You can even use it to build a program that exceeds all expectations. Customer Relationship Metrics has documented the common mistakes we have seen since inventing and providing post-call IVR surveying programs in contact centers 20 years ago. To fulfill one of our missions to better the contact center industry, we freely provide the insights we have learned to everyone.
Why is this a problem?
The act of “collecting” customer feedback with a post-call IVR survey is not extremely difficult. This is part of the problem. It is not uncommon for contact center managers to fulfill the requirement to have a customer feedback tool by activating some software module to collect the data. Turn it on and the data starts to pour in, right? Like every other area in your contact center, you have too much useless data accumulating. Well, that is true and there are 25 other points in this self-assessment to stop garbage data coming from your post-call IVR survey program.
Everyone knows cheap and simple solutions do not deliver value in the long run. Do you feel as if your customer experience measurement program is a glorified comment card? Unfortunately, you are not alone. That is the evil side of benchmarking; do what the average do and you are what…average? The top performing contact centers do not focus on data collection but instead focus on insight capture. If collecting data points is all that is desired, then you will make your post-call IVR survey seem like a comment card and the results will be as basically worthless. Simple solutions lack the ability to effectively collect explanations from the callers about why a particular score was given or a description of the unresolved issue. A score alone is the bare bones, but a comment is the much needed meat.
Everyone that presents customer experience results are quickly asked questions that start with “why is?” or “why does?”. Those that provide a summary of “comment card” survey results lose creditability fast. Without the ability to explore the reason for the scores, the value of the data diminishes. So, if you have a technology solution that is blocking you from branching to open customer verbatims at multiple points within the survey script navigation, your risk of comment card foolishness just went way up.
The best of the best contact center leaders have measurement programs that are research solutions delivering insights. Sure, they are using technology but that technology complements the needs rather than dictates them. If your survey is limited by the number and placement of customer comments because of the software, you are doomed to comment card hell. The opportunity to leave a comment must be defined by the evaluation on the corresponding question, for example, “you indicated that you are less than satisfied with the progress toward resolution of your issue…” Each of the opportunities is customized to fit the response, here a low score but there would be a complementary high score version, and the customer is told to tell you more about the score (the meat on the bone). The opportunity must also come immediately after and not thrown in at the end of the survey. It is a bad practice to ask a customer to provide comments to a question asked three questions (or more) earlier.
Those data collection tools that do not constrain the customer comment capture provide incredible value to the customer experience analytics. The best want analytics because that is what makes the post-call IVR survey program actionable. Let me be clear, gathering customer comments is the first step toward what is needed. The second step is to then convert the comments to text. And the final step is reporting the agent specific comments to each recipient and analyzing the content of the customer comments.
So as you see, collecting data is easy. Collecting the data you need to be better than average is not. Just remember, if you do not have any meat on your bones, you’re dead.
- How many things should be measured on my Quality Monitoring Form? - May 17, 2017
- Best Practices for your Quality Monitoring Form - May 12, 2017
- What is the best scale for customer satisfaction surveys? - May 8, 2017
- How to take action with Call Center Analytics - May 1, 2017
- How many calls should agents handle in an hour? - April 19, 2017
- You are Doing First Call Resolution Wrong - March 31, 2017
- For People on the Verge of Tripping on the self-service Line - December 6, 2016
- Justin Robbins CCDemo interview takes me back to Kindergarten - November 4, 2016
- How many chat sessions can agents handle? - September 9, 2016
- How we avoided contact center survey shelfware - May 16, 2016