2012 Year in Review: Top 10 Post-call IVR Survey Mistakes

/, IVR Post-call Surveys, Survey Calibration/2012 Year in Review: Top 10 Post-call IVR Survey Mistakes

2012 Year in Review: Top 10 Post-call IVR Survey Mistakes

Share on LinkedIn0Share on Google+0Tweet about this on TwitterShare on Facebook0Pin on Pinterest0Email this to someone

It’s that time of year again when we all begin to reflect on the past year and make resolutions about the things we want to change in the coming year.  I hope you are as excited as I am about all the possibilities the new year could bring.

Many organizations are doing things right.  And we celebrate them.  But the majority will get a lump of coal in their stocking because of their mismanaged Customer Experience measurement programs.

While the Chinese Zodiac tells us it will be the Year of the Snake, let’s proclaim 2013 to be the Year of Customer Experience Transformation!

Post-call IVR surveys can be an extremely valuable tool in your customer experience measurement program.  And we all know, service cannot be transformed without a valid customer experience measurement program and too many post-call IVR survey programs out there are merely collecting data (and a lot of it is bad data).  I guess a lump of coal is a lot better than the results some customer experience measurement programs will bring – lots of measurement with no changes, changes not supported by valid analysis, and malpractice from holding contact center agents accountable to the scores.  If you are guilty of using one of these ineffective outdated programs, it is definitely time to ask Santa to bring you the right post-call IVR survey program for your customer experience measurement program! Get a free example RFP

To ensure that your 2013 “IS” the Year of Service Transformation, let’s take a look at the Top 10 post-call IVR survey mistakes from the Ghost of the Past:

  1. Do you think any survey is better than no survey? A survey is a survey, right?  Wrong!  If you are measuring the wrong things, the wrong way, and/or not analyzing correctly, it’s better to not be measuring at all.  If you ask a customer a simple Yes or No question like ‘Were you satisfied with the service you received?’ what can you really do with that information other than count the number of yes and no answers?  Now think about asking ‘How satisfied were you with the service you received?’ on a 10-point scale?  That answer is more meaningful and can be used to do the right kind of analysis.
  2. Do all of your caller types receive the exact same post‐call IVR survey script? Even if your company only sells one product, do all of your customers contact you with the same question?  If you asked a person calling about their warranty the same questions that you asked somebody who called to gather information prior to making their purchase, what did you learn aside from whether or not they were satisfied with their experience?  By tailoring your questions to the type of contact/customer you are able to obtain information that will help you transform processes and policies within the center and beyond.
  3. Do you only have questions about agent performance?  You have a live customer who has agreed to give you their feedback – why would you only ask how they felt about the agent they spoke with?  How is that going to help you find out how the customer feels about the changes you made recently to the return policy?  Wouldn’t you like to quickly know if the changes are going to potentially impact future sales?
  4. Is your post‐call IVR survey missing conditional skip patterns?  If a customer answers ‘No’ when asked if they used a particular specialty service that you offer, do they still get asked questions pertaining to that service?  Conditional skip patterns not only allow you to tailor the survey to the customer you are speaking with, but also reduces the amount of time that the customer has to spend answering your survey questions.  Not to mention, they are more likely to continue answering questions if the questions actually pertain to them and the services they have used.
  5. Do you block or filter customers from participating?  Many organizations set filtering rules about who is invited to participate in a survey.  Often such rules preclude the wrong customers.  What if the people you excluded (for whatever reason) were the only people who would be able to tell you if a recent change was seen as an improvement as you intended?  Participation is not mandatory so give them the option to participate.
  6. Do you have a list of reasons that cause you to throw out surveys? Manipulating the data because someone doesn’t like it introduces bias and will prevent you from obtaining useful information.  For example, if you throw out a survey because the customer was calling the third time about their issue and a supervisor thinks that the customer is being unreasonable, you are missing out on what you can learn from this customer.  Why did they have to call back?  Was it due to agent error or process breakdown? If you don’t ask them, you don’t really know.
  7. Have you failed to implement the Survey Calibration process? If a customer doesn’t follow the instructions and chooses 1 as the top score instead of 9 and leaves a comment telling you about their error, do you leave it or correct it?  If a customer leaves a comment stating that they were actually rating Johnny that they spoke to earlier and not Suzie who was the last agent and the agent to be evaluated, do you leave the survey with Suzie or move it to Johnny?  The Survey Calibration process does all of that and more.  By sanitizing the data to ensure that the surveys are linked to the proper agents and the scores are showing as the customer intended, you instill confidence in the information.  Confidence in the information is required for Service Transformation because agent and management trust the results.
  8. Are real‐time service recovery alerts MISSING from your post‐call IVR survey program?  Do you have to wait until a report is produced to see that survey scores were low in a particular area on a certain day?  Real-time alerts allow you to address the issue at the time it is happening instead of days or weeks later when a report is published.  By setting thresholds, you can be notified whenever a customer responds above or below the threshold (and opts for a call-back) so you can quickly evaluate the situation and initiate an effective service recovery.
  9. Are you unaware of a survey participant’s request to be called back?  What does it say about your customer service if a customer leaves a comment during the survey requesting to be called back and nobody ever calls?  Being made aware when a customer has requested a call back is not only an opportunity to resolve that issue; it’s also an opportunity to prevent that issue from snowballing into a much bigger deal than it needs to be.  Not all customers qualify for an alert (see #8) and will fall through the crack if continuous attention is not paid to the comments they are leaving.
  10. Are you afraid to use your post‐call IVR survey findings to make operational and strategic decisions? If you aren’t using the data that you are collecting to transform the service of your organization, why are you bothering?  If it isn’t actionable, it’s just noise.  It may be worse than noise if you are measuring the wrong things in the wrong way and doing the right analysis – survey malpractice is extremely costly and dangerous.

Did you answer yes to any of these? Be sure to check out our free e-book, 25 Post-call IVR Surveying Mistakes to Avoid with Self-Assessment in our resource library to help you get a better understanding of what you need to do to ensure that 2013 outshines 2012.  Unless, of course, you are banking on Santa bringing you that IVR survey program that is the foundation for your Service Transformation.  Be sure to let me know if Santa comes through for you. Get a free consultation

Happy Holidays!

About Vicki Nihart

Six Sigma certified and more than 15 years experience in contact center operations. Vicki leverages contact center operations subject matter expertise and deep analytic capabilities to support clients and build relationships. Analyzing and interpreting data is a passion she converts into purpose.

View All Posts
By | 2016-12-05T15:14:55+00:00 December 13th, 2012|Customer Experience, IVR Post-call Surveys, Survey Calibration|0 Comments