Overall Working Definition and Tips:

Working Definition:

Focus percentage is the default suggestion for percentage of focus/effort/time we expect an executive would spend on a particular category of measures.


  • Focus percentages are meant to be guidelines that you can and should adapt for your particular situation.
  • Explain the context and intent of the measures so people can understand what you are trying to accomplish.
  • The two most difficult transitions for most executives will be
    • Moving from measurements as a means of ‘control and compliance’ to one of ‘listening and learning’.
    • Letting go of measures you have historically relied on and starting to capture and use harder-to-measure categories like Knowledge/Collaboration) and Acceleration.
  • Some organizations have found that during the transition period it was useful to keep track of the old way of measuring for a while – in parallel, but in the background until it became clear that they weren’t adding enough additional value.
  • An activity-based measure is easy to measure and easy to manipulate. An example of an activity-based measure is commenting on, or creating two knowledge base articles a week. It’s also easy to manipulate, since an agent can simply put in ‘I agree’ to the comments for an existing document.
    • Don’t put goals on activities.
  • This contrasts with an ‘outcome-based’ activity, which is much harder to measure and harder to directly manipulate by a single individual. A good example is the Customer Effort Score*.
    • It’s OK to put a goal on an outcome as long as you can provide your team with specific actionable behaviors you want them to demonstrate that will help them achieve that outcome.

Working Definitions and Tips for Customer Category

Working Definitions

  • Emerging measures may not initially yield highly valid or reliable data, but they are important to begin using. As measures become more valid and reliable, they become metrics.
  • Serviceability Suggestions are service-related suggestions from customers or employees. An example may be where a customer who has been offered an upgrade by Sales and attempts to use the new features, and contacts Support. However, Support has no idea of this offer, which results in many extra steps before they can make use of the new features. The employee’s suggestion? “Notify us when Sales makes an offer to an existing customer so we can make sure the customer is able to make full use of the additional functionality immediately.”
  • % of Serviceability Suggestions from Customers Accepted are service-related suggestions from your customers that are accepted (implemented) as a percentage of those that were submitted.


  • Customer Support has the broadest set of interactions with customers of any group in the organization, yet we see only 50-60% of the total demand for our services — and only a tiny percentage of them makes a suggestion. So when someone does make a suggestion, we should carefully listen to them and do something with what they say.
  • A simple rule of thumb is that for every time a customer encounters a problem, there is a 20% hit to loyalty. In other words, for every five customers who have problems, one will leave the next time they have a purchase opportunity[1].


  • Net Promoter Score** and the Secure Customer Index* are based on periodic surveys and are a measure of the ongoing relationship.
  • The Customer Effort Score* and Customer Satisfaction score are often based on a survey which is sent to a customer after they interact with your Support team.
  • First Contact Resolution and First Day Closure are highly correlated with Customer Satisfaction, and it is more likely to be something your managers can impact, so that is why it is included in the manager side.
  • If you don’t systematically collect and act on service-related suggestions from your customers, start doing so now.

[1] John A. Goodman, Customer Experience 3.0

*  Customer Effort Score, Net Promoter Score, Secure Customer Index, Q12 and KCSSM are the trademarks of CEB, Sat Metrix, Burke, Gallup and the Consortium for Service Innovation respectively.

Working Definitions and Tips for Employee Category

Working definitions:

  • Employee Engagement is a periodic survey that measures how emotionally connected an employee is to the organization. There are a number of commercial options, or you can create your own as many organizations have done.
  • Employee Satisfaction is a periodic survey that measures how happy an employee is with their current job and conditions.
  • % of Serviceability Suggestions from Employees Accepted are the suggestions from your employees that are accepted (implemented) as a percentage of those that were submitted.
  • Time to Proficiency is a measure of how quickly a new team member is ‘proficient’ in their job or how quickly an existing team member learns a new skill.


  • Employee engagement is a leading indicator of financial performance[1].
  • If you focus on the needs of the business and the needs of the customer at the expense of your team, you will get a high turnover rate.


  • Research shows that 70 percent of the variance in employee engagement is influenced by direct managers[2].
  • Leaders at industry-leading customer support organizations do not accept a high turnover rate as a byproduct of the high stress job their teams do.
    • If your organization has a 10 percent annual turnover rate, you will lose half of your experienced workers in five years, even if total headcount is the same.
    • Turnover costs between 100% to 150% of an employee’s annual salary.
  • If you don’t systematically collect and act on service-related suggestions from your employees, start doing so now. One of the most common frustrations of customer support teams is that while they know more about the internal workings of your entire organization than any other group, their opinions/ideas for improvements are rarely solicited or acted on.



Working Definitions and Tips for Business Category

Working Definitions:

  • Klever’s Law: Customer Time to Value = Time to Value (before sale) + Time to Value (after sale) + Time to Smile (after interruption).
  • Time to Smile is the total elapsed time between when a customer has their ability to use the product/service interrupted to the time they got back to a happy state. Note: This is often called ‘Time to Resolution’ or ‘Time to Restore’.

Leading support organizations have realized that we have fixated on cost and efficiency at the expense of a superior customer (and employee) experience or the value delivered. This is one category where we should pull back on some of what we currently measure and report on, to free up mindshare to think about and act on some of the other categories of measures.

Shift the emphasis away from internally focused numbers to those that are focused on end-to-end processes, from the customer’s point of view, like Customer Time to Value and Time to Smile.  This will require leadership and discussions with other groups.

Working Definitions and Tips for Knowledge/Collaboration Category

Working Definitions:

  • Level Zero Solvable
    The percent of incidents resolved by the Support organization that could have been resolved by the customer using self-service.
  • Time to Publish
    ‘Publish most of what you know, quickly, to your customers and it will dramatically improve customer success with self-service.’
  • Ratio of new to known incidents being handled by the support organization.
    • Ideal state, 85% new.
  • Attach Rate
    How many incidents have at least one knowledge article attached to it.
  • Participation Ratio
  • Article Quality Index
  • Collaboration Effort Score
    This is an emerging measure that is best handled by a survey question. Something like “Department X makes it easy for me to collaborate with them.”
  • Knowledge-driven suggestions are suggestions for improvement that are made by analyzing the data from your knowledge articles.
  • % of Knowledge-driven suggestions Accepted are suggestions for improvement that came from analyzing your knowledge articles that are accepted (implemented) as a percentage of those that were submitted.


Knowledge-sharing is a set of behaviors, supported by best-in-class methodologies (like KCS*SM) and enabled by technologies. Fundamentally, people have to want to share knowledge and it has to become part of their workflow.


  • One of the struggles with knowledge-sharing is that most executives don’t understand how to manage it and tie it into business objectives. Here is one approach that involves looking at four key measures to get you started.The first two addresses the fact that you should get most of the knowledge that your organization has into the hands of your customers as quickly as possible. The third looks to see how easy it is to collaborate with other groups. The fourth looks at how well you are harvesting actionable information from your knowledge articles.
  • Measure Level Zero Solvable –take the actual words your customers use when they contact you and use these as search terms on your online website to see what percentage of solutions could have been solved by your customers if this information was available online.
  • The second measure is to look at Time to Publish. This is a measure of how fast it takes to go from known internally to available externally. (Think minutes, not weeks.)
  • The third part — Collaboration Effort Score – will give you a sense of how easy or difficult it is for teams to interact across groups.
  • % of knowledge-driven suggestions that are accepted closes the loop between actionable information from knowledge articles and your doing something about it as an executive.
  • A big opportunity is to embed knowledge-sharing techniques outside just the support organization. (This will help the Time to Smile and Time to Customer Value)

Working Definitions and Tips for the Acceleration Category

Working Definitions:

  • Measures rationalization projectPerhaps one of the first projects would be to take an inventory of what you currently are measuring (and why), what you do with it and then see how to make a transition to the measures in the Open Customer Metrics Framework.


In the interrupt-driven world of customer support, we rarely have the luxury of uninterrupted time to make big improvements. In order to do that, we have to slow down and think, try intelligent experiments and adopt a learning loop that allows us to make significant improvement in outcomes.

Focusing on Acceleration (the rate at which improvements are realized) as a measure is a way to stay on track.

This category is the hardest to pin down, but also perhaps the most important to make sure you are always applying what you learn about your customers, your business and your employees.


  • Focus on the long term view — which may be as long as a multi-year goal — while delivering value each quarter.
  • Make sure you get top management sponsorship and communicate early and often to ensure that they still support the project.
  • As you choose which Acceleration projects to start with, consider the capacity of your team/organization to do the work that is needed, and the capacity of the organization to adopt the changes. Sometimes you will have to figure out what you can stop doing in order to free up capacity.
  • Give clear ‘guiderails’ where people should not go outside, but within those boundaries, they should be free to make radical improvements (and make mistakes).
  • A really good way to make an impact at a company level is to target improvements where support plays a key role in.  For example, end to end projects involving product quality or process quality across multiple departments in a company.
  • Check to see if your company has a Quality team, Six Sigma team or a Project Management Office.  They often have talent and resources that can help your Acceleration project(s) significantly.