(If you have not read the previous entry, the introduction to this methodology, you can do so here)
In this second part I want to cover the Strategic Measurement Framework I use.
Most organizations attack customer experiences in the same way: isolated islands of processing that complete a function or portion of a process, that somehow interconnect. Each of these processes has their own reports, metrics, and goals. These reports will tie the goals for each function to the efficiency of the actions (i.e. how well did we do what we are supposed to do). These reports are tactical in nature and the metrics associated with them are tactical as well. Number of executions per time slot, speed of execution, compliance and execution are some of these metrics.
This was acceptable in the old days when organizations had different departments that dealt with different interactions with the customer. There was a specific accounting department, for example, that dealt with accounting issues, and other departments for financial, warranties and so on. Each of these functions was isolated from the others and executed within its own shell. If an accounting problems arose, that department would receive either a request to act or a telephone call transferred to them. They would solve the problem the best their rules and operations would allow them, and that was all they ever saw of the customer. If there were other problems, they would transfer the request to another department. Connection between departments and cross-departmental operations and reporting was non-existent. The following figure represents how this worked.
As time passed, customers demanded more of customer service organizations. They expected not to be transferred and for all of their issues to be handled by the first person to interact with them. This coincided with the advent of CRM solutions and early integration between the back office (ERP systems) and the front office (CRM solutions). The integration gave the organization the visibility they needed to begin to handle end-to-end process interactions.
The problem they ran into most often was that the systems used in the background were the same as in previous generations and even though they could see and use the data to solve more complex interactions, the reporting structure and data models were not modified for this new operating model. Metrics changed slightly to reflect the new way of working and began to accommodate processes rather than functions – albeit still focused mostly on efficiency. For example, the total handle time for a transaction was no longer how long each department would take to act on a specific action, but rather for the entire process. New metrics like First Call Resolution, Service Level Agreements, and end-to-end efficiency metrics began to emerge. The following picture denotes this generational growth.
Although the systems were meant to accommodate entire processes, the reporting became more convoluted as metrics from each function were combined, usually manually – sometimes in a data warehouse or data mart, to create cross-functional reporting. The areas where each function crosses in the diagram above (for example, between call type and cost per transaction) would result in a customized report to be created and executed. Although the appearance of cross-functional reporting was created this way, often these reports were outdated by the time they were created due to the lengthy processing time and effort involved in preparing them.
In these systems we start seeing effectiveness, customer-centric metrics emerge. The first stand-alone, isolated customer satisfaction reports begin to appear. There is little, if any, relationship with the specific functions and the efficiency metrics – it is another island of reporting that is created. This creates the false illusion that organizations are concerned with the customer experience and coincides with the first Customer Experience Management (CEM) deployments and the appearance of surveys in enterprises – ultimately leading to Enterprise Feedback Management (EFM). In this second generation we still see isolated, function-specific reports and some cross-functional and end-to-end efficiency metrics as well as the early adoption of effectiveness metrics and customer satisfaction.
Alas, there is still no focus on customer experience – just an acknowledgment that is exists. There are no formal processes or reports in place to measure satisfaction with the processes or specific portions of it, nor are organizations taking any action to improve processes based on the feedback collected.
Customers begin to push organizations to recognize that stand-alone customer satisfaction measurements are not a good way to find out how to provide better experiences and few organizations begin the difficult task of convincing their management ranks that CEM is more than a passing fad. They hear about organizations that by focusing on the experience and its effectiveness, as opposed to the functions or processes efficiency, get great results in reducing customer turnover, lower costs of customer acquisition, and lower costs of customer maintenance. In other words, we enter the era of CEM as a mainstream discipline.
The problem is that virtually all organizations are trying to provide new solutions on the same old data model and process-centric measurement “framework” (if one exists at all). Most organizations begin to realize that their existing data structures cannot support CEM as there is no effectiveness measurement that spans or integrates the entire process. In addition, as most data models and applications came together separately, and data warehousing has not yet accomplished integration between these data models, there is no cohesiveness between the data used in this application. This is where a framework like the SMF works best. The following picture depicts this better.
The SMF is a graphical representation of a customer-centric data-model that is focused on both operational excellence and customer loyalty. Because neither of those concepts can be measured abstracted from each other, this framework creates an opportunity to, by correctly aligning the data elements embedded in the model to specific processes and functions implemented, manage end-to-end processes. By implementing a framework like this,organizations can
- Optimize operational efficiency by merging customer and process feedback (whether direct from the customer or via SLAs) to the actual agent or system work, resulting in a cradle-to-grave vision of how an interaction performs. Weak spots are easily identified by either examination of performance records or by listening to feedback on what works and what doesn’t.
- Strengthen their Customer Experience implementations, by always aiming to improve the experience to the customer – whether it is from internal functions (not visible to the customer) or external functions (which the customer can provide feedback on).
- Understand better the relationships between the existing reports and their entire operations
- Analyze all collected and existing data together and draw insights that are valuable to managing the implementations and solutions in the future; integrate into an analytical package or strategy
- Structure their existing reports within an end-to-end framework, making it simpler to understand the specific performance, inflection and weak points, and what resolutions will work for any given problem
There are some considerations with adopting this framework, such as changes to processes, data models, data warehousing, and integration strategy. A full adoption will take a long time and necessitate lots of changes in the organization. However, just the adoption of the theories behind it (end-to-end measurement and metrics, common and consistent data model, and integration of feedback across the entire process) as organizations evolve into customer experience will yield valuable benefits by structuring processes and metrics and aligning strategies and actions.
A key aspect of using this is that feedback is collected in different stages across the process. We can collect feedback from the user or the agent as for the entire end-to-end process, or we can collect metrics from the user or the agent on the efficiency or effectiveness of each specific action. Because each interaction is identified and measured across its entire life-span, we can relate that feedback and metrics collected to each interaction and act on what we see as an issue, or has been identified as one by feedback. We are empowering the user and the agent to indicate areas for change, and we are looking at process-specific, efficiency-driven metrics to do the same. By mixing efficiency and efectiveness metrics we can focus on a win-win situation where we can work on improving items that need an efficiency boost and see the corresponding improvement in effectiveness.
Finally, there is one element that emerges when adopting this model that also talks to a previous post I wrote in this blog. I said that benchmarking between enterprises will make them mediocre, as they are always striving to just do sufficient to measure to par with their competitors. Internal benchmarking, the ability to compare results from one time period to the next one for specific actions, is actually a best practice in CEM. The adoption of this framework comes with an end-to-end process effectiveness index that can let organizations know exactly how well they perform across the entire customer experience – not just one portion of it – and compare over time. This index will get more time in the last entry in this series.
For now, I welcome comments on this framework and any ideas on how to improve it. This is a summary and there is a lot more material that is associated with the SMF, so if you want more details please feel free to ask. Please let me know what you think by leaving comments below, or sending me an email (contact information is in the Contact tab above)
See you next week for the third installment of this methodology, when we start diving into the different areas of Crafting Awesome Customer Experiences.