The Satisfaction Quarterly Report, Q1 2008

I recently mentioned the American Customer Satisfaction Index (ACSI) to someone and was surprised that she had not heard of it. It’s a great research effort led by Claes Fornell at the University Of Michigan which tracks customer satisfaction on a quarterly basis. Here’s a chart of the national average since the index was created in 1994:

ACSI National Satisfaction Scores

As you can tell, satisfaction scores have been generally on the rise over the last few years.

The ACSI provides both company-specific and industry-specific data for a different set of industries every quarter. The Q1 2008 ACSI looked at the following industries: hotels, restaurants, hospitals, cable & satellite TV, cellular telephones, computer software, fixed line telephone service, motion pictures, network/cable TV news, newspapers, wireless telephone service, airlines, express delivery, U.S. Postal Service, and energy utilities.

Here are some of the highlights from that Q1 2008 data:

  • Best & Worst Organizations:
    • Top rated: FedEx Corporation (express delivery), UPS (express delivery), Olive Garden (restaurant), and Southern Company (Utility)
    • Largest improvement (since last year): Ameren Corporation (energy utilities), Reliant Energy (energy utilities), Energy Future Holdings (energy utilities), and McDonalds (limited service restaurants).
    • Lowest rated: US Airways (airlines), Charter Communications (cable & satellite TV), Comcast Corporation (cable & satellite TV), and Sprint Nextel (wireless telephone services).
    • Largest decline (since last year): US Airways (airlines), Continental Airlines (airlines), Sprint Nextel (wireless telephone services), and Northwest Airlines (airlines).
  • Best & Worst Industries:
    • Top rated: Express Delivery and Ambulatory Care.
    • Largest improvement (since last year): Hotels and Fixed Line Telephone Services.
    • Lowest rated: Airlines, Cable & Satelitte TV, and Newspapers. 
    • Largest decline (since last year): Newspapers and Broadcast TV News.

The bottom line: This should be a wake-up call to many firms (are you listening airlines and cable & satellite companies?).

Lessons Learned From Chief Customer Officers

I just published a report called the “The Chief Customer/Experience Officer Playbook.” To research the report, I interviewed executives with responsibility for customer experience that cut across normal product and/or channel boundaries (we call them Chief Customer/Experience Officers or CC/EOs) from several different organizations including Air Transat, Alaska Air Group, Bank of America, Bombardier, the California State Automobile Association, Century Furniture, the Colorado Rockies, and Symantec. In addition, I spoke with Jeanne Bliss, author of the book Chief Customer Officer: Getting Past Lip Service to Passionate Action.

The research identified five categories of things that CC/EOs should do:

  1. Make sure that you’ve got the right environment.
  2. Prepare to take on a broad change agenda.
  3. Establish a strong operating structure.
  4. Kick off high-priority activities.
  5. Look ahead to the future.

The report goes into much more detail for each of these items. While I can’t share the whole report in my blog (that’s reserved for Forrester clients), I did want to share some of the most interesting quotes from the CC/EOs:

  • “It takes massive support from senior management. This role can destruct careers.”
  • “What’s more important, but less tactical and takes longer, is the realization that customer experience is culture. It’s the mindset of our associates and their empowerment. Not stuff, but attitudinal. We’ve recognized that this is a journey.”
  • “Each of the groups in our company already had some customer experience efforts, so I wanted to make sure that they were on board and not threatened. I needed to talk to each of those groups individually. It’s an ongoing issue – and it’s an ongoing effort for me.”
  • “We focus on employees first. Happy employees make a happy customer. They were very skeptical – so much of our communication is internally driven. We need to support the hell out of them.”
  • “I do a read out to the leadership team every month and tell them my perspectives on how we’re doing (fact-based); a no-holds-barred discussion. No attempt to keep any of that stuff under the rug.”
  • “Customers want one relationship with us and we’ve given them about 10. Our data sources and systems are isolated; the organizations are isolated. We’re trying to break down the silos.”
  • “We’re changing metrics in the call center to eliminate focus on average talk time.”
  • “If I did it over again, I would have focused earlier on consolidating our customer listening posts and voice of the customer efforts. We now look at the perception of reliability, not the actual reliability.”
  • “We’re looking for line of site between our initiatives and NPS, which is a lagging indicator. We’ve worked on projects that have taken three quarters to improve the NPS.”

The bottom line: CC/EOs shouldn’t “own” customer experience, but they can really help support the organizational transformation required to improve it.

Let’s Learn From Delta’s [Continuing] Customer Experience Miscues

Let me start this post with a clear disclaimer — I never consider my personal experiences when evaluating customer experience in my research. Every large company periodically delivers subpar experiences, so anecdotes aren’t necessarily indicative of a company’s overall customer experience efforts. 

Having said that, I feel the need to share my experience with Delta Airlines over the last 2 days, because there’s something to learn (or maybe unlearn) — and, to be completely honest, I feel the need to vent.

The summary: It took me 13 hours to get from the airport in Richmond, Virginia to Boston’s Logan Airport. Along the way, Delta found many ways to make the experience miserable.

The painful details:

  • Yesterday, a colleague of mine and I boarded our plane to New York (JFK) in Richmond, Virginia and the plane pulled away from the gate at 6:15 PM – right on time.
  • Minutes after pulling away from the gate, the pilot said that we were on a ground hold and would need to wait there for a while. No real details. 75 minutes later we were brought back to the terminal and allowed to get off the plane.
  • By the time we got off the plane, there were no more options for us at the airport — either on Delta or on any other airlines. (Note: JetBlue flight #1076 for JFK left ON TIME for JFK while we were sitting on the tarmac)
  • The agent at the counter in Richmond was completely unhelpful. All she said was that we were now booked on a flight out of JFK for the following day. She was completely unwilling to explore any alternatives — even those that I suggested. She didn’t seem to care — even a tiny bit – that our 4 hour trip was now going to span a couple of days.
  • Well, we finally got to the JFK terminal shortly after 10:00 PM. It turns out that our connecting flight left about 9:49 PM (Delta didn’t think we were “important enough” to wait for us to make the connection). Now, on to the Delta customer service agent in JFK.
  • The agent told us we had no options to get home that night (although I have since found out that there was a JetBlue flight #1028 that left later that evening). We were booked on a 10:15 AM flight. Luckily I know about the Delta Shuttle — and was able to push him to book us on the 6:30 AM flight.
  • I asked him which hotel Delta was going to put us at. He then informed us that Delta was not going to provide a hotel because it was not responsible for the problem. He used some technical terms that (in his mind) absolved Delta from all responsibility for our situtation. Then I mentioned that it was, of course, Delta’s fault — the JetBlue flight that left after we pushed from the gate seemed to get to JFK without a problem. His response was precious — “How do you know that?!” (As if I must be either mistaken or lying — neither of which was true). It reminded me of a Seinfeld episode. He was obviously well trained in the techniques of avoiding responsibility.
  • Well, the agent did give us a phone number of a service that helps Delta’s stranded customers find hotels in the area. So we called the number. The guy on the phone gave us the phone number for one hotel. We called the hotel and they had no vacancies. Thanks for the help Delta!
  • Well, we found a hotel in the area (on our own) and actually made it to LaGuardia the next day a bit early. So we tried to get on an earlier flight (6:00 instead of 6:30). You’ll never guess what the agents told us — “that will cost an additional $150.” That’s right, she wanted to charge us more money to get us home a day later! When we told the agent about the terrible experience that we had been through, she did a little research on her system and then said — it looks like your plane from Richmond left on time. The implication: Delta doesn’t need to go out of its way for us because it pushed the plane away from the gate at the scheduled time.
  • We finally “convinced” the agents at the desk to let us on the earlier flight (which was completely empty) without any additional charges.
  • Then, finally, we landed at Logan Airport at 7:00 AM today. 12 hours, 45 minutes later.

The analysis: Delta’s records probably show that we were on 2 flights that left on time — and therefore had a successful trip. Obviously, though, our experience doesn’t match that assessment. Hopefully Delta (and other firms) can learn to avoid the following customer experience miscues that we ran into:

  • Poor communications. I understand that delays happen. But the situation gets much worse when customers are left in the dark. We did not get a lot of accurate information about the status of our flight as we were waiting — raising our anxiety level and making it difficult for us to formulate potential solutions to the problem.
  • No accountability. Along the way, every Delta employee seemed to be trained in mechanisms for denying responsibility. The tone of our interactions may have been different if Delta trained its empoyees to recognize that stranding customers at an airport is ALWAYS its problem.  
  • No empathy. Along the entire ordeal, we did not run into a single Delta employee who said “I’m sorry” or even acknowledged our inconvenience. Maybe Delta can just teach agents to start interactions with stranded customers like this: “I know this is really inconvenient, let me see what we can do…”
  • No advocacy. All of the agents that we met were just trying to get rid of us. Not one of them asked what we wanted to do — and they certainly didn’t go out of their way to explore alternatives. A good lesson to learn: the most important time for helping customers is when they are in need. These moments of truth can build or break loyalty. In this case, Delta clearly achieved the latter.

The bottom line: You need to look at interactions from the standpoint of your customers (note to Delta and other airlines: “on-time departure” is not a good customer experience metric). It can provide a dramatically different view!

Epilogue: I sent Delta’s customer service group a link to this blog in their complaint form. But rather than reading it, they sent me an email that said:

“…We appreciate the e-mail you sent. However, please send us your experience in a text form or letter.” 

Looks like Delta doesn’t really care what happened to me — but it is finding every possible way to avoid taking responsibility.

Epilogue #2: I finally got a response from a representative who seemed to have glanced at the feedback that I had to cut and paste into an email. So the airline decided that I qualified for a $75 credit which it promissed to send via another email. But 2 weeks later — there’s still no credit. The ineptitude of Delta’s customer experience efforts is truly comical. Where’s Ashton Kutcher? I must be getting Punk’D.

Don’t Neglect Your “Welcome Experience”

My wife and I just got back from golf camp at Stratton Mountain (it was actually called Stratton Golf University, but we liked to think of it more as “camp.”) As we drove up on the first day, we were greeted by one of the instructors who was standing in front of the parking lot. He showed us where to park, took our clubs, and showed us where to go next to sign-in. Wow – what a welcoming experience!

Let’s disect what went right:

  • We had no anxiety about what we needed to do.
  • We received an immediate “personal” connection.
  • We felt like the “University” was ready for us.
  • We had a great feeling about the week. 

Notice how I discussed what went right in terms of how my wife and I felt about the experience. When I work with companies, I don’t evaluate interactions based on my personal feelings, but in this case I was actually the target audience.

Some lessons learned about a good Welcome Experience:

  1. Assume customers don’t know as much as you think. We typically spend 40 hours or more per week at work — and many more hours thinking about work and our company when we’re not even there. So we know a whole bunch about our products, services, and processes. But, alas, customers don’t spend nearly that much time thinking (or caring) about our business. So firms have a tendency to assume that customers know more than they actually do — like where to park and what to do with your golf clubs.
  2. Make sure that customer know exactly how to start. If customers don’t know where to go first, then there’s a higher likelihood that they’ll get lost. But as obvious as that sounds, we still find that many experiences fail to get customers going in the right direction. What do these flaws look like? Website homepages that don’t provide clear evidence that the user can accomplish her goal; IVR menus that don’t offer a match to what a customer wants to do on the call; and large airports that don’t provide clear signage to the check-in locations for all of their airlines.
  3. Set the tone right away. If you want your customers to think that you are helpful — establish that context right away. Good or bad — the Welcome Experience shapes how customers view every interaction after that moment. As they say: you only get one chance to make a good first impression. 
  4. Provide feedback along the way. Don’t think of the Welcome Experience as a facade — it’s just the beginning to a continuous experience. Make sure that you provide customers with clear signals and insights as to what they should be doing next. The golf instructors didn’t just point to a building and say go there and register, they took us to the door and pointed to the registration table. We’ve all seen when this goes wrong. Think about a detour you were forced to take when you were driving — only to find that there were only a sparse set of detour signs along the way. Even if you were heading in the right direction, you still wanted to see a sign saying that you were on the correct detour route.

How can you tell if you have a good Welcome Experience? I can think of 2 great ways:

  • Ask your customers. Why not ask customers in your post-interaction surveys about specific elements of the Welcome Experience. Or even interrupt a few people early in the process and ask them what they like/dislike about the experience.
  • View the experience through your customers’ eyes. As you’ll find out in many of my posts, I often recommend that companies internalize the concept of Scenario Design. Think of your target customer and ask the questions: Who is that person; what are her goals? how are you helping her accomplish those goals?

At this point in my post, you’re probably waiting for me to get to the bottom line. So here it is: We had a great time at golf camp — and my wife and I are both hopeful that we cut at least 5 strokes off of our golf scores (which were pretty high to begin with).

Net Promoter And Satisfaction Battle For King Of The Ring

Let’s start with a confession: I’m a big professional wrestling fan; so I really enjoy a good battle. One thing that I’ve learned from the WWE, is that it’s the storyline that makes a battle come to life. And the Net Promoter vs. Satisfaction debate has all of the story trappings of a great tag team match!

One one side of the ring in the blue trunks is the tag team of Fred Reicheld, “father” of the Net Promoter System (NPS) concept and Satmetrix Systems, implementor of NPS-based survey systems. On the other side of the ring in the red trunks, we find Claes Fornell, “father” of the American Customer Satisfaction Index (ACSI) and ForeSee Results, implementor of ACSI-based survey systems.

Both of these teams are fighting for their approach to be recognized as “THE” measure for tracking customer relationships. To put this into perspective, this type of measure represents only one of the five levels of a voice of the customer program (see my earlier post on voice of the customer programs).

Let’s start by handing out some awards to the teams:

  • Best marketed: Net Promoter (Reichheld is very good at touting his concept — and in writing compelling books about it)
  • Most mature: Satisfaction (The ACSI has been tracking data since about 1994 and satisfaction has been around as long as I can remember)
  • Most quantitative: Satisfaction
  • Sexiest: Net Promoter (it’s caused a lot of hooplah)

Net Promoter has gained a lot of momentum over the last few years as many large companies have adopted it. The methodology is pretty straightforward: ask people if they’d recommend your firm. Based on their response, they get categorized as a Promoter, Detractor, or neither. You take the percentage of Promoters and subtract the percentage of Detractors and that leaves you with a Net Promoter percentage.

This debate was enhanced by a recent study cited in the Journal Of Marketing which found that…

Using industries Reichheld cites as exemplars of Net Promoter, the research fails to replicate his assertions regarding the “clear superiority” of Net Promoter compared with other measures in those industries.

Well, if you’re wondering what I really think about this Battle Royale, then here it is. Just like wrestling – the storyline is much more exciting than the reality of the battle. Here’s my take on the contest:

  • Net Promoter is not the “ultimate” measure for a customer relationship
  • Then again, neither is satisfaction.
  • But companies are better off when they have more satisfied than dissatisfied customers and more Promoters than Detractors.

My recommendations:

  • Don’t expect any single measure to be eutopia. Both measures are good, but neither one has enough information to fully guage customer relationships and to provide enough diagnostic information to make all of the necessary improvements.
  • Focus on one measure to build alignment. Picking a single measure to focus on (whether or not it’s perfect) can be very valuable in aligning the organization. If you can get your entire company focused on either raising satisfaction or increasing the number of Promoters, then you will likely see some significant improvements in the reallt important metrics: retention, sales, etc. So, if in doubt, pick one and move on.
  • Evolve your metrics over time. The previous two bullets may seem to contradict each other, but they don’t when you look at it over time. The value from locking into a single measure like Net Promoter is as much from aligning the organization as it is around the perfection of the metric. But after the organization gets aligned, firms will need to build out the portfolio of metrics — and find out for themselves which measures are both predictive and diagnostic.
  • Look at Customer Advocacy. The ring was too crowded to add another contestant to the match earlier in this post, but for some industries we’ve found another measure that is a powerful indicator of loyal customer behavior. So, in the purple trunks is Customer Advocacy, the perception that the firm does what’s best for customers, not just what’s best for its own bottom line. We strongly recommend that financial services and healthcare firms take a very close look at this measure.

The bottom line: Don’t get too caught up in determining the winner of this battle. Just make sure that you do something and are prepared to learn and evolve over time.

If you’re a client of Forrester, then I also recommend that you read these two research documents:

Are you listening to the voice of the customer?

Voice Of The Customer (VoC) is a term that many people use, but few people can define. That’s the type of environment in which I love to do research. So I ended up writing two research documents on the topic: Building Your Voice Of The Customer Program and Voice Of The Customer: Five Levels Of Insight (as always, only Forrester clients can read the full reports). To start with, I developed the following definition for a VoC program:

A systematic approach for incorporating the needs of customers into the design of customer experiences

This definition contains three key elements:

  • A systematic approach. Most companies take an informal approach to gathering customer feedback. A VoC program should augment — not replace — those ad hoc approaches with a more structured way to gather and use customer insights.
  • Customer needs. Companies often have access to a great deal of customer data — but customer insights don’t automatically surface from data. A good VoC program uncovers the current and emerging needs of key customers — and helps identify areas where those needs are not being met.
  • Experience design. Gathering customer insights is only an interim step to improving customer experience. Why? Because VoC programs deliver the most value when companies actually make changes to better serve the customer needs uncovered by the research.

My research also identified five distinct levels of activities in a VoC program:

  1. Relationship tracking. Organizations need to track the health of customer relationships over time. That’s why companies often ask customers to fill out surveys — typically quarterly or annually — about their perception of the firm. Using this feedback, companies can create metrics that are simple to understand and easy to trend. Why is this important? Because an easy-to-grasp report card helps align everyone in the organization around a common purpose.(Note: I won’t get into the debate between “satisfaction” and “NetPromoter” metrics in this post, but I’ll definitely be touching on that in the future)
  2. Interaction monitoring. Every customer interaction — from an online transaction to a call into the call center — is important. Firms need a way to monitor how effectively they handle these customer touches. That’s why many companies do post-interaction surveys — asking customers for feedback on recent interactions.
  3. Continuous listening. Structured feedback through customer surveys provides enormous opportunities for analysis. But one of the strengths of these approaches — providing data — is also a limitation. To avoid this data-only view of customer relationships, companies put in place processes for executives to regularly listen to customers. There are many opportunities to hear what customers are saying, such as listening to calls in the call center, reading blogs, reading inbound emails, and visiting retail outlets.
  4. Project infusion. The following statement is probably not too controversial: Projects that affect customers should incorporate insights about customers. Despite the clear need for this type of effort, many companies lack a formalized approach for infusing customer insights into projects. To make sure that this doesn’t happen, some firms are incorporating customer insight steps in the front-end of their Sigma processes.
  5. Periodic immersion. Every so often, it’s valuable for all employees — especially executives — to spend a significant amount of time interacting directly with customers or working alongside frontline employees. These experiences, which should be at least a half day, provide an excellent opportunity for the company to question the status quo.

Here’s a graphic that shows more details on the five levels… Five Levels Of A Voice Of The Customer Program

Hopefully this helps to create some common language around the Voice Of The Customer.

Follow

Get every new post delivered to your Inbox.

Join 2,620 other followers

%d bloggers like this: