Saturday, December 23, 2006

While walking through the streets of Hermanas, a small town outside of Cape Town, South Africa, I wondered to myself, "How much has the PM movement infiltrated a country so far removed from the average Americans frame of reference?" (to settle those of you who have just read that statement and felt your hackles go up, let me follow this up by stating that this is usually the case, unless those Americans are from the financial, medical or telcom business sectors of corporate America - admit it...you still think there are lions, elephants and giraffes roaming the streets of South Africa, which, for those of you who do not know, is a country -- yes, a country...not just a reference to the SOUTHern part of the continent itself).

OK, now that we have covered our geography lesson for the day, let's get back to he question at hand...What do you think...? Do you think BI and performance management have made as much movement down here as they have in the US, especially over the course of the last few years?

At first, I will admit, I didn't think so...There are still many regions in the US that wouldn't have the 1st clue what PM and BI stand for, let alone how to employ the strategic advantage that each presents. But I was wrong...

Dichotomy is defined as "a division into two non-overlapping or mutually exclusive and jointly exhaustive parts. They are often contrasting and spoken of as "opposites". The term comes from dichotomos (divided): dich- ([in] two) temnein (to cut)," by wikipedia (en.wikipedia.org/wiki/Dichotomy).

Business intelligence, on the other hand, is defined as "the process of gathering information in the field of business. It can be described as the process of enhancing data into information and then into knowledge. Business intelligence is carried out to gain sustainable competitive advantage, and is a valuable core competence in some instances," en.wikipedia.org/wiki/Business_intelligence

If you want to think about the ubiquitous impact of BI in your workplace, all you need to do is ask yourself "have I ever opened or reviewed a spreadsheet," or is it. Many people think that just because numbers appear in the A3 cell of an Excel worksheet, that they are looking at data. And while in it's most raw format, a singular number in a cell could be seen as data, how often is that the case in Excel, versus, say a formula showing up in A3, something like =A2 + B2 which results in a singular number display for the end user, or you.
OK that was a bit difficult for the non-analyst to understand. To break it down, simply take the following case:

If you have 2 apples and 3 oranges and you want to know how much fruit you have in total for your restaurant, you might enter the 'data' into a spreadsheet that looks something like this:

In A1, B1 and C1 cells, enter 'Apples', 'Oranges' and 'Total', respectively. These serve as your headers.
Underneath those headings in A2, B2 and C2, enter '2','3' and then stop. What do you enter in cell C2?

You could easily eyeball 2 + 3, and enter a hard-coded '5', but is that the most useful way to enter DATA into a spreadsheet? What happens if the farmer that serves up your fruit makes a mistake and sends you the 3 apples and 4 oranges? When you enter the new numbers, does the 'Total' column change along with the new values?

The answer is no. But if you were to enter into C2, the following formula: = A2 + B2, the end result is still 5 from the first example, but when you modify A2 and B2 with the new values of 3 apples and 4 oranges, it automatically recalculates the total to equal 7. This is the relativity of information; or the non-absolute transformation of data.

This is a simplistic example but serves an illustrative purpose. In one case, the absolute reference to 5 total pieces of fruit is useless in my eyes for most organizations that utilize long-term or big picture strategy maps to manage performance. Why?
Because the analyst man-power to continually update cells as the data changes is huge, and unproductive, when a relative reference (or using the formula rather than the hard-coded value) is a one time effort, that utilized properly, services the organization to allow analysts to actually analyze rather than just report on information. Most companies utilize their analysts mostly as reporting analysts, rather than data mining experts or statisticians, as they should be utilized because of time involved with manual ETL; (or the extraction of the data from the source systems, followed by the compilation, computation and validation of the data integrity, all of which is still not a guarantee that there won't be calculation errors). By the time this process is done, the analyst is usually racing to meet some arbitrary or pre-determined timeline in which they haven't the time to actually draw conclusions from the data that they have now transformed into information.

If you think about the two definitions that I led off with, does this dichotomy within most companies translate into analysis strength or weakness within an organization?

I would say the latter...

There are only a few companies that get "it"; and invest in performance management systems, balanced scorecarding, strategy maps tied to performance indicators, with line of sight through the business units, tied to employees and downstream, through to the end transaction and customer experiences and perceptions of how well you serve their needs. Whew...that was a mouthful...And as I always stress, one should never go down that path without having first built the system manually, and felt the pains of manual ETL and aggregations, as who better to help serve up which product / platform you should choose to RFI during the software selection process that those analysts who do the work itself? Also, if you automate without a manual framework, you have no roadmap by which to build your rules or needs from; you are left flying blind.

This would be like driving on a windy road at night with a blindfold on...Figuratively, it is corporate suicide.

Yes, there are those companies that buy PM or BI software, only to latter complain that it 'doesn't work'. This is usually a false statement driven by the lack of understanding of a BI product or platform, and more of a complaint about how an organization decides to implement that platform within their IT infrastructure and the subsequent processes and business rules built into the BRE (business rules engine or 'If this, then what').
But when you do have your manual system in place, and you do start your PM process, say with a product like Microsoft's Business Scorecard Manager (BSM) which is currently being productized as 'Performance Point 2007' (don't get me started about how excited I am about having the power of being an OLAP administrate and report designer (due to the Proclarity acquisition) all in one with a nice little scorecarding package, and oh, did I mention, integrated with the Office platform, what more could a BI dork like myself ask for?), you, the end user or analyst reading this blog, becomes the organizational fire preventor and not the fire fighter of the past, enabled by technology to do your jobs' better.

When data is transformed into information systematically, you are left with a resource pool that can actually utilize those Harvard degrees, modeling out 'what-if' scenarios based on information served up at a click of a button. It is the difference between data and information that motivates the "cream of the crop companies" to achieve stretch goals and BHAGs...

All in all, a dichotomy exists between the manual and systematic approaches to analysis of information and transformation of data into such information as what serves up PM for any organization. By conquering and dividing out this duplicity, you are left with the strength and knowledge that outweighs what most of your

Monday, October 23, 2006

Sales folks present an interesting dichotomy in nature; these hunting pariah, tracking their consumer prey; asking whether you like the Sonics, you falling victim only once, but painfully so that one time: here's how to avoid the same: while shooting an air pistol followed by the audibly annoying sound their forefinger makes when depressed on their "hot" shoulder (think 'stzzzzzz' sizzle sound that one makes when you do something awesome - like you are the sh*t), just know that they are about to tell you that they scored two court-side tickets because they have self-proclaimed "connections in the biz," waiting for you to bite at their bait before asking you to tag along;
OK - seems harmless, no? Absolutely, not!! Here are the aforementioned clues to look out for:
1. First, the air pistol and "drop it like it's hot" finger sizzle are both played out;
2. Second, sales folks can be disingenuous at a disproportionately higher rate than many other, equally popular industries;
3. Third, while you may think they will leave work at work, come on now, they are sales people after all, often carrying the belief that one must "sell" at all times, everything in their life; playing devil's advocate, one might say that this isn't so good, as they tend to "fake it 'til you make it" and in some situations, this tactic, when employed tactfully, can be extremely powerful for the people involved.
4. Fourth, people skills beyond that of the norm, which are innate, by definition, for the good ones, intrinsically generated and often represent a skill-set that cannot be learned or taught to others (at that level) ;
Sorry to break it to ya...

Wednesday, July 19, 2006





Saturday, July 08, 2006

I often am amazed when I find out how advanced we are at EXPE with regards to our Scorecard program -- Over the course of the last 2 years, I have presented at a handful of conferences, including CFO, CIO magazines, ICPQ Quality, and BSCOL, where multiple people would approach me afterwards to ask me how they can mirror our work.

Well, let me start by saying that it has taken us almost 3 years to work through the issues that come when any company deploys a brand new quality program or measurement system -- Often, you'll have integrity issues with the data requiring ETL or transformation/cleansing of the data, something that often is hidden within disparate databases, silo'ed across your company. Next, you have the problem of 'jumping into the weeds' too soon Vis a vie the overexcitement that comes with having clean data ready to extract and publish in a balanced scorecard format. I find that people, when presented with the option of what data should constitute their most important performance indicators or KPIs for their scorecard, will act as if it is a 'Chinese Menu' and try to order 1 of everything, thus bogging down the scorecard with too much information. While anyone who reads my blogs know how I feel about business metaphors like 'jump into the weeds', it is apropos to mention at this point that 'staying at the 30,000 or 60,000 foot view' for as long as you possibly can is critical. I recommend not having the project manager or program manager who will eventually maintain the ongoing balanced scorecard from an administrative perspective be involved with the strategic decision making with regards to determining relevant and strategic KPIs that cover all 4 perspectives outlined by Kaplan and Norton's balanced scorecard methodology. And, you do not have to launch out of the gate with the perfect version of your scorecard with all 4 perspectives included. At EXPE, we started with the VOC, or Voice of the Customer, perspective, followed by our finance perspective, internal voice (i.e. your employees)/learning and development. While we weren't truly balanced when we launched nor were our KPIs perfect from the get-go, we took the next 3 years to manually create and iterate on the scorecards while we went across the company on a 'road show' to build support and executive buy-in to our program. This is the next piece that is critical for any quality program to be successful. As Fred Reichheld talks in 'Good to Great' and 'Built to Last', what is the point of quality if it isn't tied to the bottom line? Quality for the sake of quality is a slippery slope (oops...another metaphor), often causing companies financial pain if they go with guns blazing into any new quality program without first understanding how the VOC impacts the top and bottom lines.

Being the solutions oriented person I am, at this point, I would be asking 'this is all great - - But how exactly do you tie VOC with the strategic vision of your company'?

Answer: it's not easy...and while I may sound redundant, it took us 3 years before we A) got tied into the strategic planning process, B) got the entire company engaged and enthused by the methodology all leading up to my 3rd point, C) even thought about automating the process.

After 3 years, we finally automated as part of the early adopters program for Microsoft's BSM product (code named Maestro during the beta program) -- For a fraction of the cost of Cognos or Business Objects, other software vendors who offer scorecard modules, Microsoft developed a flexible and user-friendly tool for connecting to multiple different data sources and data types (i.e. relational (SQL/Oracle), multidimensional or cube data (i.e. Analysis Services) or even flat file and manual entry. No matter how technical your company or you are, you can use the software to help you.
But I stress...Do you automate at the same time as you build your program, even if your executive sponsor or boss approves buying the software ahead of time. If you do not have both executive buy-in and company wide adoption of your scorecard program, you will end up with a 'cool tool that people find interesting' that over time, becomes less and less important to the stakeholders, especially if you see a lot of turnover or mergers/acquisitions within your org. In the end, in order to be a truly balanced view, it must roll all the way up to the top of the food-chain; you will know you have achieved nirvana when CEO level planning and executing strat-planning is based on KPIs measured on your Scorecard. There is no one view that everyone must look at: CEO's will want a company/brand/lines of business overall view; but would a call center agent find that useful? Not so much. They care about their AHT and adherence to schedule, so their view would be very different. But it is all very do-able with MS's Business Scorecard Manager (or BSM) product.

For reference, check out BSCOL or Balanced Scorecard Collaborative (bscol.org) -- They have a great website full of information, as well as having cunning conferences during the year that bring together the minds of some of the leading experts in the scorecard space, including the father's of the methodology, Norton and Kaplan (my personal heroes)!

Saturday, February 25, 2006


Interestingly, I have seen in my travels through the random world of Operations and Contact Centers (a forcibly implemented change from the vanilla-version, call center), a true need for process excellence. Yet, of recent, 'process excellence' has become a trend, rather than a disciplined methodology or philosophy.

This 'flavor of the month' mentality has been around forever, but with business book publishing so popular, the methodologys of TQM, CMMI, CRM, CEM and others, have breezed through many contact centers like a hurricane. Often accompanied by SWAG ('stuff we all get', or the PC definition of 'S'), agent rewards, and other external motivators. Often, centers are dedicated and a large kickoff celebration usually ocurs.

The Executives think that the call center agents (get the subtle joke?) are into the Pomp and Circumstance, and while some agents definitely buy in (mostly for the free give-aways, and why not? I was once an agent too!), I would say that my experience has shown me that most, including myself, are skeptical or simply do not care. If they have any tenure under their belt, agents will tell you an ear-full about the 1 to 2 year trends that have swept through their call centers. And when a new program pops up spouting 'Customer is #1' or 'We love our Agents'...'Customer Retention is Most Important', the agents know that under the covers, are still the demands of 'Reduce your Handle Time' or 'WE HAVE 50 CALLS ON HOLD' (which incidentally, often is also spouted by a ticker tape monitor that relays the current calls holding as if we were monitoring stock values like a broker's office at the NYSE).

And, while most centers have seen short-term financial gains or have been able to quantify the 'soft savings' from increased customer retention or propensity to repurchase, over the long-haul, these programs have bordered on beinga 'flavor' rather than a true cultural shifter.

And, then, without notice, Six Sigma swoops down on the centers. And, believe me, I support the initiative...I am one of the black belts; but fancy myself different, having started from the ground floor as an agent in a different company and have rode the corporate wave to a new, Interent based travel company in a new role. This is where the term ' Ignorance is Bliss' originated, I'm sure; people not wanting to know both the blessings and the curse of launching a new quality program in an existing center.

Best to start with Design for Six Sigma when you first launch the centers. But when that isn't an option (probably 98% of the time), we have other things to do to win over the agents.

1) Schedule a meeting with one or two agents (1 with high quality) and a new hire (fresh-eyes perspective), a trainer and a supervisor. The objective is to discuss ways to reduce waste in the center (don't focus only on handle time...it is the inverse of First Call Resolution).
2) Ask for ideas of things that delay them on the call and afterwards. Note them on the 3M flip boards that come with the sticky side for easy attachment of the sheets to the wall.
3) Create 3 columns: 1 that says 'Noise' and 1 that says ' Controllable' and 1 that says ' Standard Operating Procedures (SOP)' -- Add each of the ideas from Step 2 into one of these columns. Circle those that are in the agent's control (Controllable) in a color -- Circle those in a different color that are in the Exectives control (SOP) and cross off those that cannot be controlled by the agent. Focus on the ones that are under the agent's control and the business' control (unless you have items in the Noise column that you find out later can be addressed by agents or management; if so, move it to the corresponding column and out of 'Noise'). Take the items in 'Noise' and rewrite them onto a new sheet -- Title this sheet "Issues out of Agent's Control" and move it tot the side - You will be revisiting this sheet in Step 9.
4) Ask for ideas and ways you can help. Note them on the flip boards.
5) Encourage the team to visualize every call as having three steps in a storyboard -- Think of yourself as a screenwriter and the caller is Hollywood -- This is key:

A. What does Hollywood want? An Action, Romance, Comedy...(In terms of the center, it means 'What does the customer want' A Refund, Change, Recap...)

B. What did we agree to do? (List what you did in Step A along with the associated procedures that are written -- NOT the actual steps you [the agent] took. That comes in Step 3).

C. What did we do? (List what you actual did for what you wrote in Step A).

The gap between B and C presents an opportunity for improvement, that is both measurable and scope-able.

Agents who focus on these basic steps will listen more closely, ask enough questions and keep promises. The result is likely to include reduced handle time; fewer repeat calls, transfers and increased FCR.
6) Ask for ideas to test this concept for reducing handle time. Conducting a pilot with one team may be a great place to start.

7) Develop a Communications Plan and Strategy -- Communicate this approach to one team, share the results of your meeting and ask them to pilot this concept for 30 days, is one idea I have read about. We actually work with our Communications Manager at our company for internal, Operations/Contact Center broadcasts and with our Internal PR team for anything that goes Company wide.

Let your center director know about this project and promise to inform her about the results. You may want to incorporate this new philosophy into your training and coach supervisors on using this new approach with their respective teams. A good communications strategy is key to building confidence in all of your center management groups.
8) Then look at the result. ACD reports may reflect an increase in handled calls by each agent and reduced TT or talk time.
9) Last but not least, we are not ignoring the Noise items. As the final step, take that flip chart titled "Issues Out of Agent Control" and address them one at a time! Stack rank them in the order of Risk -- How? Group the agents together and create a FMEA (Failure Modes Effect Analysis) -- The Result? RPN scores or Risk Indicators.

These are based on taking the Occurence (rated 1 to 10, 10 meaning a lot of occurences) X Severity (same scale as occurance) X Detectibility (rated 1 to 10, where 10 means NOT DETECTIBLE)..This is a bit different in understand than 'O' and 'S'...The product of these 3 numbers is the RPN or Risk Indicator. Sort these in descending order for a stack ranked list to help you work on the Issues that impact the agents the most first.

All in all, including the agents upfront in the planning and strategy of a Six Sigma deployment, is far more effective than swoping in with a new quality program. SWAG, however, is still good!

~Laura

"If you can't describe what you are doing as a process, you don't know what you're doing" -- W. Edwards Deming