“Confidence… thrives on honesty, on honor, on the sacredness of obligations, on faithful protection and on unselfish performance.” –Franklin D. Roosevelt
When asked, “Do you have confidence in your bank system services providers?” many bankers I work with would launch into the issues, problems, frustrations and tribulations encountered with one or more of their service providers. While bank service and product providers have been confidently selling “partnership” for years, they’ve been considerably less vocal when it comes to defining its meaning or pinpointing when it is actually achieved.
One of the ingredients of a successful partnership is confidence, particularly as noted in FDR’s quotation. In particular, I want to focus on the sacredness of obligations and performance.
Obligation: a legal duty that a company is required to perform or be penalized for neglecting to perform. Note that the definition of the word “obligation” includes the word “performance.” So, it stands to reason that the service provider that wants to instill confidence begins by upholding its obligation to perform with some penalty if it does not.
Aha, one of the problems immediately becomes obvious. Most of the contracts for services or products very clearly indicate what is being provided (or performed), but rarely do they define what constitutes “acceptable” or indicate the penalty for neglecting to perform. When a service provider is asked what can be done if the contracted service is not performed, the typical answer is, “Sue me.”
I don’t know what it’s like in your home town, but to pursue a service provider for “failure to perform” in Arizona can take years and empty a wallet even faster.
Bankers, here’s the fix: Ensure service level agreements are made a part of every executed service or product contract. A bank is not required by the Federal Financial Institutions Examination Council to include SLAs in its contracts, but the council’s recommendation is quite clear and pointed. FFIEC’s “Outsourcing Technology Services Booklet” recommends the following:
“Financial institutions should link SLAs to provisions in the contract regarding incentives penalties, and contract cancellation in order to protect themselves against service provider performance failures.”
Note that the FFIEC’s recommendation doesn’t require SLAs be in a contract, it only states that they should be in a contract. Over the years, I have examined many initial contracts provided by banking vendors, and in most cases, they did not contain any mention of SLAs. In every case, when asked, the vendor provided its “standard” wording for a few SLAs. If you believe my assertion that confidence is a necessary ingredient for a partnership, then I’m sure you will agree it doesn’t make a bit of sense for the SLA section to not be included in the initial contract. It just seems like good business to clearly communicate what constitutes acceptable performance.
First, let’s define an SLA and determine how to write one.
SLA: A documented agreement (read: the contract) between two parties (e.g., the bank and the service provider) that specifies 1) a metric (a measure of performance), 2) a value for the metric that meeting or exceeding the metric is considered to be “acceptable” performance and 3) a consequence for failure to meet or exceed the metric.
Determining the metrics to be used can be tricky. Some really lousy metrics would include “enhance employee productivity” or “reduce customer turnover.” In either case, it would be difficult, if not impossible, to effectively measure the impact of a service or product.
Good metrics should contain the following:
The most common SLA seen in a servicing contract is Response Time. Response time is very clear (or so it would seem). It is simply the difference in time between initiating a request for service and the time the computer first responds with a result. Simple, yes, since it clearly defines what is to be measured. Most contracts will pass this test. The problem lies in that the point of measurement is rarely defined. Most vendors intend the measurement to be at their computer site, in other words, when the transaction first enters their computer and the time it first leaves their computer. This is nice to know, but it doesn’t take into account the reality of wide area network transit time, local area network transit time and the time it takes to actually display on the user’s computer desktop – all factors that determine how much of a lapse the front-line teller experiences between hitting “enter” and seeing a response on the screen.
Still feeling good about having a response time SLA in your contract?
Here is an example of a well defined response time metric:
Response Time is defined as the average amount of time to process a teller financial transaction. It is measured at a teller terminal device and is the elapsed time between striking the enter key and when the response is displayed on the teller’s computer screen. Response time will be measured by the QA department once weekly in five randomly selected branches. Average time will be computed as the arithmetic average of no less than 10 financial transaction response times.
This SLA clearly defines what, how, when and who as noted earlier. To complete the service level agreement, there must be a threshold that defines acceptable performance. For the measure of response time, an acceptable threshold may be five seconds. Therefore, any time at or below five seconds is considered acceptable.
Ten years ago, computer-related service level measures were modest, such as, “Computer up time will meet or exceed 97% of available time.” When measured on a 24-hour-by-7-day basis, this would be met if the system was unavailable for more than 21 hours in a single month. Many vendor “standard” contracts still contain this level of “performance” even when the hardware and software make much higher thresholds easy to attain.
Metavante has been a pioneer in this arena and now provides comprehensive measures for each product type. Its threshold for “up time” noted above is 99.9%. Metavante has recognized the importance of both including the measures and providing thresholds that reflect the capabilities of today’s technology.
FFIEC has suggested SLAs should be used to monitor the following:
If this is what the FFIEC wants to see, it will probably be a long time before it is very happy with the results. As noted earlier, it is likely that a contract will contain no mention of an SLA. If one is present, it will be one to three measures of “Availability and timeliness of services.” Presence of an SLA for any of the other items is extremely rare.
Service and product providers need to get serious about this “partnership” thing and create confidence in their client base. Here is an action plan for vendors:
Here is an action plan for bankers:
• Review current contracts to determine if they contain SLAs. If so, do they meet the what, how, who and when test?
• Work with current vendors to explore reasonable SLAs to include in renewal contracts.
• Have a set of prepared SLAs for inclusion in any new contract.
Consequences for not meeting thresholds will be my next topic, Gonzo fans. If you think stiff financial penalties are the answer, stay tuned. They may not be the best way to build this mythical partnership.