4.2 Measure Performance of the IT Department

Basics

Customer satisfaction is one way that an IT Department will measure its service performance. IT Departments can create other measures, called metrics, that help them to determine whether the services and activities that are offered or the processes and strategies that are used are helping them operate at the highest levels possible.

You should know the following terms:

  • IT Department metrics
    • Technology metrics
    • Process metrics
    • Service metrics
  • Objectives
  • Critical success factor
  • Key Performance Indicator (KPI)
  • Metrics or measures
  • Correlation versus causation

Determining What to Measure

In addition to customer satisfaction, IT Departments may want to validate whether initiatives or strategies are working as expected, whether Service Level Targets (SLT) are being met or exceeded, whether those SLT and Service Level Agreements (SLA) are appropriate, and any area or process that might be improved.

Three common types of metrics your IT Department may use to measure performance include:

  • Technology metrics. How well are the applications we use and our infrastructure, such as the network, operating? How reliably are they available?
  • Process metrics. How well are our standard operating procedures (SOP) supporting the organization?
  • Service metrics. What is the quality of the end-to-end experience with any and all service provided by the department?

Each Department should set objectives for technology resources, processes, and services. For example, one objective most IT Departments have is providing 100% Internet availability for all school system staff and students. A service metric might be that 90% or more of the customers that interact with the Department rate their experience as Excellent.

In order to meet this objective, a critical success factor will be determined and described. In the case of meeting the objective of 100% Internet availability, the critical success factor may be that the district’s network access to the Internet and all devices that are authorized to access it is available 24 hours a day, seven days a week.

You can measure levels of performance across your department to determine if you are meeting your objectives based on the critical success factors. These are referred to as key performance indicators (KPI). KPIs should be aspirational--some the department should have to work to achieve. If they are too easy to achieve, performance can dip if staff become complacent. The KPIs for Internet access can include Internet uptime, or what percentage of each day Internet access is available; the network infrastructure uptime, because Internet access requires a successful network to provide it; and application uptime, which is the percentage of each day that Internet-reliable applications operate successfully. 

To measure these key performance indicators, the IT Department uses different metrics or measures. Technology metrics associated with the Internet objective may include Internet logs, reports on network bandwidth utilization, network infrastructure logs, and application availability alerts. There can be additional metrics and key performance indicators depending on the objectives of the department.

Metrics table
Objective: Provide 100% Internet availability for all school system staff and students.
Crictical Success factor: The Internet and all associated network electronics and applications are available 24 hours a day, seven days a week.
Key Performance Indicators: Internet uptime
Network infrastructure uptime
Application uptime
Metric: Internet logs
Network bandwidth utilization
Network infrastructure logs
Application availability alerts

 

Objectives, critical success factors, and KPIs can be applied to all three types of IT Department metrics, including process and service metrics. Every staff member should understand their role in providing information to help the IT Department meet its performance objectives. This may include following SOP for common actions or clearly and accurately documenting work in the ITSM system and Knowledge Base.

Metric Examples

Common request fulfillment metrics that your department may review can include:

  • Service Level Agreement (SLA) performance
    • Requests resolved by the service and support center within service level targets (#, %)
    • Operational Level Agreement (OLA) performance regarding service and support center requests (#, %)
  • Individual performance measurements regarding requests (#, %)
    • First Level Resolution (FLR) (#, %)
    • First Contact Resolution (FCR) (#, %)
  • Time to respond to contacts (e.g., Average Speed to Answer (ASA) for phone, chat, e-mails in seconds/minutes)
  • Request backlog
    • Open (backlog) requests (#, %)
    • Age of open (backlog) requests (e.g., hrs/days,% within goals)
  • Customer Satisfaction
    • Customer satisfaction with each technician or representative
    • Customer satisfaction with the support center
    • Customer satisfaction with the request fulfillment process
  • Quality Monitoring
    • Ticket quality (for their tickets)

Collecting and Analyzing Performance Data

The first rule of data collection is to collect only what you need and use the data you collect. You must measure what is important to your school system, but only measure what matters. If you are collecting data you are not using, stop. Data collection, analysis, and reporting should be easy and support your work rather than being a burden. Regardless of which measures you collect, make sure they are strategically aligned to your department’s objectives.

Measures shouldn’t be overly complex. If you can’t explain them in a way that anyone in the department can understand them, or perhaps anyone in the district office that doesn’t have an IT background, they’re probably too complex. If they’re too hard to explain, people won’t be able to act upon the data you collect or help you reach your department’s objectives.

Be consistent and don’t change metrics or your methodology for analyzing them midstream. Complete a normal data collection-analysis-reporting cycle, and if new or better methodologies are necessary, then change after the cycle is completed. Consistency allows you to track and compare performance over time, like from quarter to quarter or year to year. When you change your methods, you lose the ability to compare with past performance.

Data is objective. Be careful about being subjective when you analyze it. If the Internet downtime drops from 99% to 95% over a given period of time, that’s an objective measure. There may indeed be objective reasons for this, like your ISP lost connectivity one day or a major router may have malfunctioned and had to be replaced. Avoid negative feelings from being frustrated or upset about data that is not what you hoped for. Try to avoid making assumptions and never skew your analysis of the data, either positively or negatively. A lot of people can have theories about why data is what it is, but unless you can show a direct cause, it’s just a correlation. Correlations don’t imply causation and aren’t useful for making sound decisions.

Sharing Performance Data

If your department is in the beginning stages of collecting and analyzing measurements, you may only have part of the data you need. It’s a start! You can still start using that data to inform decisions and improve your department’s bottom line, whether that’s Internet connectivity, customer service, or return on investment.  

In the case that a staff member (even you!)  has to receive data that is not as positive as they’d like, the data can be used as a point of reference to create personal goals for improvement. These are likely to be determined during a private conversation with a supervisor, often during a performance review conference that should be held at least once a year. 

Don’t let one bad review skew the results for one staff member or a department. Remember, most people submit a service request after an experience that is likely to make them upset or frustrated. Look more holistically across the interactions and ratings received from multiple customers to get a better understanding of things that are going well and anything that might end up in an improvement plan.

Here are additional resources you may find useful:

Complete the following task or self-assessment:

  • What types of metrics does your department have in place?
    • Are they thorough enough to provide you with an indication of your department’s strengths and areas for improvement?
    • Are they fully formed with objectives, critical success factors, KPI, and associated measures?
  • What additional metrics might you be interested in to help improve the effectiveness and efficiency of your department?