Verax Consulting
Follow

A Visual Guide to ITSM Maturity Assessment

How do I carry out the best possible ITSM Maturity Assessment?

For years, ITSM maturity assessment and IT process assessment have rightfully been a cornerstone of process improvement projects. Given that, it's strange that so little has been written about the strategy, tactics and techniques of assessment, especially from a non-vendor perspective.

This article lifts the lid on how professionals plan and implement ITSM maturity and IT process assessments. In the course of a wider series of posts, we will go into great detail to create a complete guide to the subject. This post is an introduction to what an IT assessment is, the techniques employed, the skills required, and whatever else we meet along the way.

My intention is that this article will become the go-to resource for anyone needing to learn about assessing ITSM maturity, and all that entails. Over the coming months many resources will be added such as sample assessment statements / questions, sample reports, links to more in-depth articles on a specific subject, etc. So please check back and let me know in the comments exactly what information you need!

Related Article: A Guide to IT Assessment Interview Preparation‚Äč

But first, a quick definition of what is meant by an ITSM Maturity Assessment

Definitions

An ITSM maturity assessment rates the performance of IT elements against an industry standard 'maturity scale'. It does this by interviewing stakeholders, observing work in progress, and by identifying material evidence. The intention is to improve the effectiveness and efficiency of the process, service or other element being assessed.

"IT elements" include IT processes, services, value streams, supporting tools, functions and teams, and specific activities carried out by the IT organisation.

Because ITIL is the dominant IT Service Management framework, ITSM assessments are usually based on ITIL questions and statements and the accompanying ITIL maturity model.

An 'IT process assessment' refers to an assessment of one or more IT processes. A maturity scale is also used, but both the scale and the framework which it relates to may be non-ITIL. For instance, an IT process assessment could be carried out using the COBIT framework. This uses the COBIT Process Assessment Model.

Another common maturity model originally designed for software development is CMMI for Services.

Basic Overview of an ITSM Maturity Assessment

Let's start with a graphical overview of an assessment project:

Basic Overview of an ITSM Maturity Assessment

CAIR Overview - an ITSM Maturity Assessment approach

The CAIR graphic shows a basic map of an assessment divided into four main phases. The mnemonic I use to remember this is 'Create An Improvement Revolution'. Let's take a slightly closer look before we go into detail.

CAIR: "Create An Improvement Revolution"

'C' is for Context

The Context phase is about getting agreement from the relevant stakeholders on how the assessment will be carried out, and then doing some planning. It covers areas such as 

  • understanding the business objectives
  • deciding which IT elements will be assessed
  • establishing the methods for an ITSM maturity assessment and who will carry them out
  • an understanding of organisation-specific customizations needed in the assessment process
  • defining the details of the project

'A' is for Assessment

The skillful work of finding evidence, interviewing people and observing work in progress happens in the Assessment phase. This is what we call the 'solid foundation' of an assessment project. The skills needed are not just technical - they include interview and listening techniques, empathy, and the ability to formulate and test hypotheses.

'I' is for Improvement

The improvement phase is partly about communication. Reports and recommendations should be accessible, attractive and mindful of the needs of various audiences. Without those qualities it's difficult to gain broad support for improvement activities.

Once communication is clear more technical aspects come into play, such as designing CSFs and KPIs, managing change and building roadmaps.

'R' is for Re-calibration

Re-calibration is about managing the new environment created by the improvements that have been made. One critical feature is to ensure that no negative unintended consequences are taking place.

The ITSM Maturity Assessment Big Picture

CAIR is the skeleton of an assessment and improvement effort. Let's start to put meat on the bones with this graphic:

The 15 Elements of ITSM Maturity Assessments

Let's dig a little more deeply into each of the elements of a great IT Service Management assessment. Many of the points here will be expanded upon in separate posts.

Context 1. Customized and uniquely valuable

There is no such thing as a 'cookie cutter' assessment. Each organisation will have different requirements at different times. They will be responding to unique internal and external (market) pressures. The job of the assessor is to understand the environment and ensure that the assessment approach is fit for purpose (but see 'Ethical, Professional and Open', below). There are several aspects to this...

- What is the business context for the assessment? What are the objectives? 

Some organisations have a straightforward desire to improve processes, functions or services. Others might have a more sophisticated approach and be looking to improve an IT value stream. 

More broadly, at this point it's essential to understand:

  • problems the organisation faces and the objective of the assessment
  • behaviours which need to change. An ITSM maturity assessment should unearth the reasons for such behaviours and recommend how to change them
  • perceived competency gaps
- What are the business drivers and expected business value?

But there could be other reasons to assess. An organisation may want to measure the capability of its IT organisation to compare it to other organisations or outsourcers. Some will want to ensure that they are obtaining value for money from their current IT provider. Or an IT provider might want to prove that its services are meeting and excelling industry standards.

Each of these scenarios can introduce constraints into the assessment. Some also involve professional and ethical issues: just who are we assessing for, and should it matter? We'll come back to those questions later.

- Ask open questions

Never underestimate the value of big, open questions. Sometimes senior stakeholders will have a strong opinion about what kind of IT issues exist.Their views will strongly influence the scope of the ITSM maturity assessment. But ask other stakeholders their opinions before the scope is agreed: "what is working well around here?" "What's not working well around here?"

- Be aware of internal politics

You won't be surprised to hear that internal politics sometimes colour the objectives or scope of an IT assessment. If this seems to be the case in the organisation being assessed, try to understand the big 'political' issues that may be in play without getting drawn into them.

- Agree the scope of the assessment

Once these factors are understood, it's time to agree the scope of the assessment. There is more context to consider though, because there are now decisions to make about exactly what will be assessed. Which is coming up next...

Context 2. Context specific

Once the scope has been agreed, it will be clear what kind of assessment strategy is required. For example the assessment of a specific process will involve different techniques to an assessment of an IT Value stream.

Context 3. Perfectly prepared

Timely and effective project management is essential for a successful ITSM maturity assessment. Part of the skill of this is to make sure that a good sampling of stakeholders is selected to be interviewed. Depending on the objectives, for example, a mix of senior managers, associates and customers might be appropriate. All of these stakeholders will have limited time, and this needs to be managed with some sensitivity.

Assessment 4. Built on a solid foundation

This is where four of the five fundamental skills and activities happen ('evidence based' is covered separately). Assessment interviewing, job shadowing, tools reviews and documentation reviews are usually all essential in understanding how the work gets done.

Assessment 5. Aligned with specific frameworks

Because ITIL has become the dominant framework for ITSM, most assessments will relate to ITIL processes. ITIL has an accompanying maturity model with defined maturity levels, and of course a set of specific questions / statements which are used as the fundamental assessment tool. How to use those questions is a big subject and one which we'll go into in a separate post.

Of course, other service management-related frameworks and approaches are available. It is possible to carry out COBIT, Devops and other types of assessment. Each will have its own set of questions and statements and will use an appropriate maturity model. Again, this is a big subject and one we'll go into elsewhere.

Assessment 6. Methodologically sound and statistically valid

This element applies to the business of how we conduct interviews, which questions are asked, and how we record and 'score' the answers. It's important that there is a consistency to the way in which we rate responses to statements. 

The standard for ITSM maturity assessment is to read a statement and record a 'yes' or 'no' response as to whether or not the statement is true. There are numerous reasons why this is NOT a good idea. We will go into this at great depth in another post.

Assessment 7. Open to non-statistical information

Responses to formal statements and questions are important and are the basis of maturity scores. The customer (i.e. the party that commissioned the project) is often focused on these scores, but it is the job of the assessor to listen carefully to information that comes from other sources. 

By listening to anecdotal stories and comparing with other more formal information sources, we start to form a picture of what is really happening in the organisation. In fact, we begin to explain the evidence and its effects, and we note that new evidence supports or contradicts these explanations.

The bottom line is: if the ITSM maturity assessment sticks to formal questions, you may be missing vital facts and perspectives that will help the organisation improve. 

Assessment 8. Psychologically in-tune

Assessments are about people. It's vital that our 'respondents' are treated with respect.

Some people are positively happy about being asked to contribute. Someone remarked to me recently that he had worked in the organisation for fifteen years, and this was the first time anyone had ever asked for his opinion and valued his experience in this way.

Other people will be fearful or negative. Some will have political agendas or try to control the interview. Some will be introverts and others extroverts. The job of the assessor is to make sure that each of these kinds of people is able to make as significant a contribution as possible. 

Assessment 9. Evidence based

Some of the testimony gathered by assessors is opinion. Some of it is fact. The assessor's job is to substantiate the most important findings. This can play out at the level of overall process performance, or in the details of an activity, and everywhere in between. 

Assessment 10. Ethical, professional and open

There are two main aspects to this. The first is to respect the anonymity of respondents. People should feel free to speak their minds without fear of recrimination. The principle of anonymity should be established and agreed in the Context phase.

The second aspect is to ensure that what we find is what we report. An IT service provider might well want the best possible maturity results; but a new CIO has good reason to hope for the worst possible maturity results. We should already have discovered this sort of bias in the Context phase.

Neither of those desires or expectations must be allowed to affect an ITSM maturity  assessment in any way. They are absolutely not part of the scope, and should not change in any way the questions that are asked or the way they are reported. Of course, whoever commissioned the assessment report is in a position to change it or refuse to publish it. That situation is usually beyond the control of the assessor.

In the short term, this stance may seem suicidal for anyone specialising in assessment. In the longer term, the attempt to establish an objective, neutral position is fundamental to success and the trust of the organisations who want to assess and improve themselves.

Improvement 11. Specific and clear recommendations

One of the delights of assessment reporting is that very often the number of actionable recommendations can be huge. I say 'delights' because the recommendations are an indicator that our knowledge of whatever is being assessed has increased, and that there is obvious room for improvement.

It's important that these recommendations don't become overwhelming. They should be categorised and prioritised at a minimum, and arranged into a logical sequence in a roadmap (see below).

Improvement 12. Visually engaging, built for sharing

There is a strong case for presenting the plain facts in a plain way. But the reality is that the large amounts of information produced by an assessment need to be ordered and presented for different audiences.

Visual engagement is vital for some audiences, though we have to sound a word of warning here: make sure that any visuals, and particularly graphs, don't distort the data. At Zeno, we happen to have a tamed statistician with a brain the size of Pluto, and he keeps us on the straight and narrow. More about this in another article.

Typically, different audiences want to consume information differently. The obvious example is that for some people an executive summary will be appropriate; but for a process owner, an understanding of the detail is required.

Improvement 13. Clear on the roadmap and objectives

Producing a roadmap is not usually something an assessor would do on their own. It's important that when prioritising actions based on the recommendations, we go back to the business drivers that were established in the Context phase. Roadmaps are especially important when different 'streams' of improvement effort will be carried out simultaneously. The roadmap should literally provide the 'vision'; a project plan should provide the detail.

Re-calibration 14. Monitored

Once a process improvement plan is agreed and up and running, it should be monitored. Those familiar with Lean IT and the Theory of Constraints will understand that process improvements (or function, or service improvements, etc.) can have unintended consequences.

Put at its simplest, if activity A gets much more efficient and increases throughput, what's the effect on activities B and C? The overall effect could be negative.

This should be initially considered in the Context phase when CSFs and KPIs are designed: the measures should address desired outcomes at the level of business value, not at the level of localised process improvements.

Re-calibration 15. Focused on action and accountability

The best possible reason to assess ITSM maturity is that it provides fantastic insights into how to improve the way that services and IT in general are managed.

But if there is no clear accountability for making sure that appropriate recommendations are carried out, the assessment will have been a waste of time.

Equally, if the organisation's senior management does not strongly and visibly support the improvement efforts, again - the assessment will have been a waste of time.

Finally!

You may want to consider bookmarking this page. I intend to update it with links to other detailed posts and resources.

If there are particular assessment and improvement-related resources you would like access to in the future, please let me know in the comments below or contact me via LinkedIn.

About the Author Dan McCarthy

Dan helps IT leaders to assess and improve their organisations and processes. He writes about improving working life, processes, and efficiency, with some left-field perspectives from his Anthropology background. Has been sighted lurking near pianos and guitars, as if to play them.

follow me on:

Leave a Comment: