Tips for being a lead assessor for a service standard assessment

Of the assessment panel roles, the role of lead assessor for a government service standard assessment is both the most daunting (any difficult questions, everyone looks straight at you) and the least obvious (to be the design assessor, it helps to be a designer; to be a lead assessor, it helps to, um, know a lot about government services). 

Somewhere back in the mists of the 26 point service standard, I started acting as lead assessor on cross government assessments. In the spirit of working in the open, here’s what I’ve learned. I don’t think there’s anything in here that’s controversial but to be clear: this is nothing more than personal experience, your mileage may vary, and the CDDO/your department’s assessment team are the only official arbiters of what should or shouldn’t occur in an assessment. 

1. Do your homework

In the early days of assessments, I was once part of a service team at an assessment where the lead assessor was happy to point out that they had no idea about what was being presented to them because they had no time to read background. That assessment was not a good experience for anyone involved.

The only equivalent I’ve encountered to a service standard assessment is a Gateway Project Assessment Review (PAR). This is a different exercise, conducted with a different outcome in mind [1], but it does aim to review the work of a defined project  – including digital projects – at a point in time. It takes a minimum of three days. Three days is also roughly how long a typical service standard assessment might take if not managed well or if it didn’t have to end because people need to go home at some point. 

If you can’t cover everything in 3.5 hours, then you need to prioritise. A key part of preparation that a lead assessor needs to undertake is to identify the priority areas to cover in the assessment.  This is no more complex than setting up a 30 minute pre-meeting with the rest of the panel where you ask where the panel’s concerns lie, based on the background material you’ve got (hopefully including any previous assessment reports from that service, or even the department responsible for the service). 

From this, the lead should have two lists: one of areas that the panel are content show no areas of concern and don’t need to be covered in depth (e.g. “this is a standard tech stack already used for other services at scale, we don’t need to concentrate on that”), and one of areas where you collectively have concerns or gaps (e.g. “I can’t see what user research has taken place”) that you’ll need more information from the service team in order to have enough evidence to issue a pass. Those two lists will give you a fighting chance of finishing in 3.5 hours. 

2. Set the tone at the beginning

Assessment outcomes can have big impacts: there are many out there – including me – who will have felt the impact of an assessment outcome. As a result service teams, and especially those undergoing a service standard assessment for the first time, can be nervous. Not unlike the chair of the panel for a job interview, it’s the job of the lead assessor to put the service team at ease to start with. 

The first time you speak is also the only chance you’ll have to set some boundaries about what happens next. The intro is the only thing I’ll rehearse as an assessment lead beforehand, because I know it’s vital I get it right. I usually take a post-it with a list of bullets on it to remind me what to say, and it looks something like this:

  • Check whether the panel have all been involved in an assessment before. (If they haven’t, check what their understanding of the process is.) 
  • Make it clear that there are no trick questions: as a panel, we’re here to record enough evidence to ensure that we can show why we can give you a pass. 
  • Underline that I’ve been involved in an assessment on both sides of the table, and that I’m here to make this as easy as possible. 
  • We will have a break, but if you feel you need a stop for 5 minutes at some point, please say and we’ll stop. 
  • As lead, I will interrupt you on a number of occasions: I don’t like doing this and being rude but don’t take it personally: there is no way we will finish on time unless I do this.  
  • Timings and order on the agenda will typically move about a bit. Typically, the first few sections take longer than you think but cover material from the later sections, so don’t worry: we will finish on time.
  • If you want to use a slide from a Powerpoint presentation to answer a question, that’s fine, but once you’ve done the initial presentation, the sections will be a Q&A format – the assessment isn’t a continual presentation from you, and we won’t be approaching the individual sections as mini presentation and then question sections. 
  • No matter what happens, we will make time at the end to ensure that if you’ve remembered anything that you want to add, something where you’ve suddenly thought “why didn’t I tell them about that” in answer to an earlier question, you can add it. This doesn’t often happen, but it does help a service team relax because it takes the stress out of feeling that they’ve only got one shot at answering a question.
  • The first thing to go is your presentation. In order that you can keep to time, I will ask the panel to keep their questions until the end of your presentation, but I will be stopping you at (check watch, give time): now, over to you. 

3. Ask stupid questions

One of the most important ways you can help the rest of the panel is to ensure that what they’re being asked to assess is as clear as it can be. 

Service teams will probably have lived, breathed and eaten nothing but the service that they’re presenting to the panel for a reasonable period of time. You and your panel are probably the first people that they’ve spoken to who have no idea of the background to the service: why it needs to exist, what’s existed previously, and why users hate/love/tolerate any previous services that exist. 

This all means that, particularly early on, you may need to stop and unpack a few areas – whether that’s legislation, acronyms, existing services or other factors. This usually means asking questions that feel fairly stupid, and playing these back – “so you’re saying that because of the legislation passed in 1875, your department has no choice but to provide five services…”. These are the most important questions that you’ll ask. 

4. Eliminate scope creep

The subject of an assessment is a single service, and how it delivers an outcome for users. Partly because government doesn’t typically structure teams and organisational structure around services, partly because government services often link up with other services, and partly because relevant user research may have occurred on other services, it’s astonishingly easy to end up in assessments talking about areas that are nothing to do with the service you’re assessing. 

The new departmental approach to building infrastructure platforms, the other services that users of this service will also need, the PMO approach to risk, the board’s appetite for digital transformation: all interesting, but all only useful in as much as they have a measurable impact on the service that a user will experience. Anything else, you need to step in and move the discussion along. 

(This also means that you sometimes have to be careful about querying the source of examples presented, as some service teams will re-use examples that have proved successful in previous assessments, rather than examples from their particular team. I once encountered what looked like a great example of user research that, on questioning, turned out to be over two years old.) 

5. Use the time wisely

I’ve already talked about prioritising areas for discussion, and interrupting where necessary to ensure that you finish the assessment on time. That’s not what this point is about: you also need to ensure that you use the time within the assessment as best as you can. You, preferably in collaboration with the rest of the panel, have the ability to control this. I’ll give three examples of approaches I’ve taken to structuring the time in an assessment. 

  • The order. The order in which the agenda lays out the sections to be discussed usually follows the order in the standard itself. This may not map to the areas that you have the biggest concerns in, so it may be worth re-jigging it. Starting with user needs is usually a good approach, but e.g. performance can often get jammed into five minutes at the end if you’re not careful and – particularly at live, where you should have a performance analyst on the panel – that’s worth switching up earlier in the order. One challenge I make to myself as a lead is to put the sections that I’m responsible for (like the Team) at the end of the order so that if I’m bad at keeping things running to time it’s me that suffers, rather than the rest of the panel. 
  • The whole assessment (!). Twice in my experience, I’ve encountered service teams that are completely unprepared for an assessment. In one case, the team didn’t even know of the existence of the standard or why they were undergoing an assessment and in another, the exercise was more about telling senior staff that the service wasn’t ready to go forwards than it was about actually having anything to assess. In both cases, the panel worked out fairly quickly that continuing to ask standard assessment prompts was not going to achieve anything apart from antagonising a service team by continually asking them questions that they couldn’t answer. In both cases, we pivoted to running a workshop focused on coming up with the best possible recommendations to help the team. We still called it an assessment and there was still a report, but we focused on identifying the best starting points rather than how far short they fell from the standard. 
  • Sometimes, service teams will try and fit too much into the first presentation. Particularly if timings are looking tight, it’s worth intervening to say “this is great information, but probably fits more into the xx section, so shall we keep this for that point?”. For example, lengthy details about tech choices are probably best kept for the tech section and videos of user research are often best postponed for later in the session unless they are less than a minute long.  

6. Keep the macro view in mind

Assessments can be intense and there is a lot of detail. Even with the best intentions, it’s easy to fall down a rabbit hole. Sometimes – usually because there is no data to allow an objective discussion – there is no resolution to a discussion possible within the constraints of the assessment. 

As lead assessor, sometimes you have to step back and, in order to move on, either summarise a discussion or simply point out that you’ve taken the discussion as far as is practically possible. The easiest way of working out whether you should move on or not is to ask yourself whether the topic under discussion relates to the core purpose of the assessment or not. This changes with the assessment phase, but my rule of thumb is:

  • Alpha is about whether the team have tested the riskiest assumptions in their proposed service, and have a credible plan for a limited private beta; 
  • Beta is about whether the team have done enough work that a service that can be used by anyone will withstand the stress and not embarrass government in any way, and is sustainably resourced;
  • Live is about whether the service is good enough and supported well enough to stand firm no matter what happens, and for any old services to be switched off. 

If you’re having a lengthy chat about something that doesn’t relate to this, then move on. A good example of this is the design review of the prototype. This usually picks up a long list of issues. Experienced design assessors will zero in on the issues that might hint at deeper issues, and supply a snag list of minor issues to accompany the assessment report; less experienced design assessors sometimes need to be reminded not to run through every issue in an assessment. 

7. Have a back channel 

We’ve talked about the need for an assessment lead to be able to react and intervene, move on, pivot, or summarise the discussion at any point in an assessment. But how do you make sure that the rest of the panel understand what you’re doing and align to it, rather than reopen the can of worms you’ve just managed to cram a lid onto? The answer is a back channel. 

You can run a back channel on any number of platforms, you just need to find one that everyone on the panel has access to, or can be invited into. You can do it on teams or as a separate channel in cross government Slack or elsewhere. (And before anyone gets the wrong idea, this isn’t to say anything about the service team that wouldn’t be share-able, it’s mainly about being able to quickly canvas opinions whilst conversations are going on to speed stuff up). 

I can’t underplay how useful it is to be able to be able to say “I think we can stop here, as this is a recommendation that we all agree on but we’re not going to get more detail” or even “I’d like to stop for a break in ten minutes after the end of this section – is that OK?”. As a bonus, you can also allow the observers access to the back channel so that – if they do spot anything – they can raise it unobtrusively.  

Early on in my lead assessor career, I had to tell someone whose work I’d been a fan of for years that they needed to stop and move on. I could never have done it openly, but I could do it quickly and straightforwardly on the group chat. I apologised to them afterwards and they – typically – told me that I should have told them to shut up much sooner. 

8. Make sure the report is not an afterthought

The assessment itself isn’t what most of those involved in the process will remember. The report that follows it is. It’s tempting to treat the report as an afterthought, particularly when it’s the most time consuming aspect of the whole exercise. It’s also difficult to retain the details of what happens in the assessment when you’re trying to prioritise eye contact and manage the flow of discussion.

(An assessment is the only occasion in my career when I’d make a case for an official note taker. As it is, I take random typed notes throughout, with frequent capitalised RECOMMENDATIONs next to paragraphs, and I know that the only way for me to manage those is to work through these within 24 hours of the assessment.) 

The report is what will allow the service team to respond positively and to make changes: the recommendations are always more important than the “pass” or “not pass” for the assessors. And this means that the recommendations are the most important part of the whole end to end process. They’ve got to be like all good metrics: understandable and actionable. Anything that’s not clear, or not evidenced, can end up blocking the team from progressing. 

I was once lead assessor for an assessment on a major government service. The result wasn’t what the department hoped for. I checked back with the assessment team after a few days to find out whether there was anything that the panel needed to provide by way of follow-up. I was told “no, the minister has read the report and there’s nothing in there that they can find that they could disagree with”. Gulp. I wasn’t expecting my report to be stress tested to that extent, but I have written every report after that with the image of an impatient government minister at the forefront of my mind. 

9. Play it forward

Being an assessment lead is a lot of work. At best, no-one thanks you; at worst, people send you questioning emails. It’s also a privilege: it’s impossible to spend the best part of a day enmeshed in the finer details of how another team are hurdling problems to deliver a new service without picking up some insight that will help you with a problem, whether that’s now or in the future. 

If you have a privileged position, then it’s worth thinking about how you can play that forward to others, and particularly to others who might not see themselves as future assessment panel members. 

One of the things I did as a lead was to always try to take someone along to assessments as an observer. Sometimes that was someone who’d expressed an interest in assessments, sometimes it was someone who had expressed scepticism in the process, sometimes it was someone that I’d identified as a potentially excellent panel member. I even took along a representative of a large third sector organisation to one. 

10. TL;DR: make sure everyone goes to the toilet

There’s a lot of detail here. Some of it may be useful but remember: the most important thing a lead assessor can do is make sure you finish on time and that everyone gets a chance to go to the toilet at some point. Anything else is a bonus. 


[1] Gateway reviews are undertaken on behalf of an SRO, who partly sets the agenda for the review, which concentrates on SRO responsibilities (risk, governance, finance, delivery). Service Standard Assessments are undertaken on behalf of the user, which means that, done properly, they are a voice for the frequently unrepresented in government. 

Leave a Reply:

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.