Skip to main content

A fresh approach to service assessments

Posted by: , Posted on: - Categories: service assessments, Service design, service ownership, software and services

Diagram showing the benefits of mentoring. Each benefit is shown using an icon. They are: goal, coaching, guidance, training, motivation, knowledge, support and success

We’ve been designing and building a new service to make it easier for schools and education providers to get funding so that teachers can mentor trainees.

We knew we were coming to the end of our alpha phase of software delivery, when we heard our delivery manager say, “We need to book an alpha assessment.”

It’s fair to say people are not that keen on service assessments because they take up so much time and need a lot of preparation. But I didn’t need to panic. As a product manager, I know our team has been doing all the right things – investigating assumptions, testing, building prototypes and having conversations with policy and finance colleagues.

Our assessment was not going to be a scary, box-ticking exercise lasting 4 hours. Instead we changed the format and structure to make it more agile and adaptable to both our needs and our assessors’.

Here’s how we did it.

Preparing for our assessment

We had an initial conversation with a lead service assessor who asked, “Please don’t make it death by slide deck”.

Four hours of presenting and questioning is daunting for the service team and it’s a long time for assessors to listen and absorb complex information.

On the advice of one of the department’s lead service assessors, we agreed to have 4, separate 1-hour sessions instead of a half day.

We assumed booking multiple sessions in assessors’ diaries would be a challenge, but it was much easier to find 1-hour slots than half days.

Session 1 of the service assessment

This was our opportunity to share the narrative about our service clearly and succinctly. We explained the policy aim and the rationale for designing a digital service.

Fortunately, in our service team, we had policy colleagues working beside us as fully-fledged members of our multidisciplinary team. They acted as an integral link between our service team and policy teams elsewhere in the Department for Education. They shared our progress, findings, and insight with those policy teams who were then able to iterate in step with our user research.

This meant we had a solid narrative about the service’s direction and intent. The panel’s feedback on this first assessment session was really positive.

"The panel were really impressed with the team's thorough understanding of the problem space. Your timekeeping was also exceptional and the slides look great." James Cheetham, Lead Assessor.

Session 2

We knew that the assessors would want to delve into the details, and rightly so. But instead of using this session to focus on the work each person in our team had done, we decided to show how we collectively solved the challenges around:

  • user research
  • service design
  • technology

We focused on how much the schools needed funding to cover the cost of training a teacher to become a mentor. Using our research findings, we explained how this need informed the service design, and how we'd use technology to meet that need.

This storytelling approach saved time. And it went down well with our lead assessor who said it was clear our team understood the problem space, and we'd been great at influencing policy.

However the assessors felt they'd not had enough time to see the product to fully understand it, and so the session had felt a bit rushed.

Refining our approach before sessions 3 and 4

We deliberately did not finalise the content for the last 2 sessions – we wanted to shape them based on how the first 2 went.

We had a list of questions from the assessors after session 1 and 2 that they wanted us to answer.

Session 3

The team focused on the riskiest assumptions, technology choices and demonstrated the prototype, explaining how UR had influenced decisions.

Time was available for the team to revisit questions that the assessors had sent through after session 2.

Session 4

There will be a 2-year gap between alpha and beta phases because:

  • schools and training providers don't need to have mentors trained until September 2024
  • funding is not available until 2024

We used this final session to demonstrate how the 2-year gap would be managed and what the policy team would need to do in that time.

What we learned

laptop with a screenshot showing a GOV.UK page about 'What is a service assessment". The webpage text is unreadable.

Four 1-hour sessions are better than half a day

Both assessors and the service team have time between sessions to inspect and adapt.

Space the 4 assessment sessions out over two weeks

This was the right amount of time for us to look at what went well and what we needed to prepare for the next session.

Using minimalist slides for greater impact

This was tough. We had so much to share, but we tried to keep slides to a minimum, to make them more readable and therefore easier to digest.

Preparation -– not too little not too much

We focused on preparing the first 2 slide decks in detail and kept the last 2 light touch until we knew which areas the assessors wanted more information on.

Involve the whole team

The assessors feedback that it was good to see all team members taking part in the narrative, including their policy colleagues.

"It's clear the team really understands the problem space and have been great at influencing policy as a result."

Next steps

We’re proud to say we passed our alpha assessment and we will move into beta.

Our work on the alpha shows why digital, data and technology multidisciplinary teams are so effective in user centred service design. And we hope our approach to assessments is a useful example of how to make assessments more accessible.

Sharing and comments

Share this page