2. Iterate and improve frequently

[TODO: Add/revise introductory text]

Guidelines:

2.1 Develop services using agile, iterative and user-centred methods

[TODO: Add/revise introductory text]

Digital Service Standard (Ontario): Design and build the service using an agile and user-centred approach. Agile is an approach to building services that breaks the work into smaller chunks known as iterations. Build one feature of the service at a time until the entire service is complete.

It is a much lower risk approach than traditional build-it-all-at-once approach known as waterfall because frequent iterations expose any flaws in the original plan much faster (e.g. not getting approvals, not enough resources, not the right people on the team, etc.)

User-centred methods such as user research and usability testing put the focus on making services that are easy-to-use. Traditional government services focus on meeting business needs and aligning with policy goals. A user-centred approach ensures business needs are also balanced against user needs. This helps to increase digital service uptake.

Checklist

[TODO: Add/revise checklist items]

Alpha, beta and live stages:

  • work in an agile way, using agile tools and techniques, and continue to do so when the service is live (Digital Service Standard (Ontario / UK / AU))
  • ensure the team reviews and iterates the ways problems are fixed (Digital Service Standard (Ontario / UK / AU))
  • show that your service governance is agile, based on clear and measurable goals (Digital Service Standard (Ontario / UK / AU))
  • explore design options for your prototype and explain why some are discarded (Digital Service Standard (Ontario / UK))
  • When iterating, focus on workable solutions over comprehensive documentation. (3. Apply agile principles and be iterative. (Do - Digital Design Playbook (ISED)))
    • Having a workable solution that can be tested and validated will give you useful information for improving your service. Whenever possible, focus on results rather than unnecessary documentation and reporting (while staying within policy and regulatory limits).
  • When you can, use agile tools and techniques. (3. Apply agile principles and be iterative. (Do - Digital Design Playbook (ISED)))
    • Techniques can include: daily stand ups, issue trackers, code reviews, rapid prototyping, design sprints, usability testing, user stories, retrospective meetings.
  • make sure you have the ability to deploy software frequently with minimal disruption to users (Digital Service Standard (UK))

Live stage:

  • make sure deployments have zero downtime in a way that doesn't stop users using the service (Digital Service Standard (UK))
  • make sure you have enough staff to keep improving the service (Digital Service Standard (UK))

Implementation guides

Reusable solutions

Similar ressources

2.2 Continuously improve in response to user needs

[TODO: Add/revise introductory text]

Once you have designed and launched a service, there is still work to do. Treat the service as a product; it requires regular reviews, usability tests and improvements. Unlike a project that has pre-determined start and end date, a product has a life cycle that goes far beyond the launching of the service. Regularly assessing the service and welcoming opportunities for improvement will help to ensure that the service keeps pace with evolving client needs and benefits from new or improved technology. (2. Product management, not just project management. (Assess - Digital Design Playbook (ISED)))

At every stage of a project, we should measure how well our service is working for our users. This includes measuring how well a system performs and how people are interacting with it in real-time. Our teams and agency leadership should carefully watch these metrics to find issues and identify which bug fixes and improvements should be prioritized. Along with monitoring tools, a feedback mechanism should be in place for people to report issues directly. (Digital Services Playbook (US))

Continuously capture and monitor performance data to inform ongoing service improvements.

Measuring performance means continuously improving a service by:

  • learning its strengths and weaknesses
  • using data to support changes

(Digital Service Standard (Ontario))

Checklist

[TODO: Add/revise checklist items]

  • have a quality assurance testing and rollback plan that supports frequent iterations to the service (Digital Service Standard (Ontario))
  • use a phased approach to test changes to part of service, when feature-based changes are not feasible (Digital Service Standard (Ontario))
  • Define your testing objective (1. Test the service before launching the service. (Assess - Digital Design Playbook (ISED)))
    • Define the purpose of the test and what you want to learn? The purpose of the test is often determined by your business goals and user needs identified through feedback, analytics and other sources.
    • Identify top or critical tasks to test. Main outcomes and features your clients want to achieve should be prioritized.
  • Test under realistic conditions (1. Test the service before launching the service. (Assess - Digital Design Playbook (ISED)))
    • Create realistic scenarios that reflect the context and environment in which clients would use the service.
  • test the service in an environment that is as similar to the live environment as possible (Digital Service Standard (Ontario))
  • Commit to regular service reviews (2. Product management, not just project management. (Assess - Digital Design Playbook (ISED)))
  • Identify opportunities to improve the service based on the results of regular test (2. Product management, not just project management. (Assess - Digital Design Playbook (ISED)))
  • analyze user research and use it to improve your service (Digital Service Standard (UK))
  • Use different types of tests to assess the service (1. Test the service before launching the service. (Assess - Digital Design Playbook (ISED)))
    • Identify the best testing method based on your needs. Examples of tests include:
      • Tree Testing - A test in which participants are asked to find a resource based on a series of menus.
      • Card Sorting Testing - A reverse tree test where participants sort through items and group them together in a hierarchical manner.
      • First Click Testing - A test that observes the first item that a participant clicks on and uses the selection as an indication as to whether users are directed as intended.
  • have a process for testing changes made to the service (Digital Service Standard (Ontario))
  • have a process for monitoring and testing the service frequently even when changes are not being made (Digital Service Standard (Ontario))
  • Create automated tests that verify all user-facing functionality (Digital Services Playbook (US))
  • Create unit and integration tests to verify modules and components (Digital Services Playbook (US))
  • Run tests automatically as part of the build process (Digital Services Playbook (US))
  • Conduct load and performance tests at regular intervals, including before public launch (Digital Services Playbook (US))
  • Monitor system-level resource utilization in real time (Digital Services Playbook (US))
  • Monitor system performance in real-time (e.g. response time, latency, throughput, and error rates) (Digital Services Playbook (US))
  • Ensure monitoring can measure median, 95th percentile, and 98th percentile performance (Digital Services Playbook (US))
  • Create automated alerts based on this monitoring (Digital Services Playbook (US))
  • Track concurrent users in real-time, and monitor user behaviors in the aggregate to determine how well the service meets user needs (Digital Services Playbook (US))
  • Use an experimentation tool that supports multivariate testing in production (Digital Services Playbook (US))

Implementation guides

Reusable solutions

[TODO: Add/revise reusable solutions]

Similar ressources

2.3 Try new things, start small and scale up

[TODO: Add/revise introductory text]

Checklist

[TODO: Add/revise checklist items]

  • Start with a prototype (3. Apply agile principles and be iterative. (Do - Digital Design Playbook (ISED)))
    • Create a minimum viable product, that is, a version of the service with just enough features to gather insights, test assumptions and inform future improvements. Use the prototype to capture client feedback and then make improvements until you have a version that really meets client needs.
  • Start small and build upon successes. (General design principles - Digital Design Playbook (ISED))
  • Don’t wait for a fully developed service to start testing. (1. Test the service before launching the service. (Assess - Digital Design Playbook (ISED)))
    • Develop a prototype of the service and test it to validate ideas, to challenge assumptions and to identify opportunities for improvement.

Implementation guides

[TODO: Add/revise implementation guides]

Reusable solutions

[TODO: Add/revise reusable solutions]