Digital Standards (draft) - Single page view
GCDigital Tools
- 1. Design with users
- 2. Iterate and improve frequently
- 3. Work in the open by default
- 4. Use open standards and solutions
- 5. Address security and privacy risks
- 6. Build in accessibility from the start
- 7. Empower staff to deliver better services
- 8. Be good data stewards
- 9. Design ethical services
- 10. Collaborate widely
1. Design with users
[TODO: Add/revise introductory text]
1.1 Research with users to understand their needs and the problems we want them to solve
Focus on the needs of your users, using agile, iterative and user-centred methods when building a service. Start with extensive research and analysis to help understand who is using the service, what their needs are and how the service will affect their lives to better understand how the service should be designed. The absence of the user voice leads to assumptions that may be incorrect and costly.
A key part of building digital services that work for users is developing a good understanding of who are the users, what are their needs and how the service will affect their lives. It is equally important to develop a good understanding of the different contexts in which users could be interacting, since user needs and expectations can vary depending upon where, when and how they use a digital service.
Checklist
- Put in place a plan to pay for user research throughout the design of the service and after it's built
-
Interview potential users to help develop the following for the service:
- User goals (e.g., As a [user type], I want [some goal] so that [some reason])
- User personas (e.g., based on habits, personality, attitudes and motives)
- User profiles (e.g., based on demographics such as gender, age, location, income and family size)
- Use a range of qualitative and quantitative research methods to determine people's goals, needs, and behaviours
- Create and maintain a list of priority tasks that users are trying to accomplish (i.e., "user stories")
- Document all end-to-end user journeys, including journeys that involve multiple services and external services
- Understand how will users interact with the service, optimizing the experience for online and offline interactions
- Use plain language that is appropriate and easy to understand for the audience
Implementation guides
Reusable solutions
Similar resources
- 1. Understand user needs (Digital Service Standard (UK))
- 1. Understand users and their needs (Digital Service Standard (Ontario))
- 1. Understand what people need (Digital Services Playbook (US))
- 1. Understand user needs (Digital Service Standard (AU))
- 1. Understand client needs (Think - Digital Design Playbook (ISED)) (internal to GC only)
- 2. Do ongoing user research (Digital Service Standard (UK))
- 12. Make sure users succeed first time (Digital Service Standard (UK))
1.2 Conduct ongoing testing with users to guide design and development
User needs are constantly evolving which it is why it is important to plan for ongoing user research and usability testing. Engage users at all stages, continuously seeking feedback to ensure the service helps users to accomplish their tasks and to keep improving the service to better meet user needs.
Users should be involved throughout the lifecycle of the service, with user research and testing informing the earliest design phases through to continuous improvements after the service has launched.
When designing a service, it is important to determine the problems that the service needs to solve and how it will help users to achieve their goals. The focus shouldn't be just on the service itself, but also how the service fits in the overall user journey. The service should be designed to seamlessly integrate into the overall user journey and regularly measured to ensure that it is meeting user needs.
Checklist
- Put in place a plan to pay for usability tests throughout the design of the service and after it's built
- Use qualitative and quantitative data to help improve your understanding of user needs and identify areas for improvement
- Make services simple, intuitive and consistent
- For multi-step processes, provide users with clear information about where they are in the process and the ability to exit and the ability to exit and return later without losing progress
Beta and live stages:
- Regularly test with users when building the service and after the service has been launched to ensure it meets the needs of user and to identify any parts of the service that users may find difficult
- Continuously measure client experience and create a customer-prioritized improvement plan. (2. Product management, not just project management. (Assess - Digital Design Playbook (ISED)))
-
Test with clients and others (1. Test the service before launching the service. (Assess - Digital Design Playbook (ISED)))
- You need to ensure that the service works from technical perspective and from the perspective of the user and the service provider (including the help desk agent who assists clients when they face challenges using the service). By testing with a diverse group and different type of users, you can capture a more comprehensive understanding of how your service is working.
- Make sure the participants are representative of your clients.
- Utilize user experience testing services offered by the Chief Information Office and the Communications team
-
Plan and deliver client testing cycles (1. Test the service before launching the service. (Assess - Digital Design Playbook (ISED)))
- Pilot your test: Make sure it all works
- Implement the test.
- Test often (e.g., six month or yearly intervals), apply the findings and keep on testing.
- Regularly assess the service, indentifying and fixing problem areas that are degrading the user experience
- Regularly measure how well the service is meeting user needs at each step of the service and for the end-to-end experience
- Provide a mechanism for users to provide feedback and to address service issues in a timely manner (as required by the Policy on Service).
- Focus on the needs of users, using agile, iterative, and user-centred methods
Implementation guides
Reusable solutions
Similar resources
- 5. Ensure users succeed the first time (Digital Service Standard (Ontario))
- 4. Design the service from start to finish (Digital Service Standard (Ontario))
- 2. Address the whole experience, from start to finish (Digital Services Playbook (US))
- 3. Make it simple and intuitive (Digital Services Playbook (US))
- 12. Support those who need it (Digital Service Standard (Ontario))
2. Iterate and improve frequently
[TODO: Add/revise introductory text]
2.1 Develop services using agile, iterative and user-centred methods
[TODO: Add/revise introductory text]
Digital Service Standard (Ontario): Design and build the service using an agile and user-centred approach. Agile is an approach to building services that breaks the work into smaller chunks known as iterations. Build one feature of the service at a time until the entire service is complete.
It is a much lower risk approach than traditional build-it-all-at-once approach known as waterfall because frequent iterations expose any flaws in the original plan much faster (e.g. not getting approvals, not enough resources, not the right people on the team, etc.)
User-centred methods such as user research and usability testing put the focus on making services that are easy-to-use. Traditional government services focus on meeting business needs and aligning with policy goals. A user-centred approach ensures business needs are also balanced against user needs. This helps to increase digital service uptake.
Checklist
[TODO: Add/revise checklist items]
- work in an agile way, using agile tools and techniques, and continue to do so when the service is live (Digital Service Standard (Ontario / UK / AU))
- ensure the team reviews and iterates the ways problems are fixed (Digital Service Standard (Ontario / UK / AU))
- show that your service governance is agile, based on clear and measurable goals (Digital Service Standard (Ontario / UK / AU))
- explore design options for your prototype and explain why some are discarded (Digital Service Standard (Ontario / UK))
-
When iterating, focus on workable solutions over comprehensive documentation. (3. Apply agile principles and be iterative. (Do - Digital Design Playbook (ISED)))
- Having a workable solution that can be tested and validated will give you useful information for improving your service. Whenever possible, focus on results rather than unnecessary documentation and reporting (while staying within policy and regulatory limits).
-
When you can, use agile tools and techniques. (3. Apply agile principles and be iterative. (Do - Digital Design Playbook (ISED)))
- Techniques can include: daily stand ups, issue trackers, code reviews, rapid prototyping, design sprints, usability testing, user stories, retrospective meetings.
- make sure you have the ability to deploy software frequently with minimal disruption to users (Digital Service Standard (UK))
- make sure deployments have zero downtime in a way that doesn't stop users using the service (Digital Service Standard (UK))
- make sure you have enough staff to keep improving the service (Digital Service Standard (UK))
Implementation guides
- 247 different checklists for usability testing (userfocus.co.uk)
- Agile Principles & Practices (18F (US))
- Scrum (Scrum Alliance)
- The Scrum Framework (Scrum Inc.)
- Lean (internal to Government of Canada)
- Kanban (development) (Wikipedia)
- GCpedia Community of Practice Agile Methods (internal to Government of Canada)
- CRA Agile Framework (internal to Government of Canada)
- Agile delivery (Digital Service Standard (UK))
- Service design and delivery process (Digital Service Standard (AU))
- Manifesto for Agile Software Development (agilemanifesto.org)
- Is your project using ‘agilefall’? (18F (US))
- Agile methods: an introduction(Service Manual (UK))
- How to be agile in an non-agile environment (Government Digital Service blog (UK))
- You can’t be half agile (Government Digital Service Blog (UK))
- Agile delivery (Digital Service Standard (UK))
Reusable solutions
Similar resources
- 4. Use agile methods (Digital Service Standard (UK))
- 5. Iterate and improve frequently (Digital Service Standard (UK))
- 8. Be agile and user-centred (Digital Service Standard (Ontario))
- 4. Build the service using agile and iterative practices (Digital Services Playbook (US))
- 3. Agile and user-centred process (Digital Service Standard (AU))
- 3. Apply agile principles and be iterative. (Do - Digital Design Playbook (ISED)) (internal to Government of Canada)
- 2. Product management, not just project management. (Assess - Digital Design Playbook (ISED)) (internal to Government of Canada)
2.2 Continuously improve in response to user needs
[TODO: Add/revise introductory text]
Once you have designed and launched a service, there is still work to do. Treat the service as a product; it requires regular reviews, usability tests and improvements. Unlike a project that has pre-determined start and end date, a product has a life cycle that goes far beyond the launching of the service. Regularly assessing the service and welcoming opportunities for improvement will help to ensure that the service keeps pace with evolving client needs and benefits from new or improved technology. (2. Product management, not just project management. (Assess - Digital Design Playbook (ISED)))
At every stage of a project, we should measure how well our service is working for our users. This includes measuring how well a system performs and how people are interacting with it in real-time. Our teams and agency leadership should carefully watch these metrics to find issues and identify which bug fixes and improvements should be prioritized. Along with monitoring tools, a feedback mechanism should be in place for people to report issues directly. (Digital Services Playbook (US))
Continuously capture and monitor performance data to inform ongoing service improvements.
Measuring performance means continuously improving a service by:
- learning its strengths and weaknesses
- using data to support changes
(Digital Service Standard (Ontario))
Checklist
[TODO: Add/revise checklist items]
- have a quality assurance testing and rollback plan that supports frequent iterations to the service (Digital Service Standard (Ontario))
- use a phased approach to test changes to part of service, when feature-based changes are not feasible (Digital Service Standard (Ontario))
-
Define your testing objective (1. Test the service before launching the service. (Assess - Digital Design Playbook (ISED)))
- Define the purpose of the test and what you want to learn? The purpose of the test is often determined by your business goals and user needs identified through feedback, analytics and other sources.
- Identify top or critical tasks to test. Main outcomes and features your clients want to achieve should be prioritized.
-
Test under realistic conditions (1. Test the service before launching the service. (Assess - Digital Design Playbook (ISED)))
- Create realistic scenarios that reflect the context and environment in which clients would use the service.
- test the service in an environment that is as similar to the live environment as possible (Digital Service Standard (Ontario))
- Commit to regular service reviews (2. Product management, not just project management. (Assess - Digital Design Playbook (ISED)))
- Identify opportunities to improve the service based on the results of regular test (2. Product management, not just project management. (Assess - Digital Design Playbook (ISED)))
- analyze user research and use it to improve your service (Digital Service Standard (UK))
-
Use different types of tests to assess the service (1. Test the service before launching the service. (Assess - Digital Design Playbook (ISED)))
-
Identify the best testing method based on your needs. Examples of tests include:
- Tree Testing - A test in which participants are asked to find a resource based on a series of menus.
- Card Sorting Testing - A reverse tree test where participants sort through items and group them together in a hierarchical manner.
- First Click Testing - A test that observes the first item that a participant clicks on and uses the selection as an indication as to whether users are directed as intended.
-
Identify the best testing method based on your needs. Examples of tests include:
- have a process for testing changes made to the service (Digital Service Standard (Ontario))
- have a process for monitoring and testing the service frequently even when changes are not being made (Digital Service Standard (Ontario))
- Create automated tests that verify all user-facing functionality (Digital Services Playbook (US))
- Create unit and integration tests to verify modules and components (Digital Services Playbook (US))
- Run tests automatically as part of the build process (Digital Services Playbook (US))
- Conduct load and performance tests at regular intervals, including before public launch (Digital Services Playbook (US))
- Monitor system-level resource utilization in real time (Digital Services Playbook (US))
- Monitor system performance in real-time (e.g. response time, latency, throughput, and error rates) (Digital Services Playbook (US))
- Ensure monitoring can measure median, 95th percentile, and 98th percentile performance (Digital Services Playbook (US))
- Create automated alerts based on this monitoring (Digital Services Playbook (US))
- Track concurrent users in real-time, and monitor user behaviors in the aggregate to determine how well the service meets user needs (Digital Services Playbook (US))
- Use an experimentation tool that supports multivariate testing in production (Digital Services Playbook (US))
- Ensure that data being used by the Automated Decision System is routinely tested to ensure that it is still relevant, accurate and up-to-date and follow any applicable policy or guidelines with regards to data management practices in accordance with the Policy on Information Management.
- Ensure quality is considered throughout the Software Development Lifecycle
- Encourage and adopt Test Driven Development (TDD) to improve the trust between Business and IT
Implementation guides
- Test your service's performance (Service Manual (UK))
- Quality assurance: testing your service regularly (Service Manual (UK))
- Test your service's performance (Digital Service Standard (UK))
- Deployment environments (Digital Service Standard (UK))
- Vulnerability and penetration testing (Digital Service Standard (UK))
- Testing your service (Service Manual (UK))
- TBITS 26: Software Product Evaluation, Quality Characteristics and Guidelines for their Use (GC)
- Exploratory Testing (Service Manual (UK))
- Testing Cookbook (18F (US))
- Using data to improve your service: an introduction (Digital Service Standard (UK))
- Choosing digital analytics tools (Digital Service Standard (UK))
- Measuring digital take-up (Digital Service Standard (UK))
- Measuring user satisfaction (Digital Service Standard (UK))
- Measuring cost per transaction (Digital Service Standard (UK))
- Measuring completion rate (Digital Service Standard (UK))
- Benefits of User-centered Design (Usability.gov (US))
- Measuring success (Service manual (UK))
Reusable solutions
[TODO: Add/revise reusable solutions]
Similar resources
- 1. Test the service before launching the service. (Assess - Digital Design Playbook (ISED)) (internal to Government of Canada)
- 3. Review and improve services continually (Think - Digital Design Playbook (ISED)) (internal to Government of Canada)
- 10. Test the end-to-end service (Digital Service Standard (UK))
- 6. Test the end-to-end service (Digital Service Standard (Ontario))
- 10. Automate testing and deployments (Digital Services Playbook (US))
- 15. Collect performance data (Digital Service Standard (UK))
- 13. Measure performance (Digital Service Standard (Ontario))
- 12. Use data to drive decisions (Digital Services Playbook (US))
2.3 Try new things, start small and scale up
[TODO: Add/revise introductory text]
Checklist
[TODO: Add/revise checklist items]
-
Start with a prototype (3. Apply agile principles and be iterative. (Do - Digital Design Playbook (ISED)))
- Create a minimum viable product, that is, a version of the service with just enough features to gather insights, test assumptions and inform future improvements. Use the prototype to capture client feedback and then make improvements until you have a version that really meets client needs.
- Start small and build upon successes. (General design principles - Digital Design Playbook (ISED))
-
Don’t wait for a fully developed service to start testing. (1. Test the service before launching the service. (Assess - Digital Design Playbook (ISED)))
- Develop a prototype of the service and test it to validate ideas, to challenge assumptions and to identify opportunities for improvement.
Implementation guides
[TODO: Add/revise implementation guides]
Reusable solutions
[TODO: Add/revise reusable solutions]
3. Work in the open by default
3.1 Share evidence, research and decision making openly
[TODO: Add/revise introductory text]
Identify performance indicators for the service, including the 4 mandatory key performance indicators (KPIs) defined in the manual. Establish a benchmark for each metric and make a plan to enable improvements.
Setting performance indicators allows you to continuously improve your service by:
- learning its strengths and weaknesses
- using data to support improvements you make
(Digital Service Standard (UK))
Share your experiences with colleagues across the Government of Canada, other levels of government, clients and service providers. Sharing experiences and best practices helps to raise the overall service quality. It helps to reduce duplication of effort and save costs. So share ideas, share intentions, share failures and learn together. (Plan - Digital Design Playbook (ISED))
Checklist
[TODO: Add/revise checklist items]
- Procuring goods and services in the open is an important part of an open environment (Open Markets - Open First Whitepaper (GC))
- When appropriate, share your development process and progress publicly (Digital Services Playbook (US))
- Document and show your work. (Plan - Digital Design Playbook (ISED))
- If you are redesigning a service document the changes and show how these changes will enhance the client experience when using the service. (Plan - Digital Design Playbook (ISED))
- document where you're getting the data for your metrics (Digital Service Standard (UK))
- set up your analytics package to collect user journey data (Digital Service Standard (UK))
- Publish metrics externally (Digital Services Playbook (US))
- make sure all stakeholders are actively involved in promoting or supporting digital delivery of the new service (Digital Service Standard (UK))
- track people moving from using the offline service to the online one (Digital Service Standard (UK))
- Publish information on the effectiveness and efficiency of Automated Decision Systems annually on a website or service designated by the Treasury Board of Canada.
- When requested, provide information on the achievement of the expected results of the Automated Decision System and compliance with the Directive on Automated Decision-Making (draft) (GC) will be provided to the Treasury Board of Canada Secretariat.
- Publish a Service Level Agreement for each service
- Make an audit trail available for all transactions to ensure accountability and non repudiation
- Establish business and IT metrics to enable business outcomes
- Apply oversight and lifecycle management to digital investments through governance
Implementation guides
- Using data to improve your service: an introduction (Digital Service Standard (UK))
- Choosing digital analytics tools (Digital Service Standard (UK))
- Measuring digital take-up (Digital Service Standard (UK))
- Sharing data on the Performance Platform (Digital Service Standard (UK))
- Measuring success (Digital Service Standard (UK))
Similar resources
3.2 Make all non-sensitive data, information, and new code developed in delivery of services open to the outside world for sharing and reuse under an open licence
Make all source code open and reusable under an appropriate open source software licence, so that other developers can:
- benefit from your work and build on it
- learn from your experiences
- identify parts of your code for reuse which you might not have recognised yourself
This includes working in the open, sharing any and all data and information produced in developing the solution, and making the final solution available as open source software. Publishing your code and data from the beginning of your technology project or programme will encourage:
- clearer documentation, making it easier for your team to maintain the code, track changes to it and for other people to use it
- cleaner and well-structured code that is easier to maintain
- processes that will allow you to continuously publish code as it is written
- clarity around data that needs to remain protected and how that's achieved
- suggestions about how the code can be improved or where security can be improved
- others to contribute ideas as the project is in progress
Checklist
- Work in the open and make data and source code open and reusable
- Host source code publicly in an open internet source code repository
- Use an Open Source Initiative approved licence
- Offer users a mechanism to report bugs and issues, and be responsive to these reports
- Keep track of changes to it using version control
- Release the final results of Algorithmic Impact Assessments in an accessible format via Government of Canada websites and services designated by the Treasury Board of Canada Secretariat pursuant to the Directive on Open Government.
- Make available to the public all of the source code used for the Automated Decision Systems on the Open Resource Exchange.
- In cases where it is deemed that source code should not be disclosed, seek the approval of the Enterprise Architecture Review Board to exempt the disclosure. In these cases, the justification as to why code was not disclosed shall be published according to the process specified in the Directive on Open Government.
- Source code for systems that are classified SECRET or TOP SECRET are exempt from making it available to the public on the Open Resource Exchange.
- Ensure that all licenses required for the Automated Decision Systems are open licenses as listed in the Open Source Software Registry.
- Ensure that Canada maintains the right to have access to foreground intellectual property to respond to any legal challenges.
- Make source code open and reusable under an appropriate open source software license
- Expose public data to implement Open Data and Open Information initiatives
- Share code publicly when appropriate, and when not, share within the Government of Canada
Implementation guides
Similar resources
- Open Source Software Contribution (Open First Whitepaper (GC))
- Open Culture (Open First Whitepaper (GC))
- Logiciels libres et ouverts - Guide de référence (Québec)
- Politique du libre (Montréal)
- 8. Make all new source code open (Digital Service Standard (UK))
- 3. Be open and use open source (Technology Code of Practice (UK))
- DINSIC Open-source contribution policy (France)
- Open Source Software (Federal Source Code Policy (US))
- 13. Default to open (Digital Services Playbook (US))
- 18F Open Source Policy (US)
- 8. Make source code open (Digital Service Standard (AU))
4. Use open standards and solutions
4.1 Leverage open standards and embrace leading practices, including the use of open source software where appropriate
Build technology that uses open standards to ensure your system works and communicates with other products or systems, and can easily be upgraded and expanded.
Adopting and using open standards means you can:
- move between different technologies when you need to, avoiding vendor lock-in
- quickly and easily change your service when you need to
- increase compatibility with all stakeholders
- open up the range of companies you can purchase from as more of them are likely to use the same standard as you
- access a wider range of both open source and proprietary software vendors
Our choices for hosting infrastructure, databases, software frameworks, programming languages and the rest of the technology stack should seek to avoid vendor lock-in and match what successful modern consumer and enterprise software companies would choose today. In particular, digital services teams should consider using open source software, cloud-based, and commodity solutions across the technology stack, because of their widespread adoption and support by successful consumer and enterprise technology companies in the private sector.
Open source software (OSS) tends to use and help define open standards and publicly available specifications. OSS products are, by their nature, publicly available specifications, and the availability of their source code promotes open, democratic debate around their specifications, making them both more robust and interoperable.
Using open source software means you can benefit from:
- solving common problems with readily available open source technology
- more time and resource for customised solutions to solve the rare or unique problems
- lower implementation and running costs
Checklist
[TODO: Add/revise checklist items]
- Use open standards and open source software at every layer of the technology stack
- Factor in the use of open Standards and open source software when calculating total cost of ownership of a solutions including exit or transition costs
- Avoiding lock-in to any proprietary solutions where open source software and/or open standards are available
- Ensure that software can be deployed on a variety of commodity hardware types
- Cloud services are identified and evaluated as the principal delivery option when initiating IT investments, initiatives, strategies and projects.
- In considering how to manage security risks, departments and agencies must follow the GC Cloud Security Risk Management Approach and Procedures and the Direction on the Secure Use of Commercial Cloud Services: Security Policy Implementation Notice (SPIN).
- Departments and agencies may deploy solutions that have data-categorization requirements that fall outside of a particular cloud security control profile, as described in the Government of Canada Security Control Profile for Cloud-Based GC IT Services, with appropriate risk-mitigation measures that have been developed in consultation with GC security partners.
- To ensure, to the greatest extent possible, the GC’s continuous access to sensitive data, departments and agencies must comply with the Direction for Electronic Data Residency.
- To ensure business continuity and to manage risks, departments and agencies will develop an appropriate exit strategy before using cloud services.
- Departments and agencies should consider portability and interoperability of services when designing cloud-based solutions.
-
Ensure that the relevant employees are sufficiently trained in the design, function, and implementation of the Automated Decision System to be able to review, explain and oversee automated decision-making, as prescribed in the following:
- Level I: None
- Level II: Documentation on the design and functionality of the system
-
Level III:
- Documentation on the design and functionality of the system
- Training courses muse be completed.
-
Level IV:
- Documentation on the design and functionality of the system
- Reocurring training courses.
- A means to verify that training has been completed.
- Define program services as business capabilities to establish a common vocabulary between business, development, and operation
- Model business processes using Unified Modelling language to identify common enterprise processes
- Avoid lock-in and seek independence and substitutability where open source software or open standards are available
- Enforce this order of preference: open source first, then platform-agnostic COTS, then proprietary COTS, and lastly custom-built
- Enforce this order of preference: Software as a Service (SaaS) first, then Platform as a Service (PaaS), and lastly Infrastructure as a Service (IaaS)
- Enforce this order of preference: Public cloud first, then Hybrid cloud, then Private cloud, and lastly non-cloud (on-premises) solutions
- Design for cloud mobility and develop an exit strategy to avoid vendor lock-in
- Design for resiliency
- Ensure response times meet user needs, and critical services are highly available
- Support zero-downtime deployments for planned and unplanned maintenance
- Use distributed architectures, assume failure will happen, handle errors gracefully, and monitor actively
Implementation guides
- Socle Logiciels Libres (France)
- Logiciels libres et ouverts - Guide d'analyse de maturité (Québec)
- Logiciels libres et ouverts - Guide d'analyse du coût total de propriété (Québec)
- Working with open standards (Service Manual (UK))
- Choosing technology: an introduction (Service Manual (UK))
- Australian Government ICT Policy Guides and Procurement (AU)
- Australian Government Open Source Software Policy (AU)
- DoD Open Source Software FAQ (US)
- DoD Memorandum on Guidance Regarding Open Source Software (US)
- W3C Standards (W3C)
- OASIS Standards (oasis-open.org)
- Government of Canada Right Cloud Selection Guidance
- Government of Canada Security Control Profile for Cloud-Based GC IT Services
- Government of Canada Cloud Adoption Strategy
- Government of Canada White Paper: Data Sovereignty and Public Cloud
- Government of Canada Cloud Security Risk Management Approach and Procedures
- Direction on the Secure Use of Commercial Cloud Services: Security Policy Implementation Notice (SPIN)
- Direction for Electronic Data Residency
Reusable solutions
Similar resources
- Open Standards (Open First Whitepaper (GC))
- Open Source Software Use (Open First Whitepaper (GC))
- Natural Resources Canada Free and Open Source Software Licensing Primer (GC)
- Logiciels libres et ouverts - Guide de référence (Québec)
- Politique du libre (Montréal)
- 18F Open Source Policy (US)
- 3. Be open and use open source (Technology Code of Practice (UK))
- 4. Make use of open standards (Technology Code of Practice (UK))
- 9. Use open standards and common platforms (Digital Service Standard (Ontario))
- Three-Step Software Solutions Analysis (Federal Source Code Policy (US))
- 1. Comply with Government of Canada acts, policies, standards and directives (Plan - Digital Design Playbook (ISED)) (internal to GC only)
- 2. Reuse, improve and share technological solutions where appropriate (Do - Digital Design Playbook (ISED)) (internal to GC only)
- 9. Deploy in a flexible hosting environment (Digital Services Playbook (US))
4.2 Design for services and platforms that are seamless for Canadians to use no matter what device or channel they are using
In order to limit costs, avoid duplication of effort and provide a consistent client experience when using various services, the reuse and adaptation of existing technological solutions is encouraged. If the development of new solutions is required, consider the ability of others to reuse and adapt your work as this will provide additional value on an organizational level.
Using common, proven government solutions, approaches, and platforms will help the government:
- meet the needs of your users by building with proven solutions
- make users' experience of government more consistent, which generates trust
- save time and money by reusing things that are already available
Interoperability is a characteristic of a product or system, whose interfaces are completely understood, to work with other products or systems, present or future, in either implementation or access, without any restrictions. Interoperability should be ensured, via the use of open standards.
Application Program Interfaces (APIs) are a means by which business functionality is exposed digitally. They are building blocks that are critical to the successful delivery of government online digital services and expanding service delivery to third party providers. They can also enable greater interoperability between services, optimized experiences across devices and can even lead to innovative new services by enabling third party products to work seamlessly with Government of Canada systems.
Checklist
- Ensure content and functionality is optimized for a wide range of devices, including mobile devices and voice assistants, enabling users to be successful with their device of choice
- Build services that are API-centric services, which execute most, if not all, functionality through API calls (e.g., connecting frontend to backend through an API)
- Plan out API access from the beginning, designing services to be able to safely and securely expose functionality to other systems and the public.
- Design APIs to be compete but also minimal, ensuring the expected functionality is provided but with as few public members per class and as few classes as possible. This makes it easier to understand, remember, debug and change the API.
- Design APIs to have clear and simple semantics to make common tasks easy. Rare tasks should still be possible but not the focus. Avoid being overly general, optimizing specific use cases.
- Design APIs to be intuitive so that a semi-experienced user can be successful with minimal assistance from the documentation and programmers can easily understand code that uses the API.
- Design APIs to be easy to memorize by implementing a consistent and precise naming convention. Use plain language and recognizable patterns and concepts, avoiding abbreviations where possible.
- Work across the entire application lifecycle, from development and testing to deployment and operations
- Expose all functionality as services
- Use microservices built around business capabilities. Scope each service to a single purpose
- Run each service in its own process and have it communicate with other services through a well-defined interface, such as an HTTPS-based application programming interface (API)
- Run applications in containers
- Leverage enterprise digital exchange components such as the GC Service Bus, Digital Exchange Platform, and the API Store based on fit-for-use
- Leverage and reuse existing solutions, components, and processes
- Select enterprise and cluster solutions over department-specific solutions
- Achieve simplification by minimizing duplication of components and adhering to relevant standards
Implementation guides
- Canada.ca Content and Information Architecture Specification
- Canada.ca Content Style Guide
- Designing for different browsers and devices (Digital Service Standard (UK))
- API technical and data standards (Government Digital Service (UK))
- Developing cross-government API data and technical standards (Government Digital Service blog (UK))
- How to measure digital take-up (Service Manual (UK))
- Encouraging people to use your digital service (Service Manual (UK))
Reusable solutions
Similar resources
- 3. Be consistent (Digital Service Standard (Ontario))
- 6. Consistent and responsive design (Digital Service Standard (AU))
- 4. Design the service from start to finish (Digital Service Standard (Ontario))
- 2. Address the whole experience, from start to finish (Digital Services Playbook (US))
- 14. Encourage everyone to use the digital service (Digital Service Standard (UK))
- 11. Encourage people to use digital services (Digital Service Standard (Ontario))
5. Address security and privacy risks
Canadians who use government services must have confidence that:
- any information they provide is handled and stored appropriately
- they know how their information will be used by government
- they can easily retrieve information they provide
- that their privacy is protected while they use the service, and afterwards
- the system they’re using is safe and secure
If a service cannot guarantee confidentiality, integrity and availability of the system, people will not use it. Effective cyber and IT security is an essential enabler of digital transformation. Securing #GCDigital requires the delivery of government services that are safe, secure and trusted by Canadians.
5.1 Take a balanced approach to managing risk by implementing appropriate privacy and security measures
All organizations face risks, no matter the size, yet one size does not fit all when it comes to risk management. Each IT organisation has to make difficult decisions around how much time and money to spend protecting their technology and services. An understanding of the users, data and threats that affect the service will help to inform this risk-based approach to support the delivery of a usable and secure system. Appropriate steps must be taken to identify, assess and understand security and privacy risks to GC sensitive and protected data and the systems that process this data.
A key goal of risk management is to inform and improve these decisions. Making it easy for those responsible for risk management decisions to have access to (and understand) the information they require is important for the effective communication of risks. The effective communication of risk management information helps organizations to direct and control risk management activities. Accepting that technology and security risks will be realised and understanding what the organisation will do to minimise damage, continue to operate, and make improvements based on lessons learned.
Assessing cyber risks cannot be done in isolation. It must be assessed while considering potential impacts on other parts of an organization, and interactions with other elements such as financial risk and safety. Understanding what an organization cares about, and why it's important, will help to prioritize where to invest when implementing appropriate privacy and security measures into your design with minimal user impact. The level of investment in privacy and security should be based on the perceived or actual value placed on the assets or information you are protecting. When considering the balance of controls, account for the cost of lost trust - the effort to rebuild trust, should your service be compromised.
Include security and privacy in innovation
Canadians’ support for open data and digital services is enhanced when their privacy rights are protected: transparency and respect for privacy are complementary goals. The shift to digital government offers opportunities to strengthen privacy rights and safely share more data that can benefit society. Innovation must be matched by conscious responsibility regarding stewardship of users’ personal information and data.
Embedding privacy protection in the design of digital applications or open data increases political legitimacy and public confidence, and privacy safeguards are a necessary condition for a successful shift to a digital Government of Canada. Digital services also have the potential to enhance privacy rights, for example, by facilitating access to and correction of personal information.
Organizations have a responsibility to ensure that the data under their care remains protected at all times, including in the process of sharing with external partners and within their own network. This requires an understanding what data is worth protecting, manage who and what can access it, and build effective defenses that both support innovation and protect the investment made in services and associated assets.
The law and governance in cyberspace is not the sole responsibility nor under the authority of any one specific government, or group; boundary-less services require a fulsome understanding of any jurisdiction in which you operate.
Canadians want to have confidence that government digital services are designed to meet the laws and regulations stipulated in multiple acts protecting the confidentiality, integrity and accessibility of systems and information. Develop a legal and regulatory view of the department for the purposes of designing secure information systems through identifying the business needs for security. A business need for security is any protection or compliance requirement that ensures the confidentiality, integrity or availability of a business activity or information assets supporting a business activity. Business needs for security can also be derived from departmental missions, objectives, priorities, the need to preserve the organization's image and reputation, and various obligations that may have been contracted.
Canadians want to have confidence that government digital services are designed to meet the laws and regulations stipulated in multiple acts protecting the confidentiality, integrity and accessibility of systems and information. Develop a legal and regulatory view of the department for the purposes of designing secure information systems through identifying the business needs for security. A business need for security is any protection or compliance requirement that ensures the confidentiality, integrity or availability of a business activity or information assets supporting a business activity. Business needs for security can also be derived from departmental missions, objectives, priorities, the need to preserve the organization's image and reputation, and various obligations that may have been contracted.
Checklist
- Document the approach to security and risk management including the types of information and optimum level of documentation, necessary to make timely and effective decisions.
- Ensure that you identify and categorize information based on the degree of injury that could be expected to result from a compromise of its confidentiality, integrity and availability. Determine the type of information collected, how it should be secured, how long it is kept, and how it may be used and shared.
- At the start of designing a new service or feature, the team lead should engage the appropriate privacy, information management, and legal officer(s) to ensure that solutions comply with all requirements for the collection, sharing and protection of personal information.
- Determine what you absolutely need to protect through risk assessment and threat modeling – can you clearly identify what components are essential to the success of your service? Not all assets need the same level of protection. #pragmaticsecurity
- Determine what you’re willing to pay for security – can you adequately protect your assets with the desired level of investment? Is the cost of protection in excess of the value of your assets? #pragmaticsecurity
- Complete a preliminary privacy impact assessment (PPIA) or privacy impact assessment (PIA) if personal information is involved in the service.
- Establish a cycle of re-evaluation to ensure what you’re protecting is actually what you need to protect and make improvements based on lessons learned.
- Develop your system with the 7 principles of Privacy by Design (Information and Privacy Commissioner of Ontario) in mind.
- Ensure your service complies with Canadian security and privacy legislation, government policy instruments, and departmental security policies at all stages.
- Identify and understand the jurisdictional requirements of where your digital service operates, and where your stakeholders are.
- Develop, update, and maintain written cybersecurity policies and procedures, including on governance by both service and organizational management.
- Develop, publish and maintain training and awareness material as required, to establish secure service-user behaviours
- Make sure the service limits access to physical and logical assets and associated facilities to authorized users, processes, or devices consistent with the risk of unauthorized access.
- Maintain logs of user access and system interactions to fully trace a user as they traverse each part of the system
- Implement layered defenses to reduce exposure to cyber threats with increased awareness and understanding to proactively manage such threats
- Plan recurring interactions with the business and information risk teams to ensure ongoing alignment.
- Integrate a security advisor into the delivery team to support IT security risk management throughout the full delivery of the service.
- Document the protective measures implemented to enable the secure processing and sharing of data and information across government
- Document how the service manages information and records (data) in order to protect their confidentiality and integrity, and ensure their availability.
- Ensure all APIs have appropriate authentication and that only authorized users/services are able to access the information; “open data” APIs are explicitly configured to allow access by all by default.
{: .dpgn-standards-hide .dpgn-stage-beta} Note: Beta Stage includes all elements from the previous Alpha stage, plus the following:
- Where collecting personal information, inform users about privacy rights and protections, and about their right to access and correct their own personal information.
- Use appropriate de-identification strategies to minimize the risk of disclosing personal information.
- Establish a data access audit process to provide assurance to users that their data has not been accessed in an unauthorized manner.
- Incorporate privacy safeguards into partnership and data sharing agreements.
- Ensure that privacy breach protocol is implemented and understood. Federal institutions are required to notify the Office of the Privacy Commissioner of Canada (OPC) and the Treasury Board of Canada Secretariat (TBS) of all material privacy breaches and of the mitigation measures being implemented, if the breach involves sensitive personal information and could reasonably be expected to cause serious injury to the individual.
- Establish terms of services to ensure users understand how their data will be used and how it will be accessed
- Ensure your service has properly documented event management processes, in the event of a data breach or compromise of the integrity of your systems.
- Provide users adequate information (Terms and Conditions / Privacy Agreement) to ensure they fully understand the authority they are providing to 3rd party services.
- Ensure all APIs are developed in alignment with secure connection guidance; APIs should be accessed over HTTPS.
{: .dpgn-standards-hide .dpgn-stage-beta} Note: Live Stage includes all elements from the previous Alpha and Beta stages, plus the following:
- Establish agreements with 3rd parties who may benefit from receiving data from your service in accordance with guidance such as the TBS Guidance on Preparing Information Sharing Agreements Involving Personal Information, to ensure they will treat your data with appropriate care.
- Conduct risk assessments throughout the development of the system and ensure appropriate safeguards to be applied, as per the Policy on Government Security.
- Implement security across all architectural layers
- Categorize data properly to determine appropriate safeguards
- Perform a privacy impact assessment (PIA) when personal information is involved
- Balance user and business needs with proportionate security measures
Implementation guides
- Information Technology Policy Implementation Notices (ITPIN)
- GC Security Policy Implementation Notices (internal to Government of Canada)
- Security and Identity Management Policy Instruments
- Security Resource Centre
- OPC guidance for federal institutions (Office of the Privacy Commissioner of Canada (OPC))
- Guidance on Preparing Information Sharing Agreements Involving Personal Information
- Guidance Document: Taking Privacy into Account Before Making Contracting Decisions
- Guidelines for obtaining meaningful consent (Office of the Privacy Commissioner of Canada (OPC))
- Guidance on inappropriate data practices: Interpretation and application of subsection 5(3) (Office of the Privacy Commissioner of Canada (OPC))
- Direction for Electronic Data Residency
- Canadian Criminal Code, Unauthorized use of computer (Sect 342.1/342.2)
- Canadian Criminal Code, Mischief in relation to computer data (Sect 430(1.1))
Reusable solutions
Similar resources
- Managing Risk through Digital Trust (CSO)
- 10. Embed privacy and security by design (Digital Service Standard (Ontario))
- Privacy Act (Office of the Privacy Commissioner of Canada)
- Personal Information Protection & Electronic Documents Act (PIPEDA) (Office of the Privacy Commissioner of Canada)
- Privacy by Design (Wikipedia)
- 7. Understand security and privacy issues (Digital Service Standard (UK))
- 11. Manage security and privacy through reusable processes (Digital Services Playbook (US))
- 5. Make it secure (Digital Service Standard (AU))
5.2 Make security measures frictionless so that they do not place a burden on users
Digital services need to be designed to provide a rich and streamlined user experience, while also ensuring that sensitive information is protected within a processing environment that remains secure throughout its lifecycle. Service owners must be mindful that users will often find a way to circumvent burdensome security measures for convenience. It is important to make security seamless and frictionless by designing security measures that enable the user experience, through streamlined user-interface and features with which they interact, and to help improve the overall posture to prevent workarounds. Leveraging enabling services such as digital identity will help to provide users with access to digital services from their preferred device.
Services must be designed to resist attacks. However, security is not one-size fits all, and appropriate defenses are best developed to address the “soft spots” in your systems. By thinking about situations in which you could be compromised, it will help to identify and eliminate design issues. Undertaking a defense-in-depth approach provides layered security measures to help prevent against evolving and existing threats. It allows security to be addressed at multiple layers, hardening your systems as required, while providing unimpeded operations in others.
Integrating security from the outset and “shifting security left” in the service design will help to address security and privacy risks earlier in the development process, allowing teams to identify security needs as components are developed, reducing the cost and burden of changes later. A process of continuous review and improvement should be built into the development and maintenance of the service to support the selection of proportionate security measures that will protect against cyber attacks.
Checklist
- Implement an Identity and Access Management (IAM) solution that aligns with trusted digital identity frameworks, such as the Pan-Canadian Trust Framework, for security commensurate to service sensitivity, ID portability across platforms, and authentication and authorization agility.
- Where possible, provide users easily-accessible means of authentication (e.g.: biometrics) to your service - take advantage of improvements in consumer technologies.
- Use deployment scripts to ensure configuration of production environment remains consistent and controllable.
- Test and certify components in each layer of the technology stack for security vulnerabilities, and then to re-use these same pre-certified components for multiple services.
- Ensure all APIs are developed in alignment with secure connections requirements from TBS and CSE; all APIs should be accessed over HTTPS only.
- Ensure all APIs have appropriate authentication and that only authorized users/services are able to access the information; “open data” APIs are explicitly configured to allow access by all by default
- Ensure your digital service offers a quick and easy reporting mechanism, that enables the process of security vulnerability disclosures; alerts should be treated with care and consideration equal to internal evaluations.
- Develop robust IT Continuity plans, including infrastructure and data backups, to ensure that your digital service is able to return to operational status with minimal disruption.
- Document the plan and process for technical updates and support for services/system software
- Leverage existing services and frameworks such as the Pan-Canadian Trust Framework to foster multi-jurisdictional service delivery.
Implementation guides
Reusable solutions
[TODO: Add/revise reusable solutions]
Similar resources
6. Build in accessibility from the start
Building in accessibility from the start is key to ensuring that your programs, service, tools and applications can be used by everyone, including those living with a permanent disability, but also by those who may have a temporary limitation or disability due to illness, accident, environmental changes or technological difficulties.
Worldwide, over a billion people, about 15% of the world's population, have some form of disability. Between 110 million and 190 million adults have significant difficulties in functioning (World Health Organization).
In Canada:
- 14% of the population (4.9 million Canadians) identify as a person with a disability
- 30% of the population have a disability if you include “invisible” disabilities (e.g., colour blindness, cognitive, mental health or chronic pain-related)
- 50% of the population have a disability if you take into account age-related impairments (e.g., low vision, low hearing or cognitive impairments)
- At least once in a person’s lifetime, they may also have a temporary disability brought on through accident, illness, repetitive strain or lifecycle changes (pregnancy).
Curb cuts are intended to help wheelchairs get up on sidewalks, but they also help bicyclists, parents with strollers, delivery people, and many other non-disabled groups. This benefit to others became known as the “Curb-Cut Effect”.
When inclusive and accessible design is in place, it is a benefit for all and seamlessly meets the needs of individuals across the board, including people without disabilities.
Sometimes people are in situations that limit their ability to hear, see, use their hands, concentrate, understand instructions, etc. Sometimes they are using devices that have limitations in size, input interface, etc. For example:
- Watching TV in a noisy environment (limits one's ability to hear, but closed captioning helps by conveying audio messages through text)
- Driving limits one's ability to concentrate on multiple things and limits the use of their hands. When drivers are lost, they often rely upon their smartphone for directions. To avoid getting tickets for distracting driving, taking one's eyes off the road or taking one's hands off the wheel, drivers can use voice recognition to ask the smartphone directions and have the directions read aloud.
- Walking around with small children (e.g., curb cuts for stollers, hands full, concentrate, understand instructions)
- Having one's hands full may require relying on smart speakers for instructions (e.g., getting recipe details while cooking, getting step-by-step instructions while fixing things around the home)
- Using a small mobile phone in bright sunlight, where the glare reduces visibility, while browsing the Web with only one hand (e.g., carrying a bag in the other hand) on a slow Internet connection
- Having to interact in another language
These limitations are sometimes mentioned as an example of how accessible design helps everyone, including people without disabilities.
- findability; when content is accessible it increases its findability
- understandable content for all users that is also machine-readable
6.1 Services should meet or exceed accessibility standards
The Government of Canada is committed to ensuring that a high level of accessibility is applied uniformly across its service delivery channels. Technologies and standards are constantly evolving and accessibility plays a major role in making the Government of Canada more effective and inclusive. A more consistent, convenient, clear, and easy user experience when using government services online builds trust.
Development of accessible (regardless of ability, device or environment) digital services enhances the overall experience for everyone by improving and simplifying the overall design.
Checklist
[TODO: Add/revise checklist items]
- meet the Standard on Web Accessibility
- Design the service to be as easy to use as possible. Usability is critical to making a service accessible for people with disabilities and limitations.
- conduct research and testing to ensure the service is accessible to people of all abilities no matter how they access the service (Digital Service Standard (Ontario))
- Use ongoing research, testing and analytics to continually assess and improve the accessibility of the service.
- Ensure design and development resources have the knowledge and expertise to build accessible services and to resolve accessibility issues.
- train development staff on use of keyboard only navigation and ensure that new features are regularly tested (Digital Service Standard (Ontario))
- Ensure testing and quality assurance resources have the knowledge and expertise to identify accessibility issues.
- Ensure support resources are trained to assist persons with disabilities with completing tasks and accessing information.
- Make it easy for all users (including persons with disabilities) to provide feedback, address problems, and request support for using the service.
- Design the service to work with a variety of browsers and devices, including assistive devices.
- Ensure testing processes include testing for conformance to WCAG 2.0 level AA and testing with a variety of browsers, devices and assistive devices.
- ensure when technology platforms are considered that there is transparency about known WCAG 2.0 AA issues and efforts to implement ATAG 2.0 - Part A & B (Digital Service Standard (Ontario))
-
Designing for users on the autistic spectrum (Dos and don'ts on designing for accessibility (Government Digital Service blog (UK)))
-
Do:
- use simple colours
- write in plain English
- use simple sentences and bullets
- make buttons descriptive - for example, Attach files
- build simple and consistent layouts
-
Don't:
- use bright contrasting colours
- use figures of speech and idioms
- create a wall of text
- make buttons vague and unpredictable - for example, Click here
- build complex and cluttered layouts
-
Do:
-
Designing for users of screen readers (Dos and don'ts on designing for accessibility (Government Digital Service blog (UK)))
-
Do:
- describe images and provide transcripts for video
- follow a linear, logical layout
- structure content using HTML5
- build for keyboard use only
- write descriptive links and heading - for example, Contact us
-
Don't:
- only show information in an image or video
- spread content all over a page
- rely on text size and placement for structure
- force mouse or screen use
- write uninformative links and heading - for example, Click here
-
Do:
-
Designing for users with low vision (Dos and don'ts on designing for accessibility (Government Digital Service blog (UK)))
-
Do:
- use good contrasts and a readable font size
- publish all information on web pages (HTML)
- use a combination of colour, shapes and text
- follow a linear, logical layout -and ensure text flows and is visible when text is magnified to 200%
- put buttons and notifications in context
-
Don't:
- use low colour contrasts and small font size
- bury information in downloads
- only use colour to convey meaning
- spread content all over a page -and force user to scroll horizontally when text is magnified to 200%
- separate actions from their context
-
Do:
-
Designing for users with physical or motor disabilities (Dos and don'ts on designing for accessibility (Government Digital Service blog (UK)))
-
Do:
- make large clickable actions
- give form fields space
- design for keyboard or speech only use
- design with mobile and touch screen in mind
- provide shortcuts
-
Don't:
- demand precision
- bunch interactions together
- make dynamic content that requires a lot of mouse movement
- have short time out windows
- tire users with lots of typing and scrolling
- View poster for physical or motor disabilities
-
Do:
-
Designing for users who are D/deaf or hard of hearing (Dos and don'ts on designing for accessibility (Government Digital Service blog (UK)))
-
Do:
- write in plain English
- use subtitles or provide transcripts for video
- use a linear, logical layout
- break up content with sub-headings, images and videos
- let users ask for their preferred communication support when booking appointments
-
Don't:
- use complicated words or figures of speech
- put content in audio or video only
- make complex layouts and menus
- make users read long blocks of content
- don't make telephone the only means of contact for users
-
Do:
-
Designing for users with dyslexia (Dos and don'ts on designing for accessibility (Government Digital Service blog (UK)))
-
Do:
- use images and diagrams to support text
- align text to the left and keep a consistent layout
- consider producing materials in other formats (for example, audio and video)
- keep content short, clear and simple
- let users change the contrast between background and text
-
Don't:
- use large blocks of heavy text
- underline words, use italics or write capitals
- force users to remember things from previous pages - give reminders and prompts
- rely on accurate spelling - use autocorrect or provide suggestions
- put too much information in one place
-
Do:
- Conform to both accessibility and official languages requirements
Implementation guides
- Dos and don'ts on designing for accessibility (Government Digital Service blog (UK))
- Web Content Accessibility Guidelines (WCAG) 2.1 (W3C)
- Accessible Rich Internet Applications (WAI-ARIA) 1.1 (W3C)
- Diversity of Web Users (W3C)
- Designing for Inclusion (W3C)
- Mobile Accessibility: How WCAG 2.0 and Other W3C/WAI Guidelines Apply to Mobile (W3C)
- Guidance on Applying WCAG 2.0 to Non-Web Information and Communications Technologies (WCAG2ICT) (W3C)
- Standard on Web Accessibility
- 18F Accessibility Guide (18F (US))
- The A11y Project (a11yproject.com)
- Making your service accessible (Service Manual (UK))
Reusable solutions
Similar resources
6.2 Users with distinct needs should be engaged from the outset to ensure what is delivered will work for everyone
[TODO: Add/revise introductory text]
Involving users early in projects helps you understand real-world accessibility issues, such as how people with disabilities and older people use the web with adaptive strategies and assistive technologies.
Involving users early helps you implement more effective accessibility solutions. It also broadens your perspective in a way that can lead you to discover new ways of thinking about your product that will make it work better for more people in more situations.
This applies when designing and developing:
- Websites and web applications
- Browsers, media players, and assistive technologies
- Authoring tools such as content management systems (CMS), blog software, and WYSIWYG editors
- Accessibility standards and policies
- Web technologies and technical specifications, such as HTML
(Involving Users in Web Projects for Better, Easier Accessibility (W3C))
Checklist
[TODO: Add/revise checklist items]
-
Including Users to Understand the Issues (Involving Users in Web Projects for Better, Easier Accessibility (W3C))
-
As early as possible in your project:
- Learn the basics of how people with disabilities use the web by reading online resources and watching videos.
- Find people with disabilities, with a range of characteristics. See Getting a Range of Users and Working with Users below.
- Early on, learn about general issues related to what you are developing, e.g., websites, web tools, standards, or other products. Ask people to show you websites or related products that work well for them. Then, ask them to show you problems in products that do not work well. Ask lots of questions to help you understand the accessibility issues.
-
As early as possible in your project:
-
Including Users in Implementation (Involving Users in Web Projects for Better, Easier Accessibility (W3C))
-
For example, for websites, web applications, and web tools:
- When you are considering a specific design aspect, such as expanding/collapsing navigation, find other products that are already doing it and have users explore with you what works well and what does not.
- Throughout your design and development, ask users to review prototypes. Give them specific tasks to complete and see how the different aspects of the design and coding could be improved. Ask lots of questions.
-
For example, for websites, web applications, and web tools:
-
Carefully Consider Input (Involving Users in Web Projects for Better, Easier Accessibility (W3C))
- Caution: Carefully consider all input and avoid assuming that input from one person with a disability applies to all people with disabilities. A person with a disability does not necessarily know how other people with the same disability interact with the web, nor know enough about other disabilities to provide valid guidance on other accessibility issues. Getting input from a range of users is best.
-
Getting a Range of Users (Involving Users in Web Projects for Better, Easier Accessibility (W3C))
- People with disabilities are as diverse as any people. They have diverse experiences, expectations, and preferences. They use diverse interaction techniques, adaptive strategies, and assistive technology configurations. People have different disabilities: auditory, cognitive, neurological, physical, speech, and visual — and some have multiple disabilities. Even within one category, there is extreme variation; for example, "visual disability" includes people who have been totally blind since birth, people who have distortion in their central vision from age-related degeneration, and people who temporarily have blurry vision from an injury or disease.
- Include users with a variety of disabilities and user characteristics. Most projects have limited time and budget and cannot include many different users. Selecting the optimum number of users with the best suited characteristics can be difficult. There are resources on the web that provide guidance on selecting participants with disabilities; for example, determining participant characteristicslinks off WAI website and finding participants with disabilitieslinks off WAI website.
-
Users' Experience Interacting with the Web (Involving Users in Web Projects for Better, Easier Accessibility (W3C))
- A primary consideration in selecting users is their experience interacting with the web. For example, some assistive technologies (AT) are complicated and difficult to learn. A user with insufficient experience may not know how to use the AT effectively. On the other hand, a very advanced user might know uncommon work-arounds to overcome problems in a website that the "average" user would not be able to handle.
- In the early stages when you are first learning how people with disabilities interact with the web, it is usually best to get people with a fairly high experience level.
-
Working with Users (Involving Users in Web Projects for Better, Easier Accessibility (W3C))
- Follow common practices for working with people informally and formally, for example:
- Develop appropriate relationships with your users. For example, spending time talking informally over lunch may help you work together more comfortably.
- Ensure informed consent and other research ethics. For example, participants in studies should be told that they are free to stop at any time.
- Treat people with disabilities and older users with the respect you would any other users. For example, respect their time and provide appropriate compensation.
-
Combine User Involvement with Standards (Involving Users in Web Projects for Better, Easier Accessibility (W3C))
- While including users with disabilities and older users with accessibility needs is key to making your accessibility efforts more effective and more efficient, that alone cannot address all issues. Even large projects cannot cover the diversity of disabilities, adaptive strategies, and assistive technologies. That is the role of accessibility standards.
- For websites and web applications, using comprehensive standards such as Web Content Accessibility Guidelines (WCAG) 2.0 helps ensure that you address all issues. Combine user involvement with evaluating conformance to WCAG to ensure that accessibility is provided to users with a range of disabilities and situations.
- For authoring tools such as content management systems (CMS), blog software, and WYSIWYG editors, follow Authoring Tool Accessibility Guidelines (ATAG).
- For browsers, media players, and other 'user agents', follow User Agent Accessibility Guidelines (UAAG).
Implementation guides
- Involving Users in Web Projects for Better, Easier Accessibility (W3C)
- Diversity of Web Users (W3C)
- Designing for Inclusion (W3C)
- Planning and Managing Web Accessibility (W3C)
- Determining participant characteristics (uiaccess.com)
- Finding participants with disabilities (uiaccess.com)
- Involving Users in Evaluating Web Accessibility (W3C)
- Analyzing Accessibility Issues (W3C)
- Drawing Conclusions and Reporting (W3C)
- Interacting with People with Disabilities (uiaccess.com)
- Assistive Technology and Location (uiaccess.com)
- The RESPECT Code of Practice (respectproject.org)
- Just Ask: Integrating Accessibility Throughout Design (uiaccess.com)
- Incorporating Accessibility Early and Throughout (uiaccess.com)
- A checklist for digital inclusion - if we do these things, we’re doing digital inclusion (Government Digital Service Blog (UK))
- Consider the range of people that will use your product or service (Government Digital Service blog (UK))
Reusable solutions
7. Empower staff to deliver better services
7.1 Make sure that staff have access to the tools, training and technologies they need
We need to evaluate and determine the tools and systems that we will use to build and host the service, as well as prepare for the operation and measurement of the service. We will also need to determine how to procure or build these tools and systems efficiently.
We must seek out and use modern methods such as Agile to ensure that the digital services we own and implement are innovative. We must ensure that our teams are consistently engaged through various opportunities to learn and participate in knowledge-sharing, and that we are successfully collaborating with both internal and external partners.
Checklist
- Ensure team members can achieve their full potential; allow them to participate in knowledge-sharing across the government and ensure they receive the training they require.
- Seek the best results for Canadians by encouraging evidence-based decision-making and extensive collaboration.
- Develop and maintain a core set of principles that prioritize commitment to users.
- Determine the team's decision making/approval processes, support practices (e.g., pairing, peer review), and working practices (e.g., stand-ups, sprint planning, etc.).
- Determine how the team will apply modern methods for project management and design (Agile, co-design, service design).
- Identify and maintain effective methods for engaging users, stakeholders, and advisors/experts.
- Be prepared to adapt to a frequently changing digital environment that is consistently evolving (e.g., emerging tech, emerging threats/security concerns, increased use of artificial intelligence, etc.), and act accordingly.
- Maintain a modern workplace with the professional development, IM-IT tools, and environment necessary for employees to do their jobs effectively.
- Check for risks or restrictions associated with the tools and avoid any contracts that will prevent you from changing/improving your service.
- Ensure that the system you build will be both sustainable and easily maintainable once the service is live.
- Determine technical choices and programming tools.
- Determine how you will net value for money spent on tools and how you will monitor your service.
- Determine how you will manage limits placed on the service.
- Document purchases and value for money, how you will monitor the service, support arrangements, and the specifics and reasons behind third-party decisions.
- Document tech stacks and development toolchain changes made during beta and why.
- Document how you are continuing to get the value for money, how you will check the health of the service, support arrangements that have been set up, and the specifics and reasons behind outsourced decisions.
Implementation guides
Similar resources
7.2 Empower the team to make decisions throughout the design, build, and operation of the service
We must empower staff to share power and control over projects. This can involve assigning tasks, setting priorities, troubleshooting problems, and assessing issues. As a whole, this will mean balancing a recognition of talent with a frank assessment of results.
We must always be searching for ways to improve service delivery, through review of business processes, user testing, and commitment to best practices for service delivery when designing or redesigning digital services. We need to maintain a strong working relationship with experienced contracting and budgeting officers to facilitate a smooth contracting process.
Every service must have a person that will hold the designate authority to make critical decisions (product owner). Key responsibilities will include managing how the project's vision is articulated, stakeholder/vendor relationships, efficiency, and accountability. This individual will also determine features of the service.
Checklist
- Encourage learning, even if it is through well-intended efforts that result in mistakes.
- Remember to celebrate employees' hard-work, including successes and failures.
- Focus on a strong governance approach in the workplace.
- Emphasize sustainability and innovation.
- Promote information-sharing to build better decision-making capacity.
- Have clear objectives, milestones, and defined roles to assist employees as they navigate interactions with users.
- Allow employees to grow into personal roles, and avoid the limits of hierarchy within team member/manager dynamics.
- Seek opportunities to connect and integrate with relevant existing services to simplify the experience for users and clients.
- Use up-to-date IM-IT management practices and tools.
- Ensure that the budget includes research and discovery activities, as well as prototyping processes.
- Maintain contracts that are structured to request regular deliverables and hold the vendors accountable accordingly.
- Ensure that contracts commit to the evaluation of open source solutions.
- Ensure contracts specify that the software and data to be generated by third parties will remain under our control, with the potential to use, reuse, and release to the public if appropriate and lawful.
- Maintain that contracts specify a warranty period, a transition of services period, and a transition-out plan.
- Use business intelligence and business analytics.
- Identify the policy and data constraints that may restrict the project.
- Build the digital platform when the potential for improvements and relevant considerations are accounted for.
- A product owner is identified and all stakeholders agree that the owner has the authority necessary to assign tasks and make critical decisions regarding product features and technical implementation.
- The product owner has the necessary qualities to fill the role, including apt knowledge and seniority.
- The product owner has technical experience and a project management background, with strong knowledge of the procurement/development process.
- The product owner has a work plan complete with budget estimates and identified funding sources.
- The product owner's relationship with the contracting officer/development team is strong and able to endure potential challenges throughout the process.
- There is a plan to conduct regular user testing to deduce the success of the service's ability to meet the users' needs.
- The particular data sources and data-capture methods are appropriately identified; a roadmap highlights performance analysis and responsibilities for the identification of actionable data insights.
- The service can be improved through iteration with the necessary resources and flexibility.
Similar resources
- Forbes - 6 ways to empower others to succeed
- 5. Structure budgets and contracts to support delivery (Digital Services Playbook (US))
- 1. Better services rather than new websites. Optimize business processes before designing technological solutions. (Do - Digital Design Playbook (ISED)) (internal to Government of Canada)
- 6. Assign one leader and hold that person accountable (Digital Services Playbook (US))
8. Be good data stewards
[TODO: Add/revise introductory text]
The Government of Canada is increasingly looking to utilise technology and automated systems to make, or assist in making, administrative decisions to improve service delivery. It is committed to doing so in a manner that is compatible with core administrative law principles such as transparency, accountability, legality and procedural fairness.
8.1 Collect data once from users only once and reuse wherever possible
[TODO: Add/revise introductory text]
Checklist
[TODO: Add/revise checklist items]
Implementation guides
[TODO: Add/revise implementation guide items]
Reusable solutions
[TODO: Add/revise reusable solutions]
8.2 Ensure that data is collected and held in a secure way so that it can easily be reused by others to provide services
[TODO: Add/revise introductory text]
Checklist
- Ensure data is well-structured, intuitive and in a format that is easy to integrate and reuse by others
- Design data to have clear and simple semantics to make common tasks easy. Rare tasks should still be possible but not the focus. Avoid being overly general, optimizing specific use cases.
- Design data to be intuitive so that a semi-experienced user can be successful with minimal assistance from the documentation and programmers can easily integrate and reuse it.
- Design data with a consistent and precise naming convention. Use plain language and recognizable patterns and concepts, avoiding abbreviations where possible.
- Monitor the outcomes of Automated Decision Systems on an ongoing basis to safeguard against unintentional outcomes and to ensure compliance with institutional and program legislation, as well as the Directive on Automated Decision-Making (draft) (GC).
Implementation guides
Reusable solutions
9. Design ethical services
[TODO: Add/revise introductory text]
9.1 Make sure that everyone receives fair treatment
[TODO: Add/revise introductory text]
Checklist
[TODO: Add/revise checklist items]
- Complete an algorithmic impact assessment, prior to the production of any Automated Decision System.
- Ensure that the algorithmic impact assessment remains up to date and accurately reflects the functionality of the Automated Decision System.
Implementation guides
[TODO: Add/revise implementation guide items]
Reusable solutions
[TODO: Add/revise reusable solutions]
9.2 Comply with ethical guidelines in the design and use of systems which automate decision making (such as the use of artificial intelligence)
[TODO: Add/revise introductory text]
Checklist
[TODO: Add/revise checklist items]
-
Apply the following relevant requirements as determined by the algorithmic impact assessment:
-
Approval Requirement:
- Level I: None
- Level II: Government of Canada Enterprise Architecture Review Board
- Level III: Government of Canada Enterprise Architecture Review Board
-
Level IV:
- Government of Canada Enterprise Architecture Review Board
- Requires specific authority from Treasury Board
-
Approval Requirement:
- Consult with the institution’s legal services unit, to ensure that the use of the Automated Decision System System is compliant with applicable legal requirements.
- Before going into production, develop the appropriate processes to ensure that training data is tested for unintended biases and other factors that may unfairly impact outcomes.
-
Notify affected individuals that the decision rendered will be undertaken in whole or in part by a Automated Decisions Systems as prescribed in the following:
- Level I: None
- Level II: Plain language notification listed on the program or service website.
-
Level III:
- Plain language notification listed on the program or service website.
- If the service involves an online application, the notice must be made at the time of the application.
-
Website must link to additional information where information about the system is provided, including:
- The role that the Automated Decisions System, has within the decision process,
- A description of the training data, or a link to the anonymized training data if this data is publicly available, and
- A description of the criteria used for making the decision, including relevant business rules.
-
Level IV:
- Plain language notification listed on the program or service website.
- If the service involves an online application, the notice must be made at the time of the application.
-
Website must link to additional information where information about the system is provided, including:
- The role that the Automated Decisions Syste, has within the decision process,
- A description of the training data, or a link to the anonymized training data if this data is publicly available, and
- A description of the criteria used for making the decision, including relevant business rules.
-
Provide a meaningful explanation to affected individuals of how and why the decision was made as prescribed in the following:
-
Explanation Requirement for Recommendation:
- Level I: None
- Level II: None
- Level III: Meaningful explanation provided upon request based on machine or human review.
-
Level IV:
- Meaningful explanation, including the variables in the decision, provided with the decision rendered.
- Explanation can be human or machine generated.
-
Explanation Requirement for Decisions:
- Level I: An explanation provided upon request based on machine or human review. This could include a Frequently Asked Questions section of a website.
- Level II: Meaningful explanation provided upon request based on machine or human review.
-
Level III:
- Meaningful explanation, including variables used in the decision, provided with the decision rendered.
- Explanation can be human or machine generated.
-
Level IV:
- Meaningful explanation, including variables used in the decision, provided with the decision rendered.
- Explanation can be human or machine generated.
-
Explanation Requirement for Recommendation:
- Provide affected individuals with information regarding options that are available to them for recourse to challenge the automated decision or recommendation.
-
Subject to requirements prescribed in the following, ensure that a contingency systems and/or processes are available should the Automated Decision System be unavailable for an extended period of time:
- Level I: None
- Level II: None
- Level III: Ensure that a contingency plans and/or backup systems are available should the Automated Decision System be unavailable.
- Level IV: Ensure that a contingency plans and/or backup systems are available should the Automated Decision System be unavailable.
Implementation guides
[TODO: Add/revise implementation guide items]
Reusable solutions
[TODO: Add/revise reusable solutions]
10. Collaborate widely
[TODO: Add/revise introductory text]
10.1 Create multidisciplinary teams with the range of skills needed to deliver a common goal
[TODO: Add/revise introductory text]
It’s important to have a strong multidisciplinary team in place, led by one person who is accountable and has the authority to make decisions based on the outcomes of research, testing and prototypes.
The team’s skills and focus need to evolve as the service is designed and developed. The team also needs to adapt its structure based on the needs of the service and the phase of work.
To be successful, build a team with:
- a broad mix of skills and roles from the start
- quick decision-making processes and the ability to change and adapt as the service evolves
- the resources and ability to deliver the product
(Digital Service Standard (Ontario))
Checklist
[TODO: Add/revise checklist items]
- Have a manager with the ability to make day-to-day decisions to improve the service (Digital Service Standard (Ontario))
- Make sure you have at least one user researcher working at least 3 days each week (Digital Service Standard (UK))
- Make sure there is separation of key roles in the team, meaning that nobody is performing multiple roles (Digital Service Standard (UK))
- Member(s) of the team have experience building popular, high-traffic digital services (Digital Services Playbook (US))
- Member(s) of the team have experience designing mobile and web applications (Digital Services Playbook (US))
- Member(s) of the team have experience using automated testing frameworks (Digital Services Playbook (US))
- Member(s) of the team have experience with modern development and operations (DevOps) techniques like continuous integration and continuous deployment (Digital Services Playbook (US))
- Member(s) of the team have experience securing digital services (Digital Services Playbook (US))
- Understand where gaps may emerge in the team and how to fill them (Digital Service Standard (UK))
- Plan to transfer knowledge and skills from contractors to permanent staff (Digital Service Standard (UK))
- Make sure there's a person on your team who's responsible for user research and usability tests (Digital Service Standard (UK))
- Make sure you'll have a team that can keep improving the service after it goes live (Digital Service Standard (UK))
- Make sure the team fully understands the service after it's gone live (Digital Service Standard (UK))
- Involve the maintenance team for the service early on in the project (Digital Service Standard (Ontario))
-
Retain the following appropriate expert to review the Automated Decision System, based on the Impact Assessment Level:
- Level I: None
-
Level II: At least one of:
- Qualified expert from a federal, provincial, territorial or municipal government institution
- Qualified members of faculty of a post-secondary institution
- Qualified researchers from a relevant non-governmental organization
- Contracted third-party vendor with a related specialization
- Publishing specifications of the Automated Decision System in a peer-reviewed journal
-
Level III: At least one of:
- Qualified expert from a federal, provincial, territorial or municipal government institution
- Qualified members of faculty of a post-secondary institution
- Qualified researchers from a relevant non-governmental organization
- Contracted third-party vendor with a related specialization
- Publishing specifications of the Automated Decision System in a peer-reviewed journal
-
Level IV: At least two of:
- Qualified experts from the National Research Council of Canada or Statistics Canada
- Qualified members of faculty of a post-secondary institution
- Qualified researchers from a relevant non-governmental organization
- Contracted third-party vendor with a related specialization
- OR: Publishing specifications of the Automated Decision System in a peer-reviewed journal
- Include all skillsets required for delivery, including for requirements, design, development, and operations
Implementation guides
[TODO: Add/revise implementation guide items]
Reusable solutions
[TODO: Add/revise reusable solutions]
Similar resources
10.2 Share and collaborate in the open
[TODO: Add/revise introductory text]
Checklist
[TODO: Add/revise checklist items]
- Offer users a mechanism to report bugs and issues, and be responsive to these reports (Digital Services Playbook (US))
- Link to the work of others
- Document how you accept contributions and comments on the code (Digital Service Standard (UK))
- Document how you're handling updates and bug fixes to the code (Digital Service Standard (UK))
- Document the code you've not made open and why (Digital Service Standard (UK))
- Determine how a team in another department can reuse your code (Digital Service Standard (UK))
- Identify capabilities that are common to the GC enterprise and can be shared and reused
- Decouple Master Data from applications and host within the appropriate system of record
- Make systems of record authoritative central sources
- Assign data custodians to ensuring data is correct, consistent, and complete
- Design data resiliency in accordance with GC policies and standards
- Use Master Data Management to provide a single point of reference for appropriate stakeholders
- Inform the GC EARB about departmental investments and innovations
Implementation guides
[TODO: Add/revise implementation guide items]
Reusable solutions
[TODO: Add/revise reusable solutions]
Similar resources
10.3 Identify and create partnerships which help deliver value to users
[TODO: Add/revise introductory text]
Checklist
[TODO: Add/revise checklist items]
- Develop open and innovative partnerships - Recognize that an organization can't have all the best ideas. Create partnerships and collaborate.
-
(The Good Collaboration Toolkit: Checklist (The Good Project)):
- Do you discuss the purpose of the collaboration? Is there agreement about the vision among collaborators?
- Is there a clear leadership/rotation of leadership for the collaboration?
- Is there a documented scope of work, with an associated timeline?
- Do you discuss goals and accountability to achieve these goals?
- Are people clear about their roles in the collaboration?
- Do you discuss the collaborators’ own interests, needs and values?
- Do you discuss methods of communication and decision-making?
- Are all of the voices of the collaborators being heard?
- Are all collaborators invested in the work?
- Is the work getting done?
- Are there obstacles that need to be discussed?
- Are supports communicated and shared?
- Is there agreement about a “product” or outcome for the collaboration?
Implementation guides
[TODO: Add/revise implementation guide items]
-
The Good Collaboration Toolkit (The Good Project)
- Checklist (The Good Project)
- Excellently Executed (The Good Project)
- Leadership Driven (The Good Project)
- Engaging for Participants (The Good Project)
- Mission Focused (The Good Project)
- Ethically Oriented (The Good Project)
- Nurtured Continuously (The Good Project)
- Time Well Spent (The Good Project)
- Solution Inspired (The Good Project)
Reusable solutions
[TODO: Add/revise reusable solutions]