-
Criterion 7 – Do no harm
Criterion requirements
To successfully meet this criterion, agencies will need to:
- Protect users’ digital rights.
- Understand privacy impacts.
- Understand the limits of data.
For existing services, this means that agencies must have mechanisms in place to protect users’ digital rights and understand the privacy impacts of the services and the limits of data.
Checklist items
The digital rights of users are protected.
Best practice approaches:
- Consider how the service might impact the digital rights of users. Identify users facing greater personal risks and make sure they’re provided with the means to access, communicate and contest the service transparently or anonymously. If rights are breached, move quickly to implement changes that prevent future harm.
- Consider the implications of the service beyond its immediate impacts. Workshop environmental, economic or social impacts and undertake scenario planning to explore unforeseen issues and opportunities.
The services’ privacy impacts are understood and appropriately responded to.
Best practice approaches:
- Undertake regular Privacy Impact Assessments to capture issues. Mitigate unwarranted and unauthorised surveillance, data collection and malicious data breaches and share these actions with users.
- Where required, seek and obtain informed consent from users prior to collecting, storing or disclosing any of their data. Consider opt-out options and ensure the service requires as little user data as possible.
- Communicate how data will be used or may be used in the future at the time of consent. This includes how it may be shared with other people or between services and secondary or less obvious uses.
The limits of data that is collected and / or used by the service is understood.
Best practice approaches:
- Data should only be collected and used for the stated purpose that the user agrees to. Account for how data models, datasets and algorithms may produce discriminatory results and provide transparent detail to users on how decisions and calculations are made. Before sharing data, apply the DATA Scheme’s Data Sharing Principles to help assess whether it would be safe to do so.
- Quantitative data, which is numeric or measurable, helps us understand what is happening on a service. Qualitative data, which is descriptive or observable, helps us understand why. Use both to fully understand the story and match any correlation with a provable causation. Do this before making important decisions.
Optional
- Describe how the digital service complies with this criterion, referencing best practice approaches deployed where possible.
-
Criterion 8 – Innovate with purpose
Criterion requirements
To successfully meet this criterion, agencies will need to:
- Follow guidance on critical and emerging technologies.
- Maintain interoperability in the face of new technology.
- Track adoption of new technology.
For existing services, this means that agencies must demonstrate that they have adopted emerging technologies only when there is an inherent benefit, maintain interoperability where relevant, and have implemented measures to monitor for changes relating to critical and emerging technologies that may impact the service.
Checklist items
There are processes in place to monitor and implement guidance for critical and emerging technologies for the service.
Best practice approaches:
- Stay current, technology can advance at a staggering pace. If available, refer to government guidance on risks, opportunities and developments for up-to-date advice on critical or emerging technology that may impact the service.
- Regularly check the Australian Government Architecture: Follow published guidance in the Australian Government Architecture for the adoption of critical and emerging technologies.
There are processes in place to maintain the interoperability of the service in the face of new technology.
Best practice approaches:
- Consider if new technologies will impact the service’s interoperability. Plan for its introduction or implementation in partnership with other affected agencies to prevent further divergence or disconnection.
- Undertake an assessment of the preparedness for new technologies. Consider the resources and training for a new technology that will be required by the agency and team.
There are processes in place to track adoption of new technology.
Best practice approaches:
- Prior to implementing a new technology, determine whether it aligns with the clear intent of the service and whether it risks leaving certain types of users behind. If implemented, monitor how users respond to the new technology and respond to any accessibility or usability concerns.
Optional
- Describe how the digital service complies with this criterion, referencing best practice approaches deployed where possible.
-
Criterion 9 – Monitor your service
Criterion requirements
To successfully meet this criterion, agencies will need to:
- Establish a baseline for the service.
- Identify the right performance indicators.
- Measure, report and improve according to strategies.
For existing services, this means that agencies must demonstrate that there is continuous monitoring and measurement of services to ensure they operate smoothly, remain secure and cater for users’ evolving needs.
Checklist items
There is an established performance baseline for the service.
Best practice approaches:
- Determine the current state by identifying and reviewing existing metrics for the service. Use this as a yardstick to measure progress.
- Compare the service to similar services or existing standards to identify areas of improvement. Seek out best practices of similar and well-performing services to consider if they can be adopted.
Appropriate performance indicators have been identified for the service.
Best practice approaches:
- Use metrics that accurately capture the service’s ability to deliver the outcomes that users expect. These might include adherence to design standards and privacy legislation, site/app performance, security benchmarks or tasks completed by users.
- Use metrics that accurately capture the service’s ability to deliver the outcomes that users expect. These might include adherence to design standards and privacy legislation, site/app performance, security benchmarks or tasks completed by users.
The service is measured, reported against and improved according to strategies.
Best practice approaches:
- Make sure the service meets the Data and Digital Government Strategy and consider how information collected and reported could improve the service in line with the Strategy’s implementation plan. All digital and ICT-enabled investment proposals must define their purpose, outcomes and methods for measuring, monitoring and optimising them. Find out more in the Benefits Management Policy.
Optional
- Describe how the digital service complies with this criterion, referencing best practice approaches deployed where possible.
-
Criterion 10 – Keep it relevant
Criterion requirements
To successfully meet this criterion, agencies will need to:
- Improve the service across its life.
- Schedule regular assessments.
- Communicate service upgrades.
For existing services, this means that agencies must seek to continuously improve their services, schedule regular assessments and communicate service upgrades.
Checklist items
There are mechanisms in place to make improvements to the service across its life.
Best practice approaches:
- Increase people’s use of the service by continuously optimising performance, enhancing security, introducing relevant features, addressing bugs and increasing compatibility. Use metrics identified in Criterion 9 (‘Monitor your service’) to reveal the biggest opportunities for impact and ground improvements in evidence. Provide ongoing training and materials for staff to support change.
- Increase people’s use of the service by continuously optimising performance, enhancing security, introducing relevant features, addressing bugs and increasing compatibility. Use metrics identified in Criterion 9 (‘Monitor your service’) to reveal the biggest opportunities for impact and ground improvements in evidence. Provide ongoing training and materials for staff to support change.
Regular assessments are scheduled to review the performance and experience of the service over time.
Best practice approaches.
- Define the goals and scope of the assessment then observe performance and experience over time. Performance metrics might include load times, responsiveness or bottlenecks. Experience metrics might include entry/exit points, dwell time or task abandonment. Ongoing monitoring should be part of business-as-usual processes and a detailed review part of regular service evaluation.
- Define the goals and scope of the assessment then observe performance and experience over time. Performance metrics might include load times, responsiveness or bottlenecks. Experience metrics might include entry/exit points, dwell time or task abandonment. Ongoing monitoring should be part of business-as-usual processes and a detailed review part of regular service evaluation.
Service upgrades appropriately communicated with users.
Best practice approaches.
- Develop an iterative communication plan for how, when and through what channels updates and findings will be shared with users. When writing content, show how users’ feedback informed the actions that have been taken. Highlight key achievements or milestones reached and use real-life stories to demonstrate how users shaped change.
- Develop an iterative communication plan for how, when and through what channels updates and findings will be shared with users. When writing content, show how users’ feedback informed the actions that have been taken. Highlight key achievements or milestones reached and use real-life stories to demonstrate how users shaped change.
Optional
- Describe how the digital service complies with this criterion, referencing best practice approaches deployed where possible.
-
-
-
DX Policy compliance, reporting and exemption information for digital government services.
-
Scope and device lifecycle
-
Comply with the policy
-
Develop a business case for change
Be outcomes focused: Consider what problems your service needs to solve and why they are important. Share your early-stage assumptions, gather diverse perspectives from stakeholders and take advantage of pre-existing data and resources. Clearly state the risks of action and inaction, who might be impacted, potential barriers to success and your knowledge gaps.
Frame the problem: Form a simple, clear problem statement from the evidence that’s already available. Use it as the basis of further research and validation, and to identify the users you need to engage with.
Don’t jump to solutions: Don’t anticipate a technical or design solution before validating the problems you’ve identified. Evaluate the rest of the Standard’s criteria to understand what else could drive the problem. Consider whether a new solution is required or if an existing platform or service might achieve the best outcome.
Align stakeholders to a vision: Engage key stakeholders to establish a shared vision for success. Ensure clear expectations are set for the project and everyone knows why change is necessary.
Off
Connect with the digital community
Share, build or learn digital experience and skills with training and events, and collaborate with peers across government.