-
Links
- Australian Government: Style Manual
- Victorian Government: Make content accessible – digital guide
- Western Australia Government: Accessibility and Inclusivity Guidelines
- South Australian Government: Accessibility toolkit
- Victorian Government: Communicating data with colour
- Victorian Government: Branding guidelines of Victorian Government 2017
- NSW Government: Accessibility in NSW
- NSW Government: Accessibility matters
- Western Australia Government: Digital Services Policy Framework
- gov.uk: Dos and don’ts on designing for accessibility
- Centre for Accessibility: How do I check if my work is accessible?
- W3C: How to meet WCAG (Quick reference)
- W3C: Developing for Web Accessibility
- W3C: Web accessibility evaluation tools list
- Agency for Digital Government: Digital Inclusion
- NSW Government: Your responsibilities
-
-
-
Provide flexibility and choice for how users engage with your digital service
- Incorporate responsive design: Make sure your service has a responsive design that allows for compatibility across various devices and screen sizes, accommodating users who access services through different platforms.
- Incorporate adaptable user interfaces: Design services that can be customised and adapted to allow personalised experiences. This may include flexible layouts, themes that support enhanced day and night vision and tailoring the user interface to meet device-specific considerations. Include preferences for written, audio and visual information and other settings that enhance user comfort and accessibility.
- Be considerate of time: Implement save and resume functionality that allows users to complete tasks immediately, or later. This is beneficial for processes that may require multiple steps to complete or information gathering. Disperse information gradually to prevent overwhelm. Be mindful to provide enough time to complete tasks and avoid time constraints that may pose challenges to individuals with cognitive or motor disabilities.
-
Create seamless experiences across service delivery channels
- Support users to move between service channels with ease: Consider the support users need for a complete service experience and maintain non-digital channels for those who need it. Map user experiences to identify pain points and opportunities and ensure a consistent look and feel across all channels, including websites, mobile apps and in-person interactions.
- Enable real-time data synchronisation: Where possible, use real-time data synchronising across all service channels. This prevents inconsistencies and lets users to access to the most up-to-date information regardless of where or how they interact with the service.
-
-
-
Guidance to provide flexibility and choice
-
Provide flexibility and choice for users to engage with the digital service
Most users use government systems to complete tasks. When designing, consider how to make it easier for the user by taking into account time needs and preferences for interacting with the service online:
- Conduct user research and user testing.
- Use the Digital Inclusion resources as a starting point to consider how to assist the users you are designing for.
- Ensure compatibility across various devices and screen sizes to accommodate users accessing services through different platforms.
- Conduct usability testing across different devices and put mobile-first design principles in place.
- Regularly update and test the service for new devices.
-
Incorporate adaptable user interfaces
Design customised and adapted services to allow personalised experiences for the user:
- Develop user personas to understand diverse needs.
- Create theme options for users to select. Provide options for users to customise font sizes and contrast.
- Create flexible layouts tailoring the user interface to meet device-specific considerations.
- Support enhanced day and night vision
- Tailor the user interface to meet device-specific considerations. Include preferences for written, audio and visual information and other settings that enhance user comfort and accessibility.
-
Be considerate of time
Allow users to complete tasks immediately or later when creating processes that require multiple steps or information gathering:
- Gradually disperse information to prevent overwhelm.
- Make sure there is enough time to complete tasks and avoid time constraints that may challenge individuals..
- Introduce auto-save features for ongoing tasks.
- Break down complex processes into smaller, manageable steps.
- Provide clear instructions and time estimations for each step.
-
Support users to move between service channels with ease
Provide a complete service experience by maintaining non-digital channels, for those who need it:
- Map user experiences to identify pain points and opportunities
- Provide consistent look and feel across all channels, including websites, mobile apps and in-person interactions.
- Develop a unified style guide for all service channels.
- Conduct regular user feedback sessions.
- Provide training for staff on all service channels.
-
Document your findings
Document your findings and recommendations on how to apply criterion 5:
- Ensure your proposal supports your decisions and demonstrates that you have considered and applied flexibility in your service design.
- Use the Digital Capability Assessment Process (DCAP) template to report on meeting the criterion.
- Make sure the data is collected and documented in a centralised knowledge repository.
-
Links
- User research | digital.gov.au
- Delivering for all people and business | Data and Digital
- Queensland Government: Consistent User Experience Standard v3.0
- NSW Government: Towards a customer-centric government
- Agency for Digital Government: Digital Inclusion
- OECD Guidelines for Citizen Participation Processes | OECD
-
-
-
Define clear objectives and goals, based on user needs
- Establish a performance monitoring framework: Use a performance monitoring framework to understand the digital platform’s real-world impact and how users interact with digital services. The framework should be established from an end-user perspective, not from the perspective of an agency’s infrastructure. Use clear objectives and goals framed in the context of what users need and expect from the digital service.
-
Choose relevant metrics that align with organisational goals, meet Digital Performance Standard criteria and capture the user experience
- Key performance indicators: Apply measures to achieve the outcome as set out in the Digital Performance Standard and to support your organisational goals. They should be specific and measurable and further your agency’s understanding of how users interact with your agency on digital platforms. Metrics need to be meaningful to understand and improve user experience. Meaningful metrics are crucial to the overall success of the framework.
- Apply a best-practice approach: Implement a performance monitoring approach that is comprehensive and focuses on the end-user experience. Where best practice cannot be achieved or does not line up with your agency’s other metrics, strive to introduce best practice concepts over time.
-
Articulate how you will implement the monitoring framework
- Leverage analytical tools: Reliable digital analytics tools may need to be implemented to collect and analyse performance data. When designing a framework, consider what data sources you require for successful implementation and consider what can be readily deployed within your ICT environment.
-
Develop processes for continuous improvement based on insights
- Continuous improvement of the user experience: Integrate processes for continuous improvement with a focus on user-centric benefits. Data and feedback should be regularly analysed to find improvement opportunities to enhance overall user experience.
- Use a baseline to measure performance: Establish a baseline for your digital service performance with data gathered from your digital service. A baseline can identify areas to improve a digital service in line with user expectations.
- Share insights and learnings: Share your insights and learnings with the DTA and other agencies. A collaborative approach to digital experience will support whole-of-government standardisation of digital services, build digital and ICT capabilities and deliver a consistent customer experience. The DTA will support agencies by incorporating insights and best practices in its guidance documents and toolkit.
-
-
-
Guidance to implement a monitoring framework
-
Define clear objectives
To develop your monitoring framework, start by setting clear objectives and goals for your digital service based on user needs:
- Conduct user research to identify the needs, expectations and goals of the users of the digital service.
- Use service design methods to further interpret user research and document the user perspective.
- Define the desired outcomes and impacts of your service for your users based on the user perspective.
-
Choose relevant metrics
Choose metrics that align with organisational goals and meet Performance Standard criteria:
- Consider the user experience, service benefits and individual agency objectives.
- Create key performance indicators (KPIs) that are meaningful for your service and your agency.
- Make sure the KPIs can be monitored over time to enable continuous improvements.
- Check your KPIs align with criteria 2, 3 and 4 of the Performance Standard.
- Choose metrics to measure against your KPIs — for example, user satisfaction, completion rates and service availability.
-
Choose monitoring tools and methods
Choose appropriate monitoring tools and methods to collect, store, and analyse performance data:
- Research different tools and methods that can capture, store, and analyse data related to your KPIs, referring to guidance for criteria 2, 3 and 4.
- See what tools and methods other agencies use to understand best practices.
- Collate your research and create a shortlist of tools and methods.
- Make sure tools and methods meet the legal and ethical requirements for data collection and analysis.
- Document your choices and rationale to make an informed decision, for example by conducting a cost-benefit analysis, considering factors such as:
- alignment with your chosen performance indicators
- ease of use
- adaptability, compatibility and ease of integration
- ability to produce comprehensive reports
- scalability
- cost
- security and reliability.
-
Choose appropriate monitoring tools and methods to collect, store, and analyse performance data: • Research different tools and methods that can capture, store, and analyse data related to your KPIs, referring to guidance for criteria 2, 3 and 4. • See wh
Articulate the monitoring framework in a document that covers data collection reporting:
- Include information such as data sources, methods, frequency, formats, and the roles and responsibilities of staff and teams.
- Establishing clear rules and maintaining thorough documentation to detail quality assurance mechanisms will provide clarity for staff involved in data processing.
- Data validation (the process of ensuring data is checked for errors and anomalies), data verification (ensuring data is accurate and consistent against source or reference points) and data cleansing (identifying and correcting inaccurate data) procedures will ensure the collection and use of high-quality data.
- Plan how data and insights will be presented and communicated, for example a dashboard, spreadsheet or a report.
- Explain the process for reviewing and acting on the performance data, include the key decision makers, responsible persons and key stakeholders.
- Review and update the document regularly based on the effectiveness of the monitoring framework, new information and changing needs or priorities.
-
Implementing the monitoring framework
Articulate the monitoring framework in a document that covers data collection reporting:
- Include information such as data sources, methods, frequency, formats, and the roles and responsibilities of staff and teams.
- Establishing clear rules and maintaining thorough documentation to detail quality assurance mechanisms will provide clarity for staff involved in data processing.
- Data validation (the process of ensuring data is checked for errors and anomalies), data verification (ensuring data is accurate and consistent against source or reference points) and data cleansing (identifying and correcting inaccurate data) procedures will ensure the collection and use of high-quality data.
- Plan how data and insights will be presented and communicated, for example a dashboard, spreadsheet or a report.
- Explain the process for reviewing and acting on the performance data, include the key decision makers, responsible persons and key stakeholders.
- Review and update the document regularly based on the effectiveness of the monitoring framework, new information and changing needs or priorities.
-
Set achievable targets or benchmarks
Benchmarks are reference points to compare your performance with other services, agencies and sectors. They help you set realistic and achievable targets or benchmarks for each KPI based on user expectations and best practices.
Setting benchmarks and targets:
- Review the benchmarks and targets in similar services within your agency, government, or the private sector.
- Use the benchmarks to inform the targets for each KPI. For example, you could examine the average task time or drop-out rate for completing a similar task in different services — these targets should reflect user expectations, best practices and the strategic goals of your agency.
- Check your targets and benchmarks are specific, measurable, attainable, relevant and time bound (SMART). Targets and benchmarks should reflect the available resources of your service.
- Document the choices and rationale for setting these targets or benchmarks, and communicate them to relevant stakeholders, senior management or other agencies.
Connect with the digital community
Share, build or learn digital experience and skills with training and events, and collaborate with peers across government.