-
-
-
Identify the appropriate measure to monitor the satisfaction rates of the digital service
- Use a methodology that suits the digital service: Customer satisfaction is a widely implemented, industry-standard measure of digital service quality. There are many quantitative methods to measure user sentiment. When designing a methodology, implement effective mechanisms that suit the digital service. For example, users can quickly and easily choose ‘thumbs up’ or ‘thumbs down’ options, which results in high response rates. A feedback form with an open text field requires more effort from the user, but provides more specific insight as to why a customer has provided that rating.
-
Give users the ability to rate their satisfaction or dissatisfaction
- Design convenient feedback mechanisms: Design a feedback mechanism that is easy and accessible for users and encourages engagement. The higher the response rate, the closer the data will be to the true sentiment of users. Having accessible and prominent feedback channels across every webpage and digital service will result in more valuable insights to enhance the user experience.
-
Continuously monitor customer satisfaction of the digital service and act to improve outcomes
- Listen to and understand user needs: By capturing and tracking customer satisfaction with the digital services, agencies can learn how users feel about the quality of their digital service(s) and the areas to be improved. This includes understanding user expectations and what they need from government digital services. Positive customer satisfaction indicates well-designed, accessible and inclusive digital services.
-
-
-
Guidance to measure if your digital service is meeting customer needs
-
Identify appropriate measures to monitor user satisfaction
Choose an appropriate method and tool to monitor levels of user satisfaction:
- Consider the outcomes and goals of your digital service, and the needs and expectations of your users. See also: Define clear objectives.
- When choosing a tool, consider how to minimise the burden on your users, along with the frequency, timing, and sample size of your data collection.
- The DTA recommend a simple thumbs up/down sentiment tool on each page of the digital service that is followed up by a free text feedback dialogue. See also Choose monitoring tools and methods.
-
Give users the ability to rate satisfaction or dissatisfaction
Configure your selected tool as an easy way for users to provide feedback about the service:
- Craft precise and straightforward questions or statements to gauge user satisfaction.
- Validate questions and the tool’s functionality by testing with a representative sample of users to ensure reliability and accuracy.
- Make sure customer feedback tools are available on each page and at the end of a digital transaction.
-
Continuously monitor customer satisfaction and improve the service
Regularly assess customer satisfaction with your digital service and make improvements:
- Implement a tool to measure the user satisfaction measurement in your digital service and monitor the results regularly.
- Use data visualisation and reporting tools to communicate the findings and trends to your stakeholders.
- Evaluate the impact of your user satisfaction measurement on your digital service performance and use the insights to identify areas for continuous improvement or innovation. See also Develop processes for continuous improvement.
- Test and iterate your digital service based on user feedback and satisfaction data.
-
-
-
Establish internal processes to support performance data analysis and reporting
- Collect and report meaningful data: Make sure the performance monitoring frameworks and data analytics tools are fit for purpose and provide meaningful reporting data. While there are numerous metrics, calculations and methods to collect data, your choice should prioritise ‘real-time’ user-centric approaches and align with the criteria in the Digital Performance Standard. The data gathered should reflect the true user experience to gain valuable insights. Agencies are required to report ongoing performance data for digital services delivered via IOF-tracked ICT investments once the service is implemented.
-
Report on progress during Investment Oversight Framework states and post-implementation performance
- Use data to identify the benefits: Use the data collected to identify service benefits. Benefits can include uncovering service inefficiencies by analysing data on digital service performance, unearthing deeper insights into users’ experience, segmenting user data based on user groups to better understand their needs and working in partnership with users to develop user-based solutions. Further qualitative metrics, complementing the quantitative, can add a rich layer of information on underlying factors influencing the user experience.
-
Analyse your performance results and act on any improvements to the digital services
- Use data-driven insights to continuously improve: Look for ways to continuously improve the digital service and the quality of the data. Use automated reporting tools where possible to streamline processes and reduce manual efforts. This will allow agencies to dedicate more resources to the analysis of the data.
-
-
-
Establish processes to support performance data analysis and reporting
Establish internal processes to support performance data analysis and reporting:
- Set up a dedicated team responsible for collecting, analysing and interpreting your digital service data. The team should be equipped with tools and software to make sure data reporting is accurate and comprehensive.
- Select monitoring tools with built-in analysis and reporting features and that easily integrate with existing software applications. This will make analysis and reporting easier.
- Make sure regular analysis and reporting are part of your monitoring framework. See also: Document how you will implement the monitoring framework and Develop processes for continuous improvement.
-
Report progress during Investment Oversight Framework states and post-implementation performance data
Reporting using existing reporting mechanisms will make sure your proposal or project is complying with the Performance Standard. This will occur as part of the DTA’s Digital and ICT Investment Oversight Framework process. Depending on your ICT investment your agency will be requested to report information in the following states:
- Strategic Planning and Prioritisation: within your proposal or business case report on how you intend to implement a monitoring framework to your digital service.
- Contestability: report that the Performance Standard has been, or will be, applied to the digital service. For example, as part of the Digital Capability Assessment Process
- Assurance: report how you have applied the Performance Standard to the digital service and made progress with delivery milestones.
- Operations: using the DTA’s existing data collection mechanisms such as the Approved Programs Collection, report on how your digital service continues to meet customer needs post-implementation.
-
Analyse performance results and make improvements to the digital service
- Apply the steps outlined in Develop processes for continuous improvement, which emphasises an agile approach to continuous improvement based on performance analysis.
-
Links
- Choosing digital analytics tools - Service Manual - GOV.UK (www.gov.uk)
- Whole-of-Government Application Analytics (WOGAA) - Improve Government Services with Data | Singapore Government Developer Portal (tech.gov.sg)
- Agile approach to service delivery | Digital NSW
- Understanding more from user feedback – Data in government (blog.gov.uk)
- DSS criterion 8. Innovate with purpose | digital.gov.au
- DSS criterion 10. Keep it relevant | digital.gov.au
-
-
-
-
Develop a business case for change
- Be outcomes focused: Consider what problems the service needs to solve and why they are important. Share early-stage assumptions, gather diverse perspectives from stakeholders and take advantage of pre-existing data and resources. Clearly state the risks of action and inaction, who might be impacted, potential barriers to success and any knowledge gaps.
- Frame the problem: Form a simple, clear problem statement from the evidence that’s already available. Use it as the basis of further research and validation, and to identify the users agencies need to engage with.
- Don’t jump to solutions: Don’t anticipate a technical or design solution before validating the problems identified. Evaluate the rest of the Digital Service Standard criteria to understand what else could drive the problem. Consider if a new solution is required or if an existing platform or service might achieve the best outcome.
- Align stakeholders to a vision: Engage key stakeholders to establish a shared vision for success. Set clear expectations for the project and make sure everyone knows why change is necessary.
-
Survey the policy and service landscape
- See the bigger picture: Assess how the problems identified play out in the broader policy and government service ecosystems. Use resources such as the Australian Government Architecture and Delivering Great Policy Toolkit to understand the landscape and the intentions of different policies.
- Align to government priorities: Have a clear understanding of how the service will contribute to government priorities, including the achievement of the Data and Digital Government Strategy 2030 vision.
Connect with the digital community
Share, build or learn digital experience and skills with training and events, and collaborate with peers across government.