• Develop processes for continuous improvement

    Develop processes for sharing insights and continuously improving the digital service:

    • Make sure the people responsible for performance monitoring work closely with those delivering the digital service. 
    • Frequently assess your service performance data and metrics and insights in comparison to your benchmarks or targets, pinpointing areas where you excel and where improvements are needed. 
    • Use data-driven insights to inform decisions and actions for enhancing the quality, efficiency and effectiveness of your service.
    • Implement agile and iterative methods for testing, experimenting and deploying changes to your service. 
    • Use prototypes, minimum viable products, or beta versions to validate assumptions and hypotheses and measure the changes in your performance data.
    • Document findings, learnings and best practices from your service improvement processes. Share these with your team, senior management, other agencies or the public. 
    • Foster a culture of learning and innovation in your organisation and encourage collaboration and knowledge sharing across teams and agencies.
    • Seek feedback and input from your peers, mentors, experts, or external partners.
    • Participate in communities of practice, forums, or events related to your service domain.

    Combining user research efforts across the DX Policy and its standards can help to reduce duplication and the cost of research.

    Off
  • Identify the most appropriate measure to monitor availability
    • Fit for purpose: Understand if your monitoring methods for digital service availability (if they exist) are fit for purpose before considering new tools.
    • Prioritise user-centric metrics: Align metrics with user expectations and preferences to create seamless digital experiences. Reflect on diverse user journeys and consider different entry points, navigation paths and transaction types.
    Off
  • Monitor the availability of the digital service based on the expected user outcomes
    • Measure from the end-user’s perspective: Make sure the digital services are available by monitoring them from an inside and outside perspective. Implement tools that monitor uptime to make sure the system remains online. To catch any issues that internal checks might miss, consider other tools that simulate real-world experiences from a user perspective. Comprehensive monitoring will allow agencies to understand and improve the experience of the end-user.
    Off
  • Act to improve user outcomes
    • Maintain a reliable service: Make sure your digital service is available, stable and consistent for users no matter their location. Schedule downtime and maintenance when it will cause the least disruption for users and notify users well ahead of time that digital services will be impacted or unavailable.
    • Create response plans: Make sure clear communication channels are included in response plans. This will allow your agency to proactively address issues and act quickly to maintain availability of the service.
    Off
  • Guidance to measure the availability of your digital service

  • Identify appropriate measures to monitor availability

    Service availability is how often your service is available and accessible. To monitor availability: 

    • Define what availability means for your service based on user expectations. For instance, this could mean a digital service that's always accessible, has functional links and works on mobile devices in areas with unreliable internet speeds. See also Define clear objectives.
    • Identify the key performance indicators (KPIs) that reflect the availability of your service. At a minimum you should monitor uptime, however other KPIs might include error rate, load time, fully loaded time or percentage of valid links. See also: Choose relevant metrics.
    • Choose tools and methods that enable you to collect, store and analyse the data related to your KPIs. These tools should be capable of providing real-time insights and generating comprehensive reports. See also: Choose monitoring tools and methods.
    • Make sure the tools integrate seamlessly with your existing systems to facilitate smooth data flow and accessibility for your team. 
    Off
  • Set availability benchmarks or targets

    Set benchmarks or targets to evaluate service availability performance: 

    • You might specify the minimum acceptable level of availability, the maximum tolerable level of planned and unplanned downtime, or the optimal range of load time for your service. See also Define clear objectives and Choose monitoring tools and methods.
    • Establish a consistent schedule and procedure for tracking availability data and presenting the findings. 
    • Examine the findings to pinpoint gaps, trends, patterns, or anomalies that highlight the service's performance and its impact on user outcomes.
    Off
  • Act to improve user outcomes

    Act on the findings and implement improvements or changes to the service design, delivery, or maintenance as needed. 

    Actions may include:

    • fixing bugs
    • upgrading servers
    • enhancing features
    • communicating with users.

     See also Develop processes for continuous improvement.  

    Off
  • Understand what success looks like for the digital service
    • Understanding cohorts: Overlaying demographic data, such as location or socio-economic data, may help agencies to understand the outcomes of different cohorts and their interactions with digital services. If a cohort has a low success rate in completing transactions online, it may signal the need for digital service improvements.
    Off
  • Identify the most appropriate measure to monitor the success of the digital service
    • Compare performance to non-digital channels: While a user’s whole activity may involve multiple channels, including non-digital channels, in meeting Criterion 3, it is important that agencies measure the digital component(s) of the activity separately from the non-digital channels. For example, if an end-to-end service requires multiple digital transactions and an in-person assessment prior to the service being delivered, each digital component should be measured separately.
    Off
  • Regularly measure and monitor the effectiveness of the digital service and act to improve outcomes
    • Understand the touch points of a user’s digital journey: Capture data along the user’s digital journey. Map out the user flow and capture data from the start to the end of their journey, marked by the successful completion of their transaction. This data will help you recognise potential hurdles and drop-off points, to optimise how users complete their transactions online.
    • Enhancing the overall experience: Many government services have both digital and non-digital channels, designed to work together. Consider assessing the user’s journey across different channels to help understand user behaviour and identify ways to enhance the digital experience, aiming to make the whole service smoother.
    Off
  • Guidance to measure the success of your digital service

  • Understand what success looks like

    Begin by understanding what success looks like for your transactional digital service:

    • Identify the user's objectives and goals through research, affirm their expectations, and make sure they match the agency's completion expectations. See also,Define clear objectives
    • Identify the key tasks and/or transactions that users need to complete to achieve their goals. Break down the user journey into individual, manageable components, focusing on each critical interaction point. For example, if the service involves updating personal information, key tasks might include logging in, navigating to the profile section, editing details and saving changes.
    Off
  • Determine measures of success for your digital service

    Determine the measures of success for transactions: 

    • Identify KPIs that indicate how well the service supports users to finish actions or tasks in the digital service. KPIs will depend on the nature of your service, and your user’s goals. For services involving applications or submissions, KPIs could include the number of applications started versus completed, the average time to submit an application, and the number of users who encounter errors and abandon the process. Refer also to Choose relevant metrics.
    • Where feasible, DTA recommend measuring the number and percentage of digital tasks that are started and completed successfully versus abandoned.
    • Additional metrics to monitor service success include:
      • the time spent by users on each touchpoint, task and the overall transaction
        • the sources of traffic that lead users to the digital service
        • errors, bugs, or technical issues encountered by users 
        • user behaviour patterns, such as clicks, scrolls, mouse movements, and keystrokes
        • user demographics, such as age, gender, location, and language spoken
        • Capturing a Customer Effort Score (CES) with a free-text feedback dialogue at points along the user journey. See also Criterion 4.
      • Once you have determined KPIs, choose tools to collect, store and analyse data. These tools should integrate with your systems and provide real-time insights and comprehensive reports. See also Choose monitoring tools and methods
    Off
  • Monitor the effectiveness of the digital service and act to improve outcomes

    Once the digital service is operational and monitoring tools are implemented:

    • Regularly analyse data and report insights within your agency to build an organisational understanding of how helpful your digital service is and identify areas for improvement. 
    • Act on the findings and implement improvements or changes to the service design, delivery, or maintenance, as needed. 
    • Test the improvements and measure for impact by monitoring for changes in data. 
    • Ensure that measures of success are embedded into an ongoing process of continuous improvement. See also Develop processes for continuous improvement).
    Off

Connect with the digital community

Share, build or learn digital experience and skills with training and events, and collaborate with peers across government.