• This criterion considers whether project process and management systems are established and followed, with reference to those processes most significant in digital projects.

  • Schedule

    As with all projects, the adequacy of schedule management processes including identification of variance against baseline, establishing trends for the major elements of projects, identification and management of the critical path and project manager ownership of the schedule all affect delivery confidence.

    For digital projects, pressure to start delivery quickly instead of developing a robust business case has been found to reduce confidence, potentially resulting in the schedule not representing the full complexity of the task and not accounting for essential interdependencies. This can lead to unrealistic expectations of digital project pace.

    Due to the uncertainties involved in many digital projects, contingency should also be included in schedules to allow for learning during delivery.

  • Stakeholder engagement 

    Government digital projects can involve a complex ecosystem of stakeholders with direct impact on transformation effectiveness. 

    • Successful delivery relies upon user, client and senior executive involvement in the formulation of project goals and scope and in project decision-making.
    • Cross agency inter-dependencies can affect delivery confidence, separating delivery and the business across multiple agencies, confounding understanding of responsibility and ownership.
    • Effective stakeholder engagement will reveal dependencies that are beyond project control but affect delivery confidence, such as licensing, regulations, policies, data sharing and interfaces with systems over jurisdiction boundaries. 

    Lack of engagement with suppliers on the feasibility of objectives prior to contracting can reduce confidence on high ambition transformations involving unfamiliar technology.

  • DCA tolerances

  • Your responsibilities

    To successfully meet this criterion, you need to:

    • adopt transparent data handling
    • implement security measures
    • maintain a reliable service
    • be accountable for the service.
  • High

    Substantial early and sustained engagement with people who are influential, impacted or involved in the project, allowing for a nuanced understanding of needs and interdependencies.

    Off
  • Medium high

    Substantial early engagement and limited ongoing engagement, allowing for a good understanding of needs and interdependencies.

    Off
  • Medium

    Early engagement with some stakeholders but limited ongoing engagement.

    Off
  • Medium low

    Key stakeholder groups are not engaged. Assumptions not tested with the people themselves.

    Off
  • Low

    No engagement. Little basis for assumptions.

    Off
    • Substantial early and sustained engagement with people who are influential, impacted or involved in the project, allowing for a nuanced understanding of needs and interdependencies.

    • Substantial early engagement and limited ongoing engagement, allowing for a good understanding of needs and interdependencies.

    • Early engagement with some stakeholders but limited ongoing engagement.

    • Key stakeholder groups are not engaged. Assumptions not tested with the people themselves.

    • No engagement. Little basis for assumptions.

  • Relevant policy 

    The Digital Experience Policy (the policy) sets agreed benchmarks for the performance of digital services and supports agencies to design and deliver better experiences by considering the broader digital service ecosystem. 

    The policy supports a whole-of-government focus on improving the experience for people and business interacting digitally with government information and services, setting a benchmark for good digital services and integrating data based on real-world use. 

    Through a phased implementation, agencies will be required to meet four standards the: 

  • Delivery Confidence Assessment (DCA) tolerances

  • Adopt transparent data handling

     

    Consider privacy, consent, and control: Safeguard user data by adhering to the Australian Privacy Principles and the Privacy Act (1988). Always obtain explicit, informed consent before collecting a user’s data and provide a means to update or delete it. Allow users to report inaccurate data and respond with how it has been rectified. Notify users of their own responsibilities to protect their data, such as not to share their password with others.

    Eliminate ambiguity in your user interface: Provide validating feedback and progress tracking as users interact with your service. Design to eliminate the need for error messages in the first place; make them understandable and actionable where they remain. Tell users what information they need before they start a task and, where appropriate, allow them to pause and resume at their own pace.

    Off
  • High

    The project schedule covers the entire scope for the solution, is used to inform management action and is actively updated. Progress assessed on estimate to complete. There is sufficient contingency for the risk of the project

    Off
  • Medium high

    A schedule measurement baseline exists with a critical path. This provides the basis for management of change.

    Off
  • Medium

    The schedule appears accurate but is not actively updated. Measurement against baseline is not consistent or regular.

    Off
  • Medium low

    The schedule appears mostly complete but is not being used to inform management action.

    Off
  • Low

    The schedule does not cover the complete scope. The critical path is not being managed. The schedule is not being used to support management action. Progress assessed on time spent. No contingency.

    Off
    • The project schedule covers the entire scope for the solution, is used to inform management action and is actively updated. Progress assessed on estimate to complete. There is sufficient contingency for the risk of the project

    • A schedule measurement baseline exists with a critical path. This provides the basis for management of change.

    • The schedule appears accurate but is not actively updated. Measurement against baseline is not consistent or regular.

    • The schedule appears mostly complete but is not being used to inform management action.

    • The schedule does not cover the complete scope. The critical path is not being managed. The schedule is not being used to support management action. Progress assessed on time spent. No contingency.

  • Cost and finance

    Delivery confidence can be higher where there is evidence of regular monitoring of cost, value and revenue, with any variation attributed to specific causes and with appropriate delegations.

    The DCA should consider whether the baseline cost estimate is realistic, or whether there is evidence of optimism bias, or underestimation of budgets and overestimation of benefits to facilitate initiation.

    Funding continuity can also affect delivery confidence. Factors to consider include the budget allocation for development after go-live, and to support the training and organisational change management activities needed to realise benefits.

    Budgets also need to cater for recurrent costs post implementation, for example, accounting for ongoing operating expenditure (Op-Ex) to support cloud-based digital solutions.

  • DCA tolerances

Connect with the digital community

Share, build or learn digital experience and skills with training and events, and collaborate with peers across government.