Appendix 4: Mapping of dimensions to the Technical standard for government’s use of artificial intelligence
To assist implementation teams with alignment questions and to provide greater technical depth for each implementation dimension, we have provided a mapping to the Technical standard for government's use of artificial intelligence below:
| Dimension | Relevant AI Technical standard statements | Practical alignment notes |
|---|---|---|
| Business & strategic alignment | Statement 9: Conduct pre-work (design), Statement 12: Define success criteria (design), Whole-of-lifecycle Statement 1: Operational model | Problem framing, define measurable outcomes, operational model governance. |
| Architecture & solution design | Statement 2: Reference architecture (whole-of-lifecycle), Statement 10: Human-centred design, Statement 11: Design safety, Statements 7 & 8: Version control & watermarking | Align to reference architecture, embed design safety, maintain traceability. |
| Data & integration | Statements 13–19: Data supply chain, orchestration, quality, fusion/integration, establish context datasets; Whole-of-lifecycle Statement 6: Manage bias | Ensure data quality, governance, sovereignty, integration and bias controls. |
| Technology & tools | Statements 20–25: Training & modelling, Whole-of-lifecycle Statements 2, 4, 7, 8 (Reference architecture, auditing, version control, watermarking) | Ensure tooling supports traceability, auditing, secure model training. |
| People & skills | Statement 3: Build people capability (whole-of-lifecycle), Statement 10: Human-centred design, Statement 1: Operational model roles, Statement 4: Auditing capability | Develop AI skills, define roles, embed co-design, build auditing capacity. |
| Governance & risk | Statement 1: Operational model, Statement 4: Auditing, Statement 5: Explainability, Statement 6: Manage bias, Statement 11: Design safety, Statement 9: Pre-work | Define governance structures, enable auditing, embed explainability and bias management. |
| Experimentation & validation | Statement 11: Design safety, Statement 12: Success criteria, Statements 22–25: Evaluation & continuous improvement, Statements 4 & 5: Auditing & explainability | Embed success thresholds, user validation, auditing, continuous learning. |
| Delivery & operations | Operate phase: Integrate, deploy, monitor; Whole-of-lifecycle Statements 4, 5, 7 (Auditing, Explainability, Version control) | Integrate securely, deploy responsibly, monitor performance and maintain auditability. |
| Scalability & transition to production | Operate phase: Deploy & monitor, Whole-of-lifecycle Statements 4, 7, 8 (Auditing, Versioning, Watermarking) | Plan scalability, embed operational monitoring, maintain version control. |
| Sustainment & exit strategy | Retire phase: Decommission, Whole-of-lifecycle Statement 4: Audit, Statement 7: Version control | Plan for decommissioning, preserve audit trails, archive responsibly. |