Assurance research series: 01
Assurance research series: 01
To successfully meet this criterion, agencies will need to:
Apply Criterion 5 throughout Beta to protect users’ digital rights and ensure robust security measures are in place.
As cyber threats become more prevalent and sophisticated, adhere to this criterion across the Service design and delivery process.
Assurance research series: 01
This guidance is intended for all Australian Government personnel working with government information – including employees, contractors and consultants.
Your agency may have enterprise generative AI tools that are not public and may offer enhanced security, privacy, or tailored functionality. You should refer to your agency’s guidance on how to use these tools.
Generative AI refers to AI tools that generate content – such as text, images, software code, audio or video – based on patterns learned from large volumes of data.
ChatGPT, Gemini and Claude are some of the well-known public generative AI tools you can access using a browser or app. These tools allow you to enter a question or instruction and get AI-generated answers. Your input is called a prompt, and the AI’s reply is the output.
Increasingly, generative AI is being built into everyday services or software. It now appears across commonly used digital tools, including search engines, communication platforms, and productivity applications.
Because it’s often embedded in tools you already use, it’s not always obvious when generative AI is active. If a tool helps you write, summarise, design, or generate ideas based on your input, it’s likely using AI.
Public generative AI tools are different from non-public or enterprise tools that have been configured to meet agency data control and information security requirements. When you use public tools, your inputs and outputs may be shared with the tool provider.
Make sure you know whether you are using a public or non-public generative AI tool as they may have a similar look and feel. For example, Microsoft 365 Copilot is an enterprise AI solution used by some agencies while Microsoft Copilot is a web-based public tool aimed at individual users.
If you’re not sure, refer to your agency’s policies or ICT support for advice.
Generative AI tools can help you work more efficiently and explore new ideas. They’re useful for checking your thinking, making content easier to understand, and supporting everyday tasks.
The Australian Government is focused on unlocking the benefits of AI to improve how we work and help deliver better services. Allowing staff to use public generative AI tools for OFFICIAL level government information is a practical step towards this goal.
You should follow your agency’s policies and guidance on using public generative AI tools in the first instance.
Subject to your agency’s policies, you can use these tools with OFFICIAL level government information (see Protective Security Policy Framework (PSPF) Policy Advisory 001-2025).
You must not put information that is security classified OFFICIAL: Sensitive or above into public generative AI tools. Information security classifications are defined in the PSPF.
Public generative AI tools include well-known services like ChatGPT, Gemini and Claude, which users can access via web browsers, apps or embedded in other services. This guidance should be read alongside the Policy for the responsible use of AI in government and Protective Security Policy Framework Policy Advisory 001-2025 on OFFICIAL Information Use with Generative Artificial Intelligence. Agencies should adapt this advice to their specific risk profiles and operational requirements.
The Australian Government is focused on capturing the opportunity of AI, broadening our safe and responsible use of this technology while building public trust and confidence.
Generative AI capabilities are increasingly embedded across digital infrastructure including search engines, productivity applications and software platforms, often without explicit user notification. This ubiquitous integration creates both opportunities and governance challenges for government agencies.
Current operational realities include:
Agencies must address security requirements and workforce capability development to unlock operational effectiveness in an AI-enhanced environment.
For more information on the program cost and terms please refer to Registration Terms and Conditions.
The DTA is the Australian Government’s advisor for its digital transformation agenda. The DTA's mandate is to provide strategic advice, coordination and assurance across the Australian Government's portfolio of digital projects.
For further information and the latest versions of the DTA’s guidance documents and templates please see the Assurance Framework for Digital and ICT Investments or contact the DTA at:
The John Grill Institute for Project Leadership conducts breakthrough research into project leadership, delivers world-leading executive education and works with industry, government and communities to shape future projects and their outcomes.
OffIt’s up to all of us to use generative AI safely, ethically and responsibly. That means understanding its benefits and limitations and applying good judgement.
Apply the 3 principles and practices below to responsibly use public generative AI in your work.