Using public generative AI tools safely and responsibly

Use this guidance to understand how to safely and responsibly engage with public generative artificial intelligence (AI) tools

This guidance is intended for all Australian Government personnel working with government information – including employees, contractors and consultants.

Your agency may have enterprise generative AI tools that are not public and may offer enhanced security, privacy, or tailored functionality. You should refer to your agency’s guidance on how to use these tools.

What generative AI is

Generative AI refers to AI tools that generate content – such as text, images, software code, audio or video – based on patterns learned from large volumes of data. 

ChatGPT, Gemini and Claude are some of the well-known public generative AI tools you can access using a browser or app. These tools allow you to enter a question or instruction and get AI-generated answers. Your input is called a prompt, and the AI’s reply is the output.

Increasingly, generative AI is being built into everyday services or software. It now appears across commonly used digital tools, including search engines, communication platforms, and productivity applications.

Because it’s often embedded in tools you already use, it’s not always obvious when generative AI is active. If a tool helps you write, summarise, design, or generate ideas based on your input, it’s likely using AI.

Differentiating public generative AI from enterprise tools

Public generative AI tools are different from non-public or enterprise tools that have been configured to meet agency data control and information security requirements. When you use public tools, your inputs and outputs may be shared with the tool provider. 

Make sure you know whether you are using a public or non-public generative AI tool as they may have a similar look and feel. For example, Microsoft 365 Copilot is an enterprise AI solution used by some agencies while Microsoft Copilot is a web-based public tool aimed at individual users.

If you’re not sure, refer to your agency’s policies or ICT support for advice.

How generative AI can help you at work

Generative AI tools can help you work more efficiently and explore new ideas. They’re useful for checking your thinking, making content easier to understand, and supporting everyday tasks.

The Australian Government is focused on unlocking the benefits of AI to improve how we work and help deliver better services. Allowing staff to use public generative AI tools for OFFICIAL level government information is a practical step towards this goal.

Using OFFICIAL level information in public generative AI

You should follow your agency’s policies and guidance on using public generative AI tools in the first instance.

Subject to your agency’s policies, you can use these tools with OFFICIAL level government information (see Protective Security Policy Framework (PSPF) Policy Advisory 001-2025).

You must not put information that is security classified OFFICIAL: Sensitive or above into public generative AI tools. Information security classifications are defined in the PSPF.

Principles for using public generative AI tools responsibly 

It’s up to all of us to use generative AI safely, ethically and responsibly. That means understanding its benefits and limitations and applying good judgement. 

Apply the 3 principles and practices below to responsibly use public generative AI in your work.  

Protect privacy and safeguard government information
  • Don’t put security classified (OFFICIAL: Sensitive or above) or personal information into public generative AI tools.
  • Assume anything you enter into a public generative AI tool could be made public.
  • Don’t put third-party copyright-protected information into public generative AI tools.
Use judgement and critically assess generative AI outputs   
  • Check for fairness, accuracy and bias in generative AI outputs. Generative AI tools may reflect bias that can lead to unfair or misleading outputs.
  • Be aware that generative AI can produce convincing but inaccurate content.
  • Undertake training to understand generative AI and how to critically assess its outputs.
Be able to explain, justify and take ownership of your advice and decisions
  • Remain responsible and accountable for content you create, share or use.
  • Generative AI must not make final decisions on government advice, services or outputs.
  • Ensure your use of generative AI supports public trust and upholds the standards and frameworks expected of government employees.

Appropriate public generative AI use examples

Inappropriate public generative AI use examples

Agency guidance on public generative AI

Guidance for Australian Government agencies on managing access to public generative AI tools for personnel working with government information – including employees, contractors and consultants.

Connect with the digital community

Share, build or learn digital experience and skills with training and events, and collaborate with peers across government.