← Field Notes
·6 April 2026·4 min read

Australian Businesses Must Disclose AI Decisions by December 2026

From 10 December 2026, businesses using AI to make decisions about customers must say so in their privacy policy. Here’s what SMEs need to do now.

On 10 December 2026, new transparency obligations under the Privacy and Other Legislation Amendment Act 2024 come into effect. If your business uses personal information in automated or semi-automated decision-making — and the decision could significantly affect someone’s rights or interests — you must disclose that in your privacy policy. Not in vague terms. You need to state what personal information your software uses and what types of decisions it makes or assists with.

Most Australian SMEs have no idea this is coming. Data from the federal government’s AI Adoption Tracker, cited in an Indeed Hiring Lab analysis published this week, shows 60% of small businesses (5–19 employees) are now using AI tools. Medium businesses sit at 72%. That’s a lot of businesses that may need to comply and haven’t started preparing.

AI adoption by business size

Source: Federal Government AI Adoption Tracker, Dec 2025 quarter (via Indeed Hiring Lab)

Large (200–500)
78%
Medium (20–199)
72%
Small (5–19)
60%
Micro (0–4)
36%

The definition is broader than most people expect. It covers any decision made by a computer program, or any decision where a computer program does something “substantially and directly related” to making the decision. That includes AI tools that assist human decision-makers — which is how most SMEs use AI today.

For trades businesses, think about: job management software that auto-assigns technicians based on skills and location. CRM tools that score or prioritise customer leads. Quoting engines that calculate prices based on historical job data. Route optimisation that determines which customers get seen first.

For professional services firms, the list is longer: accounting software that auto-categorises transactions and flags anomalies. Legal document review tools. Client risk assessment scoring. HR platforms that screen job applicants. Any tool that uses personal data to recommend, filter, or rank a decision that a human then acts on.

The test isn’t whether the software makes the final call. It’s whether the software substantially shapes the decision — and whether that decision could “significantly affect the rights or interests of an individual.” Deprioritising a customer, screening out a job applicant, adjusting a quote based on postcode history — all of these qualify.

The Privacy Act applies to businesses with annual turnover above $3 million. For trades, that’s most established businesses with five or more staff. For accounting and law firms, the threshold is rarely an issue. Health service providers and businesses that trade in personal information are caught regardless of turnover.

Non-compliance can result in infringement notices of $62,600 per offence, per Macpherson Kelley’s analysis of the legislation. For serious or repeated interference with privacy, penalties escalate to the greater of $50 million, three times the benefit obtained, or 30% of adjusted turnover. Those headline numbers are designed for large corporations. But an infringement notice, a compliance notice, or even an investigation is enough to consume weeks of management time and thousands in legal fees — for a requirement you could have met by updating a document.

$62.6K

Per infringement notice

Standard penalty per offence

$50M

Maximum civil penalty

Or 30% of turnover, whichever is greater

First, audit your software stack. List every tool that uses personal information to make or assist a decision. Your job management platform, your CRM, your accounting software, your HR tool. If it scores, ranks, categorises, assigns, or recommends based on customer or employee data, it’s likely in scope.

Second, talk to your software vendors. Ask them: does this product use AI or automated processing to make decisions that affect individuals? What personal information does it use? Most vendors should be able to answer this by now. If they can’t, that’s a red flag worth noting.

Third, update your privacy policy. The law requires you to describe the kinds of personal information used and the kinds of decisions made. You don’t need to reveal proprietary algorithms — just be transparent about what data goes in and what decisions come out. As Jackson Walker Solicitors note in their analysis, the language should be “clear, succinct, and accessible to consumers.”

We wrote recently about the mandatory AI rules hitting government agencies in June — that covers how agencies must govern their own AI use. This December deadline is the private-sector companion: how every business explains its AI use to customers. Between the two, the regulatory landscape for AI in Australia has shifted materially in the past twelve months.

Key takeaways

From 10 December 2026, Australian businesses must disclose automated decision-making in their privacy policies under the Privacy and Other Legislation Amendment Act 2024.
The rules cover any AI tool that makes or substantially assists decisions affecting individuals — including CRM scoring, job scheduling, and client risk assessments.
Businesses with annual turnover above $3 million are caught, which includes most established trades and professional services firms.
Start now: audit your AI tools, ask vendors the right questions, and update your privacy policy before the deadline.

Sources

Macpherson Kelley — Automated Decision-Making: Current privacy obligations and what’s in the pipeline for 2026

Jackson Walker Solicitors — Practical implications of the new transparency requirements for automated decision making

Indeed Hiring Lab Australia — Nothing Artificial About Australian AI Adoption (April 2026)

Assumptions & methodology
  1. The $3 million turnover threshold is based on the existing Privacy Act exemption for small businesses. A five-person trades team at a loaded rate of $80/hr, 8 billable hours per day, 230 working days per year generates approximately $3.7 million in revenue — above the threshold. Actual revenue varies by utilisation rate and pricing structure.
  2. AI adoption figures (78%, 72%, 60%, 36% by business size) are from the federal government’s AI Adoption Tracker for the December 2025 quarter, as cited by Indeed Hiring Lab Australia on 1 April 2026.
  3. Penalty figures ($62,600 per infringement notice; $50 million / 3× benefit / 30% turnover maximum) are from Macpherson Kelley’s legal analysis of the Privacy and Other Legislation Amendment Act 2024.

Next

Two-Thirds of Young Australians Trust AI for Money Advice

Read →

Field Notes are general commentary on AI trends for Australian businesses. They don’t constitute professional advice. Talk to your accountant, lawyer, or IT adviser before acting on anything specific to your situation — or talk to us if you want help working out where AI fits.

Not sure which of your tools are in scope?

A short call is enough to map your AI exposure and work out what needs to change before December.

Book a call →