Privacy

Australia's Privacy Act Reforms 2026: What You Need to Know

7 minAI Security Brief

Australia's Privacy Act has undergone its most significant overhaul since 1988 — with new penalties reaching $50 million, mandatory disclosure of AI decision-making systems, and a new statutory tort for serious privacy invasions. Here's what changed and what it means for your organisation.

For the better part of a decade, Australian privacy law lagged demonstrably behind comparable jurisdictions. The Privacy Act 1988 — designed for a pre-internet era — had not kept pace with the data practices of modern enterprise, the scale of platform-driven data collection, or the emergence of AI systems making automated decisions about individuals' rights and interests. That changed in December 2024, and the consequences for organisations operating in Australia are substantial.

The Privacy and Other Legislation Amendment Act 2024 (Cth), which received Royal Assent on 10 December 2024, represents the first tranche of what will be a multi-stage reform process. It introduced the most significant penalties in Australian privacy history, a new statutory tort for serious privacy invasions, mandatory disclosure of automated decision-making systems, expanded OAIC powers, and the first AI-specific transparency provisions in Australian law. The 10 December 2026 compliance deadline for several key provisions is approaching — and regulators are already moving.

The New Penalty Regime: $50 Million Maximums

The old penalty cap for serious privacy interference — $2.2 million for corporations — was a rounding error for large technology companies and provided negligible deterrence. The 2022 amendments, enacted following the Optus and Medibank data breaches that collectively exposed the personal data of tens of millions of Australians, changed the calculus fundamentally.

For serious interferences with privacy, the maximum corporate penalty is now the greater of:

  • $50 million
  • Three times the value of any benefit obtained through the misuse of information
  • 30% of adjusted turnover in the relevant period

These figures are no longer theoretical. On 29 September 2025, Australian Clinical Labs (ACL) agreed to pay a $5.8 million penalty following a 2022 data breach affecting 223,000 customers — the first civil penalty ever imposed under the Privacy Act. The OAIC's enforcement posture has shifted: it is now actively pursuing organisations for privacy failures, not merely issuing guidance.

The 2024 POLA Act further introduced a tiered penalty structure. Tier 2 penalties (interferences with privacy that fall short of "serious") reach $3.3 million for corporations. Tier 1 penalties — infringement notices for administrative failures like a non-compliant privacy policy — allow the OAIC Commissioner to issue fines of up to $330,000 per contravention for corporations without court proceedings. The era of Australian privacy regulation as a compliance box-ticking exercise is over.

AI and Automated Decision-Making: APP 1.7–1.9

The most consequential AI-specific provision of the reforms is the new automated decision-making (ADM) transparency requirement, which takes effect on 10 December 2026. This is the provision that technology teams and legal departments most urgently need to prepare for.

Under the new APP 1.7–1.9, any APP entity that uses a computer program to make — or to substantially and directly support — a decision that could reasonably be expected to significantly affect the rights or interests of an individual must disclose this in its privacy policy. The required disclosures include:

  • The kinds of personal information used in the operation of the automated system
  • The kinds of decisions made solely by automated systems
  • The kinds of decisions for which automated systems perform a function substantially and directly related to making the decision

This provision is broadly drafted. It is intended to capture AI-enabled systems, rule-based tools, and automated assessment technologies — not just machine learning models. The OAIC has confirmed that the obligation applies whether the automated decision affects the individual adversely or beneficially.

The Bunnings facial recognition case, decided by the Administrative Review Tribunal in early 2026, illustrates the compliance risk: while Bunnings was found not to have breached the substantive Privacy Act provisions around biometric data use, the Tribunal affirmed the OAIC's finding that Bunnings breached APP 1.3 by failing to maintain a clearly expressed and up-to-date privacy policy. Under the post-2024 regime, that same failure would now trigger an infringement notice of up to $330,000.

The Statutory Tort for Serious Privacy Invasions

One of the most structurally significant reforms is the introduction of a private right of action for serious invasions of privacy — a statutory tort that took effect in June 2025. This creates direct litigation exposure for organisations, not merely regulatory risk.

The tort applies when a defendant either intrudes upon the plaintiff's seclusion (physical or digital) or misuses information about the individual. Crucially, "misuse" is defined broadly to include collecting, using, or disclosing information about the individual. The conduct must be intentional or reckless.

For class action litigation specialists, the statutory tort is particularly significant: unlike individual data breach complaints, aggregated privacy claims affecting large numbers of affected individuals create the basis for representative proceedings. The 2022 Optus breach class action demonstrated the appetite; the statutory tort gives plaintiffs a cleaner legal pathway.

Individual Rights Expansion and the GDPR Comparison

Australia's reforms bring the country closer to GDPR alignment — but the comparison reveals where material gaps remain.

Provision GDPR Australia (Post-2024 Reform)
Maximum penalty 4% of global annual turnover or €20M $50M, or 30% of Australian turnover
Automated decision-making disclosure Art. 22 (right to explanation, opt-out) APP 1.7 (disclosure only, no individual right to contest)
Right to erasure Art. 17 Not yet legislated (under consideration in Wave 2)
Data portability Art. 20 Not yet legislated (under consideration in Wave 2)
Statutory tort Not in GDPR framework New (June 2025)
Small business exemption No Currently retained (but under review)

The divergence on automated decision-making rights is notable. GDPR Article 22 gives individuals the right not to be subject to decisions based solely on automated processing that produce significant effects, and the right to obtain human review and explanation. Australia's APP 1.7 requires disclosure of the use of such systems but does not create a corresponding individual right to contest or seek human review of automated decisions. This is expected to be addressed in the second wave of reforms.

The Children's Online Privacy Code — which the OAIC must develop and register by 10 December 2026 — will impose specialised requirements on digital platforms accessed by children, bringing Australia closer to the UK's Age Appropriate Design Code and the US Children's Online Privacy Protection Act.

What the OAIC's Enforcement Activity Signals

The OAIC's behaviour in early 2026 is the clearest signal available about enforcement priorities. The regulator's January 2026 desktop review of government agency ADM disclosure practices explicitly stated that transparency is a key regulatory focus. The privacy compliance sweep announced in early 2026 — targeting APP entities across multiple sectors — is occurring precisely as the December 2026 ADM disclosure deadline approaches.

MinterEllison's assessment is direct: "From 10 December 2026, organisations using ADM must clearly disclose how personal information is used in automated decisions. Privacy [Commissioner] is watching." Landers & Rogers notes that the new infringement notice powers for APP 1.4 non-compliance (non-compliant privacy policies) will apply to ADM disclosure failures once APP 1.7 takes effect — meaning enforcement can happen without litigation.

The $5.8 million ACL penalty is the proof of concept. The OAIC now has the tools, the precedent, and the stated mandate to pursue organisations that fail to meet their transparency obligations.

Compliance Priorities for Technology Teams

The 10 December 2026 deadline is not distant. Organisations using any automated system that makes or substantially supports decisions affecting individual rights must act now:

  1. Inventory your automated decision-making systems. This includes AI models, rule-based engines, scoring algorithms, automated assessment tools — any computer program that uses personal information in making decisions affecting individuals.

  2. Map the personal information inputs. For each ADM system, document what personal information is used in its operation.

  3. Update your privacy policy. Draft APP 1.7-compliant disclosures that identify the system categories, decision types made solely by automation, and decision types for which automation plays a substantial and direct role.

  4. Review permissions and data governance. The OAIC's focus on ADM transparency is paired with a broader enforcement agenda. Privacy policies, data collection practices, and breach response plans should all be reviewed against the post-2024 regime.

  5. Monitor Wave 2 reforms. The right to erasure, data portability, and enhanced individual rights for automated decision-making are expected in subsequent reform tranches. Organisations building AI systems now should architect them with these anticipated obligations in mind.


Key Takeaways

  • Australia's $50 million maximum penalty for serious privacy interference is now in effect; the OAIC issued the first civil Privacy Act penalty ($5.8M against ACL) in September 2025.
  • New APP 1.7–1.9 requires all APP entities using automated decision-making systems that affect individual rights to disclose this in their privacy policy by 10 December 2026.
  • The statutory tort for serious privacy invasions (in force since June 2025) creates direct litigation exposure for organisations beyond regulatory enforcement.
  • Australia's reforms are moving towards GDPR alignment but retain key gaps, including no individual right to contest automated decisions (expected in Wave 2 reforms).
  • The OAIC has signalled active enforcement in 2026 — the ADM transparency deadline is a compliance hard date, not a guideline.

References

  1. Lander & Rogers — Australian Privacy Law Update 2026: APP 1.7 automated decision-making transparency requirements, December 2026 deadline. https://www.landers.com.au/legal-insights-news/australian-privacy-law-update-what-app-entities-need-to-know-in-2026

  2. Clyde & Co — Cyber and Privacy Law Update: Accountability Gets Real (October 2025): $5.8M ACL penalty, tiered penalty structure, new OAIC enforcement powers. https://www.clydeco.com/en/insights/2025/10/cyber-and-privacy-law-update-accountability-gets-r

  3. Secure Privacy — What the Australia Privacy Act Reforms Mean for Your Business 2025: Full breakdown of POLA Act provisions, statutory tort timeline, GDPR comparison, small business exemption status. https://secureprivacy.ai/blog/what-australia-privacy-act-reforms-mean-for-your-business-2025

  4. MinterEllison — OAIC Ramps Up Privacy Enforcement: Are You Ready? (February 2026): ADM disclosure compliance sweep, Bunnings APP 1.3 case, December 2026 enforcement context. https://www.minterellison.com/articles/oaic-ramps-up-privacy-enforcement-are-you-ready


Stay ahead of AI security threats. Subscribe to the AI Security Brief newsletter for weekly intelligence on AI-powered attacks, privacy tools, and defence strategies. Subscribe now →

Filed under:PrivacyAustralia Privacy Act reform 2026automated decision making transparency