-
Please provide an overview of the legal and regulatory framework governing data protection, privacy and cybersecurity in your jurisdiction (e.g., a summary of the key laws; who is covered; what sectors, activities or data do they regulate; and who enforces the relevant laws).
In Australia, data protection and privacy are principally regulated by the federal Privacy Act 1988 (Cth) (Privacy Act). The Privacy Act regulates the collection, use, storage and disclosure of personal information by private sector organisations (with some exceptions) and federal government agencies (but not state agencies). In particular, the Privacy Act sets out 13 Australian Privacy Principles (APPs) which set out specific obligations in respect of personal information. The Privacy Act also contains credit reporting obligations which apply to the handling of credit information about individuals by credit reporting bodies, credit providers and some other entities.
In relation to cybersecurity specifically, we refer to the answers to items 39 to 48.
The Privacy Act applies to the handling of personal information by private sector organisations generally, however, organisations with aggregate group turnover of less than AUD3 million are not subject to the Privacy Act unless they are: (i) a private sector health services provider; (ii) a business that sells or purchases personal information; (iii) a credit reporting body; or (iv) a contracted service provider for a federal government agency.
There are a range of other laws in Australia, both at the federal and state/territory level, which impact data protection. These include:
- state and territory privacy legislation, applying to personal information held by state and territory government agencies and private sector contractors to such agencies (for example, the New South Wales statute, Privacy and Personal Information Protection Act 1988 (NSW)). State and territory regulators administer such legislation;
- in New South Wales (NSW), Victoria (Vic) and the Australian Capital Territory (ACT), specific privacy legislation relating to health information and health records, applying to health information collected, used and disclosed by public sector agencies (based in NSW, Vic or the ACT) or private sector organisation that is a health service provider, or that otherwise collects, holds or uses health information;
- federal law requiring telecommunications carriers and carriage service providers to capture and retain certain information about communications carried over services provided by them;
- federal and state and territory laws governing telecommunications interception and access to stored communications, the use of surveillance devices, tracking devices and listening devices, video and audio-visual monitoring of public places and workplaces and computer and data surveillance of workplaces (including home working);
- federal and state/territory freedom of information legislation, intended to promote transparency of government, which applies to information held by government agencies;
- the Spam Act 2003 (Cth) (Spam Act), which deals with the sending of unsolicited commercial electronic messages, including emails and SMS;
- the Do Not Call Register Act 2006 (Cth) (DNCR Act), regulating unsolicited commercial calling to telephone numbers listed on the national Do Not Call Register (DNCR);
- the Security of Critical Infrastructure Act 2018 (Cth) (SOCI Act) which imposes obligations on organisations operating in 11 ‘critical infrastructure sectors’ to ensure the cyber resilience of their assets;
- federal and state criminal laws dealing with unauthorised access to computer systems, including databases; and
- developing judge-made law in the form of an equitable doctrine of misuse of confidential information.
The Privacy Act is administered by the Australian Privacy Commissioner (the Commissioner) which is integrated within the Office of the Australian Information Commissioner (OAIC).
The Australian Communications and Media Authority (ACMA) enforces provisions of the Spam Act and the DNCR Act. It also administers a number of privacy affecting codes in the communications sector.
-
Are there any expected changes in the data protection, privacy or cybersecurity landscape in 2025 - 2026 (e.g., new laws or regulations coming into effect, enforcement of such laws and regulations, expected regulations or amendments)?
The Commonwealth Attorney-General conducted an extensive review of the Privacy Act, culminating in the Attorney-General’s ‘Privacy Act Review Report’ which was published in February 2023. This report contains 116 proposals for reform, many of which represent a substantial reshaping the Australian privacy landscape. In September 2023, the Federal Government released its response to the Privacy Act Review Report in which it either ‘agreed’ or ‘agreed in-principle’ to all but 10 proposals (which were ‘noted’).
In December 2024, the Federal Parliament enacted the Privacy and Other Legislation Amendment Bill 2024 (Privacy Amendment Act). The Privacy Amendment Act implemented a number of the proposals contained in the Privacy Act Review Report.
Some of the key proposals implemented under the Privacy Amendment Act include:
-
- the introduction of a statutory tort for serious invasions of privacy that are intentional or reckless. Importantly, the invasion of privacy need not cause actual damage and individuals may claim damages for emotional distress;
- while not effective until December 2026, a new requirement to disclose in privacy policies where an entity undertakes automated decision making in relation to individuals. This applies when an entity has arranged for a computer program to make, or do a thing that is substantially and directly related to making, a decision using personal information, which could reasonably be expected to significantly affect the rights or interests of an individual;
- requirement for the OAIC to develop an APP code about online privacy for children (see more at item 10);
- various measures to strengthen enforcement of the Privacy Act. In particular, new civil penalties which create mid-tier penalties for general privacy interferences with a maximum fine of AU $3,300,000 for corporates as well as infringement notices for a variety of somewhat minor prescribed contraventions, including non-compliant privacy policies (see more at item 37) ; and
- introduction of a slew of new powers for the OAIC in relation to investigations, public inquiries and determinations.
The remaining proposals are still expected to be adopted into law at some time in the future, and include:
-
- The requirement to act fairly and reasonably when collecting, using and disclosing personal information (Proposal 12). The Report stresses that this requirement will be judged on an objective standard and will apply regardless of any consent.
- A broader definition of personal information (Proposals 4.1 – 4.4). The report proposes changing the word “about” in the definition of personal information, to “relates to” (that is, “information or an opinion that relates to an identified individual…”). This change would allow the definition to capture a broader range of information. Such a change would bring the Privacy Act definition in line with the language used in the GDPR definition of ‘personal data’.
- Direct right of action to enforce privacy rights (Proposal 26) for individuals who have suffered loss or damage as a result of an interference with their privacy. This would allow individuals (and representative groups) to seek compensation in the Federal Court or the Federal Circuit and Family Court of Australia.
- Tighter timeframes for Notifiable Data Breaches (Proposal 28). The Report proposes that the deadline for reporting eligible data breaches to the OAIC will be reduced to (a GDPR-familiar) 72 hours from when the entity becomes aware that there are reasonable grounds to believe that there has been an eligible data breach.
- Additional obligations when handling employee records (Proposal 7) but, importantly, not a removal of the exemption for handling employee records.
- Introduction of the concept of processors and controllers in Australian law – to make it more akin to other jurisdictions, most notably the GPDR (Proposal 22).
- The requirement to conduct Privacy Impact Assessments (Proposal 13) for any ‘high privacy risk activity’, which would encompass activities likely to have a significant impact on the privacy of individuals.
- Regulation of targeted advertising (Proposal 20) through a prohibition on the use of information related to an individual (including personal information, de-identified information, and unidentified information (such as internet tracking history)) for targeted advertising and content to children, and prohibitions on using sensitive information for targeted advertising and content to any individuals.
- A new right of erasure that would provide individuals with the ability to request the deletion of their personal information by APP entities. This right of erasure is essentially an extension of the obligation to delete personal information once it is no longer required, and individuals will be able to exercise this right in relation to any category of personal information.
-
-
Are there any identifiable trends or regulatory priorities in privacy, data protection and/or cybersecurity-related enforcement activity in your jurisdiction?
The Privacy Commissioner has become considerably more active in recent years. This has come in the wake of several high-profile cyber incidents which resulted in the exfiltration of high volumes of personal information.
A notable enforcement action was the proceedings that the OAIC conducted against Australian Clinical Labs (ACL) in Australian Information Commissioner v Australian Clinical Labs Limited (No 2) [2025] FCA 1224. ACL was subject to a cyber attack leading to the disclosure of personal information relating to approximately 223,000 individuals. In an agreed settlement, ACL admitted to breaching APP 11 and failures to meet the timeframes required under the mandatory breach notification regime. As a result, ACL agreed to pay AUD5.8 million in civil penalties, which is the first instance of civil penalties being imposed under the Privacy Act. The penalties under the Privacy Act have increased since the conduct to which these proceedings relate, so it is possible that higher penalties would be imposed if the same conduct occurred today.
Proceedings in the other high-profile cyber incidents involving Optus and Medibank are ongoing and involve considerably larger data sets.
The OAIC has also focused on facial recognition technologies, leading to high profile investigations involving hardware supplier, Bunnings, and department store, Kmart. Those entities were using facial recognition for similar purposes: Bunnings to identify individuals who had created safety concerns in the past and Kmart for the purpose of identifying individuals involved in fraudulent returns. The OAIC made negative findings against both Bunnings in 2024 and Kmart in September 2025.
Bunnings challenged the finding in the Administrative Review Tribunal, and was successful. The Tribunal accepted Bunnings’ argument that it was entitled to collect sensitive personal information without consent because it fell within a general permitted situation, namely that Bunnings had reason to suspect engagement in unlawful activity of a serious nature. The Tribunal considered the proportionality of collecting the information, and considered that, on balance, it was permitted.
It is expected that the Kmart determination will also be considered by the Administrative Review Tribunal during the course of 2026 and will likely provide further analysis on the permissibility of facial recognition technologies in the context of a different fact scenario.
The Privacy Commissioner’s active enforcement illustrated in these matters contrasts with the Commissioner’s approach historically. Prior to 2014 the Privacy Commissioner seldom exercised its power to make determinations as to an alleged breach of privacy. In the period between 2014 and 2023, the Commissioner has made 63 determinations (with 29 determinations being issued since 2021). The OAIC issued 12 determinations in the 2023-2024 period, indicating the continued trend towards increasing enforcement. Additionally, the passage of the Privacy Amendment Act, which included a tiered penalty regime for non-serious and non-repeated breaches of the Privacy Act also indicates a trend towards more regular enforcement.
Between 2014 and 2016, the Commissioner sought to conciliate complaints between the relevant parties, with an apology to the complainant the most common remedy achieved through conciliation, followed by compensation as the next most common. Since 2017, however, compensation has been the most common remedy applied. The amount of compensation paid to an individual complainant between 2014 and 2023 has varied between AUD2,000 to AUD20,000. For instance, in January 2021, the Commissioner made findings against the Commonwealth Department of Home Affairs, holding that it had interfered with the privacy of 9,258 individuals breaching IPP 11 and failed to protect personal information from loss, unauthorised access, use, modification or disclosure or other misuse, breaching IPP 4. The Commissioner’s determination awarded a class of 1,297 members compensation ranging between AUD0 to more than AUD20,000. The Commissioner held that the quantum of compensation was to be calculated based on the nature of the loss and damage experienced by the individual as a result of the breach.
As at April 2026, the Commissioner has accepted 15 enforceable undertakings. An enforcement undertaking may impose a significant administrative and operational load upon the party giving the undertaking. For example, following an information security breach involving Marriott International, in February 2023, the Commissioner accepted an enforceable undertaking from Marriott International requiring it to:
-
- monitor the effectiveness of the privacy and security risk management strategy set by Marriott’s privacy and information security leadership and policies;
- monitor the effectiveness of Marriott’s Global Information Security & Privacy Incident Response Plan (Incident Response Plan) on no less than an annual basis and evaluate and revise if necessary;
- engage independent third parties to assess Marriott’s information security controls and audit Marriott’s security compliance with the Payment Card Industry Data Security Standard for its reservations system; and
- monitor, through engagement of security firms, for evidence of public disclosure or unauthorised use of personal information of individuals covered by the Privacy Act which were disclosed as a result of Marriott data breach and notify the Commissioner and affected individuals if evidence of such disclosure or use is discovered.
With regard to cybersecurity, as the Cyber Security Act 2024 (Cth) (Cyber Security Act) has only recently been implemented, there have been no instances of enforcement yet.
In relation to the SOCI Act, there is little available evidence with regard to identifiable trends in enforcement activity and to date we are not aware of any enforcement. The Cyber and Infrastructure Security Centre (CISC) (which is part of the Department of Home Affairs) has published its principles and approach which outlines that the Federal Government will ensure that regulation is effective and efficient through:
-
- improving accountability and transparency of regulator performance;
- sharing best practice;
- building regulator capability; and
- driving a culture of regulator experience.
A recent independent review into the SOCI Act has signalled that the Federal Government will start to consider enforcement action in the near future.
-
-
Are there any registration or licensing requirements for entities covered by these data protection and cybersecurity laws, and if so what are the requirements? Are there any exemptions? What are the implications of failing to register / obtain a licence?
There are no registration or licensing requirements under the Privacy Act for general processing of personal information.
Under the SOCI Act, responsible entities of critical infrastructure assets are required to register those assets with, and provide certain operational and ownership information to, the Register of Critical Infrastructure Assets. This is maintained by the Federal Department of Home Affairs. Failure to provide the required information to the Register can result in fines.
-
What does “personal data,” “personal information” or other equivalent terms (hereafter “personal data”) mean under data protection laws in your jurisdiction? Does the definition broadly include information about all individuals? For example, would this include individuals acting in a personal or household capacity, as well as those acting in a business or commercial capacity (such as on behalf of a business or corporate entity or employer) or otherwise?
Personal Information
Under the Privacy Act, ‘personal information’ means information or an opinion about an identified individual, or an individual who is reasonably identifiable, whether the information or opinion is true or not and whether the information or opinion is recorded in a material from or not. State and territory privacy laws use a similar definition.
Whether an individual is ‘reasonably identifiable’ from particular information will depend on considerations that include:
-
- the nature and amount of information;
- the circumstances of its receipt;
- who will have access to the information;
- other information either held by or available to the entity that holds the information;
- whether it is possible for the individual or entity that holds the information to identify the individual, using available resources (including other information available to that individual or entity). Where it may be possible to identify an individual using available resources, the practicality, including the time and cost involved, will be relevant to deciding whether an individual is ‘reasonably identifiable’; and
- if the information were to be publicly released, whether a reasonable member of the public who accesses that information would be able to identify the individual.
APP Entity
This broadly includes information about all individuals, even those acting in a business or commercial capacity. However, the obligations under the Privacy Act only apply to APP Entities. An APP Entity is either a federal government agency (but not a state or territory agency) or any private sector organisation (which includes individuals, companies, partnerships or otherwise) that has an annual turnover greater than AUD3 million which has an Australian link (see below). In addition, an organisation with annual turnover lower than AUD3 million will be an APP Entity if it: (a) provides health services and holds health information; (b) exchanges personal information for a benefit, service or advantage; (c) provides services to a federal government agency (either directly or as a subcontractor); or (d) is a credit reporting body.
-
-
Are certain types of personal data considered more sensitive or highly regulated under data protection laws in your jurisdiction? Please include the relevant defined terms for such data (e.g., special categories of personal data,” “sensitive data” or “sensitive personal information”?
Sensitive information’ is defined as information or an opinion about an individual’s:
-
- racial or ethnic origin;
- political opinions;
- membership of a political association;
- religious beliefs or affiliations;
- philosophical beliefs;
- membership of a professional or trade association;
- membership of a trade union;
- sexual orientation or practices; or
- criminal record,
that is also personal information; or
-
- health information about an individual;
- genetic information about an individual that is not otherwise health information;
- biometric information that is to be used for the purpose of automated biometric verification or biometric identification; or
- biometric templates.
There are a small number of additional obligations imposed in relation to sensitive information. These are discussed at 9.
-
-
What principles apply to the processing of personal data in your jurisdiction? For example: is it necessary to establish a “legal basis” for processing personal data?; are there specific transparency requirements?; must personal data only be kept for a certain period? Please provide details of such principles.
The following are key Australian Privacy Principles:
Transparency
Under APP 1, APP Entities are required to manage personal information in an open and transparent way and must take reasonable steps to implement practices, procedures and systems to comply with the Privacy Act. This includes an obligation to have a clearly expressed and up to date privacy policy available to the public free of charge and in an appropriate form. Practices and processes must also reflect the stated privacy policy: the Commissioner has also interpreted APP 1 as requiring implementation of ‘privacy by design’ into an APP Entity’s business practices.
APP 5 requires an APP Entity that collects personal information about an individual to take reasonable steps, at or before the time of collection, or as soon as practicable afterwards, either to notify the individual of certain matters or to ensure the individual is aware of those matters. APP 5.2 lists the matters that must be notified to an individual or of which they must be made aware.
The requirement to notify or ensure awareness of the APP 5 matters applies to all personal information collected about an individual, either directly from the individual or from a third party.
Lawful basis for processing
The Privacy Act governs the collection, holding, use, disclosure, access and correction of personal information by APP Entities. The Privacy Act prohibits an organisation from collecting personal information (which is not sensitive information) unless the information is reasonably necessary for one or more of the organisation’s functions or activities. As noted in item 2, a proposal to impose a general fairness and reasonableness requirement in respect of processing has been proposed, but not yet implemented.
Where an organisation is collecting sensitive information, as with ordinary categories of personal information, it must be reasonably necessary for one or more of the organisation’s functions or activities, but it must also obtain the relevant individual’s consent to the collection of their sensitive information (unless an exception applies).
The state and territory privacy legislation apply analogous concepts in relation to entities regulated by those Acts.
Purpose limitation
In accordance with APP 6, an APP Entity can only use or disclose personal information for the particular purpose for which it was collected (known as the ‘primary purpose’), or for a ‘secondary purpose’ if an exception applies.
Use or disclosure of personal information for a ‘secondary purpose’ is permitted under specific exceptions where that secondary use or disclosure is:
-
- consented to by the individual;
- one in respect of which the individual would reasonably expect the APP Entity to use or disclose their personal information, and that purpose is related to the primary purpose of collection, or, in the case of sensitive information, directly related to the primary purpose;
- required or authorised by or under an Australian law or a court or tribunal order;
- necessary to lessen or prevent a serious threat to any individual’s life, health or safety, or to public health or safety, and it is unreasonable or impracticable to obtain the consent of the individual;
- necessary in order for an organisation to take appropriate action in relation to a reasonable suspicion of unlawful activity, or misconduct of a serious nature, which relates to the APP Entity’s functions or activities. APP 6.2(e) also permits the use or disclosure of personal information for a secondary purpose to an enforcement body for one or more enforcement related activities;
- in the conduct of surveillance activities, intelligence gathering activities or monitoring activities, by a law enforcement agency;
- the conduct of protective (for example, in relation to children) or custodial activities;
- to assist any APP Entity, body or person to locate a person who has been reported as missing (where the entity reasonably believes that this use or disclosure is reasonably necessary, and where that use or disclosure complies with rules made by the Commissioner);
- for the establishment, exercise or defence of a legal or equitable claim; or
- for the purposes of a confidential alternative dispute resolution process.
Data minimisation
As mentioned above, under APP 3, an organisation must not collect personal information unless the information is reasonably necessary for one or more of the entity’s functions or activities. In the case of sensitive information, it must also have the individual’s consent.
Integrity
Under APP 10, APP Entities are required to ensure that the personal information they use or disclose is accurate, up-to-date, complete and relevant.
Retention
In accordance with APP 11.2, where an APP Entity holds personal information about an individual which is no longer needed for any purpose for which the information may be used or disclosed, then the APP Entity must take such steps as are reasonable in the circumstances to destroy or de-identify the information.
APPs 4.3 and 11.2 require the destruction or de-identification of personal information in certain circumstances. Where the information is contained in a Commonwealth (federal) record (which is the property of the Commonwealth) or is required to be retained under Australian law or by a court or tribunal, the information must be retained. For example, financial records must be retained under the Corporations Act 2001 (Cth) for seven years.
Collection by lawful and fair means
An APP Entity must collect personal information only by “lawful and fair means” (APP 3.5). This requirement applies to all APP Entities. Examples of where a collection of personal information may be unfair (some may also be unlawful) include collecting from an electronic device which is lost or left unattended, collecting from an individual who is traumatised, in a state of shock or intoxicated, collecting in a way that disrespects cultural differences or after misrepresenting the purpose or effect of collection, or the consequences for the individual of not providing the requested information.
Collecting directly from the individual
APP 3.6 provides that an APP Entity “must collect personal information about an individual only from the individual”, unless one of the following exceptions applies:
-
- for all APP Entities, it is unreasonable or impracticable for the entity to collect personal information only from the individual;
- for federal government agencies, the individual consents to the personal information being collected from someone other than the individual; and
- for federal government agencies, the agency is required or authorised by or under an Australian law, or a court or tribunal order, to collect the information from someone other than the individual.
Cross-border disclosure of personal information
Before an APP Entity discloses personal information to an overseas recipient, the entity must take reasonable steps to ensure that the overseas recipient does not breach the APPs in relation to the information (APP 8.1). This is usually achieved by the APP Entity imposing contractual obligations on the overseas recipient to comply with the Privacy Act (or relevant aspects).
An APP Entity that discloses personal information to an overseas recipient is accountable for any acts or practices of the overseas recipient in relation to the information that would breach the APPs (section 16C).
There are exceptions to the requirement in APP 8.1 to take reasonable steps and to the accountability provision in section 16C. These include obtaining the consent of the relevant individual to the overseas disclosure (after an express statement informing the individual that APP 8 will not apply), or where the APP Entity reasonably believes that the recipient is subject to an equivalent regime in its local jurisdiction and that there are mechanisms that the individual can access to take action to enforce that regime.
Security of personal information
APP 11 requires an APP Entity to take active measures to ensure the security of personal information it holds, and to actively consider whether it is permitted to retain personal information. An APP Entity that holds personal information must take reasonable steps to protect the information from misuse, interference and loss, as well as unauthorised access, modification or disclosure (APP 11.1). Unauthorised access includes both access by an employee of the entity or independent contractor and unauthorised access by an external third party (such as by hacking).
Reasonable steps should include, where relevant, taking steps and implementing strategies in relation to governance, culture and training, internal practices, procedures and systems, ICT security, access security, third party providers (including cloud computing), data breaches, physical security, destruction and de-identification and compliance with applicable standards
-
-
Are there any circumstances for which consent is required or typically obtained in connection with the processing of personal data? What are the rules relating to the form, content and administration of such consent? For instance, can consent be implied, incorporated into a broader document (such as a terms of service) or bundled with other matters (such as consents for multiple processing operations)?
There is no general requirement to obtain consent for the collection of most types of personal information (excluding sensitive information), or for its processing. However, consent may operate as an exception to certain prohibitions under the Privacy Act or a qualification to certain obligations.
Under APP 3.3, APP Entities are prohibited from collecting sensitive information (defined above) unless the consent of the relevant individual has been obtained. Some narrow exceptions apply.
Under APP 3.6 there is a general expectation that personal information will be collected from the individual to which it relates. However, an exception applies which permits government agencies to collect personal information from another source if the individual has given consent.
APP 6 requires personal information to only be used or disclosed for the purpose for which it was collected. However, there are some exceptions to this, one being where the individual has consented to its use or disclosure for another secondary purpose.
Under APP 7, direct marketing is prohibited unless an exception applies, and one such exception is where the organisation has obtained the individual’s consent. Further, consent is the only permitted circumstance where Sensitive Information can be used for the purpose of direct marketing. The Spam Act also prohibits commercial electronic communications without consent (which may be inferred) and requires entities to allow individuals to easily withdraw consent (or “unsubscribe”).
APP 8, which generally restricts the offshore disclosure of personal information, allows it to occur where there is consent of the relevant individual (but note this consent has separate requirements under the Privacy Act).
Where consent is applicable under the Privacy Act, it may be express or implied. Express consent can be provided orally or in writing, although best practice requires written consent (which can be electronic). Implied consent is consent which can reasonably be inferred from the conduct. The Privacy Commissioner’s guidance suggests that consent can only be implied in clear circumstances. For example, it will not be sufficient to merely establish that the collection, use or disclosure will be advantageous to the individual or that they did not object at the time of collection. The Privacy Commissioner also advises against the use of opt-out mechanisms.
The Privacy Commissioner has identified 4 elements of consent in its guidance:
-
- (informed) the individual must be adequately informed of the implications of providing or withholding consent for it to be considered informed.
- (voluntary) the individual must have a genuine opportunity to provide or withhold consent. This may require an assessment of the alternatives available to the individual if they do not consent and the seriousness of the consequences. Bundling consent may also be problematic for assessing whether consent is voluntary, as the broader consequences of a refusal need to be considered.
- (current and specific) consent for collection and proposed uses/disclosures should typically be obtained at the time personal information is collected. If such consent is being sought later, it should be at the time of the proposed use or disclosure requiring consent. Consent should be as directed as possible, and not a broad consent for various activities.
- (capacity) APP Entities need to consider whether the individual has capacity to give consent. Ordinarily, this can be presumed, however, the following factors may indicate that further inquiries are required: age, physical or mental disability, temporary incapacity or limited understanding of English.
One of the reforms currently being considered as part of a broader review of the Privacy Act (see item 2 for further details) is to incorporate these elements of consent into the legislation, however, at the time of publication, these have not yet been enacted into law.
-
-
What special requirements, if any, are required for processing particular categories of personal data (e.g., health data, children’s data, special category or sensitive personal data, etc.)? Are there any prohibitions on specific categories of personal data that may be collected, disclosed, or otherwise processed?
There are a number of instances where the APPs impose additional restrictions in respect of sensitive information. Item 6 describes the personal information which is considered ‘sensitive’ under the Privacy Act.
APP 3 requires that an individual’s consent is always required for the collection of sensitive information under the Privacy Act (unless an exception applies). This differs from other types of personal information where consent is not strictly required.
APP 6 addresses the purpose for which information may be used or disclosed. It provides that where consent has not been obtained, sensitive information can only be used for a secondary purpose if that secondary purpose is directly related to the primary purpose (that is, the purpose for which it was collected). This contrasts with other personal information which can be used for any secondary purpose which is related to the primary purpose. The term “directly related” is not expressly defined, however, it is likely to require a secondary purpose which is closely related to the primary purpose.
Under APP 7, the only circumstance in which sensitive information can only be used for the purpose of direct marketing is if the individual has consented. This is narrower than the circumstances for which other types of personal information can be used in this way (for example, where the individual would reasonably expect the information to use the non-sensitive personal information for that purpose and provides a simple means to request that the entity cease marketing).
In addition, there are special rules which apply to recipients of tax file numbers.
There are no categories of personal information that are prohibited from being collected under the Privacy Act.
Some individual states also have specific laws relating to health data, for example the Health Records and Information Privacy Act 2002 (NSW) and the Health Records Act 2001 (Vic). In the Victorian and NSW examples, these laws apply to health services providers (in the private sector as well as applicable state government entities) operating in respective states and oblige such providers to comply with “health privacy principles” which are enumerated in the legislation. In each applicable state, these principles are similar, although not the same, as the APPs.
-
Do the data protection laws in your jurisdiction have special or particular requirements, restriction, or rules regarding the collection, use, disclosure or processing of personal information from or about children or minors? If so, what is the age threshold and key requirements/restrictions that go beyond those applicable, generally?
The Privacy Act itself does not contain additional or special obligations relating to the use or disclosure of children’s personal information.
That said, the provisions relating to consent are likely to require an assessment of the relevant capacity of the individual, and guardian consent to be obtained where necessary. The Privacy Commissioner has provided guidance that where an individual is under the age of 18, an assessment of their capacity is required. If it is not practical to assess the capacity of an individual, the Privacy Commissioner has said that, as a general rule, APP Entities should assume that an individual over the age of 15 has capacity, unless there is evidence to the contrary.
Further, as of April 2026, the Privacy Commissioner is conducting a public consultation regarding the introduction of a Children’s Online Privacy Code (Code). For this purpose, the OAIC has published an exposure draft of the Code. Key features of the draft Code are set out below:
The Code applies to entities operating online services like apps, games and websites which are likely to be accessed by children, or which are primarily concerned with the activities of children. Entities are required to reasonably verify the age of users.
Entities must only collect personal information about a child which is strictly necessary for the provision of the online service.
The collection of personal information about a child must be consistent with the best interests of the child. The entity must not use or disclose personal information about a child unless that use or disclosure both, (i) has been consented to by the child; and (ii) is in the best interests of the child.
Consent can be given by a child over the age of 15 or by a personal with parental responsibility for the child. The Code sets out information that must be provided by an entity seeking such consent. Consent must be completely voluntary and fully informed.
The Code contains further requirements and standards of transparency that an entity must meet with respect to its privacy policy.
Entities must respond to requests regarding access or correction of personal information as soon as reasonably possible (no later than 30 days). There may be circumstances where complex requests can be responded to within 60 days.
The Code introduces a ‘right to be forgotten’, requiring entities to permanently delete information about a child if requested. This is subject to certain exceptions, including where the information relates to existing or expected legal proceedings.
Entities must ensure their online services notify the child if the service provides another individual with information about their location.
The Code also contains requirements relating to the receipt and processing of inquiries and complaints.
An entity must conduct a privacy impact assessment prior to providing a new service to which the Code applies or when the entity proposes to change an existing service in a way that will significantly impact the privacy of children.
Consultation on the Code is ongoing; it is not yet law.
-
Do the data protection laws in your jurisdiction include any derogations, exemptions, exclusions or limitations other than those already described? If so, please describe the relevant provisions.
Small Business Operators
As described above, an organisation will not be subject to the obligations under the Privacy Act if its annual turnover is less than AUD3 million. This exception does not apply where the organisation:
-
- provides health services and holds any health information;
- exchanges personal information for a benefit, service or advantage;
- is a contracted service provider to a federal government agency (whether directly or under a sub-contract); or
- is a credit reporting body.
Note that this exemption will likely be the subject of reform, as described in item 2.
Australian Link
The obligations under the Privacy Act are only applicable to entities if they have an ‘Australian link’. The APPs have extra-territorial application and will extend to an act done, or practice engaged in, outside Australia by an organisation but only where that organisation has an ‘Australian link’ (s 5B(1A)).
An organisation has an Australian link if the organisation is:
-
- an Australian citizen or a person whose continued presence in Australia is not subject to a legal time limitation;
- a partnership formed, or a trust created, in Australia or its external Territories (i.e., islands controlled by Australia);
- a body corporate incorporated in Australia or its external Territories; or
- an unincorporated association that has its central management and control in Australia or an external Territory.
An organisation that does not fall within one of those categories will also have an Australian link where it carries on business in Australia or its external Territories.
For clarity, external territories of Australia are the islands and other areas controlled by the Australian government.
Employee Records
Acts or practices of an organisation which is a private sector employer of an individual which are directly related to: (i) a current or former employment relationship between the organisation and the individual; and (ii) records of personal information relating to the individual’s employment, are exempt from the obligations of the Privacy Act. This exemption does not apply to contractors or unsuccessful job applicants.
This exemption will likely be the subject of reform, as described in item 2.
Journalism
APP Entities engaged in journalism are exempt from the Privacy Act provided they observe alternative standards which address privacy and have been published by an organisation representing a class of media organisations.
Journalists will be exempt from liability for the new serious invasion of privacy tort to the extent that the invasion of privacy involves the collection, preparation for publication or public of journalistic material.
State Governments and their service providers
State Governments have adopted their own privacy legislation and are not subject to the Privacy Act. Also, acts undertaken by (sub)contractors to state governments pursuant to such service contracts are exempt from the Privacy Act (however, it will likely be subject to the relevant State legislation).
Political parties
Generally speaking, political parties are exempt from the obligations in the Privacy Act. They are also exempt from the applications of the Spam Act.
Individuals in a non-business capacity
An individual who may constitute an APP Entity because of their business affairs (such as a sole trader) will not be subject to Privacy Act obligations in respect of their personal affairs.
-
-
Does your jurisdiction require or recommend privacy risk or impact assessments in connection with personal data processing activities and, if so, under what circumstances? How are these assessments typically carried out?
At present, only Federal Government Agencies are subject to an express obligation under the Privacy (Australian Government Agencies – Governance) APP Code 2017 (Agency Code) to undertake privacy impact assessments. They must do this for all ‘high privacy-risk projects’. A project may be a high privacy-risk project if the agency reasonably considers that the project involves new or changed ways of handling personal information that are likely to have a significant impact on the privacy of individuals. Agencies are also required to carry out privacy impact assessments where they are directed to do so by the Privacy Commissioner under section 33D of the Privacy Act.
Private sector organisations do not currently have an express obligation to conduct a privacy impact assessment, however, many choose to in order to address the obligation under APP 1 to take reasonable steps to implement practices, procedures and systems to ensure compliance with the APPs. Completion of a privacy impact assessment can be used to establish that such reasonable steps have been taken, as well as identify other measures to be implemented.
The Privacy Commissioner sets out a 10-step process for conducting a privacy impact assessment. These steps are:
-
- conduct a threshold assessment;
- plan the PIA;
- describe the project;
- identify and consult with stakeholders;
- map information flows;
- prepare a privacy impact analysis and compliance check;
- privacy management — addressing risks;
- consider and prepare recommendations;
- produce a report; and
- respond and review.
One of the reforms proposed by the Attorney-General’s Privacy Act Review Report in February 2023 (see item 2 above) was the introduction of a requirement for all APP entities to conduct privacy impact assessments for activities likely to have a significant impact on the privacy of individuals.
Further, there is a proposal in the exposure draft of the Children’s Online Privacy Code to make privacy impact assessments mandatory for APP entities offering online services which are likely to be used by children or which are primarily concerned with the activities of children.
-
-
Are there any specific codes of practice, or self-regulatory codes applicable in your jurisdiction regarding the processing of personal data (e.g., codes of practice for processing children’s data or health data)?
Under Part IIIB of the Privacy Act, the Information Commissioner can approve and register enforceable codes. The purpose of such codes are to provide individuals with transparency about how their information will be handled. A code cannot lessen the privacy rights of an individual provided for in the Privacy Act.
The following codes are in place:
-
- Privacy (Credit Reporting) Code 2024 – which establishes the obligations and procedures for credit reporting bodies and credit providers regarding the management, collection and disclosure of credit-related personal information;
- Privacy (Market and Social Research) Code 2021 – which was developed by the Association of Market and Social Research Organisations (AMSRO) to provide industry specific obligations for the research, data and insights sector; and
- Privacy (Australian Government Agencies – Governance) APP Code 2017 – which sets out specific requirements that agencies must comply with as part of their compliance with APP 1.2.
The Privacy Amendment Act introduced a mandate for the OAIC to develop a Children’s Online Privacy Code (Children’s Code) which will specify how online services accessed by children must comply with the APPs. The Children’s Code is intended to be in place by 10 December 2026. As of April 2026, the OAIC is conducting a public consultation on an exposure draft of the proposed Children’s Online Privacy Code. This is discussed in the answer to Question 10, and the consultation is open for public comment until 5 June 2026.
-
-
Are organisations required to maintain any records of their data processing activities or establish internal processes or written documentation? If so, please describe how businesses typically meet such requirement(s).
APP 1 expressly requires entities to maintain an up-to-date privacy policy which documents the personal information they collect and how they use and disclose it. Beyond this, there are no express record keeping obligations. However, the broader requirements of APP 1 with respect to the implementation of practices, procedures and systems to ensure compliance with the APPs may necessitate internal compliance policies and processes, including to understand what data is collected, where it is stored, how long it is to be retained, who can access it and what risks it is exposed to. This is often managed by establishing internal privacy and data use and retention policies, requirements for privacy or data impact assessments and data mapping and classification exercises, including with assistance from cyber security teams.
-
Do the data protection laws in your jurisdiction specifically impose data retention limitations? If so, please describe such requirement(s).
There are no prescriptive requirements in this regard. Under APP 11.2, APP Entities which hold Personal Information which is no longer needed and is not required by law to be retained must take such steps as are reasonable to destroy it or ensure it is de-identified. There are no express timeframes specified for disposal.
The obligation under APP 11.2 is limited to the taking of reasonable steps. What is reasonable will depend on the circumstances, and the following factors may be relevant:
-
- the amount and sensitivity of the information;
- the nature of the organisation;
- the possible adverse consequences to individuals if the information is mis-handled;
- the organisation’s information handling practices (such as where handling is outsourced); and
- the time and cost involved in complying.
Certain types of information, such as Tax File Numbers and some health information, have further retention and deletion requirements. There are also industry specific requirements, for example in the telecommunications and banking sectors.
-
-
Under what circumstances is it required or recommended to consult with the applicable data protection regulator(s)?
The Privacy Commissioner typically acts in response to complaints from individuals or self-reporting from organisations, including under the mandatory data breach notification regime. There is not a formal process for private entities to seek pre-screening or consultation with the Privacy Commissioner in the way there may be in other jurisdictions or with other regulators in Australia.
The situation is similar for federal agencies, although they may be required to provide their privacy impact assessments to the Privacy Commissioner.
-
Do the data protection laws in your jurisdiction require the appointment of a data protection officer, chief information security officer, or other person responsible for data protection? If so, what are their legal responsibilities?
Only federal Government agencies are required to have a privacy officer in connection with their Privacy obligations. This is pursuant to the Agency Code applicable to them. Other larger organisations sometimes appoint a privacy officer despite not having a strict obligation to do so.
The Privacy Officer functions required under the Agency Code include:
- providing privacy advice internally such as:
-
- the development of new initiatives that have a potential privacy impact;
- the general application of privacy law to the agency’s activities;
- what to consider when deciding whether or not to carry out a Privacy Impact Assessment; and
- what safeguards to apply to mitigate any risks to the privacy of individuals;
- liaising with the OAIC;
- co-ordinating the handling of internal and external privacy enquiries, privacy complaints, and requests for access to, and correction of, personal information;
- maintaining a record of your agency’s personal information holdings;
- assisting with the preparation of Privacy Impact Assessments; and
- measuring and documenting your agency’s performance against its privacy management plan.
-
Do the data protection laws in your jurisdiction require or recommend employee training related to data protection? If so, please describe such training requirement(s) or recommendation(s).
There are no express requirements, although many choose to in order to address the obligation under APP 1 to take reasonable steps to implement practices, procedures and systems to ensure compliance with the APPs. Employee training may also be a step that organisations take in order to ensure it protects Personal Information from misuse, interference and loss and thereby discharge their obligation under APP 11. The Privacy Commissioner provides a number of training resources on its website.
Further, many regulators consider training in respect of cyber security part of general risk management obligation under various laws.
-
Do the data protection laws in your jurisdiction require controllers to provide notice to data subjects of their processing activities? If so, please describe such notice requirement(s) (e.g., posting an online privacy notice).
APP 1 requires organisations to have a Privacy Policy setting out the types of information the organisation collects, and how it is used or disclosed. Privacy policies are required to be clearly expressed and made available free of charge. Organisations typically publish their privacy policies on their public-facing website.
A privacy policy must contain:
- the kinds of information that the entity collects and holds;
- how it collects and holds information;
- the purposes for which the entity collects, holds, uses and discloses information;
- how an individual may access their information and seek correction;
- how an individual can make a complaint, and how the organisation will deal with the complaint; and
- whether the organisation will disclose the personal information to an overseas recipient, and if so the countries in which such recipients are likely to be located.
Further, APP 5 requires organisations to take reasonable steps to provide a collection notice to individuals at or before the time they are collecting personal information (or as soon as practicable after). This notice must include:
- the identity and contact details of the organisation;
- the fact that information is being collected (this is particularly important where the collection is from a third party, or the collection is not obvious);
- where the collection is required by law, that fact and the details of the law requiring collection;
- the purpose for which the information is being collected;
- the main consequences for the individual if the information is not collected;
- any other entities to which the information will be provided;
- references to the organisation’s privacy policy (including a hyperlink if possible); and
- whether the organisation will disclose the personal information to an overseas recipient, and if so the countries in which such recipients are likely to be located.
The obligation under APP 5 is one of reasonable steps only. What is reasonable will depend on the circumstances.
-
Do the data protection laws in your jurisdiction distinguish between the responsibilities of “controllers” and those of “processors” (or equivalent terms) of personal data? If so, how are such terms defined and what are the key distinctions between the obligations of controllers and processors (or equivalent terms)?
Organisations and government agencies that collect, use or disclose personal information are regulated in relation to those activities. Terms such as ‘Controller’, ‘Owner’ and ‘Processor’ are not used in the Privacy Act or state and territory privacy acts. Organisations and federal government agencies that collect, use or disclose personal information are called ‘APP Entities’ and must comply with the Privacy Act and the APPs contained in the Privacy Act.
In practice, an important and difficult distinction is between APP Entities that collect, use or disclose personal information and organisations that as service providers to those APP Entities may handle personal information for those entities: for example, operations of data warehouses or data centres and cloud as-a-service providers.
Where personal information is entrusted by an APP Entity that collects that personal information to another party for storage and processing, the Commissioner looks to whether the second party has ‘control’ of that information. If the second party can fully access and edit that information, the provision of that personal information to the second party is a ‘disclosure’ subject to relevant notice and consent requirements and the second party is an entity that ‘collects’ this information. However, the Commissioner has expressed the view that in limited circumstances, an APP Entity might retain such a degree of control over the information that the APP Entity is considered to be ‘using’ that information and not disclosing the information to the second party. For example, where an APP Entity provides personal information to a cloud service provider located overseas, this may be a ‘use’ if the information is provided for the limited purpose of performing the services of storing and ensuring the APP Entity may access the personal information, and a binding contract between the parties:
-
- requires the provider only to handle the personal information for these limited purposes;
- requires any subcontractors to agree to the same obligations; and
- gives the entity effective control of how the personal information is handled by the provider. Issues to consider include whether the entity retains the right or power to access, change or retrieve the personal information, who else will be able to access the personal information and for what purposes, what type of security measures will be used for the storage and management of the personal information and whether the personal information can be retrieved or permanently deleted by the entity when no longer required or at the end of the contract.
Whether or not other examples are considered a ‘use’ or a ‘disclosure’ will depend on the circumstances of each individual case, having regard to the degree of control held by the APP Entity.
-
-
Please describe any restrictions on monitoring, automated decision-making or profiling in your jurisdiction, including through the use of tracking technologies such as cookies. How are these or any similar terms defined?
Online monitoring and profiling are not explicitly addressed under Australian law. To the extent that the use of cookies involves the collection, use or disclosure or transfer of personal information, the APPs will apply. The concept of ‘collection’ of personal information applies broadly, and includes information associated with web browsing, such as personal information collected by cookies. Consequently, collection of personal information using cookies can occur provided that the notice and consent requirements are followed.
While not effective until 10 December 2026, the Privacy Amendment Act introduced new obligations in relation to automated decision-making. Under the reforms, a privacy policy must specify the kinds of Personal Information used and the kinds of decisions made if a computer program uses Personal Information to make a decision (or something substantially and directly related to making a decision) and the decision could reasonably be expected to significantly affect rights and interests of individuals.
-
Do the laws in your jurisdiction include specific rules, requirement or regulator guidance regarding the use of cookies, pixels, online tracking and/or targeted advertising? Please describe any restrictions on targeted advertising and/or cross context behavioral advertising. How are these terms or any similar terms defined?
APP 7 provides that an organisation must not use or disclose personal information it holds for the purpose of direct marketing unless an exception applies. Exceptions include where an individual would reasonably expect an organisation to use or disclose personal information for direct marketing, or where the individual has consented. Where such an exception applies, then direct marketing is permitted (and the practice is widespread in Australia).
Although not expressly defined in the Privacy Act currently (albeit the Privacy Act Review Report proposes to do so), direct marketing includes the use or disclosure of personal information to communicate directly with an individual to promote goods and services and may include targeted advertising depending on the specific context and how targeted the advertising is.
Where an organisation is permitted to use or disclose personal information for the purpose of direct marketing, it must always allow an individual to request not to receive direct marketing communications (also known as ‘opting out’) and comply with that request.
The Attorney-General’s Privacy Act Review Report in February 2023 (see item 2) has proposed reform in this area. As well as the specific proposals on targeted advertising in the Report, of note in this context as well is the proposed broadening of the definition of ‘personal information’ to specifically recognise includes technical and inferred information, such as IP addresses and device identifiers.
-
Do the data protection laws in your jurisdiction specifically restrict or regulate the “sale” of personal data and/or “data brokers”? How is “sale” and/or “data broker” or (similar/related terms) defined?
There are no provisions in the Privacy Act expressly controlling or prohibiting the sale or other trading of Personal Information. That said, any such sale must comply with the APPs and other obligations under the Privacy Act. For example, a sale would constitute a disclosure of Personal Information, so the requirements of APP 6 must be met.
Further, an organisation will be an APP entity (i.e., subject to the requirements of the Privacy Act) if it is in the business of selling personal information. This is so even if the organisation would otherwise be exempt as a small business.
-
Do the data protection laws in your jurisdiction specifically regulate or restrict marketing and electronic communications, including telemarketing/telephone solicitations and ‘robocalls’, email marketing, SMS/text messaging or other direct marketing? Please provide an overview.
Electronic marketing is partly regulated through subject matter-specific federal laws such as the Spam Act, which governs most forms of electronic marketing, and the DNCR Act, which regulates unsolicited telemarketing calls.
The Spam Act prohibits ‘unsolicited commercial electronic messages’ with an ‘Australian link’ from being sent or caused to be sent. Commercial electronic messages may only be sent with an individual’s consent (express or inferred in certain circumstances), and the message contains accurate sender identification and a functional unsubscribe facility. The burden of proving consent lies with the sender of the message. The Spam Act is enforced by the Australian Communications and Media Authority, which actively enforces its requirements. In April 2026, personal finance provider, Latitude Finance, was fined AUD3.96 million for Spam Act breaches during 2024 and 2025. These breaches included a failure to provide accurate contact information or a functional unsubscribe mechanism.
Voice calls, including synthetic or recorded calls (such as robocalls), are separately regulated under a ‘do not call’ regulatory framework established under the DNCR Act and associated legislation and instruments, including the Telecommunications Act 1997 (Cth) (Telecommunications Act), under which individuals may complain about potential breaches of the Spam Act and the DNCR Act, and the Telecommunications (Do Not Call Register) (Telemarketing and Research Calls) Industry Standard 2007. Marketing faxes are also regulated. A telemarketing call or marketing fax is broadly defined as a voice call or fax made to a number to offer, supply, provide, advertise or solicit goods or services, land or an interest in land, a business/investment opportunity and donations. Certain calls are not considered to be telemarketing or fax marketing, including product recall, fault verification, appointment rescheduling, appointment reminder, payments and solicited calls/faxes about orders, requests or customer enquiries.
The DNCR Act provides an ‘opt-out’ option, allowing Australians who do not wish to receive telemarketing calls or marketing faxes to list their private-use fixed and mobile telephone numbers and fax numbers on the DNCR. As of June 2023, total DNCR registrations was around 12.5 million. The quantity of numbers that telemarketers and fax marketers submit for checking (or ‘washing’) against the DNCR was 478 million during the 2022-23 financial year.
Unsolicited telemarketing calls or faxes must not be made to an Australian number registered on the DNCR without the consent (implied or express) of the relevant account holder or their nominee.
-
Do the data protection laws in your jurisdiction regulate, restrict or impose specific obligations on the processing of biometric data, such as facial recognition. If so, how are the relevant terms defined? Are these obligations focused on the collection, use and processing of unique biometric ‘identifiers’ (rather than any sort of biometric measurements) ?
The definition of ‘sensitive information’ includes: (i) biometric information that is to be used for the purpose of automated biometric verification or biometric identification; and (ii) biometric templates. The obligations applicable to sensitive information will equally apply to biometric information. These obligations are set out in the answer to item 10 above.
As noted in item 3, the collection and use of facial recognition technology has been a particular focus of the Privacy Commissioner in recent enforcement activities.
-
Are there any data protection laws in your jurisdiction that specifically address or apply to artificial intelligence or machine learning (“AI”). If so, do these laws specifically apply to the processing of personal information related to AI, or more broadly?
The implementation and use of AI systems is not currently directly regulated in Australia. There have been a number of consultations held by government concerning proposals for the regulation of AI, although none have resulted in mandatory requirements.
The Privacy Commissioner has published two sets of non-binding guidance setting out the application of the APPs in the context of the development and use of AI. These publications are designed to provide practical guidance for APP entities on complying with (and the Commissioner’s interpretation of) existing privacy obligations when both developing AI models and when using commercially available AI-enabled tools, while also suggesting several matters of best privacy practice.
More generally, in October 2025, the Commonwealth Department of Industry, Science and Resources published the Guidance for AI Adoption, setting out 6 essential practices for responsible AI governance and adoption. The guidance is divided into two versions, one for organisations in their initial adoption of AI and a second for governance professionals and technical experts. Compliance with the guidance is voluntary.
Other regulators have also published guidance with specific industries.
-
Are there any data localization requirements in your jurisdiction? In other words, are there any circumstances where some or all personal data is required to be stored locally, or prohibited from being transferred to or stored in certain jurisdictions?
There are no general requirements for Government Agencies or private sector entities to store data within Australia. Many organisations elect to do so for policy and risk management purposes, but so long as the transfer to another jurisdiction complies with the requirements of the Privacy Act (discussed in item 29), this is not generally mandatory.
-
Is the transfer of personal data outside your jurisdiction restricted, under certain circumstances? If so, please describe these restrictions and how businesses typically comply with them (e.g., does a cross-border transfer of personal data require a specified mechanism or notification to or authorization from a regulator?)
Before an APP Entity discloses personal information to an overseas recipient (which may include a service provider that would ordinarily be considered a processor in other jurisdictions), the entity must take reasonable steps to ensure that the overseas recipient does not breach the APPs in relation to the information (APP 8.1). Reasonable steps typically requires that the APP Entity enter into an enforceable contract with the overseas recipient which includes obligations consistent with the APPs. Alternatively, disclosure is also permissible where the APP Entity reasonably believes that the overseas entity is subject to laws or a binding scheme which is at least substantially similar to the APPs and the individual has mechanisms available to them to enforce that protection.
An APP Entity that discloses personal information to an overseas recipient is generally accountable for any acts or practices of the overseas recipient in relation to the information that would breach the APPs (section 16C).
However, there are exceptions to the requirement in APP 8.1 to take reasonable steps and to the accountability provision in section 16C. These include obtaining the consent of the relevant individual to the overseas disclosure (after an express statement informing the individual that APP 8 will not apply), or where the APP Entity reasonably believes that the recipient is subject to an equivalent privacy regime in its local jurisdiction and that there are mechanisms that the individual can access to take action to enforce that regime.
The Privacy Amendment Act introduced a new mechanism for a country or binding scheme to be officially recognised, by way of regulation, as providing substantially similar protection to the APPs (for the purposes of the exceptions in APP 8.2) – similar in effect to the countries considered to have an ‘adequate’ level of data protection under the GDPR regime. This is intended to remove the burden on APP Entities to undertake an assessment of a country’s privacy laws and will remove the difficulty in determining equivalence. As yet, there have been no countries or binding schemes prescribed in the regulations.
-
What personal data security obligations are imposed by the data protection laws in your jurisdiction?
APP 11 requires an APP Entity to take active measures to ensure the security of personal information it holds, and to actively consider whether it is permitted to retain personal information. An APP Entity that holds personal information must take reasonable steps to protect the information from misuse, interference and loss, as well as unauthorised access, modification or disclosure (APP 11.1). Unauthorised access includes both access by an employee of the entity or independent contractor and unauthorised access by an external third party (such as by hacking).
Reasonable steps should include, where relevant, taking steps and implementing strategies in relation to governance, culture and training, internal practices, procedures and systems, ICT security, access security, third party providers (including cloud computing), data breaches, physical security, destruction and de-identification and compliance with applicable standards.
The Commissioner not infrequently determines that internal or external data breaches are reasonably attributable to a failure by an APP Entity to take reasonable steps to protect information security or to take reasonable steps to destroy personal information or ensure it is de-identified if it no longer needs the information for any purpose for which it may be used or disclosed under the APPs.
In addition, certain types of information (such as tax file numbers) and certain sectors (such as the financial services sector) are subject to additional cyber security requirements, including under Prudential Standard CPS 234 and the Security of Critical Infrastructure Act 2018 (Cth) (SOCI Act), as referred to below, and also via general risk management obligations, including under section 912A of the Corporations Act 2001 (Cth).
-
Are there more specific security obligations for certain types of personal data (e.g., sensitive data or special categories of personal data)?
There are no separate or distinct obligations with regard to security that applies to sensitive information. However, the requirement of APP 11 is to take ‘reasonable steps’. In the Federal Court’s judgment in the Australian Clinical Labs decision, the Federal Court gave the view that the circumstances relevant for determining the steps required to be taken includes the sensitivity of the information which is being secured. Accordingly, the requirement for stricter controls in the case of sensitive information can be found in APP 11.
-
Do the data protection laws in your jurisdiction impose obligations in the context of security breaches which impact personal data? If so, how do such laws define a security breach (or similar term) and under what circumstances and within what timeframe must such a breach be reported to regulators, impacted individuals, law enforcement, or other persons or entities?
Part IIIC of the Privacy Act sets out a regime for the notification of an ‘Eligible Data Breach’.
An Eligible Data Breach occurs where:
-
- there is unauthorised access to, or unauthorised disclosure of personal information or personal information is lost in circumstances where unauthorised access to, or unauthorised disclosure of the information is likely to occur; and
- a reasonable person would conclude that the access or disclosure would be likely to result in serious harm to any of the individuals to whom the information relates.
The potential harm contemplated in this definition includes physical, psychological, emotional, economic and financial harm, as well as harm to reputation. An assessment as to whether an individual is likely to suffer ‘serious harm’ as a result of an Eligible Data Breach depends on, among any other relevant matters:
-
- the kind and sensitivity of the information subject to the breach;
- whether the information is protected and the likelihood of overcoming that protection;
- if a security technology or methodology is used in relation to the information to make it unintelligible or meaningless to persons not authorised to obtain it – the information or knowledge required to circumvent the security technology or methodology; and
- the persons, or the kinds of persons, who have obtained, or could obtain, the information; and the nature of the harm that may result from the data breach.
Where an APP Entity is aware that there are reasonable grounds to believe that there has been an Eligible Data Breach, whether it forms such an awareness following an assessment of a reasonable suspicion that an Eligible Data Breach may have occurred (which such assessment must take no more than 30 days), or otherwise, the entity must as soon as practicable:
-
- prepare a statement that, at a minimum, contains:
- the entity’s contact details. If relevant, the identity and contact details of any entity that jointly or simultaneously holds the same information in respect of which the eligible data breach has occurred, for example, due to outsourcing, joint venture or shared services arrangements may also be provided. If this information is included in the statement, that other entity will not need to separately report the eligible data breach;
- a description of the data breach;
- the kinds of information concerned; and
- the steps it recommends individuals take to mitigate the harm that may arise from the breach (while the entity is expected to make reasonable efforts to identify and include recommendations, it is not expected to identify every possible recommendation that could be made following a breach);
- provide a copy of this statement to the OAIC; and
- take such steps as are reasonable in the circumstances to notify affected or at-risk individuals of the contents of the statement. Individuals may be notified by the mode of communication normally used by the entity, or if there is no normal mode of communication, by email, telephone or post. If direct notification is not practicable, the entity must publish the statement on its website and take reasonable steps to publicise its contents.
- prepare a statement that, at a minimum, contains:
The OAIC provides a standard form which may be used to notify, found here on its website.
What constitutes a ‘practicable’ timeframe will vary depending on the time, effort or cost required to comply with the above requirements. Proposed reforms (see item2) may introduce a 72-hour notification requirement.
-
-
Do the data protection laws in your jurisdiction establish specific rights for individuals, such as the right to access and the right to deletion? If so, please provide a general description of such rights, how they are exercised, and any exceptions.
Access to data
An APP Entity that holds personal information about an individual must, on request, give that individual access to the information (APP 12.1).
APP 12 also sets out minimum access requirements, including the time period for responding to an access request, how access is to be given, and that a written notice, including the reasons for the refusal, must be given to the individual if access is refused. For example, an APP Entity must respond to a request for access to the personal information if the entity is an agency, within 30 days after the request is made, or if the entity is an organisation, within a reasonable period after the request is made.
There are a number of exceptions to the obligation for organisations to provide an individual access to their personal information, including where the entity reasonably believes that:
-
- giving access would pose a serious threat to the life, health or safety of any individual, or to public health or public safety; or
- giving access would have an unreasonable impact on the privacy of other individuals.
Correction and deletion
APP 13.1 provides that an APP Entity must take reasonable steps to correct personal information it holds, to ensure it is accurate, up-to-date, complete, relevant and not misleading, having regard to the purpose for which it is held.
APP 13.1 requires an APP Entity to take reasonable steps to correct personal information it holds, in two circumstances: on its own initiative, and at the request of the individual to whom the personal information relates.
Upon receiving a request an entity must decide if it is satisfied that the information is incorrect, and if so, take reasonable steps to correct it.
APP 13 does not stipulate formal requirements that an individual must follow to make a request, require that a request be made in writing, or require the individual to state that the request is an APP 13 request.
Objection to processing
There is no general right for an individual to object to collection, use or disclosure of personal information. The Privacy Act generally only requires notice of processing activities to be provided to individuals, and consent is only required in relation to particular activities, notably including collection, use or disclosure of sensitive information and use and disclosure of personal information for the purpose of direct marketing.
However, APP 2 provides that individuals must have the option of dealing anonymously or by pseudonym with an APP Entity. However, an APP Entity is not required to provide those options where:
the entity is required or authorised by law or a court or tribunal order to deal with identified individuals; or
it is impracticable for the entity to deal with individuals who have not identified themselves (which is often the case).
Anonymity means that an individual dealing with an APP Entity cannot be identified, and the entity does not collect personal information or identifiers.
A pseudonym is a name, term or descriptor that is different to an individual’s actual name.
Where applicable, an APP Entity must ensure that individuals are made aware of their opportunity to deal anonymously or by pseudonym with the entity.
Complaint to relevant data protection authority
An individual has the right to lodge a complaint with the Privacy Commissioner for alleged breaches of the Privacy Act. Generally, the complainant must first register a complaint with the APP Entity to which the complaint relates. If dissatisfied with the response, a complainant can complain to the Commissioner or to an external dispute resolution scheme of which the entity is a member (if applicable). In conducting its investigations, the Commissioner may require the production of documents and information and compel people to appear and answer questions.
The Attorney-General’s Privacy Act Review Report in February 2023 (see item 2) has proposed reform in this area, but these have not been actioned in the first tranche of reforms.
-
-
Do the data protection laws in your jurisdiction allow or provide for a private right of action for violations? If so, does your jurisdiction also allow “class action” litigation (i.e., on behalf of a class or (‘many’) claimants)? Please explain under what circumstances in which a private right of action applies and/or a class action may be brought, and whether types of claims/violations present a higher risk of a private right of action or class action (e.g., are there statutory damages or presumed harm for certain violations)?
The Privacy Amendment Act introduced a statutory tort for invasion of privacy which allows individuals to enforce their rights for a serious physical invasion into their private space or the misuse of information. The elements for establishing the tort are:
-
- the defendant intruded on the plaintiff’s seclusion and/or misused their personal information;
- a person in the position of the plaintiff had a reasonable expectation of privacy;
- the invasion was intentional or reckless;
- the invasion was serious; and
- the public interest in protecting the plaintiff’s privacy outweighs any countervailing public interest.
The fault element of this tort is confined to intentional or reckless invasions of privacy, meaning a negligent act would not be sufficient to meet the test. This new tort is actionable where a person had a ‘reasonable expectation of privacy’. In addition, for a plaintiff to have a cause of action under this new tort, the Court must be satisfied that the public interest in privacy outweighs any countervailing public interest.
This new tort has been considered in a small number of cases. The first was Kurraba Group Pty Ltd & Anor v Williams [2025] NSWDC 396, where the tort was relied on as the basis for granting an interlocutory injunction preventing a party from continuing to publish private wedding photographs as part of a business dispute which had devolved into personal attacks between the individuals involved. In granting the injunction, the District Court considered the private nature of the images, that the images were never intended to be published and the misuse of the photographs by the defendant.
Separate to the statutory tort or a private right of action in the courts, the Privacy Act also allows complainants who may have suffered from the same privacy breach to bring a ‘representative complaint’ to the Privacy Commissioner. This is akin to a ‘class action’ that is managed through the investigations and complaints processes of the Privacy Act. A representative complaint may be lodged under the Privacy Act only if:
-
-
- the class members have complaints against the same entity;
- all the complaints are in respect of, or arise out of, the same, similar or related circumstances; and
- all the complaints give rise to a substantial common issue of law or fact.
-
The Privacy Act also contains a right to enforce a declaration by the Privacy Commissioner for compensation or to seek an injunction. The private right to seek injunctive relief has been used very infrequently.
We note that privacy breaches may arise in circumstances which also constitute a breach of confidentiality or a breach of negligence. Where this is the case, individuals may have rights under a contract with the entity or an equitable duty of confidentiality may also apply under common law.
-
-
Are individuals entitled to monetary damages or compensation if they are affected by breaches of data protection law? Does the law require actual and material damage to have been sustained, or is non-material injury to feelings, emotional distress or similar sufficient for such purposes?
The Privacy Commissioner has power to award compensation to individuals affected by breaches of the Privacy Act. This is available both for financial loss as well as for non-economic losses, such as emotional harm, humiliation or inconvenience.
The Privacy Commissioner has applied the following principles in awarding compensation:
-
- principles of damages applied in tort law will assist in measuring compensation;
- compensation should be assessed having regard to the complainant’s reaction (not a ‘reasonable person’ test);
- there must be a good reason not to award compensation once loss is established; and
- aggravated damages may be awarded in appropriate cases.
While aggravated damages are seldom awarded, it is open to the Privacy Commissioner to do so, particularly if:
-
- an entity’s conduct is considered to be ‘high-handed, malicious, insulting or oppressive’; or
- the entity has acted in a way that exacerbates the complainant’s injury or hurt feelings.
If a breach of the statutory tort is established, a court may award damages to the individual. Damages may be awarded for emotional distress and may be exemplary or punitive; however, non-economic loss and exemplary or punitive damages must not exceed AUD478,550 or the cap applicable to non-economic loss that may be awarded in defamation proceedings.
-
-
How are data protection laws in your jurisdiction typically enforced? What regulatory body(ies) have enforcement authority?
The Privacy Commissioner has a range of regulatory powers including powers to:
-
- conduct an assessment of whether an entity is maintaining and handling personal information in accordance with relevant provisions (such as the APPs);
- direct a government agency (but not private sector organisations) to give the Privacy Commissioner a privacy impact assessment;
- request entities to develop an APP code or impose one where appropriate;
- investigate an entity following a complaint;
- investigate an entity on its own initiative, that is, without someone making a complaint (Commissioner initiated investigation);
- accept an enforceable undertaking from an entity. An enforceable undertaking is a promise by an entity that it will take specified action or refrain from taking specified action in order to comply with relevant privacy provisions, or to ensure it does not do an act or engage in a practice that interferes with an individual’s privacy;
- make a determination on a privacy complaint. The Privacy Commissioner can also make a determination after conducting a Privacy Commissioner-initiated investigation; and
- apply to the courts for an injunction to restrain a person from engaging in conduct that would constitute a breach of relevant privacy provisions or for an order that an entity pay the civil penalty.
The Privacy Act provides several complaints paths for individuals where there has been (or is suspected to have been) a breach of an APP. The primary complaints process is through a complaint to the Privacy Commissioner, initiating an investigation by the Privacy Commissioner (sections 36 and 40). This process typically requires that the individual has first complained to the relevant APP Entity.
An investigation may result in a determination by the Privacy Commissioner, containing a declaration that:
-
- the respondent’s conduct constituted an interference with the privacy of an individual and must not be repeated or continued;
- the respondent must take specified steps within a specified period to ensure that such conduct is not repeated or continued;
- the respondent must perform any reasonable act or course of conduct to redress any loss or damage suffered by the complainant;
- the complainant is entitled to a specified amount by way of compensation for any loss or damage suffered by reason of the act or practice the subject of the complaint; or
- that no further action is needed (section 52(1)).
A complainant may apply to the Federal Court of Australia or the Federal Circuit and Family Court of Australia to enforce a determination of the Commissioner (section 55A). An individual may also apply to the Federal Court or Federal Circuit and Family Court of Australia for an injunction where a person has, is, or is proposing to engage in conduct that was or would be a breach of the Privacy Act (section 98).
There is not a private right to claim damages, only a right to enforce a declaration by the Privacy Commissioner for compensation or to seek an injunction. The private right to seek injunctive relief has been used very infrequently.
Section 80W of the Privacy Act empowers the Privacy Commissioner to apply to the Federal Court or Federal Circuit and Family Court of Australia for an order that an entity, that is alleged to have contravened a civil penalty provision, pay a civil penalty. A civil penalty order financially penalises an entity, but does not compensate individuals adversely affected by the contravention (although, as noted above, the Privacy Commissioner also has these powers).
The ‘civil penalty provisions’ in the Privacy Act include:
-
- for serious interferences with privacy (section 13G) – with maximum penalties for companies of the greater of: (a) AUD50 million; (b) 3 times the benefit gained from the ‘interference’ or contravention (if that is able to be ascertained); or (c) 30% of the company’s adjusted turnover during the breach period (if the benefit gained is not able to be ascertained),
- mid-tier and low-tier (infringement notice) penalties, as described below in item 37; and
- various civil penalty provisions set out in Part IIIA – which are only applicable to credit reporting bodies and credit providers – with maximum penalties of either AUD825,000, AUD1,650,000 or AUD3, 300,000 depending on the offence (500, 1000 or 2000 penalty units, and multiplied by 5 in the case of companies, respectively).
It is important to note that while other enforcement actions (such as the making of determinations and the award of compensation) can be made by the Privacy Commissioner, liability for civil penalties only arises where it is ordered by the Federal Court. Where an APP Entity experiences an Eligible Data Breach, the occurrence of that data breach in and of itself is unlikely to result in the entity facing penalties. However, a failure to (amongst other things):
-
- if an entity has a reasonable suspicion that there may have been an eligible data breach, carry out a reasonable and expeditious assessment of whether there are reasonable grounds to believe that an eligible data breach occurred and to take all reasonable steps to ensure that that assessment is completed within 30 days after the entity becomes suspicious; and
- report an eligible data breach,
- will be considered an “interference with the privacy of an individual” affected by the Eligible Data Breach (section 13(4A)).
The calculation of civil penalties are a matter for Court discretion and there are no guidelines in the Privacy Act other than the maximum penalties. As noted in item 3, the decision in Australian Information Commissioner v Australian Clinical Labs Limited (No 2) [2025] FCA 1224 is the first instance of a civil penalty being imposed under the Privacy Act. The civil penalty, which had been agreed by the parties, comprised the following elements:
-
- AUD4.2 million for violating APP11 by failing to take reasonable steps to protect personal information;
- AUD800,000, for failing to conduct a reasonable and expeditious assessment of the breach, as required for the purpose of the eligible breach notification regime; and
- AUD800,000, for failing to notify the OAIC promptly, again in breach of requirements under the eligible breach notification regime.
A notable feature of the agreed penalty, which was endorsed by the Federal Court, was the treatment of each individuals’ records that were disclosed as a separate breach. As there were 223,000 such records and the maximum penalty at the time of the infringement was AUD2.2 million, the theoretical maximum penalty that could have been imposed was stated by the Federal Court to be AUD495 billion. Ultimately, the agreed penalty amounted to approximately AUD20 per affected individual. It is unlikely that this would form a reliable benchmark for future cases.
-
-
What is the range of sanctions (including fines and penalties) for violation of data protection laws in your jurisdiction? Are there any guidelines or rules for the calculation of such fines or the imposition of sanctions?
The most significant civil penalties may be imposed for ‘serious’ interferences with privacy. The maximum civil penalty that may be imposed is the greater of:
-
- AUD50 million;
- 3 times the benefit gained by the interference (if that number is able to be ascertained); or
- 30% of the adjusted turnover of the company during the breach period (if the benefit is not ascertainable).
The Privacy Amendment Act introduced a new tiered system into the Privacy Act’s civil penalty regime, allowing targeted and more proportionate responses from the OAIC. These reforms include:
-
- a new mid-tier civil penalty provision covering interferences with privacy which do not necessarily meet the ‘serious’ threshold (up to 2000 penalty units, being $3,300,000, for corporates); and
- a new low-tier civil penalty and infringement notice provision for specific administrative breaches including, for example, having a non-compliant privacy policy or not complying with the requirements of an eligible data breach notice (up to 200 penalty units, being $330,000 for corporates).
Additionally, the Privacy Amendment Act introduced greater powers and flexibility for the courts to make orders in response to privacy interferences, including directions to pay damages to an individual by way of compensation for any loss or damage, enforceable undertakings, or directions to publish, or otherwise communicate, a statement about the contravention.
There are also a number of civil penalty provisions which are applicable to credit reporting bodies and credit providers who use or disclose information in contravention of the Privacy Act. The current maximum civil penalty that may be imposed is AUD3,300,000 (10,000 penalty units) for corporate entities.
-
-
Are enforcement decisions open to appeal in your jurisdiction? If so, please provide an overview of the appeal options.
Yes, as described in item 36 above, privacy enforcement in Australia occurs through the Federal Court and Federal Circuit and Family Court of Australia. Where the Privacy Commissioner is seeking injunctions or penalties, it must do so through a court process.
Similarly, if the Privacy Commissioner (or an individual) is seeking to enforce a determination of the Privacy Commissioner, or an entity (or Federal Government agency) is seeking to appeal a determination, this must be done through court proceedings, generally comprising the Commonwealth administrative appeals process in Australia, either a merit review through the Administrative Review Tribunal or a judicial review by the Federal Court of Australia or Federal Circuit and Family Court of Australia.
-
Do the cybersecurity laws in your jurisdiction require the implementation of specific cybersecurity risk management measures and/or require that organisations take specific actions relating to cybersecurity? If so, please provide an overview of these obligations and explain their scope/applicability. For example, are all organizations subject to the requirement or only to certain organizations (e.g., based on size, sector, critical infrastructure designation, public company)? Are there specific and/or additional regulations for different industries (e.g., finance, healthcare, government)?.
Apart from the high level, general security principle in the Privacy Act (APP 11) (which requires entities to take reasonable steps to protect personal information from misuse, interference and loss, and from unauthorised access, modification or disclosure), there is no general, sector-wide cybersecurity regulation requiring proactive management of cyber risks.
If an entity is regulated under the SOCI Act, it must develop and implement a written critical infrastructure risk management program. This risk management program must: (a) identify material hazards; (b) as reasonably practicable, minimise or eliminate any material risk associated with the hazard; and (c) as reasonably practicable, mitigate the impact of such hazards. An entity’s risk management program must also comply with applicable rules promulgated under the SOCI Act.
The Australian Cyber and Infrastructure Security Centre (CISC) has provided guidance regarding organisations’ risk management programs. While the obligation on entities is to identify all hazards, CISC indicates a particular focus on the following:
-
- cyber and information security;
- personnel (i.e., trusted insider risks);
- supply chain disruptions; and
- physical and natural security.
As noted, the risk management program is only required to minimise, eliminate or mitigate the impact of risks to the extent reasonably practical. CISC provides guidance that reasonableness is required to be assessed in the individual circumstances of an entity. While the requirement may require an entity to act in a timely manner, it does not oblige entities to take measures which are disproportionate to the likelihood or consequences of a particular risk. Entities have flexibility to determine the steps they will include in their risk management plans.
SOCI-regulated entities are required to review their risk management programs regularly and keep it up-to-date. Regulated entities must report annually on the currency and effectiveness of their risk management program.
We separately note that financial services entities (including banks, insurers, superannuation funds etc) are subject to Australian Prudential Regulatory Authority (APRA) requirements under CPS 234 which require them to proactively manage the security of their ‘information assets’ to ensure their continued availability, confidentiality and integrity.
-
-
Do the cybersecurity laws in your jurisdiction impose formal cybersecurity audit or certification requirements? If so, please provide an overview.
No, international standards such as ISO27001, SOCII or NIST are not specifically required under specific laws in Australia. Some Government Agencies and large institutions may require this by policy, and their adoption and compliance is often used to support arguments that an entity has taken ‘reasonable steps’ (for example, in response to an alleged breach of APP 11).
The Cyber Security Act, requires that the manufacturers and suppliers of smart devices, connected devices and Internet-Of-Things devices are required to meet standards prescribed in Rules promulgated under the Cyber Security Act. To date, these Rules have only mandated high-level principles, as opposed to recognised standards but there is scope for this to evolve in the future.
Although not legislatively enshrined, the Australian Cyber Security Centre (part of the Department of Home Affairs) routinely references the ‘essential eight’ guidelines as a resource for organisations and government entities. We think it is likely that regulators would view favourably, compliance with these principles in determining whether other legal requirements with respect to cyber security have been breached.
-
Do the cybersecurity laws in your jurisdiction impose specific requirements regarding vendor and supply chain management? If so, please provide details of these requirements.
There is no economy-wide requirements in this regard. Industry specific requirements may apply. For example, critical infrastructure asset holders may need to consider supplier and supply-chain risk in developing their risk management programs and APRA-regulated entities may need to consider the extent to which their information assets are managed by third parties in order to comply with various mandatory prudential standards.
-
Do the cybersecurity laws in your jurisdiction require the appointment of a chief information security officer, regulatory point of contact, or other person responsible for cybersecurity? If so, please provide an overview of the requirement, including whether there are any formalities that must be observed regarding such appointment (e.g., board-approval, reporting line structure, notification to regulatory body).
No, this is not mandated by legislation but it is typical for large corporate entities to have a person in the position of chief information security officer or regulatory point of contact.
-
Do the cybersecurity laws in your jurisdiction impose specific reporting or notice obligations in the context of cybersecurity incidents? If so, how do such laws define a cybersecurity incident and what are the reporting and notification requirements (please also note whether these laws require reporting of certain cyber security incidents, regardless of whether there has been a ‘breach of personal data’)?
The Cyber Security Act commenced on 30 November 2024 and is a whole-of-economy legislative framework that introduces a number of new substantive obligations for businesses operating in Australia, as well as a range of administrative reforms and a new government body. While the Cyber Security Act applies generally to businesses, there is a turnover threshold for businesses to be caught by the requirements. At the time of publication, the revenue threshold had not yet been set.
The Cyber Security Act introduced a mandatory requirement for a ‘reporting business entity’ to notify the Department of Home Affairs and the Australian Signals Directorate (ASD) if it pays a ransom to a cyber threat actor within 72 hours of making the payment. A ‘reporting business entity’ is any entity carrying on business in Australia, except for those that fall below the revenue threshold (this amount will be set out in the rules and so it is not yet known). Regardless of revenue, the obligation will also apply to any responsible entity for a critical infrastructure asset, pursuant to the SOCI Act which pays a ransom or gives a benefit. The Cyber Security (Ransomware Payment Reporting) Rules 2025 (Ransomware Rules) (that commence in May 2025) set out the mandatory reporting requirements for a ‘reporting business entity’. An entity may be liable for a civil penalty if the entity fails to make a ransomware payment report as required under these rules.
Under the SOCI Act entities responsible for critical infrastructure assets are required to report cyber security incidents impacting the critical infrastructure asset to the ASD. For critical cyber security incidents having a ‘significant impact’ on the availability of an asset, this must be reported to the ASD verbally within 12 hours and a written report must be provided within a further 84 hours. For all other cyber security incidents having a ‘relevant impact’ on an asset, this must be verbally reported within 72 hours and a written report must be provided within a further 48 hours to ASD.
The requirements above are in addition to the eligible data breach notification requirements discussed in item 32 and also mandatory cyber security incident notification requirements on certain financial services providers regulated by APRA, as described in item 41.
In addition, Part 4 of the Cyber Security Act provides a mechanism to allow entities impacted by actual or suspected significant cyber security incidents to provide information in relation to such incidents to the National Cyber Security Coordinator (NCSC). Such information may be provided at any time during the incident response period, and may be provided voluntarily by the impacted entity or in response to a request from the NCSC. A cyber security incident is a ‘significant cyber security incident’ if there is a material risk that the incident has seriously prejudiced or could reasonably be expected to prejudice Australia’s social or economic stability, defence, or national security, or the incident is, or could reasonably be expected to be, of serious concern to the Australian people.
Where reports or information are shared with the Department of Home Affairs, ASD and/or NCSC under Parts 3 (ransomware reporting) or 4 (voluntary information sharing) of the Cyber Security Act, the relevant Commonwealth body receiving that information must only use it in limited circumstances. Division 3 of both Part 3 and 4 outline the permitted use and disclosure of reports / information provided to the relevant Commonwealth body. In addition to restrictions on use and disclosure for the Commonwealth body originally receiving the reports / information, there are also limitations on the use and disclosure of reports / information by other entities and Commonwealth or State bodies who have been provided such information in accordance with the Cyber Security Act.
The Department of Home Affairs and/or ASD may only make record of, or use or disclose information provided in a ransomware payment report for nine limited purposes, including (but not limited to):
-
- assisting the reporting business entity to respond to, mitigate or resolve the cyber security incident;
- supporting the performance of a Commonwealth or State body in responding to, mitigating or resolving a cyber security incident;
- supporting the performance of the NCS under Part 4 (voluntary information sharing);
- informing and advising Commonwealth Ministers about a cyber security incident; and
- supporting the performance of the functions of an intelligence agency (such as the Australian Secret Intelligence Organisation or ASD).
In relation to significant cyber security incidents, the NCSC may only make a record of, or use or disclose information voluntarily provided to it for the purpose of assisting the impacted entity to respond to, mitigate or resolve the cyber security, or for a ‘permitted cyber security purpose’ (defined in section 10 of the CSA).
In relation to other incidents, the NCSC may only make a record of, or use or disclose the information voluntarily provided to it for the purpose of:
-
- directing the impacted entity to services that may assist them to respond to, mitigate or resolve the cyber security; and
- if the incident is a cyber security incident:
- coordinating the whole of Government response to the cyber security incident (where the NCSC considers such a response necessary); and
- informing and advising Commonwealth Ministers about the cyber security incident.
-
-
Can individuals bring a private right of action for cybersecurity incidents or other violations of cybersecurity laws? If so, does your jurisdiction also allow “class action” litigation (i.e., on behalf of a class or (‘many’) claimants)? Please explain under what circumstances in which a private right of action and/or a class action may be brought?
Any rights of an individual in respect of an cybersecurity incident are likely to arise in the context of a Privacy Act breach. Accordingly, item 34 sets out the relevant rights of individuals. As discussed, class actions are possible under the statutory tort and representative complaints can be made to the Privacy Commissioner.
-
How are cybersecurity laws in your jurisdiction typically enforced? What regulatory body(ies) have enforcement authority?
We are not aware of any enforcement action as yet under the SOCI Act nor the Cyber Security Act. The Department of Home Affairs administers those Acts, as further described in item 46. Industry specific regulation is enforced by the relevant industry regulatory (such as APRA in respect of the banking sector).
-
What powers of oversight / inspection / audit do regulators have in your jurisdiction under cybersecurity laws.
Part 5 of the Cyber Security Act establishes the Cyber Incident Review Board (the Board or CIRB) and requires the Board to conduct reviews of cyber security incidents on written referral by the Minister, National Cyber Security Coordinator, an entity impacted by the incident(s), or a member of the Board.
The purpose of a review is to make recommendations to government and industry about actions that could be taken to prevent, detect, respond to, or minimise the impact of, cyber security incidents of a similar nature in the future. Importantly, if the Board reasonably believes an entity has information or documents relevant to a review, the Chair may first request (and later require) by notice the entity to give any documents as specified (although there are limitations on the use of such documents).
The Cyber Security (Cyber Incident Review Board) Rules 2025 (which commenced on 30 May 2025) provide further details on the Board’s requirement to consider referrals, matters the Board must consider when prioritising referrals and reviews, the terms of reference (including number of standing Board members, Expert Panel members, required security clearance and other eligibility requirements), and the timing and notification of reviews.
The SOCI Act has a range of oversight powers vested in the government including the following:
-
- Under Part 3, the Minister for Home Affairs has the power to direct owners or operators of critical infrastructure assets to take specific actions to mitigate risks that are deemed to be prejudicial to security, or address serious deficiencies in their risk management programs;
- Under Part 3A, the Government can provide support to industry in responding to cyber security incidents, particularly those that threaten Australia’s national interests or economy; and
- Under Part 4, the Secretary of the Department of Home Affairs can direct entities to provide information necessary for crisis response or to assess the nature and severity of an incident.
In respect of financial services businesses regulated by APRA, APRA also monitors and enforces compliance with cybersecurity aspects of those entities’ obligations.
-
-
What is the range of sanctions (including fines and penalties) for violations of cybersecurity laws in your jurisdiction? What is the range of sanctions (including fines and penalties) for violation of data protection laws in your jurisdiction? Are there any guidelines or rules for the calculation of such fines or the imposition of sanctions?
Contravention of the Cyber Security Act may result in a reporting entity being liable to civil penalties with a maximum of $19,800 (60 penalty units) for any violation of a civil penalty provision. This includes, among other things, for failure to report a ransomware payment, failure to comply with a notice to produce documents, or unauthorised use or disclosure of information in ransomware payment reports.
The SOCI Act contains both civil penalty provisions which are dealt with by way of fines, and two criminal offences, which are punishable with fines or potentially imprisonment. The civil penalty provisions range up to AUD412,500 (1,250 penalty units) for corporations. In March 2026, the Minister for Home Affairs opened consultation on a number of amendments to the SOCI Act, one of which was a proposal to increase the maximum penalties to AUD3.3 million (10,000 penalty units).
The criminal offences under the SOCI Act are:
-
- failure to comply with an action direction given by the Home Affairs Secretary under section 35AQ – this only applies to the responsible entity for a critical infrastructure asset; and
- the use or disclosure of protected information (which is information pertaining to an entity’s SOCI Act obligations) except in permitted circumstances – this applies to all individuals and organisations.
The offences are punishable by fines of up to AUD198,000 (600 penalty units) for corporations or 2 years imprisonment for individuals.
-
-
Are enforcement decisions open to appeal in your jurisdiction? If so, please provide an overview of the appeal options.
Enforcement action under the Cyber Security Act must be administered through the Federal Court of the Federal Circuit and Family Court of Australia which means decisions will have an appeal right.
Decisions under the SOCI Act are administrative in nature meaning that appeal of a decision can be sought through the Administrative Review Tribunal as opposed to the traditional court process. However, given the administrative nature, the grounds for appeal are quite restrictive (they include, but are not restricted to, error of law, procedural fairness, unreasonableness and bias).
Australia: Data Protection & Cybersecurity
This country-specific Q&A provides an overview of Data Protection laws and regulations applicable in Australia.
-
Please provide an overview of the legal and regulatory framework governing data protection, privacy and cybersecurity in your jurisdiction (e.g., a summary of the key laws; who is covered; what sectors, activities or data do they regulate; and who enforces the relevant laws).
-
Are there any expected changes in the data protection, privacy or cybersecurity landscape in 2025 - 2026 (e.g., new laws or regulations coming into effect, enforcement of such laws and regulations, expected regulations or amendments)?
-
Are there any identifiable trends or regulatory priorities in privacy, data protection and/or cybersecurity-related enforcement activity in your jurisdiction?
-
Are there any registration or licensing requirements for entities covered by these data protection and cybersecurity laws, and if so what are the requirements? Are there any exemptions? What are the implications of failing to register / obtain a licence?
-
What does “personal data,” “personal information” or other equivalent terms (hereafter “personal data”) mean under data protection laws in your jurisdiction? Does the definition broadly include information about all individuals? For example, would this include individuals acting in a personal or household capacity, as well as those acting in a business or commercial capacity (such as on behalf of a business or corporate entity or employer) or otherwise?
-
Are certain types of personal data considered more sensitive or highly regulated under data protection laws in your jurisdiction? Please include the relevant defined terms for such data (e.g., special categories of personal data,” “sensitive data” or “sensitive personal information”?
-
What principles apply to the processing of personal data in your jurisdiction? For example: is it necessary to establish a “legal basis” for processing personal data?; are there specific transparency requirements?; must personal data only be kept for a certain period? Please provide details of such principles.
-
Are there any circumstances for which consent is required or typically obtained in connection with the processing of personal data? What are the rules relating to the form, content and administration of such consent? For instance, can consent be implied, incorporated into a broader document (such as a terms of service) or bundled with other matters (such as consents for multiple processing operations)?
-
What special requirements, if any, are required for processing particular categories of personal data (e.g., health data, children’s data, special category or sensitive personal data, etc.)? Are there any prohibitions on specific categories of personal data that may be collected, disclosed, or otherwise processed?
-
Do the data protection laws in your jurisdiction have special or particular requirements, restriction, or rules regarding the collection, use, disclosure or processing of personal information from or about children or minors? If so, what is the age threshold and key requirements/restrictions that go beyond those applicable, generally?
-
Do the data protection laws in your jurisdiction include any derogations, exemptions, exclusions or limitations other than those already described? If so, please describe the relevant provisions.
-
Does your jurisdiction require or recommend privacy risk or impact assessments in connection with personal data processing activities and, if so, under what circumstances? How are these assessments typically carried out?
-
Are there any specific codes of practice, or self-regulatory codes applicable in your jurisdiction regarding the processing of personal data (e.g., codes of practice for processing children’s data or health data)?
-
Are organisations required to maintain any records of their data processing activities or establish internal processes or written documentation? If so, please describe how businesses typically meet such requirement(s).
-
Do the data protection laws in your jurisdiction specifically impose data retention limitations? If so, please describe such requirement(s).
-
Under what circumstances is it required or recommended to consult with the applicable data protection regulator(s)?
-
Do the data protection laws in your jurisdiction require the appointment of a data protection officer, chief information security officer, or other person responsible for data protection? If so, what are their legal responsibilities?
-
Do the data protection laws in your jurisdiction require or recommend employee training related to data protection? If so, please describe such training requirement(s) or recommendation(s).
-
Do the data protection laws in your jurisdiction require controllers to provide notice to data subjects of their processing activities? If so, please describe such notice requirement(s) (e.g., posting an online privacy notice).
-
Do the data protection laws in your jurisdiction distinguish between the responsibilities of “controllers” and those of “processors” (or equivalent terms) of personal data? If so, how are such terms defined and what are the key distinctions between the obligations of controllers and processors (or equivalent terms)?
-
Please describe any restrictions on monitoring, automated decision-making or profiling in your jurisdiction, including through the use of tracking technologies such as cookies. How are these or any similar terms defined?
-
Do the laws in your jurisdiction include specific rules, requirement or regulator guidance regarding the use of cookies, pixels, online tracking and/or targeted advertising? Please describe any restrictions on targeted advertising and/or cross context behavioral advertising. How are these terms or any similar terms defined?
-
Do the data protection laws in your jurisdiction specifically restrict or regulate the “sale” of personal data and/or “data brokers”? How is “sale” and/or “data broker” or (similar/related terms) defined?
-
Do the data protection laws in your jurisdiction specifically regulate or restrict marketing and electronic communications, including telemarketing/telephone solicitations and ‘robocalls’, email marketing, SMS/text messaging or other direct marketing? Please provide an overview.
-
Do the data protection laws in your jurisdiction regulate, restrict or impose specific obligations on the processing of biometric data, such as facial recognition. If so, how are the relevant terms defined? Are these obligations focused on the collection, use and processing of unique biometric ‘identifiers’ (rather than any sort of biometric measurements) ?
-
Are there any data protection laws in your jurisdiction that specifically address or apply to artificial intelligence or machine learning (“AI”). If so, do these laws specifically apply to the processing of personal information related to AI, or more broadly?
-
Are there any data localization requirements in your jurisdiction? In other words, are there any circumstances where some or all personal data is required to be stored locally, or prohibited from being transferred to or stored in certain jurisdictions?
-
Is the transfer of personal data outside your jurisdiction restricted, under certain circumstances? If so, please describe these restrictions and how businesses typically comply with them (e.g., does a cross-border transfer of personal data require a specified mechanism or notification to or authorization from a regulator?)
-
What personal data security obligations are imposed by the data protection laws in your jurisdiction?
-
Are there more specific security obligations for certain types of personal data (e.g., sensitive data or special categories of personal data)?
-
Do the data protection laws in your jurisdiction impose obligations in the context of security breaches which impact personal data? If so, how do such laws define a security breach (or similar term) and under what circumstances and within what timeframe must such a breach be reported to regulators, impacted individuals, law enforcement, or other persons or entities?
-
Do the data protection laws in your jurisdiction establish specific rights for individuals, such as the right to access and the right to deletion? If so, please provide a general description of such rights, how they are exercised, and any exceptions.
-
Do the data protection laws in your jurisdiction allow or provide for a private right of action for violations? If so, does your jurisdiction also allow “class action” litigation (i.e., on behalf of a class or (‘many’) claimants)? Please explain under what circumstances in which a private right of action applies and/or a class action may be brought, and whether types of claims/violations present a higher risk of a private right of action or class action (e.g., are there statutory damages or presumed harm for certain violations)?
-
Are individuals entitled to monetary damages or compensation if they are affected by breaches of data protection law? Does the law require actual and material damage to have been sustained, or is non-material injury to feelings, emotional distress or similar sufficient for such purposes?
-
How are data protection laws in your jurisdiction typically enforced? What regulatory body(ies) have enforcement authority?
-
What is the range of sanctions (including fines and penalties) for violation of data protection laws in your jurisdiction? Are there any guidelines or rules for the calculation of such fines or the imposition of sanctions?
-
Are enforcement decisions open to appeal in your jurisdiction? If so, please provide an overview of the appeal options.
-
Do the cybersecurity laws in your jurisdiction require the implementation of specific cybersecurity risk management measures and/or require that organisations take specific actions relating to cybersecurity? If so, please provide an overview of these obligations and explain their scope/applicability. For example, are all organizations subject to the requirement or only to certain organizations (e.g., based on size, sector, critical infrastructure designation, public company)? Are there specific and/or additional regulations for different industries (e.g., finance, healthcare, government)?.
-
Do the cybersecurity laws in your jurisdiction impose formal cybersecurity audit or certification requirements? If so, please provide an overview.
-
Do the cybersecurity laws in your jurisdiction impose specific requirements regarding vendor and supply chain management? If so, please provide details of these requirements.
-
Do the cybersecurity laws in your jurisdiction require the appointment of a chief information security officer, regulatory point of contact, or other person responsible for cybersecurity? If so, please provide an overview of the requirement, including whether there are any formalities that must be observed regarding such appointment (e.g., board-approval, reporting line structure, notification to regulatory body).
-
Do the cybersecurity laws in your jurisdiction impose specific reporting or notice obligations in the context of cybersecurity incidents? If so, how do such laws define a cybersecurity incident and what are the reporting and notification requirements (please also note whether these laws require reporting of certain cyber security incidents, regardless of whether there has been a ‘breach of personal data’)?
-
Can individuals bring a private right of action for cybersecurity incidents or other violations of cybersecurity laws? If so, does your jurisdiction also allow “class action” litigation (i.e., on behalf of a class or (‘many’) claimants)? Please explain under what circumstances in which a private right of action and/or a class action may be brought?
-
How are cybersecurity laws in your jurisdiction typically enforced? What regulatory body(ies) have enforcement authority?
-
What powers of oversight / inspection / audit do regulators have in your jurisdiction under cybersecurity laws.
-
What is the range of sanctions (including fines and penalties) for violations of cybersecurity laws in your jurisdiction? What is the range of sanctions (including fines and penalties) for violation of data protection laws in your jurisdiction? Are there any guidelines or rules for the calculation of such fines or the imposition of sanctions?
-
Are enforcement decisions open to appeal in your jurisdiction? If so, please provide an overview of the appeal options.