Skip to content   Skip to footer navigation 

Op-ed: To fix the Privacy Act, we need one extra sentence

Why individuation or 'singling out' is critical to improving Australia's Privacy Act and protecting consumers.

illustration of hand adding sentence to privacy document on laptop screen
Last updated: 04 May 2023

Anna Johnston is a leading privacy expert, former Deputy Privacy Commissioner of NSW and founder and Principal of Salinger Privacy. 

Need to know

  • A proposal to change the definition of personal information is the key to privacy reform 
  • Regulators and advocates are united in support of changing the definition to capture unfair and deceptive business practices, including profiling and tracking
  • Big business and the AdTech industry are lobbying against the reforms despite clear evidence consumers want better protections

Indirect identification, individuation, disambiguation, distinguishing from all others, or singling out… call it what you want, but the statutory definition of 'personal information' needs to clearly state that it includes cases when individuals can be singled out and acted upon, even if their identity is not known.

Yet as they now stand, the proposals to amend the Privacy Act do not include this critical reform.

I wrote previously about some of the themes arising from the Final Report into the review of the Privacy Act. 

One of the surprises, but not of the happy-surprise-birthday-party kind, was the way in which "personal information" has been treated.

Can someone be 'identifiable' if a business doesn't know their name?

The definition of personal information is a critical threshold definition because the privacy principles only apply to personal information. If a business can successfully argue that some data is not personal information, they can collect, use, disclose and trade the data with impunity.

Right now, the definition of personal information includes if someone is "reasonably identifiable". But that phrase is foggy, to the detriment of businesses and consumers alike, who may ask: Can someone be 'identifiable' if a business doesn't know their name?

The OAIC says yes

In guidance dating back to 2017, and in a string of case law determinations, the OAIC has maintained that 'identifiability' in law does not necessarily require that a person's name or legal identity can be established from the information. Instead, it implies uniqueness in a dataset: "(g)enerally speaking, an individual is 'identified' when, within a group of persons, he or she is 'distinguished' from all other members of a group… This may not necessarily involve identifying the individual by name".

The Attorney-General's Department says yes

The Final Report quotes, without challenge, the OAIC position, and states that "The test does not require that an individual's legal identity be known provided the information could be linked back to the specific person that it relates to."

Based on European and Californian privacy laws and others, our global trading partners say yes

Each has either explicitly expanded on the meaning of identifiability, or has introduced alternatives to identifiability as a threshold element of their definitions. The GDPR calls it "singling out". 

The California law (CCPA) includes, within its definition of personal information, data which is "capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household", without first needing to pass an identifiability test. The 2019 international standard in Privacy Information Management, ISO 27701, is similar.

one targeted shopper with trolley among other shoppers

AdTech businesses are tracking consumers' buying behaviour and data-matching across brands and businesses.

Australians want the answer to be yes

In a 2018 Roy Morgan Report on Consumer Views and Behaviours on Digital Platforms, 79% of digital platforms users considered telephone or device information, and 67% of digital platform users considered browsing history to be information that could reasonably be used to identify them when doing things online. 

The OAIC's 2020 Community Attitudes to Privacy Survey showed that only around a quarter (24%) of Australians feel the privacy of their personal information is well-protected, and that the vast majority (83%) of Australians would like the government to do more to protect the privacy of their data. Public support for strengthening the Privacy Act surged in 2022, in the wake of the Optus and Medibank data breaches.

The vast majority (83%) of Australians would like the government to do more to protect their privacy

And the most recent research, released last month by the Consumer Policy Research Centre, found that the majority of Australians regard things like their IP address, device IDs, location data and online search history to be their 'personal information'. In fact, respondents were even more likely to consider this data personal information than categories of data like sexuality and disability. 

This research showed that the majority of Australians were also uncomfortable with that type of data being used by companies to create a personal profile, or with it being collected from, or shared with other companies.

So who is saying no?

Right now, some industries say no, or they are lobbying to make the answer be no, or they add to the fog by obfuscating their way around the terminology when dealing with consumers and pretend the answer is no.

Some industries are exploiting the fog around the definition to match up customer records from different devices and different apps and share user attributes between different companies, by arguing that the data they are using is not 'reasonably identifiable', and thus the privacy rules (which prohibit unrelated companies sharing their customers' personal information without consent) do not apply.

For example, we have seen industry arguments that no-one can be "reasonably identified" from facial detection, or non-cookie based targeted advertising. And law academic Katharine Kemp has highlighted the disingenuous claims about "anonymous" data made to consumers by media and AdTech companies, especially when compared with what they privately tell brands about their cross-brand data-matching, online tracking, profiling and "addressable" targeting capabilities.

An industry player admitted that the reforms 'will force us to stop doing some things that we probably shouldn't have been doing anyway'

In an article about the law reform proposals, one industry player was quoted as admitting that reforms will "force us to stop doing some things that we probably shouldn't have been doing anyway". Another said, of the use of hashed emails: "It's very easy to link those two data sets together and then re-identify the personal information". 

So if the law is clarified to state that pseudonyms like hashed emails (which facilitate data-matching at the individuated level) constitute "personal information", the result will be "a big impact on existing industry practices", because "there are thousands of AdTech companies and publishers using hashed emails" (to match up data about customers, profile and target them without consent).

As Victorian Information Commissioner Sven Bluemmel puts it: "I can exploit you if I know your fears, your likely political leanings, your cohort. I don't need to know exactly who you are; I just need to know that you have a group of attributes that is particularly receptive to whatever I'm selling or whatever outrage I want to foment amongst people. 

"I don't need to know your name. And therefore, arguably depending on how you interpret it, I don't need 'personal information'. I just need a series of attributes that allows me to exploit you."

The media publishing and AdTech industry players know that this is true, but they will do what they can to maintain fog around the phrase 'reasonably identifiable', so that their practices can stay in the shadows.

Industry is working to water down privacy reforms

So long as the wording in the statutory definition of 'personal information' is not clear and precise, the fog will not dissipate. The much-touted reforms will fail to stop these widespread, covert data-sharing practices.

Fear of the fog clearing is the reason that industry is pushing to water down the proposed reforms – or kill them off entirely. In what has been described as a "privacy counterstrike", industry lobby groups representing digital platforms, media giants and advertising companies are "planning a cross-industry counteroffensive in a bid to wind back key proposals" in the Final Report.

One of the digital and AdTech industry's objectives is to lobby for "a more reasonable definition [of personal information] … [the proposed definition] doesn't seem workable".

The proposals in the [Privacy Act Review] Final Report don't deliver the clarity or strength that we need

And what is this radical and unworkable proposal in the Final Report that is deserving of such a focused counteroffensive? To change the word "about" to "that relates to". That's it. Three words. THREE WORDS. As per the Final Report, that's the only actual change proposed to the statutory definition of "personal information". 

The rest would be put in guidance, or in a list of things which might or might not be personal information, or in a list of things that organisations should "have regard to", when "doing their own assessment" about what the definition might mean.

Which means that industry lobbying has already been successful, because this is a watering down of what was proposed by the Department in 2021, in their Discussion Paper on the review. Then, the proposal included adding a whole extra sentence! 

Perhaps the review team was persuaded that the sky would fall in if they dared to add the following words into the Act, as they originally proposed in 2021 – and hold onto your hats here, because this is scary radical stuff:

"An individual is 'reasonably identifiable' if they are capable of being identified, directly or indirectly."

crowd of people walking over binary code

Consumer research shows that Australians are uncomfortable with companies using their personal information to profile them.

The 2021 Discussion Paper stated that such a definition "would cover circumstances in which an individual is distinguished from others or has a profile associated with a pseudonym or identifier, despite not being named".

I know, vive la revolution it ain't, but nonetheless this modest proposal from 2021 doesn't even appear in the Final Report.

According to the Final Report, only one submission argued against the proposition that the definition should be amended to expressly include when someone can be distinguished from all others in a group (even if not named) in order to be profiled, targeted, or acted upon in some way.

And yet … the proposals in the Final Report don't deliver the clarity or strength that we need.

What's currently being proposed

Instead of proposing an amended definition of personal information that would clearly encompass the types of individuating identifiers that allow online behavioural advertising and other practices to go unchecked, the Final Report proposes a whole separate regime for regulating certain use cases, like direct marketing, online targeting and trading. 

(And they do this by not touching the definition of "personal information", but by saying that for these special regulations of these special use cases, sometimes de-identified or even unidentified data will be within scope as well.)

But then when you read the details of those new rules in Chapter 20 of the Final Report, the only substantial right is to opt out of being shown targeted advertising. Those proposals will not impact on the collecting of information, or the building or sharing of profiles, or the use of our information to create 'lookalike audiences' at all; all the stuff behind the scenes gets a free pass.

The current proposal leaves consumers open to harm

It's not like privacy harms only come from direct marketing, online targeting or trading in personal information. The ability to distinguish one individual from others, in order to track, profile, locate, contact or influence them, is also the starting point for stalking, surveillance and abuse. 

Privacy harms can come from personal digital assistants, chatbots or generative AI giving erroneous advice, or apps which monitor health then leak information to third parties. They can come from anti-abortion activists targeting women using geo-fencing.

Individuation online is the diesel that fuels the algorithmic engines, amplifying the voices of influencers, and powering the trains of online hate, misinformation and extremism

Individuation online is the diesel that fuels the algorithmic engines, amplifying the voices of influencers, and powering the trains of online hate, misinformation and extremism which lead to everything from the explosion in mood disorders to pro-anorexia content to Holocaust denial and genocide.

Why the proposed approach won't work

As we've said in our submission to the Privacy Act review, playing legislative whack-a-mole by trying to regulate specific use cases is guaranteed to make the Act out of date the day it is amended. Regulating specific use cases will also just shift the battleground, such that arguments will become about what business practices are in or out of those defined use cases. 

(And that's before we even get to the proposed extra rules for 'unidentified' and 'de-identified' data as well, which would not be needed if the definition of personal information was fixed instead.)

We also already know that shifting definitional matters to OAIC guidance just doesn't work. The practices I've described here take place now, despite the existing OAIC guidance that the definition of personal information (and therefore all the privacy rules) apply to data which enable an individual to be "distinguished" from all other members of a group, without needing to know their names.

What needs to happen to protect consumers

As the Final Report itself states, codifying OAIC guidance makes propositions "more readily enforceable".

That's why it is essential to amend the statutory definition of 'personal information' in the Privacy Act, which applies across all industries, and all use cases, and cannot be ignored.

So dear Attorney-General, please take this historic opportunity to strengthen but simplify and clarify the law. Just one extra sentence will do it:

"An individual is 'reasonably identifiable' if they are capable of being distinguished from all others, even if their identity is not known."

That one extra sentence would clear the fog, protect Australians the way they expect, simplify compliance, stop the disingenuous claims by industry, and bring Australian law closer to alignment with that of our trading partners, by building into the wording of the Act itself what is already in determinations and guidance from the OAIC.

It's only one extra sentence, but it will make all the difference.

This article has been republished from the Salinger Privacy Blog.

We care about accuracy. See something that's not quite right in this article? Let us know or read more about fact-checking at CHOICE.

Stock images: Getty, unless otherwise stated.