Professors Mark Andrejevic, Daniel Angus and Jean Burgess are researchers in the ARC Centre of Excellence for Automated Decision-Making and Society and lead the Australian Ad Observatory project.
The online economy, dominated by platforms such as Google, Facebook and Instagram, largely runs on revenues generated from users' personal data. The ads that follow us around the internet are targeted at us, based on detailed information about our identities, interests, relationships and online activity.
Targeted advertising can be useful and convenient at times, like when you've been searching for the perfect bedside lamp and Facebook shows you ads from lighting companies with similar lamps. But can also be irrelevant and annoying. It can have grave consequences, too: the same systems that target you as a potential buyer of a new bedside lamp could also persuade you to buy a scam health product or vote for a particular political party.
Serious harms can result – for example, online platforms might make it possible to target job ads by race or gender, violating anti-discrimination laws. In 2019, Facebook paid more than $6 million to settle a US lawsuit concerning precisely this kind of discrimination – a tiny slap on the wrist for a company whose yearly revenues were more than $100 billion at the time.
How well does Facebook regulate advertising on its platform?
Like all large platforms, Facebook hosts so much content that it needs to rely on automated systems to make sure ads comply with its rules. But these measures don't appear to be reliable in preventing harmful and unethical advertising practices – as the world learned in 2016 during the US presidential election and the UK referendum on leaving the European Union (Brexit), when advertisers used personal data to craft ads targeted at specific groups and individuals in an effort to manipulate them and influence the result.
Researchers at the internet accountability group Reset Australia recently tested Facebook's anti-misinformation measures by submitting 'dummy' political ads. These were approved by Facebook. Set up to be run ahead of the upcoming federal election, the ads included official-looking notifications falsely stating that people who weren't vaccinated wouldn't be allowed to vote, and that the federal election had been cancelled because of the COVID-19 pandemic. Reset pulled the ads before they ran. If they had been published, it's likely that the ads would have been reported and withdrawn anyway – but only after they'd potentially deceived or confused a large number of people.
Platforms keep us in the dark
When ads appear in newspapers, on billboards or on TV, people can see what kind of messages are being circulated and who else may be receiving them. This makes public accountability for advertising practices relatively straightforward.
But online the situation is dramatically different: ads can be targeted at individuals on their personal devices for reasons they may not fully understand. Someone who receives a job or real-estate ad, for example, has very few ways of knowing who else is seeing the same ad, and therefore whether they're being singled out based on their gender, age, ethnicity or other personal characteristics.
As a society, we need to find new ways to see and detect potentially harmful patterns in how consumers and citizens are being profiled, sorted and targeted by platforms and advertisers
Targeted online ads are equally opaque to the groups we've historically relied on to hold advertisers to account, such as journalists, advocacy groups and regulators. So a major barrier to accountability in online advertising is 'observability': as a society, we need to find new ways to see and detect potentially harmful patterns in how consumers and citizens are being profiled, sorted and targeted by platforms and advertisers.
In response to public pressure and legislation, platforms such as Facebook have begun creating their own 'ad transparency dashboards'. But from a public oversight perspective, these dashboards are hard to make sense of and of very limited use. They aggregate or abstract key information, and remove most historical data. They also obscure much of the detailed data necessary to identify patterns in reach and targeting that might indicate discrimination or predatory advertising. Nor is there any independent verification of the accuracy of the data.
How can we achieve better public oversight of online advertising?
People have a right to know how their data is being used by platforms and advertisers to target them. As a society, we need to make sure this data isn't being used to enable unfair discrimination or predatory business practices. So far, platforms and advertisers haven't gone far enough in terms of self-regulation and have aggressively resisted any push for accountability.
In response, the ARC Centre for Automated Decision-Making and Society (ADM+S) has created the Australian Ad Observatory. Modelled on a similar program in the US, the Ad Observatory invites Australians to share the ads they encounter on Facebook with researchers. Anyone can take part by installing a browser extension that automatically collects the ads they receive in their Facebook news feeds – and nothing else. The extension only works on laptop and desktop computers, and can be removed or disabled at any time. If you do install it, you'll be able to see all the ads you receive during the time the extension is running.
People have a right to know how their data is being used by platforms and advertisers to target them
We know it will be challenging for an independent project such as this to create a complete picture of how people are being targeted by online advertising. But we hope it will help build public support for greater transparency and accountability, including legal requirements for companies to give detailed information about the ads they run and how they're targeted. As a society, we need these measures to make sure targeted advertising isn't being used in predatory or discriminatory ways.
If you'd like to find out more about the Australian Ad Observatory, or would like to join it, please visit our website.
Disclaimer: Professor Jean Burgess has previously engaged with Facebook as an external academic advisor on policy matters.
Stock images: Getty unless otherwise stated.