Abstract of air vent.

NGO submits complaints on allegedly discriminatory algorithms for job ads

Published on 25 November 2021

The question

How can online platforms ensure their ad targeting algorithms are non-discriminatory?

The key takeaway

An NGO, Global Witness, has submitted complaints about Facebook’s ad targeting to the Equality and Human Rights Commission (EHRC) and the Information Commissioners Office (ICO). Global Witness has stated that the algorithms used by Facebook in their ad targeting are discriminatory. The algorithms in question resulted in certain job adverts being targeted at audiences that were predominantly one gender (eg nursery nurse jobs were targeted at an audience that was 95% female). 

The background

Methodologies for ad targeting have become increasingly refined in recent years. Paid search has become more prominent, while organic posts now garner fewer and fewer impressions. The targeting of ads is run by algorithms of increasing sophistication. Platforms are also able to use the data they collect on users to more effectively target their ads. This is all in an attempt to boost CTR (click-through rate) and more importantly, conversions (or sales). 

These algorithms are complex and can sometimes lead to results that are biased. In March 2019, Facebook was sued in the USA following allegations that its algorithms were discriminatory. In response, Facebook moved to prevent advertisers from targeting housing, employment and credit adverts to people by age and gender. 

Earlier this year, another case against Facebook was brought in the USA. Samantha Liapes alleged that the targeting algorithm had discriminated against her by not showing her ads for insurance. Liapes stated that such adverts are targeted more at male users and younger users. 

The development

Global Witness is an NGO based in London. In order to test Facebook’s ad targeting, the group ran a series of job ads for particular roles to adults in the UK. Advertisers on Facebook are able to use a range of different targeting methods ie by age group, gender, interests etc. Global Witness instead used “Optimisation for Ad Delivery”, which leaves the targeting up to the platform’s algorithm. In this case, this was to generate link clicks. This meant that Facebook would show the ads to the audience(s) it determined were most likely to click on them.

Global Witness ran several ads for genuine job openings. Of these ads, 96% of the people shown ads for mechanic jobs were men, 95% for nursery jobs were women, 75% for pilot jobs were men and 77% for psychologist jobs were women. Global Witness has called for the EHRC to investigate these results and whether the algorithm breaches equality and data laws. Global witness has also consulted the ICO under Article 36(1) of the UK GDPR on the basis that the data processing utilised by Facebook’s advertising tool presents a high risk of discrimination contrary to the fairness principle contained in Article 5(1)(a) of the UK GDPR. 

Global Witness is also calling on the Government make it mandatory for technology companies to make their targeting criteria transparent and to carry out risk assessments on potentially discriminatory algorithms in order to identify potential issues and mitigate these risks. 

Why is this important?

All online platforms with a presence in the UK need to be aware of the laws on personal rights, data and much more besides. This issue could be a hint of the future difficulties to be faced in managing automation and algorithms and imbuing them with a human side.

Any practical tips?

Platforms need to be mindful of the possibility of emphasis on optimisation leading to some undesired results. All organisations should seek to strike a balance between leveraging technology so it can be hugely impactful while also remembering to incorporate human checks and balances along the way. Being able to show that an algorithm utilises several metrics in order to target ads effectively will also be helpful to show that the algorithm is not potentially discriminatory.