The Dark Sides of Digital Marketing

Algorithm-Based Advertising: Unintended Effects and the Tricky Business of Mitigating Adverse Outcomes

Anja Lambrecht and Catherine Tucker


Algorithms, Ad auctions, Discrimination, Gender, Bias, STEM

download pdf

Algorithms are everywhere
In the digital age, algorithms are often praised as powerful tools that help people and organisations make better decisions and accomplish their objectives more effectively. It is typically assumed that they function purely fact based and would produce unbiased and objective outcomes. However, there is more and more evidence that algorithms might lead to outcomes which resemble the discriminatory tendencies of humans. For example, Amazon had to cancel plans for the implementation of an AI driven automized recruiting tool because the system turned out to favour male over female applicants. Apple’s algorithms associated with their newly launched credit cards in 2019 sparked an enquiry. The system had offered men much higher credit limits than women, even if they were married, sharing all their bank accounts.

Biases in automated advertising
Biased algorithms can also be observed in advertising. In an eye-opening study, computer science professor Latanya Sweeney investigated the role of race in Google ads. She searched for common African-American names and recorded the ads that appeared with the results. She then searched for names that are more common among whites. The searches for black-sounding names were more likely to generate ads offering to investigate possible arrest records. Apart from racial discrimination other findings also document gender biases. In our own study related to online advertising we investigated such effects in the context of STEM (Science Technology Engineering Mathematics) careers. We sought to understand how internet and social media algorithms determine whether advertising content gets seen more by men or women and why. Our results suggest that advertising algorithms are not gender-biased as such, but economic forces that govern them might lead to unintended uneven outcomes.


Examining possible explanations
The fact that women were so much less likely to see the ad was surprising as no characteristic of the campaign had specified such an imbalance. Therefore, we investigated possible reasons.

The first question was whether the algorithm might have learned its behavior from women simply not clicking on ads as much as men. If that were the case, the advertising algorithm may have concluded that it was more economical to show ads to men. However, it turned out that women tended to click more often than men. Thus, that could not be the reason for the uneven display of ads.
Second, we asked whether the algorithm might have faced some sort of capacity constraint in that not sufficient female eyeballs were available to show ads to. However, women are similarly active on social media as men.
Third, we examined whether possibly the algorithm was reflecting underlying patterns of discrimination against women in specific countries. However, data from the World Bank revealed no relationship between the educational and labor market opportunities for women and whether STEM ads were displayed to them in the study.
Last, we turned to explore whether underlying economic mechanisms might be causing the imbalance in the display of STEM ads across genders and found an explanation in the way advertising auctions on Facebook and other platforms work (see Box 2).


Economic mechanisms: The actions of other advertisers interfere
The implication of higher bids by competing advertisers is that when advertising indiscriminately across genders, such as was the case in the campaign for STEM careers, advertisers are more likely to get their ad in front of male eyeballs than in front of female eyeballs. The algorithm is not intending to discriminate but spillover effects across different industries mean that they are more likely to reach one segment of the population than another. The higher price for female views results from the higher likelihood of women, especially aged 25 to 34, to convert each view of an advertisement into an actual purchase. This means that for an advertiser with a gender-neutral strategy it is more difficult to reach women. Economics forces might unintentionally favor men.

Mitigating insidious algorithms is tricky
Finding solutions to this kind of problem is challenging. First, because the issue is caused by the unintended interaction between different and independent economic participants who each have their own advertising strategy. Second, because employment laws in most countries do not yet adequately stipulate how targeted advertising fits within existing discrimination frameworks. Some seemingly simple solutions might not work properly.

  • Separate campaigns?
    At first sight, one potential solution could be for advertisers to run separate campaigns for men and women to make sure they can target both demographic groups equally. We set up a campaign that would do exactly this. However, Facebook prevented us from even running this campaign. The reason was that in the US, Federal Law prevents companies from targeting employment ads to only one gender. So ironically a law that was designed to avoid discrimination actually ruled out a fairly simple way to correct the bias and makes it harder for advertisers to fix unintentional uneven outcomes.

Algorithmic transparency and gender neutrality will not suffice in addressing unequal gender outcomes.

  • Transparency?
    Another popular prescription to prevent instances of apparent discrimination has been a focus on algorithmic transparency, whereby algorithmic codes are made public. Transparency might be helpful to counteract discrimination if it is hard-coded into an algorithm. However, in the particular context of our STEM campaign, algorithmic transparency would not have helped regulators to foresee uneven outcomes. It would likely have revealed an algorithm focused on minimizing ad costs for advertisers, which as such is reasonable. Without appropriate knowledge about the economic context and how such cost minimization might affect the distribution of advertising, such “transparency” would not have been particularly helpful.
  • Equal advertising distribution across groups?
    The highlighted tension illustrates further need for policy guidance in this area. One potential solution is for platforms to offer advertisers the option for a specific campaign of distributing ads equally across specified demographic groups.

Policy makers should be watchful
These results should be concerning to policy makers and platforms as disseminating information can be important to ensure equal opportunities of access. The key allocation mechanism that dictates the distribution of information does not reflect the desirability of information dissemination, but instead is the return on investment on advertising across all industry sectors. Advertising allocation decisions by a retail sector selling household products may affect communication opportunities and costs in the sector offering job opportunities. Groups that policymakers may worry about not receiving the same information – in our study women compared to men— might be more costly to engage with.


Anja Lambrecht, Professor of Marketing, London Business School, England, alambrecht@london.edu

Catherine Tucker, Professor of Marketing, Massachusetts Institute of Technology, Cambridge, MA, USA, cetucker@mit.edu


Further Reading

Lambrecht, A.; & Tucker, C. (2019): “Algorithmic Bias? An Empirical Study of Apparent Gender-Based Discrimination in the Display of STEM Career Ads”, Management Science, Vol. 65(7), 2966-2981. https://doi.org/10.1287/mnsc.2018.3093

Sweeney, L. (2013): “Discrimination in online ad delivery”, ACM Queue, Vol. 11(3), 10:10–10:29.