Home>>Blog>>Article
Research

30 October, 2025

Algorithmic Discrimination: A New Perspective on Cost-Driven Information Inequality

Paper:
Title: Algorithmic Bias? An Empirical Study into Apparent Gender-Based Discrimination in the Display of STEM Career Ads
Authors: Anja Lambrecht and Catherine Tucker
Publication: Management Science (MNSC.2018.3093)

Why Automated Algorithmic Decisions Produce Seemingly Discriminatory Outcomes

The increasing adoption of algorithms in automating decision-making has generated deep concern that these automated choices may produce discriminatory results. Traditional explanations for algorithmic discrimination often cite intentional or unconscious bias by programmers or bias learned from behavioral data.

However, the empirical study by Lambrecht and Tucker (2018) provides a nuanced explanation by examining the display of ads promoting STEM career opportunities. This ad campaign was explicitly intended to be gender-neutral in its delivery.

Key Finding: Economic Crowding Out as the Core Driver of Disparity

The central finding was that despite the gender-neutral intent, the ad was shown to over 20% more men than women across 191 countries, particularly in the 25–54 age range. The research ruled out several alternative explanations:

  • Exclusion of Consumer Behavior: Women's click-through rates (CTR) (0.167%) were actually higher than men's (0.131%), which invalidates the hypothesis that the algorithm reduced exposure because women were less likely to engage.
  • Exclusion of Cultural Prejudice: The imbalance was not related to country-specific factors like female education levels, labor market participation, or general gender equality indices.

Core Cause—Competitive Spillovers and Pricing Pressure: The unequal allocation reflected the economics of ad delivery in competitive online markets.

  • Women as a Prized Demographic: Evidence suggests that younger women are a "prized demographic" because they often control household purchases and have higher conversion rates (more likely to purchase after clicking) for other sectors like retail.
  • Cost-Driven Disparate Impact: Consequently, female "eyeballs" are more expensive to advertise to. An algorithm optimizing solely for cost-effectiveness will favor the cheaper audience (men) when displaying a gender-neutral ad, leading to an apparent disparate impact due to crowding out.

This suggests that economic forces can distort algorithmic decision-making, disadvantaging one group relative to another, even when human bias is removed.

Policy Implications and Legal Tension

The findings highlight significant policy challenges in regulating algorithmic bias:

  • Limits of Algorithmic Transparency: Requiring mere "algorithmic transparency" may be insufficient. Public scrutiny of the code would likely only reveal an algorithm aiming for the reasonable goal of cost minimization, failing to reveal how that minimization interacts with competitive economic environments to create bias.
  • Legal Conflict: U.S. employment discrimination law distinguishes between "disparate treatment" (intentional discrimination) and "disparate impact" (neutral practices leading to negative outcomes). The legal status of targeted advertising within this framework is unclear.
  • The Paradox of Correction: When advertisers attempt to correct the observed imbalance by running a targeted campaign aimed only at women to ensure equal reach, the ad is often not approved by the platform. This is because Federal law prohibits using targeting to exclude a group in employment advertising (disparate treatment). This restriction prevents firms from using digital techniques to rectify imbalances caused by the algorithm.

Policymakers should consider new guidance, potentially requiring platforms to offer advertisers the option of automatically equalizing ad distribution across specified demographic groups for certain campaigns.

Share this Article