Login

The Dark Sides of Digital Marketing

Marketing Automation: Marketing Utopia or Marketing Dystopia?

Klaus Wertenbroch

Keywords

AI, Algorithms, Free Choice, Marketing Automation, Personalization

download pdf

Marketing Utopia – Individual real time access to consumers for convenient and relevant offers
Marketing has undergone revolutionary changes in the last decade. Virtually all processes involved in marketing can now be automated, from segmentation and targeting to service provision, advertising, distribution, retailing, and pricing. The ability to track individuals’ behavior online and to merge multiple data sources into “big data” sets increasingly allows marketers to target consumers individually. Machine learning-based algorithms can tailor product offers, advertisements, and prices to individuals in real time: Utopia has become real for marketers. Such personalization boosts companies’ profitability from more accurate price discrimination, and consumers enjoy convenience and offers tailored to their needs. However, automating and personalizing interactions may also have less positive economic and psychological consequences for consumers, among them higher individual prices and threats to their perceived autonomy.

Higher individual prices for consumers
Companies can maximize profits when every customer pays a price for a product which is close to his or her willingness to pay (WTP). In the past, individual WTP was impossible to determine, often allowing consumers to shop for less than they would be ready to pay. Today, machine learning-based prediction algorithms can approximate individuals’ preferences and their WTP at ever greater levels of precision, and they can create personalized offers reflecting this knowledge. In one experiment, recruiting company ziprecruiter.com found that it could increase profits by more than 80% when switching from its historical uniform pricing to algorithm-based individualized pricing, using more than a hundred input variables, by which it could characterize each of its customers. Uber’s route-based pricing reportedly uses machine learning to determine route and time-of-day-specific prices that take various demand conditions into account. Uber could easily use customers’ ride histories and other personal data, along with information that machine learning can extract from linking different riders’ data, to derive even more personalized prices. While these possibilities help companies advance their profit and shareholder value maximization objectives, they are alarming for customers. Personalized price discrimination may benefit consumers with a lower WTP who might otherwise be priced out of the market, but, overall, consumers likely end up paying prices closer to their WTP, leaving them with less surplus, especially consumers with a higher WTP.

Automating and personalizing interactions may have less positive economic and psychological consequences for consumers, among them higher individual prices and threats to their perceived autonomy.

Low compensation for personal data
Typically, consumers freely reveal all the information necessary to infer their preferences and WTP. Wouldn’t charging for such data allow consumers to be compensated for the downsides of personalization? Companies argue that they aptly compensate consumers with better offers and free services like YouTube videos, social networking, etc., whereas critics argue that companies do not compensate consumers enough. In several laboratory experiments, applying strict criteria of rational choice theory, we found that consumers tend to systematically underprice their private data when they barter it away for goods or services as opposed to selling it for money (see Box 1 and Figure 1). Consider consumers using Google or Facebook. Consumers pay for these services with private data, which these companies collect and use to generate profits as advertising platforms. It seems that consumers undervalue their private data in such non-monetary exchange settings because they do not view their data as a marketable resource, even though they are handing the data over to for-profit companies. This allows companies to extract extraordinary profits and gain market power at consumers’ expense. The unprecedented valuations of the dominant technology companies, to which consumers turn over their private data, are perhaps a reflection of this uneven exchange. Markets for personal data may not work efficiently, at consumers’ expense.

Box 1: Consumers underprice their private data

What’s adequate compensation for consumers’ private data? Geoff Tomaino, Dan Walters, and I conducted several experiments to investigate the price consumers demand for their private information. In a series of experiments, we compared how much several thousand participants on Amazon’s MTurk and Prolific demanded for the same private data in exchange for money or for goods or services. Consumers with rational preferences for privacy should want equal compensation in both conditions. However, across all experiments consumers systematically valued their private data less when they were asked to trade it for goods [as measured by how much money they wanted for these goods] than when they were asked to sell it for money. Of course, e-commerce companies usually collect consumers’ private data in return for services and not in return for money.

Loss of autonomy
For consumers, there is another discomforting aspect of giving up their privacy: less autonomy. As human beings and consumers, we value being autonomous in our choices, free from external influence imposed by other agents and expressing our own free will. But autonomy requires privacy. Without privacy, we become predictable, which, of course, is the goal of prediction algorithms, used to predict anything from individuals’ credit defaults or insurance claims to responses to advertisements and purchase probabilities. In further experiments, Rom Schrift, Yonat Zwebner, and I found that consumers act as if they experience a threat to their autonomy when they understand that algorithms can predict their choices. Participants who were told that an algorithm could predict their choices, rather than just calculate how consistent their choices would be with their preferences, ended up choosing less preferred options to re-establish their sense of autonomy. Consumer acceptance of prediction algorithms may thus depend on whether marketers frame them in ways that preserve users’ perceived autonomy in their choices.

Surrendering to a black box
Another concern with decision-making algorithms is their “black-box” nature. Often, the mechanisms behind algorithms are too complex to be “explainable” or cannot be made transparent for competitive reasons. Not knowing how and why an algorithm decides to block desired financial transactions or grant credit card limits worries regulators and antagonizes many consumers. GDPR Articles 13 through 15 require companies to provide customers with “meaningful information about the logic involved” in such automated decisions. In another set of experiments, we found that goal-oriented explanations, informing customers why algorithmic decisions were put in place, can make up for the lack of a mechanical explanation. We showed in an actual marketplace setting that explaining the goals of an algorithm can be more satisfying to customers than purely informing them about a negative outcome. Explaining goals implies that customers are treated fairly.

The complex challenge of mitigating marketing dystopia
Preventing dystopian outcomes is typically a task for regulators, but finding solutions can be difficult. Companies need to address consumer concerns in their policies as well. Figure 2 and the following points summarize possible measures.

 

  • Regulation to support competition
    To protect customers and prevent companies from using their market power to charge higher prices or collect personal data without adequate compensation, regulators may attempt to both protect consumer privacy and encourage competition. Ironically, competition to provide consumers with better, more personalized offers at competitive, less discriminating prices requires sharing consumers’ personal data between companies. Thus, privacy poses a policy conundrum: On the one hand, policy makers have to protect consumer privacy to limit opportunities for companies to monopolize their markets by extracting value based on personal data. Yet regulation such as the European Union’s GDPR may stifle competition, which requires sharing private data across companies, implying less privacy. Paradoxically, we may not be able to have both privacy and competition. If we protect privacy, we undermine competition. If we protect competition, we undermine privacy.
     
  • Transparency by companies
    Given the difficulties regulators face, companies themselves should take data privacy issues seriously. Instead of opposing attempts by consumers and regulators to protect privacy and to counteract the unlimited collection and use of private data, they should incorporate rules in their policies that give consumers authority over their data. Being transparent about how personal data is collected and used as well as providing consumers with a better understanding and control over their data can help restore faith in automated marketing routines. This may limit price discrimination opportunities but will protect brands and profits in the long term.
  • Frame algorithms in positive ways
    Even if many algorithms are suspicious to consumers, they can be more efficient and accurate than humans and improve our lives. To exploit this potential, companies need to address concerns and design algorithms in ways that help consumers (re)establish trust and prevent reactance. Rather than emphasizing that algorithms predict individual behavior, marketers should present them as tools that enable consumers to act consistently with their preferences. Making algorithms transparent can further reduce skepticism. If this is not possible, explaining the goals of algorithms can also reduce fears associated with AI-driven decisions.

Considering all the effects of marketing automation, avoiding marketing dystopia is in the best interest of all market participants — at least with a longer-term perspective. To avoid dystopia, companies need to take consumer psychology into account and resist the temptation to maximize short-term profits at the cost of consumers.

Authors

Klaus Wertenbroch, Novartis Chaired Professor of Management and the Environment & Professor of Marketing, INSEAD, Singapore, klaus.wertenbroch@insead.edu
https://www.insead.edu/faculty-research/faculty/klaus-wertenbroch

Further Reading

André, Q.; Carmon, Z.; Wertenbroch K.; et al. (2018): “Consumer Choice and Autonomy in the Age of Artificial Intelligence and Big Data,” Consumer Needs and Solutions, Vol. 5 (1-2), 28-37.

Dubé, J.-P.; & Misra, S. (2017): “Scalable Price Targeting,” NBER Working Paper 23775, http://www.nber.org/papers/w23775.

Carmon, Z.; Schrift, R.; Wertenbroch, K.; & Yang, H. (2019): “Designing AI Systems That Customers Won´t Hate,” MIT Sloan Management Review, https://mitsmr.com/2qY8i35.

Tomaino, G.; Abdulhalim, H.; Kireyev, P.; & Wertenbroch, K. (2020): “Denied by an (Unexplainable) Algorithm: Teleological Explanations for Algorithmic Decisions Enhance Customer Satisfaction,” INSEAD Working Paper No. 2020/39/MKT, http://dx.doi.org/10.2139/ssrn.3683754 .