Laws have some catching up to do with websites that turn to manipulative software design to, for instance, charge your credit card more.
Have you ever used your credit card to make an online purchase and ended up buying more than you intended to, or paying a higher price than you expected? You might then be a victim of so-called “dark patterns.” Marketers have always been geared toward manipulating consumers into buying their goods, and the digital age has provided them a new canvas that enables a wider range of tricks to do so.
Harry Brignull, a user experience designer credited for coining the term “dark patterns,” and founder of darkpatterns.org, refers to it as a manipulative software design “that gets users to complete an action they would not otherwise have done if they had understood it or had a choice at the time.”
Manipulation through dark patterns has the impact of reducing individual welfare since consumers can’t exercise their real preferences. It also diminishes social welfare since it helps dominant businesses collect data and cut down competition, thereby maintaining their market share.
With the growth of online commerce, especially following the pandemic, credit card users need more protection from this sort of manipulation. Legislation has been introduced to deal with this pattern of marketing, and the Federal Trade Commission (FTC) is also aiming to tackle such tricks more actively.
Examples of dark patterns
Speaking at a Federal Trade Commission workshop on dark patterns, Brignull compared the practice to running a business that could garner a 21% rise in sales by getting consumers to press a certain button. That would be a “no brainer” for the business, and that’s what is behind dark patterns.
A topical example of dark patterns is how airlines reacted during the pandemic when flights were canceled worldwide, and they were met with demands for refunds from travelers. Their websites were not designed to encourage consumers to ask for refunds. Instead, the button readily available for consumers to click on directed them to get a voucher or rebook their ticket for a later date.
Finn Lützow-Holm Myrstad, director of digital policy, Norwegian Consumer Council, said at the FTC workshop, “And both of those options are actually giving the airline an interest-free loan with no security. If you wanted to get your money back, a refund, a lot of the airlines would have an impossible-to-see button in very small writing, far down below in the website, and you almost have no ability to click it.”
In another example, the food delivery company Instacart has been sued for adding a 10% charge to consumer bills, presenting it to them as a tip for delivery people. There was an obscure link on its website that allowed consumers to opt out of this charge, which went to Instacart rather than the delivery people.
There are a variety of ways to manipulate online shoppers. These include:
- Making consumers jump through hoops to opt out of something
- Online forms that trick consumers into giving information they didn’t intend to provide
- Online sites that sneak items into your shopping basket, maybe by using an opt-out option on a prior page that you missed
- Making it difficult for you to compare prices of items so that you can’t make a good decision
- Charging for items such as taxes or delivery fees, or tacking on other unexpected charges, at the final point of checkout (hidden fees are particularly an issue with hotel and travel websites)
- Charging your credit card without notice after your free subscription term runs out
- Getting your email or social media permissions under the guise of finding your friends, and then spamming them with messages
Legislation tackling dark patterns
Regulators are catching up to such manipulations, and efforts are being made to tackle them. For one, Senators Mark Warner and Deb Fischer introduced the “Deceptive Experiences to Online Users Reduction Act,” or the DETOUR Act, in 2019. The broad aim of the regulation is “to prohibit the usage of exploitative and deceptive practices by large online operators and to promote consumer welfare in the use of behavioral research by such providers.”
The law would prohibit designing or tinkering with a user interface so that it has the effect of interfering with consumer autonomy in order to get consent or data from consumers.
It also rules against online segmentation of consumers into groups for the purpose of behavioral studies without their permission.
And it outlaws the practice of tampering with user interface so as to engage children under 13 in compulsive usage.
In emailed comments, Sean O’Brien, founder of the Yale Privacy Lab (and a lecturer in cybersecurity at Yale Law School), noted, “The DETOUR Act is a positive step forward, especially where informed consent is concerned, and attempts to tackle user interfaces that result in compulsive usage for children under the age of 13. Given the complexity of proving, for example, that A/B testing is occurring in a shopping cart environment and that such manipulation has not been consented to, I worry the law will be difficult to apply to real-world cases.”
FTC empowered to pursue unfair and deceptive dark patterns
The DETOUR Act allows the Federal Trade Commission to pursue violations under this regulation as unfair and deceptive practices. The FTC can also charge online companies that trick consumers into recurring charges via the Restore Online Shoppers’ Confidence Act. And the CAN-SPAM Act asks marketers to provide an easy way for consumers to opt out of email messages.
In a statement condemning the practice of online obfuscation (while pursuing a case against Age of Learning), Rohit Chopra, FTC commissioner at the time, said, “Digital deception should not be a viable American business model. If the FTC aspires to be a credible watchdog of digital markets, the agency must deploy these tools to go after large firms that make millions, or even billions, through tricking and trapping users through dark patterns. We cannot replicate the whack-a-mole strategy that we have pursued on pressing issues like fake reviews, digital disinformation, and data protection.”
California has been the vanguard in the matter of online consumer privacy, enacting its California Consumer Privacy Act in 2020. The state also amended that regulation in April 2021 to provide protections against dark patterns. Going a step further, the state has passed the California Privacy Rights Act (CPRA) (to be enacted in 2023) that further protects against dark patterns and also sets up an enforcement agency called the California Privacy Protection Agency. For one, the CPRA gives consumers the right to opt out of sharing personal information.
Laws against dark patterns are difficult to enforce
While it’s encouraging that a variety of regulatory avenues seek to limit dark practices, they face challenges in cracking down on them. For one, it’s difficult to test interfaces to determine if they are engaging in manipulation. And this sort of deception is not likely to attract the attention of internal whistleblowers.
Jennifer Rimm, assistant attorney general, District of Columbia, noted at the FTC workshop, “The proliferation and sheer variety of dark patterns on its own makes it more difficult for government enforcers to effectively identify the most severe offenders and to address these offenders through targeted enforcement actions. Added to this, a feature of dark patterns is that they are covert or deceptive, meaning that consumers may not realize they are being manipulated, or they may be inured to these dark patterns due to their prevalence.”
It seems dark patterns are more prevalent on applications and smartphones than on websites, considering that they are harder to detect on a smaller screen. That’s also a hurdle to enforcing laws. At the FTC workshop, Johanna Gunawan, a doctoral student at Northeastern University, shared, “If an audit only examined the desktop site, they might think that service is compliant and provides meaningful methods for users to delete their accounts when the reality for mobile users would be quite different.”
Another deterrent in going after dark patterns is a concern about whether businesses can point to commercial free speech protections under the First Amendment in their defense since some of these manipulations fall in a grey area, Lior Strahilevitz, a professor at the University of Chicago Law School, observed at the FTC workshop.
Yale’s O’Brien believes that’s not always the case. He said, “I don’t find it likely that a First Amendment claim would protect a business that is engaging in price-fixing, discriminatory pricing, and behavioral or psychological manipulation. However, I see the primary challenge as proving tangible harm to the consumer – as with privacy laws in the U.S., this bar can be difficult to meet.”
For consumers who feel they have been misled by dark patterns, until more robust laws are developed, it seems one recourse is to file a complaint with the FTC or the Consumer Financial Protection Bureau (if it’s a matter relating to financial products or services).
The digital era has provided marketers with a variety of ways to trick consumers into giving up their data and money with the help of manipulative software design. Lawmakers are catching on to such practices and developing consumer protections.
It’s not clear how such laws will work in practice, though. In the meantime, you can always file a complaint with the FTC or the CFPB if you believe you have been at the receiving end of an unfair or deceptive e-commerce practice.