Digital platforms have become central to everyday life, whether for shopping, learning, or entertainment. While we believe that we are making informed choices, much of what we see and select is shaped by design choices engineered to influence behaviour. These manipulative and deceptive designs, now known as “dark patterns”, are being adopted up bye-commerce entities to guide users toward outcomes they may not have chosen freely.

From one-click subscriptions with obscure cancellations to hidden fees at checkout and misleading urgency cues, these practices are not an accidental. They are conscious tactics that take advantage of user attention, manipulate competition, and require scrutiny under both consumer protection and competition law.

Dark Pattern recognition under the Indian Legal Framework

The Advertising Standards Council of India (ASCI) first officially acknowledged ‘dark patterns’ in 2022, defining them as misleading or deceptive practices that violate consumer rights under the Consumer Protection Act, 2019 (CPA). Their discussion paper [1] reported dark patterns in 52 out of 53 Indian apps, and that almost 29% of advertisements in 2021-22 involved disguised advertising, a typical dark pattern. [2]

Guidelines for Online Deceptive Design Patterns in Advertising, 2023:  [3] Issued by ASCI, these Guidelines identify major manipulative tactics commonly used in digital advertising, such as drip pricing (where additional charges are disclosed only at the final stage of purchase), bait-and-switch tactics (where the advertised offer is substituted with another), false urgency (which induces unnecessary pressure on buyers in terms of time), and disguised advertising (where promotional content is presented without proper disclosure).

Consumer Protection Act, 2019 (CPA): While the said Act addresses unfair trade practices and misleading advertisements,[4] but lacks express language dealing with deceptive user interface or user experience design strategies.

Digital Personal Data Protection Act, 2023 (DPDPA, 2023): This Act provides a comprehensive framework for data privacy in India. In case of dark patterns, it considers “forced action” as forcing users to purchase more goods or provide more personal data than intended, often by making privacy settings difficult to change. The Act’s provisions are triggered by whenever any data fiduciary engages in such forced actions. Under Section 2(u) of the said Act, forcing users to share personal data without consent is deemed a personal data breach, punishable by fines up to ₹250 crores. However, the said Act only deals with the data privacy related aspects of dark patterns, thereby highlighting the need for a more comprehensive legislation covering all types of dark patterns.

Guidelines for Prevention and Regulation of Dark Patterns, 2023 (Guidelines, 2023): Issued by the Central Consumer Protection Authority (CCPA),[5] these guidelines marked India’s first regulatory effort specifically targeted  at identifying and curbing deceptive digital design practices by platforms, sellers, and advertisers.

Section 2(e) of the Guidelines, 2023 defines “dark patterns” as “any practices or deceptive design pattern using user interface or user experience interactions on any platform that is designed to mislead or trick users into doing something they originally did not intend or want to do, by subverting or impairing the consumer autonomy, decision making or choice, amounting to misleading advertisement or unfair trade practice or violation of consumer rights”.

In addition to the definition, the Guidelines, 2023 recognized an illustrative list of thirteen dark patterns, namely, (i) false urgency; (ii) basket sneaking; (iii) confirm shaming; (iv) forced action; (v) subscription trap; (vi) interface interference; (vii) bait and switch; (viii) drip pricing; (ix) disguised advertisement; (x) nagging; (xi) trick questions; (xii) SaaS billing; and (xiii) rogue malware. [6]

Though the Guidelines, 2023 are an important step towards formally acknowledging dark patterns in Indian law, they face significant enforcement limitations. Although Guideline 4 [7] bars all parties from engaging in dark pattern practices, the lack of a penal provisions renders these bans toothless, leaving no concrete mechanism for the Competent Authority to enforce them.

In addition, the scope of protection remains ambiguous. Although the Guidelines, 2023 broadly refer to “users,” their linkage to “consumer rights” under the CPA raises major concerns. This is so because the CPA recognises “consumers” as someone who buys goods or services for consideration, which would exclude the applicability of Guidelines, 2023 on users who only interact with a platform without making a purchase, thus creating a significant gap in protection.

Interface Manipulation as Abuse of Dominance under the Competition Act, 2002 (Act, 2002).

From Consumer Harm to Competitive Harm

Dark patterns, when used by market-dominant enterprises[8], can go beyond consumer issues and pose serious competition law concerns. Where interface design is manipulated not simply to persuade, but to hinder users’ freedom of choice, manipulate choice architecture, and deny competitors’ access to markets, it can be categorized as non-price exclusionary behavior and attract Section 4 of the Act, 2002.

Section 4(1) of the Act, 2002 prohibits any enterprise or group from abusing its dominant position in the market. More specifically, Section 4(2) delineates instances of such abuse, including the imposition of unfair conditions on consumers,[9] denial of market access to competitors,[10] and practices that limit market development to the prejudice of consumers. [11].

Interface Design as a Tool of Exclusion

A significant example of dark patterns is the widespread use of subscription traps by dominant streaming and retail platforms, such as Amazon and Netflix. Such platforms deliberately use “single-click” easy subscription techniques but make the cancellation equally difficult and complex through multi-step processes, hidden settings, or coersive prompts like “Are you sure you want to lose your benefits?”, a tactic known as confirm shaming. These techniques impose psychological and structural barriers to exit, effectively locking users into services and denying potential competitors market access, thereby invoking Section 4(2)(c) of the Act, 2002.

Other interface practices, such as basket sneaking and drip pricing, commonly seen in dominant online marketplaces, violate price transparency and thus exploit consumer inertia and transparency. Ancillary costs, for instance, in the case of drip pricing, are revealed only at the point of payment, while basket sneaking entails pre-ticking add-on upgrades such as warranty extension or contributions that users need to opt out of.

These design mechanisms capitalize on the asymmetry of information and stealthily manipulate the behaviour of users to the platform’s benefit. User consent is engineered by dominant companies, thus gaining a competitive advantage, rather than through innovative efforts, but by exploitation of interface power, a practice arguably covered by Section 4(2)(a)(i) of the Act, 2002, as unfair imposition of terms.

Emerging Regulatory Attention

The evolving global and domestic regulatory responses underscore the growing recognition of interface manipulation as a locus of abuse. In the United States, Amazon US has faced multiple actions by the Federal Trade Commission (FTC), which accused it of employing deceptive interface designs to enroll users into auto-renewing Prime memberships without informed consent.[12] In India, the Central Consumer Protection Authority (CCPA) issued a notice to Amazon India for engaging similar practices involving Prime Subscriptions.[13]

In another high-profile case, the Competition Commission of India (CCI) imposed a substantial penalty of approximately INR 1,000 crore against Google for abuse of dominance in the market for Android mobile operating systems. The CCI directed Google to stop mandating the pre-installation of its suite of applications, required that access to its Play Services APIs be made non-discriminatory, and prohibited exclusive arrangements with OEMs. Importantly, the CCI ordered Google to permit users to uninstall pre-installed apps and to offer them a choice of default search engine during device setup.[14] While the ruling did not explicitly identify dark patterns, it addressed interface-level restrictions and control mechanisms as aspects of dominance.

Similarly, an action was taken by the U.S. government against Adobe for “subscription trapping,” with allegations that it concealed early termination fees and made the cancellation process unreasonably difficult, another instance of manipulative interface design contributing to market power and consumer lock-in. [15]

While these instances reflect increasing regulatory awareness, the characterization of dark patterns as a separate form of exclusionary abuse under competition law remains limited. With digital platforms increasingly exercising interface design as a competitive tool, a more dynamic and purposive interpretation of Section 4 of the Act, 2002 is necessary to ensure that both user autonomy and market fairness are preserved.

Conclusion and the Way Forward

Dark Patterns must be viewed not only as consumer deception but also as market abuse. A coordinated regulatory response involving the CCPA, CCI, and the Data Protection Board is imperative. While the CCPA’s Guidelines, 2023 lack sufficient penal force, the CCI can intervene under Section 4 of the Act, 2002, to address interface-based exclusionary practices. The Data Protection Board, in turn, must proactively safeguard user privacy from such manipulative designs. Ultimately, as user interface becomes a competitive tool, the law must evolve to match. A dynamic interpretation of Section 4 of the Act, 2002 by the CCI can play a critical role in curbing these practices and protecting user autonomy.

Authored by Mr. Rohit Jolly (Associate Partner), Mrs. Sonali Khanna (Senior Associate), Ms. Sukriti Verma (Associate) and Ms. Vanshika Gupta (Associate).

Endnotes:

[1] ASCI, Dark Patterns- The New Threat to Consumer Protection- Document Discussion, November 2022, available at: https://www.ascionline.in/wp-content/uploads/2022/11/dark-patterns.pdf

[2] Ibid.

[3] ASCI’s Guidelines for Online Deceptive Design Patterns in Advertising, 2023, available at: https://www.ascionline.in/wp-content/uploads/2023/05/Guidelines-for-Online-Deceptive-Design-Patterns-in-Advertising.pdf

[4] Section 2 (28) of Consumer Protection Act, 2019

[5] Guidelines for Prevention and Regulation of Dark Patterns, 2023, November 30, 2023, available at: https://consumeraffairs.nic.in/sites/default/files/file-uploads/latestnews/Draft%20Guidelines%20for%20Prevention%20and%20Regulation%20of%20Dark%20Patterns%202023.pdf

[6] Ibid.

[7] Ibid.

[8] Section 4 (1)(a) of the Competition Act, 2002.

[9] Section 4(2)(a)(i) of the Competition Act, 2002.

[10] Section 4(2)(c) of the Competition Act, 2002.

[11] Section 4(2)(d) of the Competition Act, 2002.

[12] Jay Mayfield, FTC Takes Action Against Amazon for Enrolling Consumers in Amazon Prime Without Consent and Sabotaging Their Attempts to Cancel, June 21, 2023, available at: https://www.ftc.gov/news-events/news/press-releases/2023/06/ftc-takes-action-against-amazon-enrolling-consumers-amazon-prime-without-consent-sabotaging-their.

[13] Our Bureau, CCPA sends notice to Amazon for tricking customers into purchasing Prime memberships, December 15, 2023, available at: https://www.telegraphindia.com/business/central-consumer-protection-authority-sends-notice-to-amazon-for-tricking-customers-into-purchasing-prime-memberships/cid/1987021#goog_rewarded.

[14] Mr. Umar Javeed and Ors. And Google LLC and Anr, CCI, Case No. 39 of 2018. Available at: https://cci.gov.in/antitrust/orders/details/1070/0.

[15] Supra note 13.

 

More from Hammurabi & Solomon Partners