Designing Consent Under India's DPDP Act: Why UX Is Now A Legal Compliance

Yugal Bhatt

20 Jan 2026 3:00 PM IST

  • Designing Consent Under Indias DPDP Act: Why UX Is Now A Legal Compliance
    Listen to this Article

    The digital economy in India can no longer rely on 'checkbox' philosophy to obtain user consent. Corporations deliberately minimized friction to obtain maximum information from the user. We as users are accustomed to pop-ups when visiting websites, to be dismissed before it fully loads, a pre-ticked box not to waste time on, or a link to a expertly drafted privacy policy full of Latin maxims. This era is supposed to end with India's Digital Personal Data Protection Act (DPDP Act) of 2023 and the subsequent DPDP Rules of 2025. This article aims to explain the act at the intersection of law, product design, engineering, and regulatory enforcement. The DPDP Act transforms the legal checkbox into a product-level compliance system where user experience decisions directly determine legal validity.

    The constitutional foundation of the act was established by the Supreme Court in Puttaswamy vs UOI. The "Data Principal" must know and retain meaningful control over their digital footprint. The Act identifies the entity processing and determining the use and processing the personal data as a "Data Fiduciary," a term signifying trust and accountability.

    1. Consent Beyond the Click

    Section 5 of the DPDP Act mandates notice for every consent. The request for consent must be either accompanied by or preceded by a notice. The notice must specifically contain three things: personal data and purpose for which it is being collected; the manner in which he or she may withdraw consent or make grievance; and the manner in which the complaint may be made to the board.

    This means that the platforms must now inform users at the point when data is collected rather than relying on a single privacy policy at sign up. For instance, an application that requires microphone access cannot provide a single generic “terms of service”.

    An app that collects and processes audio for voice-to-text feature must present a purpose-specific notice stating that the audio shall be used for converting speech into text, whether the audio will be stored or transmitted, and how the consent can be withdrawn. Audio processing is lawful only if the user agrees to the specific request.

    While section 5 ensures that the users are informed at the point of data collection, section 6 goes a step further and examines whether the user's agreement was genuinely free, specific, informed, unconditional, and unambiguous.

    2. The End of Bundled Consent

    This rules out the previous practice of bundling “essential” consent required to use the application and “optional” consent for marketing or tracking. The engineering teams can no longer build a single “yes or no” Boolean consent flag. Systems must have purpose-level consent flags. Practically, different uses of data must be kept separate. As a result, a user who refuses consent to marketing cannot be forced to share data for promotional purposes, even when the core services continues to operate.

    “Free” consent also requires interfaces to avoid deceptive nudges or coercive UI design. Consider a consent banner implemented with a large “Accept All” button as the primary call-to-action button while the “Reject” option is kept hidden behind a secondary link that opens multiple additional screens. This creates an asymmetric interaction cost where acceptance requires a single click and refusal demands several steps. If consent is obtained through such interface, it cannot be regarded as voluntary or valid.

    3. How Interface Design Became a Legal Issue

    The problems described above are not new and the European Union confronted it a few years back when it implemented GDPR in 2018. The court in the case Planet 49 observed that the platform contained a consent flow where the interface had pre-selected option allowing tracking for advertising and refusal required additional effort. The users technically did have the option to opt out but the interface made rejection comparatively harder. The court held that the consent was not given freely. The opt out option existed in theory, but the system design pushed users toward acceptance. This means if one path is completed in a single click and the other requires multiple screens, the system is no longer merely recording user intent but actively shaping it.

    4. When Consent No Longer Matches Data Use

    Another systems failure would be notice version mismatch, where consent is obtained for one version and is used for a different processing reality. Suppose a user consents under notice version 2.2 for limited purposes. The product evolved and a new analytics feature is added or the data is shared with new category of processors provided for under notice version 3.0. If the consent log does not bind the user's consent to a specific notice version, purpose set or notice hash, the system cannot in reality capture what the user actually agreed to at the point of time. From the legal standpoint, this misses the “informed” consent element since there is a lack of versioned evidence and the data fiduciary cannot prove that the user provided consent to the expanded processing. Section 6(10) of the DPDP Act captures this failure, it places burden on the Data Fiduciary to show that the consent was validly obtained and in the case fails to prove the consent context, the law would treat it as consent never having existed.

    5. When Screen Design Becomes a Compliance Failure

    The case CNIL v. Google decided by the European Court throws light on the consent architecture. The consent architecture provided critical information such as the purposes of processing, categories of data collected and retention periods. The information was dispersed through multiple screens and links therefore requiring users to navigate through several layers to understand how the data would be used. Although consent was captured through a single action but material information could be obtained only after multiple clicks which effectively decoupled disclosure from the consent event. CNIL successfully argued this as a systems failure since the interface design in fact delayed and fragmented information. The regulator thus did not merely focus on the content of Google's privacy policy but on the interaction flow, screen sequencing and the timing of disclosure.

    The CNIL decision thus shows that consent now involves not only what information is provided but how and when it is presented to the user.

    The key points are listed for reference purpose since compliance now requires consent must be implemented as a system and not one-time event.

    • Notice Version: Any change in purpose, data category, or sharing policy should generate a new version ID.
    • User-Notice Mapping: The database must record which version ID was displayed to which user, along with a timestamp (and language of delivery, where applicable).
    • Purpose Tagging: Every data field in the backend must be tagged with a purpose ID that corresponds to a specific item in the versioned notice.
    • Lifecycle enforcement: When a user withdraws consent or a notice is no longer valid, the system must ensure that the data is no longer used or retained for that purpose

    6. When Consent Records Become Legal Evidence

    The products teams treated user interaction logs as disposable information used for purposes like analytics, debugging or testing and was disposed after retention for short period. The new Act challenges this as they are treated as audit objects. Section 6(10) of the Act: whenever the validity of consent is questioned, the burden to prove that the consent was validly obtained lies on the Data Fiduciary. Deleting consent logs is no longer an internal affair but destruction of legal evidence.

    A defensible consent record must capture the full interaction such as which notice version was shown, what purposes were disclosed, language of the notice and the action of the user (click, toggle, checkbox). The standard operational logs might be disposed after 30 or 90 days but the consent logs cannot follow the same cycle. Section 6(10) implicitly states that consent records must be retained as long as the data is being processed for the purposes shown in the notice. If the personal data was collected in 2024 and is still being processed in 2028, the Fiduciary must produce the 2024 consent logs as evidence.

    Design as the New Site of Compliance

    The Digital Personal Data Protection Act represents a fundamental redesign of the digital contract. In this new regime, the validity of a legal agreement is no longer hidden in the fine print but is displayed through the clarity of the interface and the transparency of the design. For example, pre-ticked boxes are no longer valid but must be switched off by default. Compliance in this new framework is achieved by building systems and user interfaces which respect user choices in practice as opposed to drafting long privacy policies. Engineers must now approach systems and designs as a regulatory function where information is clear and consistent throughout the user's interaction.

    The Author Is An LLM Student At Hidayatullah National Law University (HNLU)

    Views Are Personal

    Next Story