'Sensitive' Information Under India’s New Data Regime

Sense and Sensitivity : ‘Sensitive’ Information Under India’s New Data Regime



Earlier this month, India’s Union Cabinet approved a revised version (the “2023 Draft”) of the Digital Personal Data Protection Bill (“DPDP”), which seeks to replace the country’s existing legal framework on data protection.

A previous draft (the “2022 Draft”) of DPDP was released by the Ministry of Electronics and Information Technology (“MeitY”) late last year for public comments. Pursuant to extensive feedback received on the 2022 Draft, the 2023 Draft of DPDP – a version of which was approved by a parliamentary standing committee in March – is now ready. However, unlike its predecessor, the 2023 Draft has not yet been made publicly available.

Nevertheless, it appears that the draft law is poised to be introduced, considered and/or passed during the Lok Sabha’s monsoon session – which is currently underway and remains scheduled to continue until August 11. Today’s media reports about Lok Sabha proceedings suggest that a parliamentary standing committee for communication and information technology adopted a draft report on DPDP just yesterday through a majority vote under Rule 261 of the Lok Sabha’s rules of procedure (despite strong dissent from the minority). Based on a provisional calendar released by the Lok Sabha Secretariat with respect to sittings and allotment of days, MeitY may answer questions about the 2023 Draft in Parliament on August 2 or 9 – if the government manages to proceed to that stage in the midst of other pressing matters.

Section 11 of DPDP

Under Section 11 of the currently available draft of DPDP, the central government (“CG”) is empowered to notify any ‘data fiduciary’ (or a class of data fiduciaries) as a ‘Significant Data Fiduciary’ (“SDF”). Such CG notification can be issued on the basis of the government’s own assessment of prescribed factors – which include the volume and sensitivity of the personal data processed, as well as the risk of harm to a ‘data principal’ (along with reasons of national/public interest, as well as additional factors considered necessary by the CG).

Pursuant to governmental classification under Section 11 of DPDP, special obligations may be imposed on such notified SDFs, over and above the general obligations that all data fiduciaries need to comply with under Section 9 of DPDP, as discussed in our previous note.

Key terms

A ‘data principal’ under DPDP is the individual with respect to whom the personal data under question relates. The EU’s General Data Protection Regulation (“GDPR”) uses the term ‘data subject’ to convey the same idea.

On the other hand, a ‘data fiduciary’ under DPDP means any person who – either alone or in conjunction with other persons – determines the purpose and means of processing personal data. Individuals, companies, firms, associations of persons or bodies of individuals (even if they are incorporated), the state itself, and any artificial juristic person, may be considered a data fiduciary under DPDP.

S&R Data+

The previous note of S&R Data+, a multipart series on digital data protection, had reviewed DPDP’s provisions related to SDFs, including with reference to existing law and past legislative proposals.

This note and the next will discuss those main factors as prescribed under DPDP’s Section 11 (other than national/public interest) – viz., sensitivity, volume and harm – based on an assessment of which the CG may classify a data fiduciary as an SDF.

While this note focuses on sensitivity alone, the next will deal with volume and harm, respectively.



In general, while personal information obviously includes an individual’s personal details, such details need not always be considered ‘sensitive’.

Although informational ‘sensitivity’ is one of the prescribed evaluative parameters in respect of SDF notifications under DPDP, the only allusion to sensitivity that DPDP’s present-day draft makes is under Section 11 itself (which, in turn, spells out additional obligations of SDFs). Moreover, DPDP does not explicitly define or refer to sensitive personal data or information (“SPDI”) either. Nevertheless, the CG can notify any data fiduciary as an SDF on the basis of the former’s own assessment of sensitivity with respect to personal data processed by the latter.

In the absence of an express clarification within DPDP about what the term ‘sensitive’ connotes (or includes), the meaning of SPDI itself may be interpreted with reference to corresponding definitions under the existing Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011 (the “SPDI Rules”) and/or GDPR – or even by referring to past iterations of DPDP, such as the Personal Data Protection Bills of 2018 (“PDP 18”) and 2019 (“PDP 19”), respectively, as well as the Data Protection Bill of 2021 (“DP 21,” and together with PDP 18 and PDP 19, “Prior DPDP Iterations”).

SPDI Rules and Prior DPDP Iterations

Existing regime in India

IT Act

In general, India regulates the use of data (including SPDI) under the Information Technology Act, 2000, as amended by the Information Technology (Amendment) Act, 2008 (“the 2008 Amendment,” and collectively, the “IT Act”), along with the SPDI Rules and other rules framed under the IT Act.

Pursuant to the 2008 Amendment, Section 43A of the IT Act requires companies, firms, sole proprietorships, and other associations of individuals engaged in commercial or professional activities to maintain ‘reasonable security practices and procedures’ (“RSPPs”) if they possess, deal with or otherwise handle SPDI in a self-owned/controlled/operated computer resource. Explanations to Section 43A of the IT Act spell out what RSPPs and SPDI, respectively, entail.

SPDI Rules

In India, SPDI is expressly defined and dealt with under the SPDI Rules. In particular, such rules require entities that hold user-related SPDI to maintain certain security standards. These rules also prescribe the specific protocols necessary for storing personal information electronically, including in respect of SPDI.

According to Rule 3 of the SPDI Rules, for instance, SPDI comprises information relating to items such as passwords, financial information (e.g., details related to a bank account, credit/debit card, or other payment instrument), medical and health records, biometric data, etc.

Nevertheless, any information that is freely available or which can be accessed in the public domain, or is furnished under the Right to Information Act, 2005 (or pursuant to any other law in India), will not be considered SPDI – at least as far as the SPDI Rules are concerned.

Prior DPDP Iterations

Prior DPDP Iterations had defined SPDI along similar lines (as in the SPDI Rules) – albeit with minor modifications, such as by adding ‘genetic data’ to the list of informational categories (similar to what GDPR did – see below).

Further, somewhat like present-day DPDP’s Section 11, Prior DPDP Iterations had provided for the possibility that a data protection authority could later specify any other data category as SPDI. However, while Prior DPDP Iterations had empowered a Data Protection Authority of India (“DPA”) to notify SDFs, DPDP directly authorizes the CG to make this classification.

While DPDP has done away with the DPA altogether – replacing such authority with a ‘Data Protection Board of India’ (“DPBI”) instead – the CG may adopt a more subjective and discretion-based approach than what the DPBI (or the erstwhile DPA) might have been limited to while specifying other kinds of information as SPDI.

For instance, compared to Prior DPDP Iterations, certain additional factors have been introduced in DPDP for making SDF evaluations, such as: (i) the potential impact on the sovereignty and integrity of India; (ii) the risk to electoral democracy; (iii) security of the state; and (iv) public order. Further, DPDP empowers the CG to consider “such other factors as it may consider necessary.” Accordingly, such broad parameters may also be invoked by the CG for the purpose of specifying large swathes of data categories as SPDI.

With respect to SPDI itself, Prior DPDP Iterations had sought to clarify the contours of the term. For instance, DP 21, which comprised an amalgam of revisions made to PDP 19 by a parliamentary joint committee, had defined SPDI as personal data that reveals, is related to, or constitutes: financial, health, biometric or genetic data – along with information relating to official identifiers, sex life and sexual orientation, transgender and intersex status, caste and tribe, religious and political belief/affiliation, as well as any other informational category which may be notified. Further, Clauses 3(35) and 3(36) of PDP 18 and PDP 19, respectively, had defined SPDI using the same formulation as in DP 21.

Importantly, Clause 15 of PDP 19 (similar to Clause 22 of PDP 18) had empowered the CG, in consultation with appropriate authorities and/or regulatory bodies, to notify certain categories of personal data as SPDI based on: (a) the risk of ‘significant’ harm that may be caused to a data principal by the processing of such data; (b) the expectation of confidentiality attached to that data category; (c) whether a significantly discernible class of data principals may suffer ‘significant harm’ from the processing of such data; and (d) the adequacy of protection afforded by ordinary provisions on personal data.

Financial data

Like GDPR, DPDP does not define ‘financial data’ – although each of the Prior DPDP Iterations did. Expanding upon the scope of Rule 3(ii) of the SPDI Rules, Clauses 3(19), 3(18) and 3(21) of PDP 18, PDP 19 and DP 21, respectively, had defined ‘financial data’ as any number or other personal information used to identify an account opened, or card/payment instrument issued, by a financial institution with respect to an individual; or any personal data regarding the relationship between a financial institution and an individual, including in respect of financial status and credit history. While other types of financial information like account statements, as well as data relating to financial products and/or investment information, were not included within such definition (i.e., as contained in Prior DPDP Iterations), DPDP altogether omits any reference or explanation with regard to the term.

Genetic, biometric and health data

Each of genetic, biometric and health data was defined in Prior DPDP Iterations – such as in Clauses 3(20), 3(8), and 3(22), respectively, of PDP 18; and in Clauses 3(19), 3(7), and 3(21), respectively, of PDP 19. However, only biometric and health data have been referred to under present-day DPDP – and that too with respect to provisions on ‘deemed consent’ alone (see Section 8 of DPDP).

Genetic data

DPDP neither defines nor refers to ‘genetic data’. However, while defining personal data in Article 4, GDPR clarifies, among other things, that an individual may be considered directly or indirectly identified/identifiable with reference to one or more specific factors – which can include the genetic identity of a person. Although the GDPR’s definition of personal data stems from its predecessor – Article 2(a) of Directive 95/46/EC (the “Directive”) – GDPR has added the word ‘genetic’ to the Directive’s list of identifiers. In addition, GDPR separately defines ‘genetic data’ under Article 4(13) as information relating to the inherited or acquired genetic characteristics of a natural person which provides unique data about the physiology or health of that person – particularly when such information results from an analysis of their biological sample. Significantly, the Prior DPDP Iterations – including PDP 18 and PDP 19 – had reproduced the GDPR definition in its entirety.

Meanwhile, despite not explicitly referring to genetic data while defining SPDI, the SPDI Rules do include information related to the physical, physiological and mental condition of a person – as well as their medical records and history – within the scope of such definition.

Health data

As per Paragraph 2.2 of the July 2020 Strategy Overview of the National Digital Health Mission, health data can be classified into the category of (i) personal health data – including the personally identifiable information (“PII”) of stakeholders such as healthcare professionals; and (ii) non-personal health data – including when it has been aggregated and anonymized, such as when all PII has been removed.

Further, under Section 8 of DPDP, a data principal will be ‘deemed’ to have consented to the processing of their personal data if such processing is necessary for responding to a medical emergency involving a threat to their (or someone else’s) life or health, or for taking measures to provide medical treatment or health services to an individual during a threat to public health. Such provisions appear to resemble Recitals 53 and 54 of the EU’s GDPR – which, in turn, deal with the processing of sensitive data in the public health and social sectors, respectively.

Recital 54 of GDPR indicates that the processing of data concerning health for reasons of public interest may not result in processing for other purposes by third parties – such as employers (or insurance and banking companies). However, under DPDP, if an individual shares their biometric data with an employer – say, for the purpose of marking attendance at the workplace – they will be deemed to have consented to the processing of such data, albeit for the purpose of attendance verification only (see Clause 8(7) of DPDP and its illustration).

In the past, Clauses 3(22) and 3(21) of PDP 18 and PDP 19, respectively, had defined health data as information related to the state of physical or mental health of a data principal, including (i) records in respect of past, present or future health states, (ii) data collected in the course of registering a person for health services, and (iii) data associating a person to some specific health service.

Biometric data

In general, biometric data may include biological properties, physiological characteristics, living traits, or repeatable actions where such features and/or actions are both unique and measurable, even if the patterns used to ‘measure’ them involve a certain degree of probability. Typical examples of biometric data are fingerprints or retinal patterns and may include less obvious features such as digitized versions of a handwritten signature or keystrokes.

A special feature of biometric data stems from the fact that it may serve as an identifier. Thus, on account of its unique link to a specific individual, biometric data may be used to specifically identify that person. Although the term ‘biometric data’ has been left undefined in DPDP, if such data is subsequently digitized pursuant to extraction and/or manipulation, the provisions of DPDP may come into play, including with respect to notice and consent requirements.

Further, Rule 2(1)(b) of the SPDI Rules, while including such term within its SPDI definition, defines ‘biometrics’ separately, as: the technologies that measure and analyze human body characteristics, such as fingerprints, eye retinas and irises, voice and facial patterns, hand measurements and DNA for authentication purposes.

Somewhat similarly, Article 4(14) of GDPR defines ‘biometric data’ as personal information resulting from specific technical processing related to the physical, physiological or behavioral characteristics of a natural person, which, in turn, allows or confirms the unique identification of that natural person, such as facial images or dactyloscopic data (i.e., fingerprint images, images of fingerprint latents, palm prints, palm print latents, and templates of such images (coded minutiae) when they are stored and dealt with in an automated database). Meanwhile, in the past, Clauses 3(8) and 3(7) of PDP 18 and PDP 19 had closely resembled the GDPR definition.

DPDP, like PDP 19, does not define or refer to ‘passwords’ as a separate informational category. However, PDP 18, like the existing SPDI Rules, had included passwords as an SPDI category. Meanwhile, Rule 2(1)(h) of the SPDI Rules defines a ‘password’ to mean a secret word, phrase or code; or a passphrase or secret key; or encryption or decryption keys that one uses to gain admittance or access to information.


Recital 51 of GDPR, for instance, explains that certain kinds of personal data, by dint of their unique nature, can be particularly sensitive as far as fundamental rights and freedoms are concerned. Accordingly, such categories of information require specific protections. As such, the context of data processing itself may create significant risks to the rights and freedoms of corresponding individuals.

Accordingly, each of Recitals 51 to 54 of GDPR deal with the idea of SPDI. These provisions appear to stem from an understanding that certain types of data, on account of their higher relevance, should be subject to a higher level of protection. Thus, GDPR’s Article 9 – which deals with the processing of special categories of personal data – prohibits such processing when it reveals certain types of information which are similar to the categories later reflected in Prior DPDP Iterations.

In India, although “sensitivity of personal data processed” is one of the main evaluative parameters prescribed by DPDP for making SDF notifications under Section 11, the only allusion to ‘sensitivity’ that DPDP makes is under such Section 11 itself.

Accordingly, given DPDP’s silence in this regard, definitions and provisions related to SPDI – including in the SPDI Rules and Prior DPDP Iterations, respectively – may provide useful interpretive guidance. In addition, the EU’s GDPR – several provisions of which had influenced India’s Prior DPDP Iterations – may continue to guide data fiduciaries in India and elsewhere for the purpose of complying with DPDP’s provisions, especially with respect to unpacking the implications of ‘sensitivity’.

This insight has been authored by Deborshi Barat (Counsel); he can be reached at dbarat@snrlaw.in for any questions.This insight is intended only as a general discussion of issues and is not intended for any solicitation of work. It should not be regarded as legal advice and no legal or business decision should be based on its content.
© 2023 S&R Associates