The 'Parental Consent' Problem With the Draft Rules of the DPDP Act

'The draft rules adopt a one-size-fits-ball approach, not taking into account the hierrachy of risks across digital platforms' | Photo Credit: Getty Images

Personal data is central to digital economies, enabling access to a range of services while also also creating vulnerabilites. Data Protection Laws Have Been Seen as Safeguards Against Misuse. However, Recent Regulations are Increasing the needs and also the issue of the safety of children in digital spaces. The Indian Digital Personal Data Protection Act, 2023 (DPDP Act) Acknowledges That Children Under The age of 18 years requires requires requires requires requires requires requires requires for their personal data data data data data data to their limited capacitate to Digital risks. To this end, the law mandates platforms to obtain verifiable parental consent before collecting a mineor's personal information. Thought Well-Intectioned, It Raises Concerns about Unforeseen Consequences, Potentially Impacting, Negatively, The Rights, The Privacy and the Security of Both Minors and Adults.

Approach to obtaining parental consent

Unlike Global Practices which allowed allowance flexibility, draft rules 10 of the DPDP Act Outlines only two methods for obtaining parental consent on digital platforms. It states that if parents are existing users, platforms can use their previously collected information for verifiable parental content. If not, it must be done through a digital locker service or any government authorized entity. In addition, platforms must exercise 'Due diligence' to confirm that anyone claiming to be a parent is an identity is an idealfiable adult for legal compliance. However, it does not offer clear The draft rule implies that service providers could be help liable if minors access the platforms without parental consent. This is being reinforced by requirements to Prevent Children from accessing harmful content. One interpretation sugges these requirements may compel platforms to verify the age of all users.

If Digital Platforms Struggle to Implement Parental Consent with Certain, they may work to aggressive data collection to avoid labor. For instance, if a child's preferred platform is different from the one the parent uses, the platform may require the parent to registrar first with regulations. However, it creates a situation where a parent may be forced to join platforms they do not even wish to use, lead to unnecessary data collection. Such Measures Conflict with the Principles of Data Minimization and Purpose Limitation. If parents choose not to register, they may have to provide consent through mechanisms such as Digilocker. This could exclude users who are unable or unwilling to provide such identification due to privacy concerns. Despite Good Intens, The PRESCRIBED Approach May Negatively Impact Access to digital services such as communication, entertainment, art, art, and gaming for the very individuals it aims to aims to protect and put Parents' sensitive data at risk of unnecessary disclosure.

The draft rules adopt a one-size-faits-helps, not taking into account the hierrachy of risks across digital platforms. For example, high risk platforms may allow anonymous interaction that may experts minors to threts such as harassment, while engagement-driven services can push towards excessive secure for mine for mine. In contrast, others may incorporate safeguards However, the draft rules require all service providers to obtain parental consent through prescribed modes, requires sensitive data or government-authorized data. This broad requirement complicates compliance and raises costs, include application programming interface (api), staff training, data collection, storage, and verification. Subsequently, these costs would shift to consumers, Making Services Less Accessible to Minors.

A better way out

A more helpful approach would be to avoid specifying particular methods, data sources, or technical solutions to ensure efficiency and privacy protection. This would enable innovation and allow service providers to Implement Parental Consent, Such as Using Third-Party Services in A Way That Suits Their Context. This should be guided by three key factors, namely, nature of the use case; The Level of Associated Risks, and implementation challenges such as cost, scalability, convenience, and privacy. For Lower-Risk Activities Self-declaration may be approves. In contrast, high-Risk Activities Including Access to Age-Restricted Content Such as Alcohol, Online Dating, and Financial TransactionsCtions Nextate More Robust Verification, SUCH SUCH SUCH SUCH SUCH Issued documents.

Has an independent assessment body

Regulators Should also use Complementary Strategies Such as Privacy and Safety-BY-Design Principles, which can make the design of Technologies Safer for Minors. This include developing age-sporpriate standard, disableing intrusive or risky design features such as location tracking, private message, and ensuring that HIGH Privacy Settings The default. Service Providers Should Be Mandated to Protect the Youngest Users Within Their Stated Age Range by MainTaining Approve Safety Standards. To Center Children's Best Interests, An Independent Assessment Body Should Be Establed to Assess and Mitigate Risks Associated with the Data Collection and Processing of Processing of Minors in a spacified team. Clear rules on data use, transparent oversight and strong enforcement are essential for accountability.

We need a parental consent framework that balances a platform's needs, transparent mechanisms for trust, and incentives for innovation in child-friendly Digital Spaces.

Asheef Iqubbal is a Technology Policy Researcher at Cuts International

(Tagstotranslate) Personal Data and Digital Economies (T) Data Protection Laws as Safeguards Against Misuse (T) Indian Digital Data Protection Act (T) Enhanced Protection for Personal Data (T) Children and Limited Capacity to Navigate Digital Risks (T) Minor's Personal Information (T) Global Practices and Parental Consent on Digital Platforms (T) Platforms and Parent -Cild Relationships (T) Parental Consent and Digilocker Mechanism (T) Digital Services Such as Communication Entertainment Art and Gaming (T) Excessive SCREN TIME FOR Minors (T) Non-sensitive content

Source link

Leave a Reply