It's clear that we need to do a lot more to protect children online, only way that is going to happen is through laws, the recent congressional hearing made that pretty clear, Diving deep into the nuances of DPDP, while we welcome the intent, we see some real red flags when it comes to definitions and implementation
Our top 3 points to revisit and dive deeper into are:
- “Detrimental Effect” - It’s a term that’s as broad as the horizon, yet as undefined as the edge of the universe. The Act, in its current form, leaves this term open to interpretation, creating a fog of uncertainty that could stifle innovation or allow non-compliance to slip through the cracks.
Data fiduciaries, in the absence of clear guidance, are left to navigate this fog on their own. They could end up being overly cautious, restricting data processing to the detriment of progress. Or, they could exploit the ambiguity to sidestep compliance.
To clear this fog, we need our policymakers to step up. They need to provide a compass - a set of determining factors that define what “detrimental effect” means. They need to set boundaries that prevent data fiduciaries from arbitrarily classifying activities. This isn’t just about the here and now; it’s about paving the way for the future. After all, we’re in the business of innovation, not stagnation. Let’s make sure our policies reflect that. - Age and Consent Verification - It’s a puzzle that the Act, in its current form, doesn’t quite solve. Websites, whether they’re delivering news or entertainment, are tasked with this responsibility, even if they’re not directly collecting data. But in the digital world, data collection isn’t always direct. It’s often done subtly, through methods like cookies.
Then there’s the issue of age restrictions. Many platforms, especially social media, set an age limit, usually between 13 and 16. But let’s face it, kids often find a way around these restrictions, raising questions about the effectiveness of age verification.
The digital realm is a complex one, and verifying age and consent isn’t a straightforward task. It’s even more challenging for platforms that cater to a wide age range. Sure, identity verification tools like Aadhaar could be a solution, but it comes with its own set of problems. It could lead to over-collection of data, including sensitive information like birthdate and address, violating the principle of data minimization. This could potentially put children at greater risk, instead of protecting them.
We need to rethink this. We need to innovate, to find a solution that balances the need for verification with the principles of data minimization. After all, we’re not just creating technology for today, we’re shaping the digital landscape of tomorrow. Let’s make sure it’s a safe and inclusive one. - Children’s Agency and Right to Privacy - it’s a delicate balance. On one hand, we have regulations designed to protect children, which is undoubtedly important. But on the other hand, these same regulations could end up compromising their privacy and excluding them from decision-making processes.
We need to remember that children aren’t all the same. They have different levels of maturity and technological competency. A child nearing eighteen, for instance, often has a strong sense of self and a desire to protect their own privacy. They want to exercise their freedom of speech, to have their voice heard.
By standardizing consent requirements across all age groups, we risk infringing on this freedom. We risk creating an environment that excludes rather than includes. Empirical studies support this.
While we understand the complexities of the law we at Social & Media Matters believe that in the digital age we cannot separate the internet from children. While the eligibility to join any social media platform is 13 years and the bill quotes the age of 18, we need to t rightly understand that, children between this age cannot be put under one bracket. Thus, while there is no need to have parental consent but due diligence is what we propose. More action oriented programs like open conversations between parents and children and more engagement programs to bridge the digital divide. We need to innovate, to find a solution that respects the individuality of each child. After all, we’re not just creating technology for today, we’re shaping the digital landscape of tomorrow. Let’s make sure it’s a landscape where every voice can be heard. Let’s make sure it’s a landscape that respects the right to privacy. Let’s make sure it’s a landscape that empowers, not excludes. Because at the end of the day, that’s what innovation is all about.