Today, Arindam Goswami writes on the recent CCI order against Whatsapp-Meta’s data sharing practices. Lokendra Sharma follows in this week’s curated section, with a piece on the rise of bitcoin and the broader cryptocurrency landscape.
Technology has become important not just in our everyday lives, but has also become an arena for contestation among major powers including India. The Takshashila Institution has designed the 'Technopolitik: A Technology Geopolitics Survey' to understand and assess what people think about how India should navigate high-tech geopolitics. We are sure you are going to love the questions! Please take this 5-minute survey at the following link: https://bit.ly/technopolitik_survey
Technopolitik: The End of Unchecked Data Collection?
— Arindam Goswami
The recent Competition Commission of India (CCI) order barring WhatsApp from sharing user data with Meta represents a watershed moment, not just as a local regulatory action but also as a significant statement about user rights in the digital ecosystem, digital privacy and technological monopolies.
The core question is: how much true control do users have over their personal data when using seemingly "free" digital platforms? The WhatsApp-Meta case sheds a light on the not-anymore-hidden economic model where user data is the primary commodity. The order prevents automatic data sharing. It essentially challenges the tech giant's assumption that user consent can be presumed through complex, often unreadable terms of service.
Indian jurisprudence, particularly the landmark K.S. Puttaswamy judgment of 2017, established privacy as a fundamental right under the Constitution. There are international precedents like the European Union's General Data Protection Regulation (GDPR) and similar rulings in the United States that have consistently emphasised informed, explicit consent. Beyond mere consent and data collection, the monopolistic practices of Big Tech extend beyond mere data collection to the wider problem of creating ecosystem lock-ins where users are effectively compelled to accept invasive data practices to maintain social connectivity. Market share capture and the effective monopoly of Big Tech create a coercive environment where genuine consent becomes a legal fiction. The economic model underlying these practices ruthlessly prioritises data aggregation and targeted advertising. The CCI order disrupts this seamless data flow. It has forced a reconsideration of how user information is treated.
Crucially, for policymakers and tech companies, the message is clear: The era of unconstrained data collection is ending. Principles of transparency, user agency, and genuine consent are paramount. However, what frameworks could assist policymakers and judicial officers in thinking about these problems?
Principles to guide action
There is a clear “consent architecture breakdown”, where such policies as that of WhatsApp create a “digital adhesion contract” – a binary choice of either accepting entire terms or losing platform access. This cannot be permitted, especially, when the platform is big enough to have become essentially a digital public square due to its monopoly. This means that legal mechanisms should assess if there are multi-tier consent and granular opt-in/opt-out mechanisms, and if there are real-time data usage tracking, ownership and management controls for users.
Cross-platform data aggregation could become effectively an anti-competitive practice because it creates an asymmetric information advantage for Big Tech. It breaches Competition Act, Section 4 by abusing its dominant market position. Moreover, data collection must be proportional. Data collected cannot exceed legitimate business requirements. The proportionality principle enunciated in the Puttaswamy judgement makes this clear.
Mechanisms for action
Next, even if there is user consent, what happens after the data is collected by these platforms? Do user’s rights end there? Can we have cross-platform data integration metrics that can quantitatively assess how companies like Meta aggregate user data across multiple platforms? Standardised methods need to be used to evaluate the economic and privacy implications of data consolidation. Regulators need a robust tool for intervention. The algorithms used have to be explainable and verifiable, by regulators and users. Mandatory privacy impact assessments need to be enforced by law. Once we are able to assess the direct economic value of data, we can look at compensation frameworks. This reimagines the economic relationship between digital platforms and users.
From these aforementioned points, there emerges a need for technological governance mechanisms. These should emphasise algorithmic explainability and the creation of a comprehensive digital rights infrastructure. Standardised protocols for data portability and user consent and control, created in a participative and consultative manner, could help develop a systemic approach to digital privacy protection.
Trade-off and Frameworks
Having said this, there is, indeed, a need to balance the right to privacy with things like the need for free space for innovation and the governmental capacity to enforce the law. There have to be frameworks to assess this before framing laws and while adjudicating on cases. The courts, for example, need to evaluate along the dimensions of user harm (economic harm, privacy disruption, etc.), data integration scope, consent mechanisms available, and anti-competitive potential analysis.
Lawmakers, apart from the above points, need to consider the trade-offs between regulatory complexity and user impact. There would be nuances that need to be dealt with. For low complexity, low impact use cases, regulators might implement light-touch monitoring or minimal interventions. For instance, non-sensitive metadata collection or standard user tracking that doesn't involve personally identifiable information would fall here. Low complexity, high Impact scenarios are often hidden, systemic issues. An example might be subtle cross-platform data linking that creates comprehensive user profiles without explicit user knowledge. Interventions here require rapid, targeted regulatory responses that address potential large-scale privacy breaches. There could be high complexity, low impact scenarios, which are technically intricate privacy challenges with limited immediate user consequences. Advanced data anonymisation techniques or complex data routing mechanisms that don't directly compromise user privacy but require careful regulatory understanding, would fall under this category. Regulations need to enforce explainability and traceability mandates in such scenarios, in the far-off case that this causes some user harm. The most critical use case would be the high complexity, high impact scenario. These are sophisticated, large-scale data practices posing substantial risks to user privacy and market fairness. The WhatsApp-Meta data sharing case is the best example of this. Interventions here require comprehensive, multi-dimensional approaches involving legal, technological, and economic strategies. This could make the law in such cases very convoluted and esoteric. There could be chances for loopholes emerging. Hence, this requires the most amount of attention.
There are other axes too along which governments need to analyse data privacy related cases. There is the angle of data monetisation versus consent transparency. In this, we need to evaluate the economic exploitation of user data against the transparency of consent mechanisms in a way so as to create a nuanced framework for understanding digital privacy dynamics. Cases where there is high transparency are not worrisome. What is problematic is when there is low transparency. Within this scenario too, there could be low monetisation use cases. These typically represent smaller platforms or services with limited data economic value. These are often emerging technologies or niche services that lack sophisticated data management capabilities. While they may not pose significant privacy risks due to limited data collection, because they fail to provide meaningful user control, they could spiral into big problems. Hence, in such scenarios, it could be a time-based, gradual regulatory enforcement tool that could be employed.
The most problematic scenario, exemplified by platforms like Meta-WhatsApp, is the high monetisation, low transparency case. These are high-value data ecosystems with opaque, complex consent mechanisms that effectively obscure true data usage. Presenting users with all-or-nothing consent options, creating an illusion of choice while facilitating extensive data monetisation, makes such cases fit for aggressive regulation. Self-regulation or gradual regulatory escalation cannot be relied on in these cases. The proactive aspect of responsibilities lies with the companies in such scenarios. Even though these regulations will add to the operational expenses of the entities involved, there is no way around them because of the stakes involved.
This is, of course, not an exhaustive list of frameworks. There is a need to develop more such frameworks and models to help policymakers. These models should be both qualitative and quantitative. There is no point in flying blind without guiding mechanisms. The costs to ineffective policymaking in this arena could be enormous.
The Many Lives of Bitcoin
— Lokendra Sharma
On 4 December 2024, Bitcoin crossed USD 100,000. The cryptocurrency has come a long way in the nearly 15 years of its existence — the Bitcoin whitepaper was released in 2008 by a mysterious and till date unknown person called Satoshi Nakamoto and the currency was launched in 2009.
Bitcoin was valued at a fraction of the dollar for the first two years. Fast forward nearly 16 years after launch, the digital currency is breaking all records and is on a bullish run. From June to October this year, Bitcoin hovered below USD 70,000. But when Donald Trump clinched the presidential election in the US in early November, Bitcoin prices started climbing rapidly, reaching the historic high of USD 100,000 within roughly a month. But this is not the first time Bitcoin has given investors extraordinary returns in the short term. From a low of around USD 10,000 in mid-2020, Bitcoin reached a high of around USD 60,000 in March 2021, only to rapidly decline by half to USD 30,000 by mid-2021. This pattern of extreme volatility continued in 2021. Since January 2023 — when the price was at a low of below USD 20,000 — there has been, however, a general upward trend.
The success and promise of Bitcoin have inspired the launch of a number of cryptocurrencies, especially over the last decade. Examples of widely known ones include Ethereum
Tether, USD Coin, Binance Coin and Ripple. But what are cryptocurrencies? How do they function on the blockchain? Why do they display such extreme volatility? The following 2020 paper by Hashemi Joo et al. provides answers to these questions.
Hashemi Joo, M., Nishikawa, Y. and Dandapani, K. (2020), "Cryptocurrency, a successful application of blockchain technology", Managerial Finance, Vol. 46 No. 6, pp. 715-733. https://doi.org/10.1108/MF-09-2018-0451
For Hashemi Joo et al., cryptocurrency (such as Bitcoin) ‘is a worldwide digital payment system that performs its functions online.’ But it is not any other digital currency. It is a ‘decentralized peer-to-peer digital asset backed by strong computer cryptography that can be used as a medium of exchange.’ Cryptocurrencies emerged as an alternative to a central authority (usually a state’s central bank) issues and managed currencies. In that sense, cryptocurrencies are decentralised, and work in a peer-to-peer fashion using blockchain technology.
Cryptocurrencies are different from the Central Bank Digital Currencies (CBDCs). The latter is simply the digital version of the fiat currency issued by a central bank running on a private blockchain instead of the former's public blockchain. CBDCs are also different from the money we transact through commercial banks as CBDCs are directly guaranteed by a state’s central bank.
Stablecoins are a kind of cryptocurrencies that bring some level of convergence between decentralised currencies and the centralised ones. The reason ‘cryptocurrency market has become more volatile than the stock market or any other commodity markets’ is that cryptocurrencies like Bitcoin are not backed or regulated by a central authority or government. There is no fundamental economic factor dictating the price of a cryptocurrency — it is just demand, supply and speculation. Stablecoins attempt to address this to some extent by backing the crypto to reserve assets like the US dollars. But as Hashemi Joo et al. flag, this does not mean that stablecoins do not experience volatility.
Blockchain is the underlying technology that grants mathematical trust to a cryptocurrency in absence of a central authority guaranteeing that trust. Blockchain essentially is a distributed public ledger that records transactions in blocks connected in a chronological fashion (hence blockchain) with the focus on verifiability, transparency and immutability.
What We're Reading (or Listening to)
[Podcast] X Marks the Spot: Analysing Deepfake Regulation in the Musk Era, By Rohan Pai and Leah Govias
[Opinion] Deepening India’s steps as a key space-faring nation, By Ashwin Prasad
[Opinion] AI for All- Can India Bridge the LLM Gap? By Adya Madhavan
Its called data. But its information. For years there is a system about information& who can ask& access it. This isnt new. What happend (before digital revolution) when people apply for bank account (money, pay check, etc), passport (also called legal documents ?) these were govt issued info with many laws& legal aspects. When did digital private companies take info as their banks to trade?