pexels-ferarcosn-211151

Impact on Data Privacy in India through the Monetization of Attention on Tech Platforms

About the author: Nandini is a passionate young woman who finds an interesting balance in academia between Mathematics, Economics, Literature and Public Policy. Currently, she is enrolled in the 11th grade at DPS International Edge Gurgaon. In the future, she hopes to pursue Economics, specializing in microeconomics and finance.

Abstract

This research paper explores the intricate relationship between data privacy and the monetization of user attention by tech platforms in India. With the rapid proliferation of digital technologies and social media among Indians, tech companies have increasingly monetized user engagement through targeted advertising, raising significant concerns about data privacy. This study investigates the regulatory frameworks, industry practices and user awareness in India to assess the implications of these monetization strategies on individual privacy rights. It explores potential policy interventions to mitigate privacy risks while balancing the economic benefits digital platforms, contributing to the ongoing discourse on safeguarding personal data in an era where attention has become a valuable commodity.

Introduction 

In most of mankind’s economic history, the biggest scarcity consumers ever faced was productive and arable land. This included the fight for agrarian land, land for real estate and land for production of goods. It was the diamond in the rough for all consumers. However, with the industrial revolution (and even more so the digital and AI revolution), there is one big thing that consumers are now scarce on: knowledge (Baecker et al, 2020). Now that the standard of living has grown and people have become richer, there is one very important question that arises: what should one do with monetary wealth?

This is where marketing and the attention economy comes into play. On a tangent from its original purpose of helping consumers allocate their resources, advertising has now become a way of falsely hyperbolizing the need or overselling an average product with false promises. With so much product differentiation and online marketing, consumers try to overcome asymmetry of information in the economy to find the best toothpaste, laptop, pen or house in the market (Croxson et al, 2022). Recently experts have criticized advertising as nurturing a culture of perpetual distraction and a world of passive consumers that mindlessly scroll on their phones for hours on end. It is how many digital platforms are making their billions.

For almost every social media platform, the one way that they can hold onto their consumers to increase the marginal utility from every scroll and swipe is through personalization. One’s viewing history is tracked for the purpose of showing tailored content, and technology has been engineered to make individuals addicted to content. Not to serve the best interests of individuals or even to keep society entertained, but to increase the inventory of advertising space a company can sell. Whether it is the ads on Instagram or the “For You” page on TikTok, it all boils down to the same thing: an economy now thrives that focuses on stealing the attention of individuals based on what the platforms know they already like (Wiener, Saunders, & Marabelli, 2020). In the attention economy, where information is abundant and easily accessible, capturing and retaining users’ attention has become a valuable commodity for firms. As digital media has become increasingly consumed in society, our attention spans have become much shorter. As a result, attention itself has become a scarce resource leading to much more aggressive forms of internet marketing for firms to hold it and monetize it.

This research article aims to discuss the intricacies of how these platforms gather, store, and leverage user data for purposes such as targeted advertising and content personalization in tandem with user awareness – or lack thereof – and consent mechanisms. It is important to gauge the level of awareness among users as well as the security measures implemented by digital platforms and the policies implemented by different governments to attain privacy and anonymity in the digital world.

The Problem

Advertising is one of the most important and primary sources of revenue. Advertisers pay for the opportunity to capture users’ attention and promote their products or services. The more attention a platform can attract, the more valuable it becomes to advertisers, leading to increased advertising revenue. Some platforms also adopt subscription-based models where users pay for premium content or services.  But the most valuable piece of information firms can get is attention data, including user preferences, behaviors and engagement patterns to provide targeted advertising (Baragde et al, 2017). There have been many such scandals with reputed companies like Facebook, Amazon and Google, who were sued on multiple occasions for tracking what people view online and where they browse.

Initially, traditional models relied on transactional exchanges, emphasizing the sale of goods or services. However, contemporary digital platforms have transitioned to attention-centric models, where user engagement is the primary currency (Hanafizadeh et al, 2021). Transactional models derived revenue from direct sales from brick-and-mortar stores or local monopolies that divided society based on their economic roles. However, this line of economic roles in society has blurred and consumer engagement, marked by the number of likes and views, matter more than direct forms of payment or cash (Jabłoński & Jabłoński, 2020).

The attention-centric monetization model used by tech platforms raises several significant policy problems, especially in the context of data privacy, ethical considerations and social impact. Along with sensational news being shared, in the stride to make social media personalized and addictive, there is also illegal sharing of data that is a breach of consumer privacy. In May 2023, The European Data Protection Board (EDPB) slapped Meta Platforms Ireland Limited (Meta IE) with a 1.2 billion euro fine following the EDPB binding dispute resolution decision of 13 April 2023 (Najjar & Kettinger, 2013). The fine, the largest in GDPR history, stems from an inquiry into Meta’s Facebook service and its systematic, repetitive, and continuous transfers of massive volumes of personal data to the U.S. using standard contractual clauses since 16 July 2020. There are innumerable such cases, all of which have one goal: increase individuals’ time spent on their apps (Ahlemeyer-Stubbe et al, 2018).

Just like business models and economies work, to have a successful business is to have a good or service that requires people to keep purchasing it. Social media is just that. Technology conglomerates, some of the most financially potent entities globally, have granted them unprecedented influence over the operations of both national and global economies. Being endowed with extensive datasets and artificial intelligence, these corporations possess the ability to identify and neutralize emerging competitive challenges, solidifying their dominance and impeding innovation and individual welfare (Lange, Drews, & Höft, 2021). Furthermore, these entities wield substantial authority over civic involvement and political dialogue, exemplified by the recent acquisition of Twitter by Elon Musk. Making people slaves to their phones, an increased screen time on social media platforms can trigger impulse control problems as the constant alerts and notifications affect concentration and focus, disturb sleep and more.

The lopsided negotiating leverage of digital platforms often results in inadequate compensation for content developers, necessitating governmental intervention. This was underscored by the Australian-Facebook dispute over equitable payment for content creators and influencers (Naimi & Westreich, 2014). The erosion of individuals’ control over their personal data has profound implications for the human psyche, contributing to a pervasive sense of information overload. Current market valuations of data create a competitive race to capture individuals’ attention at the lowest conceivable cost. This approach compromises user experience and prioritizes prolonged platform usage over individual well-being. In pursuit of profit maximization, algorithms are engineered to enhance engagement by amplifying the virality of content, often leading to the promotion of highly provocative, contentious, or polarizing material to drive user interactions (Parvinen P et al 2020). This heightened exposure to misleading or false information poses a significant threat to informed decision-making, fostering addiction and desensitization to marginalized communities.

Problems arise due to the algorithmic amplification of sensationalized or polarized content which often skews user attention towards themes, impacting the discoverability of diverse and valuable information. Beyond this it links back to the problem of inefficiency in firms which can have economic ramifications by influencing consumer choices and market trends (Vila Seoane, 2021). The algorithms that drive attention-centric monetization are often proprietary and opaque, making it difficult for users, regulators, and even the platforms themselves to fully understand how content is prioritized or how ads are targeted. This lack of transparency complicates efforts to regulate these practices effectively.

Privacy infringements compromise individuals’ sensitive information and erode trust in online platforms. Users find themselves unknowingly becoming targets of data exploitation more and more as companies prioritize maximizing time spent on their apps over safeguarding user privacy (Shukla, Bisht, Tiwari & Bashir, 2023). The alarming frequency of such cases underscores the urgent need for robust regulations and vigilant oversight to protect individuals from unwarranted data exploitation and preserve the fundamental right to privacy in the digital age.

One of the primary policy issues is the adequacy of user consent mechanisms. Many users are unaware of the extent to which their data is collected, processed, and used to target them with advertisements. Often, the terms of service and privacy policies are lengthy, complex and difficult to understand, leading to questions about whether the consent of individuals signing onto these agreements can truly be considered informed (Van’t Spijker, 2014). Tech platforms also engage in extensive data harvesting practices to build detailed user profiles. The collection of vast amounts of personal data, often without explicit consent raises concerns about privacy violations and the potential misuse of data.

Policy Recommendations

The impact of digital platforms and their consistent use can negatively impact the wellbeing of individuals. Policy interventions are crucial to mitigate the potential harm, ensuring a balance between technological innovation and safeguarding mental health. Striking this balance requires a thoughtful approach that addresses the impact of digital platforms on individuals’ well-being within broader societal frameworks (Ofulue & Benyoucef, 2024).

India’s policy framework around data privacy and attention-centric advertising on tech platforms is evolving rapidly, driven by the increasing digitalization of its economy and the growing concerns over user privacy. One type of solution to mitigate threats of data privacy are known as Privacy-Enhancing Technologies or PETs: tools and techniques that limit access to personal data and control its use like encryption, anonymization, and access controls. For now, they are used in relevance to sensitive data like in healthcare sectors to protect patient data like in Netherlands or as user consent forms before using third-party tools to track user behavior for advertising purposes in Germany (The Dutch Data Protection Authority, AP) and France (The French Data Protection Authority, CNIL). However, these forms of PETs such as AI-generated synthetic data, a versatile privacy solution, enabling anonymization by creating statistically identical but diverse datasets, allowing processes like data augmentation and representative imputation while preventing re-identification, making it unsuitable for cases requiring such identification can also be used for digital platforms to ensure digital privacy.

Another example of public policies created to ensure data privacy is by The EU which has been exploring the Digital Services Act (DSA) and the Digital Markets Act (DMA) to obtain anonymity with users. The DMA, establishes rules for large online platforms and aims to create a level playing field, ensuring fair market practices, ushering in new standards for digital commerce (Jabłoński & Jabłoński, 2020). A key component of the Digital Services Package, DSA focuses on transparency, user safety, and accountability for online platforms. It sets standards for digital consumer businesses, ensuring a safer and fairer digital space across the European Union.

The Personal Data Protection Bill (PDPB), 2019 which has evolved into the Digital Personal Data Protection Act, 2023 represents India’s most comprehensive attempt to regulate data privacy. The Act outlines principles for data processing, including lawful, fair, and transparent handling of personal data (Shukla, George, Tiwari & Kureethara, 2022). It mandates that data collection must be purpose-specific and that data subjects have the right to access, correct and erase their data. It mandates that data collection must be purpose-specific and that data subjects have the right to access, correct or erase their data.

The Act also introduces the concept of data fiduciaries, requiring companies that collect and process data to be accountable for protecting user privacy. It also introduces significant penalties for data breaches and non-compliance. According to the Digital Personal Data Protection (DPDP) Act, businesses in India acting as data fiduciaries may face penalties of up to INR 250 crore for each occurrence of a data breach. Furthermore, in the case of major breaches, a maximum penalty of INR 500 crore is stipulated by the Act (Najjar & Kettinger, 2013). It incentivizes business to ensure that they are not taking part in data acquisition through illegal means.

Additionally, the rules of the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 impose additional responsibilities on social media platforms and other intermediaries to ensure compliance with Indian laws. They include provisions for grievance redressal mechanisms, content takedown requests and increased transparency in content moderation (Paunksnis, 2023). The rules require intermediaries to enable the identification of the first originator of information on platforms, which has raised concerns regarding user anonymity and privacy.

Practical deployment of these policy frameworks often faces hurdles in India, such as compatibility issues, integration complexities, and the need for standardized protocols. Industries relying heavily on data monetization may also resist the adoption of stringent privacy measures due to potential revenue implications. Understanding these dynamics is essential for developing strategies that align business interests with privacy goals. Resistance could manifest in lobbying efforts, legal challenges, or circumvention strategies (Naimi & Westreich, 2014). Examining these challenges is crucial for the effective adoption of privacy-focused solutions. Striking this equilibrium involves overcoming technical obstacles in implementing security protocols without compromising the seamless and user-friendly aspects of digital interactions.

Conclusion

Since the attention economy is new compared to other, more established sectors of society, like agrarian or manufacturing, it is imperative that we understand how to ethically gain traction without manifesting large-scale mental health ailments, distraction and overall economic inefficiency in society. Technology must be used as a boon to society, and it is up to governments to ensure that the balance between economic gains of firms and consumer’s mental wellbeing is at a point of equilibrium.

In tandem with governments and policymakers, other industry stakeholders like business firms must have a set of rules on how they can advertise and make use of behavioral economics and the human psyche to improve profits. At the end of the day, humans are irrational consumers, and make rash decisions without understanding the entirety of it all. They can either be educated so as not to be susceptible to the constant scroll in the attention economy, or they can spend hours listlessly on the Internet. It is up to not just governments and firms to implement policies based on ethical behavior, but consumers as well.

References

  • Ahlemeyer-Stubbe, A., & Coleman, S. (2018). Monetizing Data: How to Uplift Your Business. John Wiley & Sons.
  • Baecker, J., Engert, M., Pfaff, M., & Krcmar, H. (2020, March). Business Strategies for Data Monetization: Deriving Insights from Practice. In Wirtschaftsinformatik (Zentrale Tracks) (pp. 972-987).
  • Baragde, D., & Baporikar, N. (2017). Business innovation in Indian software industries. Journal of Science and Technology Policy Management8(1), 62-75.
  • Croxson, K., Frost, J., Gambacorta, L., & Valletti, T. M. (2022). Platform-based business models and financial inclusion. Bank for International Settlements, Monetary and Economic Department.
  • Hanafizadeh, P., Barkhordari Firouzabadi, M., & Vu, K. M. (2021). Insight monetization intermediary platform using recommender systems. Electronic Markets31(2), 269-293.
  • Jabłoński, A., & Jabłoński, M. (2020). Digital business models: Perspectives on monetisation. Routledge.
  • Jain, S., & Gabor, D. (2020). The rise of digital financialisation: The case of India. New political economy25(5), 813-828.
  • Lange, H. E., Drews, P., & Höft, M. (2021). Realization of Data-Driven Business Models in Incumbent Companies: An Exploratory Study Based on the Resource-Based View. In ICIS.
  • Naimi, A. I., & Westreich, D. J. (2014). Big data: a revolution that will transform how we live, work, and think. American Journal of Epidemiology, 179(9), 1143-1144.
  • Najjar, M. S., & Kettinger, W. J. (2013). Data Monetization: Lessons from a Retailer’s Journey. MIS Quarterly Executive, 12(4).
  • Ofulue, J., & Benyoucef, M. (2024). Data monetization: insights from a technology-enabled literature review and research agenda. Management Review Quarterly, 74(2), 521-565.
  • Parvinen P et al. (2020). Advancing data monetization and the creation of data-based business models. Communications of the Association for Information Systems 47:25–49. https://doi.org/10.17705/1cais.04702
  • Paunksnis, Š. (2023). India digitalized: surveillance, platformization, and digital labour in India. Inter-Asia Cultural Studies, 24(2), 297–310. https://doi.org/10.1080/14649373.2023.2182942
  • Shukla S, George JP, Tiwari K, Kureethara JV. (2022). Data privacy. In: Data ethics and challenges. SpringerBriefs in Applied Sciences and Technology. Springer, Singapore. https://doi.org/10.1007/978-981-19-0752-4_2
  • Shukla, S., Bisht, K., Tiwari, K., Bashir, S. (2023).Data Monetization. In: Data Economy in the Digital Age. Data-Intensive Research. Springer, Singapore. https://doi.org/10.1007/978-981-99-7677-5_3
  • Van’t Spijker, A. (2014). The new oil: using innovative business models to turn data into profit. Technics Publications, LLC.
  • Vila Seoane, M. F. (2021). Data securitisation: the challenges of data sovereignty in India. Third World Quarterly42(8), 1733-1750.
  • Wiener, M., Saunders, C., & Marabelli, M. (2020). Big-data business models: A critical literature review and multiperspective research framework. Journal of Information Technology, 35(1), 66-91.