A theoretical and policy approach to platform regulation
Digital platforms used to be spaces where consumers interacted with advertisers within a complex web of data relations (Gillespie, 2018a). But their trajectory has now moved beyond the domain of economy and have emerged as de facto infrastructures for social life—from the idea of ‘Platform Capitalism’ (Srnicek, 2017) to ‘Platform Society’ (Van Dijck et al., 2018), which explains how deeply platforms are embedded in social, economic, cultural, and legal institutions and practices (Van Dijck et al., 2018, p. 2).
If digital platform is understood as a ‘social figuration’ (Couldry & Hepp, 2017; Elias, 1982), this paper discerns two core processes within it which have immense ordering power: datafication and personalization (see Figure 1).
Datafication consists of massive user surveillance (Zuboff, 2019) and categorization (Gandy, 1993; Kitchin, 2014) of the collected data and is more oriented to knowledge. Personalization is comprised of ever-modulating predictions based on that data and of a hyper fragmentation of users, consumers, or citizens to apply those predictions on; it is more inclined toward practice.
Predictive analytics is now widely used in numerous areas such as risk assessments (such as insurance, policing, health, banking, credit and loans), recommender systems (such as e-commerce, and entertainment), and hypernudging (such as advertising and political campaigning). The more data are fed into predictive engines, the more accurate these predictions become; therefore, platforms are best places for hosting these engines.
Fragmentation is the automated process of segmenting the audiences into the smallest chunks so as to serve them with tailored suggestions or messages, services, and products.
The potential implications of mass personalization and its two sub-processes raise serious ethical concerns, from erosion of autonomy and solidarity to threats against democracy, and social justice (Yeung, 2018). When social action can be accurately predicted on an individual level, various actors may be able to nudge (Thaler & Sunstein, 2008) or influence users more effectively toward certain actions with commercial or political aims (Mills, 2022; Sunstein, 2012, 2013). This may undermine the citizen autonomy as the foundation of participant democracy (Leggett, 2014; Yeung, 2017, 2018).
Such concerns have incited global debates about the regulations of platforms and their embedded artificial intelligence models. Based on the proposed conceptual model, this paper proposes a regulatory approach.
Platform neutrality is the proposed term for a legal and technical unbundling of the platform’s core code, its algorithms (or AI models), and the user data which will enable users to utilise third-party algorithms on all platforms. In other words, making platforms neutral to the algorithms and AI models they use. This will instigate a free market of algorithms where concerns about monopoly, transparency, accountability, and privacy will be resolved through competition.
Figure 2. Platform layers
Similar approach had been previously devised in 1990s where the unbundling of computer hardware, Operating System, and software completely changed personal computing around the world. In the platform era, the core code is the equivalent of hardware, the algorithms the OS, and the user data resembles the software.
Some examples may be useful: A private company could sell alternative navigation algorithms, to be used as plugins on Google or Apple Maps, which suggests routes with lower emission, or routes that help local shops, or those that are safer at night or need to become safer by more traffic, etc.
Another company could provide newsfeed algorithms for Facebook or Instagram which, not only explain what kind of content they prioritize, but allow users to customize their own or their children’s newsfeeds.
Platform neutrality is a creative way to use neoliberal market dynamics against itself for the public good.
References
Cheney-Lippold, J. (2018). We are data: Algorithms and the making of our digital selves. NYU Press.
Couldry, N., & Hepp, A. (2017). The mediated construction of reality: Society, culture, mediatization. Polity.
Couldry, N., & Mejias, U. A. (2019). The costs of connection: How data is colonizing human life and appropriating it for capitalism. Stanford University Press.
Elias, N. (1982). The civilizing process. Pantheon books New York.
Gandy, O. H. (1993). The panoptic sort: A political economy of personal information. Westview.
Gillespie, T. (2018a). Custodians of the Internet. Yale University Press.
Gillespie, T. (2018b). Custodians of the Internet: Platforms, content moderation, and the hidden decisions that shape social media. Yale University Press.
Kitchin, R. (2014). The data revolution: Big data, open data, data infrastructures and their consequences. SAGE.
Leggett, W. (2014). The politics of behaviour change: Nudge, neoliberalism and the state. Policy & Politics, 42(1), 3–19.
Mills, S. (2022). Personalized nudging. Behavioural Public Policy, 6(1), 150–159.
Pasquale, F. (2015). The black box society: The secret algorithms that control money and information. Harvard University Press.
Plantin, J. C., & Punathambekar, A. (2019). Digital media infrastructures: Pipes, platforms, and politics. Media, Culture and Society, 41(2), 163–174.
Srnicek, N. (2017). Platform capitalism. Polity.
Sunstein, C. R. (2012). The storrs lectures: Behavioral economics and paternalism. Yale LJ, 122, 1826.
Sunstein, C. R. (2013). Impersonal default rules vs. Active choices vs. Personalized default rules: A triptych. Active Choices vs. Personalized Default Rules: A Triptych (May 19, 2013).
Thaler, R. H., & Sunstein, C. R. (2008). Nudge: Improving decisions about health, wealth, and happiness. Nudge: Improving Decisions about Health, Wealth, and Happiness., x, 293–x, 293.
Turow, J. (2012). The daily you: How the new advertising industry is defining your identity and your worth. Yale University Press.
Turow, J. (2017). The aisles have eyes: How retailers track your shopping, strip your privacy, and define your power. Yale University Press.
van Dijck, J. (2013). The culture of connectivity: A critical history of social media. Oxford University Press.
Van Dijck, J., Poell, T., & De Waal, M. (2018). The platform society: Public values in a connective world. Oxford University Press.
Yeung, K. (2017). ‘Hypernudge’: Big Data as a mode of regulation by design. Information, Communication & Society, 20(1), 118–136.
Yeung, K. (2018). Five fears about mass predictive personalisation in an age of surveillance capitalism. International Data Privacy Law, Forthcoming.
Zuboff, S. (2019). The age of surveillance capitalism: The fight for a human future at the new frontier of power (First edition). PublicAffairs.