Harracetal 2023 HowDigitalPlatformsOrganizeImmaturity

Download as pdf or txt
Download as pdf or txt
You are on page 1of 34

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/362399518

How do Digital Platforms Organize Immaturity: Towards an


Integrative Framework for Platform Power

Article in Academy of Management Proceedings · August 2022


DOI: 10.5465/AMBPP.2022.11518abstract

CITATIONS READS

0 40

3 authors, including:

Martin Harracá Annabelle Gawer


University of Surrey University of Surrey
9 PUBLICATIONS 9 CITATIONS 59 PUBLICATIONS 12,252 CITATIONS

SEE PROFILE SEE PROFILE

All content following this page was uploaded by Martin Harracá on 16 November 2023.

The user has requested enhancement of the downloaded file.


How Digital Platforms Organize Immaturity:
A Sociosymbolic Framework of Platform
Power
Martín Harracá
University of Surrey, UK

Itziar Castelló
City University of London, UK

Annabelle Gawer
University of Surrey, UK

The power of the digital platforms and the increasing scope of their control over
individuals and institutions have begun to generate societal concern. However, the
ways in which digital platforms exercise power and organize immaturity—defined
as the erosion of the individual’s capacity for public use of reason—have not yet
been theorized sufficiently. Drawing on Bourdieu’s concepts of field, capitals, and
habitus, we take a sociosymbolic perspective on platforms’ power dynamics,
characterizing the digital habitus and identifying specific forms of platform power
and counterpower accumulation. We make two main contributions. First, we
expand the concept of organized immaturity by adopting a sociological perspective,
from which we develop a novel sociosymbolic view of platforms’ power dynamics.
Our framework explains fundamental aspects of immaturity, such as self-infliction
and emergence. Second, we contribute to the platform literature by developing a
three-phase model of platform power dynamics over time.

Key Words: organized immaturity, autonomy erosion, digital platforms, power,


surveillance

O rganized immaturity, defined as the erosion of the individual’s capacity for the
public use of reason (Scherer and Neesham 2020), differs from other forms of
control in that it is a self-inflicted and emergent (as opposed to orchestrated) collec-
tive phenomenon in which autonomy-eroding mechanisms mutually reinforce each
other (Scherer and Neesham 2020, 9).
The phenomenon of autonomy erosion and increasing user control has been
discussed in the context of the dark side of digitalization (Flyverbom, Deibert,
and Matten 2019; Trittin-Ulbrich et al. 2021). Scholars have looked at how the

Business Ethics Quarterly (2023), pp. 1–33. DOI:10.1017/beq.2022.40


Published by Cambridge University Press on behalf of the Society for Business Ethics.
© The Author(s), 2023.

https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press


2 B E Q

automation of interactions through algorithms can lead to an emergent manipulation


of choice and autonomy erosion (Alaimo and Kallinikos 2017; Beer 2017; Just and
Latzer 2017; Orlikowski and Scott 2015), but there is still little exploration of the
organizing role of platforms in this process.
Digital platforms have been described as organizational forms that orchestrate
activities between independent users through the use of digital interfaces (Gawer
2014, 2021; Constantinides et al. 2018; Cusumano et al. 2019; McIntyre et al. 2021).
Increasingly, scholars denounce the negative effects of power accumulation by
digital platforms and platform owners. For example, studies of the structural con-
stitution of markets criticize gatekeeping positions that impose discriminatory
clauses or limit content access and creation, with consequences for users’ choices
(Crémer et al. 2019; Jacobides 2021; Khan 2018). Other researchers, such as Kelkar
(2018), Stark and Pais (2020), and Flyverbom et al. (2019), discuss sociomaterial
perspectives on platforms and show how platform owners design the interfaces,
prescribing what is accessible to users and what choices they may enjoy in the digital
platform; this, again, restricts choice and creates negative psychological effects on
users (Seymour 2019; Wu et al. 2019). Lanier (2018) and Zuboff (2019) present
systems of surveillance promoted by the power of digital platforms that explain how
the datafication of human experience leads to increasing forms of domination.
These studies provide valuable explanations of how the increasing power of
platforms hinders freedom of choice and individual autonomy. However, their
explanations are partial, focusing either on the market mechanisms that limit con-
sumer choice or on the specific role of digital objects, such as algorithms, that
constrain the platform users’ autonomy. The fundamental aspects of the organizing
of immaturity, such as the tension between organizing and emergence, and the
relationship between self-infliction and the power accumulation strategies of key
agents, such as platform owners, remain unexplored though. These tensions are
essential to explaining how organized immaturity is created and reproduced. We claim
that there is a need to explain the power accumulation of the different agents of the
platforms and its relation to the mechanisms that lead to the delegation of autonomous
decision-making. Therefore, in this article, we ask, How do digital platforms organize
immaturity?
To tackle this issue, we build a sociosymbolic perspective of power accumulation
in digital platforms inspired by Bourdieu’s writings (Bourdieu and Wacquant 2007;
Bourdieu 1977, 1979, 1984, 1987, 1989, 1990, 1991, 2005, 2011, 2014). A socio-
symbolic perspective supports building a dynamic conceptualization of power
accumulation based on agents’ practices, positions, and strategies. The concepts
of field evolution and habitus allow further explanation of the emergence of imma-
turity and the mechanisms of self-infliction. By situating the concepts of fields,
capitals, and habitus in the context of digital platforms, we describe digital platforms
as organizations mediated by a digital infrastructure and a digital habitus in which
agents accumulate capitals by operating in a field. We explain the role of the digital
habitus in organizing immaturity, complementing prior literature on materiality and
affordances. We propose a framework of power accumulation in which the dynam-
ics of platform owner power accumulation and counterpower accumulation coexist.

https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press


H D P O I 3

The platform owner accumulates power in five forms: constitutional, juridical,


discursive, distinction, and crowd. There are two forms of counterpower: crowd
and hacking. We also explain the evolution over time of the power dynamics and
propose a three-phase model in which the forms of power operate. These phases are
platform formation, platform domination within the original field, and platform
cross-field expansion.
This framework makes two significant contributions. First, we build a theoretical
apparatus that explains the organizing dynamics of immaturity by explaining the
relations between the structure, the digital objects, and the platform owner’s power
accumulation strategies. From these, we can explain the tension of emergence and
self-infliction. With this framework, we draw on sociological perspectives to expand
the understanding of organized immaturity in digital spaces by focusing on describ-
ing the practices that constitute the webs of relations that configure the digital habitus
and the processes of power accumulation. Second, we contribute to the platform
literature by developing a three-phase model of platform power dynamics over time.
This model expands current views on platform power, providing a more holistic
scheme in which power is both accumulated and contested and highlighting how
agents other than the platform owner play a role in producing and exercising forms of
power. This article concludes by providing policy recommendations on how to
understand and tackle organized immaturity and highlighting potential avenues
for further research.

ORGANIZED IMMATURITY
Organized immaturity has been defined as a collective, albeit not necessarily orches-
trated, phenomenon where independent reasoning is delegated to another’s guidance
(Scherer and Neesham 2020). It is inspired by the Kantian principle that humans
should have intellectual maturity involving autonomy of judgment, choice, and
decision-making without the guidance of an external authority. It also relates to
the ability to use experience to reason and reflect critically and ethically on complex
or problematic situations and to challenge norms and institutions (Scherer and
Neesham 2020). The concept of organized immaturity differs from other forms of
control in two ways. First, it is a “self-inflicted” (Kant, as cited in Scherer and
Neesham 2020, 8) process, referring to harm done by humans to themselves, often in
a nonconscious manner. From this perspective, “immaturity” is therefore a condition
of the human being that arises when an individual defers or delegates their
own autonomous reasoning to external authorities (Dewey 1939). The second
way in which organized immaturity differs from other forms of control is that it is
an emergent (as opposed to orchestrated) collective phenomenon in which
autonomy-eroding mechanisms mutually reinforce each other (Scherer and Nee-
sham 2020, 9).
According to Scherer and Neesham (2020), the study of immaturity relates also to
its organizing elements. The perpetuation of modern forms of immaturity has been
associated to organizations and institutions that create the conditions for the self-
inflicted immaturity. Organized forms of immaturity have been addressed in the

https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press


4 B E Q

critical analysis of bureaucratic organizations, where the individual is subject to


various forms of domination and control (Clegg 1989; Hilferding 2005).
The Fourth Industrial Revolution (Schwab 2017; Philbeck and Davis 2018) has
ushered in a consolidation of the globalized information and communication tech-
nologies that are driving the organization of economic life. However, the infrastruc-
tures and mechanisms behind these sociotechnological systems curb individual
liberties and impact people’s autonomy (O’Connor and Weatherall 2019; McCoy,
Rahman, and Somer 2018).
The term organized immaturity is not explicitly used in most of the literature
studying forms of control related to digitalization (with the exception of Scherer and
Neesham [2020] and Scherer et al. [2020]), but scholars are increasingly analyzing
the “dark side of digitalization” (Flyverbom et al. 2019; Trittin-Ulbrich et al. 2021).
In particular, attention has been directed to the use of big data and systems based on
artificial intelligence and to how the automation of interactions through algorithms
can lead to an emergent manipulation of choice. Even the basic algorithmic function
of search and match creates power asymmetries, since the inspection or control of its
guiding principles presents technical challenges for both users and regulators (Beer
2017; Just and Latzer 2017). Biases might be found in the criteria for how results
are limited, displayed, and sorted (Faraj, Pachidi, and Sayegh 2018) and may even
amplify properties of the data used as input, as has been observed in the context of
racial biases (Noble 2018). Researchers are increasingly pointing at the importance
of unpacking the consequences of algorithms in conjunction with a socially struc-
tured analysis of the device (e.g., Beer 2017; Introna 2016; Orlikowski and Scott
2015). Through this, they show how the “black box of algorithmic culture”
(Orlikowski and Scott 2015; Pasquale 2015; Striphas 2010) creates a world of
secrecy that eschews questioning and abrogates responsibility (Introna 2016), erod-
ing autonomous decision-making.
However, this emphasis on the artificial intelligence tools, algorithms, and coding
processes that hinder autonomy in decision-making must be complemented by
research into the organizing structures of immaturity, that is, the key organizing
agents. Studying digital platforms can improve understanding about how organized
immaturity happens, as these platforms organize social interactions and transform
the power relations of the different agents who participate in the digital exchanges.

PLATFORMS AND THE ACCUMULATION OF POWER

Platforms as Organizing Agents


In the platform literature, digital platforms have been described as new organiza-
tional forms that orchestrate activities between independent users through the use of
digital interfaces (Gawer 2014; Kretschmer et al. 2022; McIntyre et al. 2021).
Platforms can be considered a “particular kind of technology of organizing”
(Gulati, Puranam, and Tushman 2012, 573) or “hybrid structures between organi-
zations and markets” (Kretschmer et al. 2022, 4), as they use a mixture of market and
hierarchical incentives to coordinate autonomous agents. Platform organizations
are distinct from hierarchies, markets, and networks (Gawer 2014) because, as

https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press


H D P O I 5

Kornberger et al. (2017, 81) argued, “platform organizations question not only
extant organization designs but also, quite fundamentally, the [Coasian] idea of
the firm … and … of value creation processes.”
Two fundamental characteristics define the digital platform as an organizing
agent: how its digital architecture is structured and how it coordinates interactions.
From an organizational perspective, platforms can be described by the common set
of design rules that define their technological architecture. This system is charac-
terized by a “core” or center component with low variety and a complementary set of
“peripheral” components with high variety (Tiwana, Konsynski, and Bush 2010).
The rules governing interactions among the parts are the interfaces (Baldwin and
Woodard 2009). Interfaces contribute to reduce a system’s complexity by greatly
simplifying the scope of information required to develop each component (Gawer
2014). Together, the center, the periphery, and the interfaces define a platform’s
architecture (Baldwin and Woodard 2009). The center–periphery structure therefore
defines an asymmetric framework in which the participants collaborate and compete
(Adner and Kapoor 2010), under conditions set by the platform owners on two
elements: openness and governance rules (Gawer and Henderson 2007; Boudreau
2010).
Platforms coordinate transactions by creating “multisided markets,” in which
their owners act as intermediaries to bring together (match) and facilitate exchanges
between different groups of users by aligning market incentives (Rochet and Tirole
2003). Interactions occur in a networked structure, implying that the value derived
from platform usage increases exponentially with each additional user (Katz and
Shapiro 1985). As the value for participants grows with the size of the platform, it is
optimal for them to converge on the same platform, leading to the prediction that
platforms will tend to create concentrated markets organized by increasingly pow-
erful owners (Caillaud and Jullien 2003; Evans 2003).
Platforms’ Accumulation of Power and the Consequences for Individuals’
Autonomy Erosion
The characteristics of platforms described in the preceding section have facilitated
the accumulation of power by platform owners, leading to “new forms of domination
and competition” (Fuchs 2007, 7) that are increasingly eroding people’s capacity to
make independent decisions. The consequences of the platforms’ power accumu-
lation for manipulation of choice and autonomy delegation have been analyzed from
two perspectives: first, in relation to the structural constitution of markets and how
this structure can lead to manipulation of users’ choices, and second, from a socio-
material perspective that looks at the interaction of digital objects (e.g., algorithms)
and the platform users.
From the perspective of the structural constitution of markets, the accumulation of
power and manipulation of choice is associated to the growing centrality of large
platforms in the economy. Consumers and business partners can have their choices
manipulated because of the specific intermediary role that platforms play. Once the
market has been tipped, this role provides the platform owner with a position from
which they can charge supramonopoly prices and define the rules of the market,

https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press


6 B E Q

including who can access it and how the transactions occur (Busch et al. 2021; Khan
2018; Jacobides 2021). In this way, platforms are increasingly operating as gate-
keepers, imposing discriminatory clauses or limiting content access and creation
(Stigler Committee on Digital Platforms [SCDP] 2019; Furman 2019). Choice
making can also be limited due to market concentration driven by platforms, in that
a platform enhances its owner’s opportunities to leverage its assets (Khan 2018).
Thus the owner can entrench their (platform’s) position in a market and enter an
adjacent one by creating economies of scale and scope (Khan 2018; Jacobides
2021). This brings the possibility of creating a dominant position in apparently
unrelated markets through practices like vertical integrations, killer buys, predatory
pricing, and self-preferencing (Crémer et al. 2019; Furman 2019). In addition, the
capture and control of transactional data may be used to improve platform services,
while also enabling the creation of entry barriers that fend off competition (Khan
2018).
Market-based analyses provide a view of power accumulation based on asset
control and market position. However, they have been criticized for overlooking the
impact of other noneconomic dimensions and for portraying power as relatively
unidirectional (Margetts et al. 2021; Lynskey 2017, 2019). Such critiques recognize
that the deep social impact of platform power cannot be tackled from a market
perspective alone (Margetts et al. 2021; Lianos and Carballa-Smichowski 2022).
Sociomaterial perspectives place affordances and materiality of the digital objects
at the center of the platform interactions (Fayard and Weeks 2014; Kornberger 2017;
Curchod et al. 2019). In this perspective, digital objects, such as code, interfaces, and
algorithms, are described as central objects that can hinder autonomy. For example,
when platform owners design the interfaces, they define the category of user,
prescribing what is accessible to users and what choices they enjoy in the digital
platform (Kelkar 2018). Encoding, which comprises the rules for how offline objects
and actions are translated into a digital language (Alaimo and Kallinikos 2017), is
also defined by platform owners. Once codified, actions must be performed in
accordance with the rules established by the platform. Thus the affordances of
technology shape and mold the interactions of the users with the platforms
(Alaimo and Kallinikos 2017). Furthermore, algorithms and codes have been
denounced for their opacity (Etter and Albu 2021). The inspection and control of
a platform’s guiding principles present technical challenges for both users and
regulators (Beer 2017), which enables manipulation. For example, Seymour
(2019) and Wu et al. (2019) describe how the manipulation design techniques
employed by platform firms like Facebook and Twitter are worrying not only
because they affect an individual’s freedom of choice but also because they can
cause users to experience harmful psychological effects, such as addiction.
Yet the aforementioned studies of affordances and materiality offer a limited
understanding of how emergence and self-infliction of organized maturity are
patterned by the strategic choices of platform owners and other agents. To further
understand the organized immaturity of digital platforms, it is important to look at
how practices are shaped and organized by the relations between the technological

https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press


H D P O I 7

objects, the different users’ strategies, and structural elements that conform the
power accumulation of the platform.
Some scholars have begun to offer holistic models that explain the accumulation
of power by platform firms and its consequences for the authority erosion of different
agents. Lanier (2018) and Zuboff (2019) describe digital platforms’ datafication of
human experience, which leads to increasing forms of domination in what they term
“surveillance capitalism.” Surveillance is enabled by the asymmetric positions of
platform owners and users, defined by technological architecture, and executed
through monetization strategies based on user data. Zuboff (2019) argues that
despite the explicit narrative of platforms as both positive and objectively inevitable,
their strategies and business models—based on voluntary data sharing—are funda-
mentally connected to the extraction of economic rents. Surveillance reduces human
experience to free raw material for translation into behavioral data and prediction
products (Zuboff 2019), eroding individual autonomy and disrupting intellectual
privacy (Richards 2012). Surveillance has become a naturalized practice that we all
—willingly or not—perform (Lyon 2018). Surveillance theories therefore contrib-
ute to this debate by offering an understanding of the instrumental connection
between the business model and technological objects that constitute the platform
and the self-infliction aspects of immaturity processes.
Yet, we argue that further work is needed to understand not only the expansion of
immaturity through a system of economic surveillance but also how the everyday
practices of leading and participating in the platform relate to immaturity emergence.
Moreover, we argue that these views should be enriched with a theory of how agency
is constituted and transformed by platform power dynamics, how these dynamics
have an organizing role in producing and reproducing the delegation of autonomous
decision-making, and how the emergence of immaturity and the strategic power
accumulation by platform owners are connected.

A SOCIOSYMBOLIC PERSPECTIVE OF DIGITAL PLATFORMS


To further explain how platforms organize immaturity, we draw on Bourdieu’s
sociosymbolic theory and the concepts of field, capitals, and habitus. A sociosym-
bolic perspective situates the agents in a field and explores the power accumulation
dynamics of each agent. It takes materiality into consideration, but, through
the concept of habitus, it is able to explain how interactions are also mediated by
previous history and the networks of relations in a way that complements the notion
of affordances and its connotations for the perception of physical artifacts and
technology (Fayard and Weeks 2014). Furthermore, a sociosymbolic approach
allows us to build an integrative conceptualization of power accumulation and its
dynamics based on agents’ practices, positions, and strategies. It shows how multiple
types of powers can coexist and accounts for how the relative positions of agents
shape their motivations and actions, explaining the practices of immaturity and its
relation to self-infliction. We explain this further, first by providing an overview
of how a sociosymbolic perspective generally explains power and its dynamics
through the concepts of field, capital, and habitus; we thus show how digital

https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press


8 B E Q

platforms can be understood through these lenses. Second, we describe the dynamics
that lead to specific forms of power accumulation and explain how they can evolve
over time.
Fields, Capitals, and Habitus in Digital Platforms
Bourdieu’s sociosymbolic theory was developed to explain social stratification and
dynamics in (offline) societies by focusing on how agents (people, groups, or
institutions) produce, reproduce, and transform social structures through practice
(i.e., what they do in everyday life). Through practice, agents produce particular
social spaces with specific boundaries demarcated by shared interests and power
relations; these social spaces are termed fields of practice (Bourdieu and Wacquant
2007).
Fields
A field (champ) is a key spatial metaphor in Bourdieu’s work. It represents “a
network, or a configuration, of objective relations between positions” (Bourdieu
and Wacquant 2007, 97). These positions are objectively defined “to field occu-
pants, agents or institutions … by their present and potential position (situs) in the
structure of the distribution of species of power (or capital)” (Bourdieu and Wac-
quant 2007, 97). Individuals, groups, or organizations can be agents in a given field,
and one individual may have different agencies (or “roles”), depending on their
situation in the field.
The concept of “field” can be related to digital platforms in the sense that the
organization and production of practices situates the platform in relation to an
existing field. This may be the field of cultural production (e.g., Facebook) or the
field of goods exchange (e.g., Amazon). The fields have specific logics and struc-
tures that define them. Different agents can have multiple roles; for example, an
Instagram user may be both a contributor and a consumer of content. The relational
aspects of the fields are also very compatible with network-based perspectives
(Portes 1998) because the field in which the platform is embedded functions on
the basis of relations created during the practice of exchanges that constitute the
field. The technological infrastructure creates a center–periphery structure, which
provides the foundation on which the practices occur, both enabling and regulating
them. This approach to platforms highlights the practice of the agent and its position
but also simultaneously shows how the platform’s constitutive elements are deeply
interconnected. Taking Twitter as an example, the extent to which a specific content
generated by a user is reproduced depends on the user’s social position in the
network but also on the priorities defined by the platform’s algorithms, which create
the structure in which the content is shared.
Multiple nested and overlapping fields can be found on any platform, just as they
are in any (offline) social context. For example, YouTube constitutes a huge field of
people broadly interested in sharing and viewing online video content. However,
YouTube also hosts a variety of other, more focused subfields, for instance, a field
centered on cryptocurrency videos. At the same time, platforms do not necessarily
constitute a field in its entirety, for while some online fields exist mostly in a single

https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press


H D P O I 9

platform, like the field of video content sharing on YouTube, competing platforms
have entered some subfields, such as gaming videos in Twitch. At the same time,
other online fields are embedded in larger fields of practice. For example, job seekers
would look at job opportunities in LinkedIn while engaging offline with the job-
offering companies.
Yet the creation of a digital platform can also be conceptualized as an attempt to
“enclose” part of a field: an agent (the platform creator) designs a value creation
model for users (the specific practices to be performed by them within the field) and
develops the digital infrastructure that makes interactions possible. Digital platforms
enclose the field because they attempt to create “exclusive control rights” (Boyle
2003) over dimensions of practices that were previously in the public domain.
Consider Google’s Street View, launched in 2007, which permits users to view
the fronts of buildings from a pedestrian’s viewpoint. The service utilizes photo-
graphs taken by Google of objects that are not covered by intellectual property
rights, albeit that the photographs were taken without the authorization or agreement
of the communities, and their use is monetized (Zuboff 2019). In this case, Google
Street View becomes not only a new service for users but also a new way of
exploiting value through dispossession of public goods and private data (Zuboff
2019).
A field enclosure by a platform also includes encoding social interactions defined
by more or less variable practices (e.g., hailing a taxi on the street) into a precisely
defined process in a controlled space (using a ride-hailing app). This appropriation is
produced through the codification of social interactions, control over the digital
space, and the data generated by these interactions. Moreover, by enclosing a field,
digital platforms modify both the practices and the agents’ relative positions. For
example, drivers and passengers are inscribed into a database owned by the platform
owner and are organized into groups from which they are picked and matched.
Furthermore, the creation of the platform can transform the scope of the field.
Digitalized practices often involve connecting with deeply intimate aspects of users’
lives (Lupton 2016), such as private data exemplified in photos, comments, or
information about consumption habits. While typically regarded as private, the
encoding of these portions of experience puts them into the potential reach of a field
and exposes them to its specific field logic. Furthermore, because of the new ways of
performing certain practices, platforms collide with the established scopes of the
field, changing the agents and institutions involved in it. This is the so-called
disruptive nature (SCDP 2019) of the platform. Examples can be found in conflicts
around regulatory frameworks triggered by the introduction of platforms to some
industries, such as Uber’s entry into the field of transportation and Airbnb’s into
hospitality.
Capitals
Fields are dynamic spaces defined by the relations of power between players that
constitute the structure of the field (Bourdieu and Wacquant 2007). These relations
result from “the possession and activation of resources that are both materially and

https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press


10 B E Q

symbolically produced and perceived” (Bourdieu 1989, 16). These resources are the
capitals.
The accumulation of capitals give access to “the specific profits that are at stake in
the field, as well as by their objective relation to other positions (domination,
subordination, homology, etc.)” (Bourdieu and Wacquant 2007, 97). In each of
the specific fields, the spaces of objective relations are the sites of a logic specific to
those who regulate the fields. This logic does not need to follow purely economic
rationalities to be described (Sandberg and Alvesson 2011). For example, TikTok
users who copy their nearest higher-status digital neighbors in a particular contest or
“dance” might not be guided by economic rationality, but they do follow the logic of
the platform.
Capitals are therefore the resources—scarce and socially valued stocks of inter-
nalized abilities and externalized resources—that each agent has. Bourdieu defines
three fundamental forms of capital through which power is accumulated: economic
capital (money and other assets), cultural capital (knowledge and familiarity with
accepted norms), and social capital (reflected in the actor’s creation of connections
and social networks) (Bourdieu 2011). To these, Bourdieu adds symbolic capital,
“which is the form that one or another of these species takes when it is grasped
through categories of perception that recognize its specific logic, … [that] misre-
cognize the arbitrariness of its possession and accumulation” (Bourdieu and Wac-
quant 2007, 118), that is, the reflection in the relations of the field of accumulated
prestige, consecration, or honor (Bourdieu 1993). For Bourdieu, power struggles are
mainly symbolic, and agents who are willing to increase their power will ultimately
exercise the symbolic capital that will help them to be “perceived and recognized as
legitimate” (Bourdieu 1989, 17) in what Bourdieu (1984) also calls “distinction.”
Social dynamics in fields are centered on the generation of distinction(s) by
agents, who “constantly work to differentiate themselves from their closest rivals”
(Bourdieu and Wacquant 2007, 100), although the actors’ participations in these
games are typically no more than “unconscious or semi-conscious strategies”
(Bourdieu 1969, 118). Distinction operates through the accumulation of capital that
matters to the field. Thus fields are spaces of conflict and competition in which the
hierarchy is continually contested. However, agents can attempt to convert one form
of capital into another or transfer it to a different space, depending on the specific
logic of the field (Levina and Arriaga 2014).
The concept of distinction can be assimilated to the concept of “status” as it is used
to explain the means of interaction on digital platforms (Levina and Arriaga 2014).
For example, on digital platforms like YouTube, a user’s social network position and
cultural skills (e.g., their offline knowledge about a particular topic) combine with
their taste and the time and money they invest into the field. Together, these shape
which content gets noticed and which is ignored (Levina and Arriaga 2014) and
therefore which agents become “influencers” or agents with high status in the
network.

https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press


H D P O I 11

Habitus
Besides the description of how agents, through their collective actions, shape
emergent field structures and the understanding of which capital matters and how,
Bourdieu also looks at how structure shapes agency. Bourdieu uses the notion of
habitus to describe the socially learned schemata of perception and inclinations to
action (Bourdieu and Wacquant 2007). Habitus is the internalization of the logic of
the field. It is a set of historical relations incorporated within individual bodies in the
form of mental and corporeal schemata (Ignatow and Robinson 2017). These
relations, or the “system of schemes of perception and appreciation of practices,
cognitive and evaluative structures,” are “acquired through the lasting experience of
a social position” (Bourdieu 1989, 19); that is, they are acquired through interaction
with other social agents. The habitus includes related comportment (posture and
gait), aesthetic likes and dislikes, habitual linguistic practices, and ways of evalu-
ating oneself and others via categories. It forges not only actions but also desires and
aspirations (Ignatow and Robinson 2017). While cognitively embedded, it is also
embodied in gestures, postures, movements, and accents (Ignatow and Robinson
2017). Its reproduction depends mainly on institutions like family and school.
Mastery of the habitus tends to guarantee distinction and constancy of practice over
time (Bourdieu 1990).
Crucially, the constitution of the habitus is recursive: while agents can reshape
social distance and the ways it may be perceived, their own perception is likewise
framed by their own position in the social structure. This recursive cycle is the
process of constitution of the sociosymbolic space, where changes in position can be
understood as the outcome of symbolic struggle. Habitus is therefore a way of
conceptualizing how social structures influence practice without reifying those
structures (Costa 2006).
In his studies of class, taste, and lifestyles, Bourdieu (1984) illustrates how habitus
shapes taste in ways that make a virtue out of necessity. For example, working-class
people develop a taste for sensible, plain food, furnishings, and clothes, and they
shun fancy extravagances (Bourdieu 1984). Hence habitus leads to the “choice of the
necessary,” and in so doing, it tends to generate practices that ultimately reproduce
the original objective conditions, through which it functions as structure (Costa
2006). Thus, given a set of conditions, “habitus affords an actor some thoughts and
behaviors and not others, making those thoughts and behaviors seem more appro-
priate, attractive, and authentic than others” (Fayard and Weeks 2014, 245). Ulti-
mately, however, it is the actor who decides what to do. Often the decision occupies
no conscious thought, but, as Bourdieu (1990, 53) argues, it is “never ruled out that
the responses of the habitus may be accompanied by strategic calculation tending to
perform in a conscious mode.”
The concept of digital habitus has been used in the analysis of digital spaces (e.g.,
Levina and Arriaga 2014; Julien 2015; Ignatow and Robinson 2017; Romele and
Rodighiero 2020) to explain the ways of acting, namely, the social and technolog-
ically ingrained habits, skills, and dispositions that define the practices in the digital
field. Ignatow and Robinson (2017) argue that digital machines are not only the

https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press


12 B E Q

crystallized parts of habitus but also habitus producers and reproducers. This is
because practices performed in digital platforms have technological and symbolic
mediations: they are digitized—coded—and they are performed through a constant
interaction with algorithms and the data that feed the learning of the algorithms. For
algorithms to constitute the habitus, they need the platform to be able to extract
increasingly large amounts of data and transform them into capital. In this context,
the data work as the culture that informs the knowledge about the social space. The
norms of the platform are constantly shaped by the interaction between the data, the
algorithm, and the agents. The capital created by this interaction can be appropriated
by certain agents who know how to use these results to their advantage.
The mechanism of the digital habitus has two consequences. As socialization is
increasingly done through digital platforms, the algorithmic logic becomes a norm
that everyone needs to learn to play by or with (Beer 2017), and thus it becomes part
of the habitus. It becomes the representation of the current taste of a social class or
group so that their decisions resemble each other. However, unlike the offline
habitus, it derives from code as well as from action; thus it is somehow defined
behind closed doors by the platform owners. Second, as Ignatow and Robinson
(2017) argued, the digital habitus becomes a (re)generator of the social group
because it is mediated by the property of the algorithmic practice that relates to
aggregation for prediction. The singularities of social agents are reduced to aggre-
gates of decisions, actions, desires, and tastes. This phenomenon has been called
“personalization without personality” (Ignatow and Robinson 2017, 100), person-
ality being the principle that gives unique style to each human process of individ-
ualization.
Having set the theoretical apparatus to explain how digital platforms can be
understood from a sociosymbolic perspective, we turn now to defining how digital
platforms accumulate power and how power accumulation increases the problem of
organized immaturity.
A Sociosymbolic Perspective of Power Accumulation and Its Consequences for
Organized Immaturity
Building on Bourdieu’s later writings on the State and its forms of power (Bourdieu
1989, 2014) and in light of the latest developments of digital platforms and their
accumulation of power, we direct our analytic attention to the platform owner and its
relations with the other platform agents and sociodigital objects. Thus, we go beyond
the extant analysis of distinction in digital platforms done by scholars of digital
sociology (e.g., Julien 2015; Ignatow and Robinson 2017) which focuses on users, to
capture the mechanisms of field transformation led by platform owners in their
relationship with the other platform agents. We follow Bourdieu (2014) in terming
these mechanisms “forms of power” and showing how these contribute to explaining
organized immaturity.
Drawing on Bourdieu’s writings (Bourdieu 1984, 1989, 1991), we define the
forms of power, distinguishing between two general dynamics. We first define five
forms of power (constitutional, juridical, discursive, distinction, and crowd) that
drive the accumulation of power within the platform. Second, inspired by recent

https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press


H D P O I 13

literature on platforms (Ziccardi 2012; Eaton et al. 2015; Krona 2015; Bucher et al.
2021), we show how counterpower can also be performed by end users and other
peripheral agents through crowd and hacking power. Crowd and hacking power are
not concepts derived directly by Bourdieu’s theory but provide a more comprehen-
sive view of power accumulation dynamics.
We then articulate the platform power dynamics through three phases of platform
evolution, which are derived from an interpretation of platform innovation research
(Cutolo and Kenney 2020; Kolagar, Parida, and Sjödin 2022; Rodon, Modol, and
Eaton 2021; Teece 2017): formation, where the platform is launched and starts to be
used by agents; domination, where the platform has been widely adopted and
operates under a relatively stable design within the original field; and cross-field
expansion, where the platform expands to other fields, leveraging their accumulation
of power. Although we describe for each stage the dominant forms of power and
counterpower accumulation that enable the transformation of the field, we acknowl-
edge that several forms of power coexist in these phases, that the evolution of
platforms is often nonlinear, and that not all platforms will become dominant.
Forms of Platform Power
Constitutional Power
Constitutional power is the ability to “transform the objective principles of union and
separation, … the power to conserve or to transform current classifications”
(Bourdieu 1989, 23). Within the platform, this power comprises both the architec-
tural design (platform layers and modularity, design of user interfaces and experi-
ences) and the capacity to define the rules, norms, categories, and languages that
make up the digital interactions. Constitutional power shapes the digital medium for
interactions and defines what may and may not be accessed by each type of agent
within the platform.
Constitutional power is exercised mainly by the platform owner. As the provider
of the digital infrastructure upon which other agents collaborate, the owner defines
the symbolic space through code. Code symbolically creates the objects that con-
stitute the relations, being a neat, unified, and unambiguous language with no
openings for interpretation (Lessig 2009). In the digital realm, the actor who man-
ages the code can increase its symbolic imposition and therefore its legitimization.
As the legitimation process is unified, creation and transformation are delegated.
This legitimation is “world-making” (Bourdieu 1989), as it explicitly prescribes the
possible realities and actions. The platform owner is therefore able to hold a monopoly
over legitimate symbolic violence (Bourdieu 1989), having a differential capacity to
influence and settle symbolic struggle. The possibility of obtaining and activating this
symbolic capital is associated with complex technological competences, which are
scarce and highly concentrated (Srnicek 2016; Zuboff 2019).
The coherent body of code adopted by the symbolic space through constitutional
power is not a neutral technical medium (Beer 2017; Gillespie 2010), and it can
trigger autonomy eroding. Code is created and transformed in accordance with the
objectives of the platform owner and correspondingly managed toward these goals.

https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press


14 B E Q

For example, Kitchens et al. (2020) show how the differences in platform design for
Facebook, Twitter, and Reddit create a differentiated impact on the diversity of news
and the type of content their users consume. Calo and Rosenblat (2017) and Walker
et al. (2021) find that the algorithmic design in Uber reduces drivers’ insights about
their working conditions and the competition they face, hindering their autonomy.
Even without assuming strategic manipulation, the limited symbolic and repetitive
action of users implies a delegation of users’ own independent reasoning and the
emergent coordination of their actions by the platform.
Juridical Power
Along with the architecture definition, a second feature that is critical to the thriving
of the platform is its governance. While constitutional power has to do with the
design of governance, juridical power is the capacity to sanction via the created rules
and the authority to arbitrate in disputes (Bourdieu 1987, 2005). Typically, it can
take a variety of forms, such as sanctioning rule infringement, reporting abuses, or
managing access to the platform (Adner and Kapoor 2010).
Digital technologies can enable increased participation and distribution of roles
among agents, which is why studies of governance in these contexts have favored the
idea that digitalization processes are highly democratizing (von Hippel 2006; Zit-
train 2009). However, the hierarchical structure of digital platforms facilitates the
creation of governance layers, meaning that the importance of those decisions can be
easily packaged, resulting in a limited distribution of power in the field. For example,
transaction-oriented platforms like Amazon, eBay, and Uber rely on user-based
rating systems to ensure good quality and sanction inadequate behavior; however,
the platform owner designs the rankings and retains control of other actions, such as
account activation and suspension (Gawer and Srnicek 2021).
This role division effectively creates and redistributes power and therefore
restricts the capacity of some agents to interact without the intervention of the digital
platform owner. Hence the definition and distribution of roles will interact with (and
eventually transform) the authority structure and the conflict management mecha-
nisms that preexist in the field, including regulation. For example, Valdez (2023)
explores how Uber uses what she calls “infrastructural” power to deploy a strategy of
“contentious compliance,” both adapting to and challenging existing regulation.
This strategy allows the company to exploit differences in regulation and regulatory
scrutiny to reduce users’ access to information and acquired rights.
Discursive Power
A third distinctive form of power that characterizes agents’ strategic interplay is
discursive power. Discursive power is the power exercised in linguistic exchanges,
which are embodied and learned but also generative of the habitus (Bourdieu 1991).
The way agents talk about platforms and the words they use to explain them—these
discourses configure the collective narrative of what is possible on and valuable in a
platform.
Platforms are narrated as part of a broader, already-institutionalized rational-
technological narrative in which customer-centrism, effectiveness, and rationality

https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press


H D P O I 15

of the exchanges are dominant values (Gillespie 2010; Garud et al. 2022). Techno-
logical determinism discourses promoted by platform owners reinforce the idea that
platforms’ algorithms are inscrutable and of a complexity unfathomable to the public
or the regulator (Martin 2022; Pasquale 2015). These discourses have led to a
broader narrative of a “Manifest Destiny” (Maddox and Malson 2020) of digital
platforms, where the user is explicitly asked to delegate their own reasoning to the
platform. This, alongside user dispersion, is a fundamental element that enables
prescribing actions. Critical to maintaining user dispersion is the narrative that users
are directly connected through the platform, which is presented as an agora of
exchanges. In actuality, platforms mediate that interaction, formatting it, regulating
it, or even suspending it.
Distinction Power
Distinction power is the creation of categories and the mechanisms of categorization
that drive choice in the platform. It builds on the concept of distinction proposed by
Bourdieu (1984). It defines the rules and practices that inhabit the habitus and
designates which of them are legitimated and considered by society to be natural.
The purpose of this type of power is to produce a behavioral response that serves
some agents’ specific accumulation of capital. The platform owner can influence
user behavior by modifying the interfaces, the encoding, and the algorithms, thereby
manipulating the user’s decision-making. At the same time, users can access and
activate this power through their digital habitus, allowing them to influence and
drive other users’ choices.
On platforms, distinction power is often exercised through what Kornberger,
Pflueger, and Mouritsen (2017) call evaluative infrastructures. Evaluative infra-
structures are the different interactive devices, such as rankings, ratings, or reviews,
that establish an order of worth among the users of the platform, driving the
“attention” (Goldhaber 1997) of other users. They relate agents and their contribu-
tions with each other, but they are also instruments of power. They define not only
how agents are perceived and ranked in the community but also how the hierarchy is
monetized by the platform’s owners (Kornberger et al. 2017). Status markers are
examples of how distinction power is exercised. As they define how user activity and
loyalty to the platform are rewarded, they become a fundamental element in guiding
agents’ accumulation strategies. For example, YouTube and Wikipedia changed
their strategy for recognizing content to stimulate newcomers (Kornberger et al.
2017). Ignatow and Robinson (2017) refer to this process as the “übercapital.”
Übercapital emphasizes the position and trajectory of users according to the scoring,
gradings, and rankings and is mobilized as an index of superiority that can have
strong reactive or performative effects on behavior (Ignatow and Robinson 2017).
A key feature of distinction power is that it is exercised heterogeneously over
different users through differences created by constitutional and juridical power.
Different types of users are granted different forms of agency, not only by the
platform designers but also by their own intervention on the platform (Levina and
Arriaga 2014). For instance, passive users may be granted agency through techno-
logical features. For example, YouTube gives agency to passive users by displaying

https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press


16 B E Q

the number of views. Merely by viewing a piece of content, individuals cast a vote on
its value, which has significant consequences for the content producers. Other users
become judges or “raters” and producers of information at the same time. For
example, retweeting on Twitter is both a contribution to the platform and an act
of evaluation. As well as users who are raters, there are often users who act also as
“expert evaluators” (users who have accumulated significant cultural capital). One
such example is the “superdonor” on crowdfunding platforms like Kickstarter,
whose expert evaluations influence which projects are funded. Expert evaluators
tend to form a tight-knit group within a field (Vaast, Davidson, and Mattson 2013;
Aral and Walker 2012). Other users might have what Bourdieu called “institution-
alized consecration” (Levina and Arriaga 2014), which is the formal authority to
evaluate content given by the platform designers. These are typically site moderators
and community managers, who have more power than others to judge contributions
(Levina and Arriaga 2014). In sum, these different types of agencies are designed by
the platform owners to orient users’ actions and to promote and demote content
(Ghosh and Hummel 2014). They are typically linked to how the platform owner
designs revenue models (Zuboff 2019).
The forms of power presented so far tend to reinforce the power position of the
platform owner, but there are other forms of power that create the opposite tensions,
that is counterpower accumulation. These are crowd and hacking power.
Crowd Power
In the accumulation process, users are in a unique position in that they are the agents
who produce the platform’s activity. Crowd power results from the influence that
users can exert on the platform by the sheer mass of their actions, which may or may
not be coordinated (Bennett, Segerberg, and Walker 2014; Culpepper and Thelen
2020). These practices are, in essence, the exercise of the digital habitus. The
exercise of the habitus can have a long-lasting effect on the platform’s structure.
Practices can both inspire new functionalities and generate unexpected transforma-
tions to the value proposition, which the platform owner can recapture through
redesigning the code. For example, this has been observed in the sharing and creator
economies, in which, because the provider side of the platform is the main value
creator—for example, graphic designers, programmers—the platform owner peri-
odically changes the design to facilitate delivery of that value (Bucher et al. 2018;
Bhargava 2022).
As Bourdieu (1990) argued, the agents ultimately decide what they do, and the
digital habitus may be accompanied by strategic calculation, even if most of the
practices are bound by parameters defined by the platform owners and managed
automatically by algorithms. This creates the opportunity for practices not aligned
with the value proposition to go viral, eventually posing challenges to the balance
envisioned in the platform design. For example, Krona (2015) uses the notion of
“sousveillance”— an inverted surveillance “from the bottom” or “from many to a
few”—to describe the novel use of an audiovisual sharing platform by social
movements during the Arab Spring uprising. This emergent use emphasizes the
emancipatory potential of users to create collective capabilities and decision-making

https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press


H D P O I 17

(Ziccardi 2012), which we designate as crowd platform power–challenging forms of


power.
Yet, platform owners can attempt to use crowd power in their favor, in what we
call the crowd platform power–enhancing forms of power, through constitutional
power (architecture design, limiting the possibility of contact between users), juridical
power (policing and sanctioning users), and distinction power (by shaping the eval-
uative infrastructure). For example, Thelen (2018) shows how Uber “weaponized”
the volume of its users in a regulatory dispute by introducing a button on its
interface that would send a templated complaint email to local government on the
user’s behalf.
Hacking Power
Hacking power is the ability to identify the features and categories of digital spaces,
such as overlooked programming errors and ungoverned areas, that may be used for
a different purpose than the one originally intended (Jordan 2009; Hunsinger and
Schrock 2016). There are numerous examples in the literature of expressions of this
type of power in digital platforms. Eaton et al. (2015) have described the continuous
cycles of resistance and accommodation performed by groups of hackers and Apple
that surround the jailbreaking of each new release of the iOS. Bucher et al. (2021)
and Calo and Rosenblat (2017) have shown how workers learn to anticipate patterns
in algorithms that control their work processes and use this knowledge to defend
themselves from abuses.
Hacking power is the antithesis of individual immaturity, as it requires not only
the exercise of independent reasoning but also a degree of understanding of the
specific system in which the power is exercised. It is deliberate and purposeful,
unlike crowd power, which is independent of users’ understanding because it stems
from the combined volume of their actions. At the same time, hacking power
necessarily operates in the margins or interstices of the platform. Furthermore,
hacking power can be thought of as opposed to the constitutional and juridical
powers; as such, it will be dispersed, under the radar, and is often considered illegal
(Castells 2011). This creates difficulties for creating and accumulating this power in
the field and consequently for using it to challenge other forms of power. Table 1
summarizes the different forms of power and provides further examples.
Platform Power Dynamics
By discussing platforms in the context of fields, we have shown how the relations
between the different key agents can be understood through dynamics of power
accumulation. On one hand, users activate their capitals through the production of
the practices that configure the digital habitus, which enhances their understanding
of the ways of participating on the platform. However, it is mainly the platform
owner who captures most of the value creation process through constitutional,
juridical, discursive, and distinction power. This uneven distribution facilitates the
creation of a leveled field upon which the relative positions can be consolidated
while, at the same time, enlarging the distance between agents and therefore their
capacity to decide in an autonomous way.

https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press


Table 1: Forms of Platform Power and the Organization of Immaturity
18

Form of power Definition Organization of immaturity Examples of the use of platform power
Constitutional Design and control of the platform’s architecture Explicitly prescribes realities and actions, granting the • Definition of user requirements to join (e.g.,
(modules, interfaces, layers, and algorithms) and platform owner a monopoly over legitimate require proof of identity to join Airbnb)
the capacity to define the rules, norms, categories, symbolic violence, leading to the delegation of
and languages that make up the digital interactions users’ own independent reasoning and the • Definition of possible interactions (e.g., the
emergent coordination of their actions by the introduction of the “Like” button on Face-
platform. book)

• Definition of allowed and forbidden user


actions (e.g., the impossibility of editing a
tweet on Twitter)
Juridical Capacity to sanction via the created rules and the Disciplines users’ voice and participation and align • Sanction rule infringement (e.g., suspension
authority to arbitrate in disputes them to the platform’s interests and values of a Lyft driver’s account for using an alter-
native route)

https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press


• Report abuses (e.g., users flagging inappro-
priate content on Instagram)

• Management of access to the platform (e.g.,


restrict blacklisted users from using Tinder)
Discursive Power exercised in linguistic exchanges, which are Shapes collective discourse; asks users to delegate • Discourses of efficiency, technological
embodied and learned but are also generative of the their own reasoning to the platform determinism, or complexity promoted by the
B E Q

habitus platform owners (e.g., accuracy and neutral-


ity of Google Search results)

• Narratives created within users’ communi-


ties (e.g., the superiority of PC/Windows
gamers over Mac users)
Distinction Creation of categories and the mechanisms of Enacts the platform owner’s capability to shape • Definition of users’ performance, status, or
categorization that drive a user’s choice in the behavior and the digital habitus visibility metrics (e.g., definition of a prop-
platform erty’s valuation metrics in Booking)

• Creation of differentiated tools to define


hierarchies among users (e.g., ability to view
profiles while remaining anonymous for
LinkedIn premium users)
Table 1: continued

Form of power Definition Organization of immaturity Examples of the use of platform power

Crowd Users’ influence on platforms by the shared mass of Implicitly contests immaturity when used against • User viralization of a message or practice
their actions, coordinated or not; can become platform power accumulation (e.g., coordination of a protest through a
manipulated by the platform owner Telegram channel)

• Force a change in the platform’s functional-


ities (e.g., introduction of feedback options
for service providers in UpWork)

• Manipulation of users by covertly coordi-


nating their actions (e.g., mobilization of
Uber’s users to settle a regulatory dispute)
Hacking Exploitation of a platform’s features and categories Explicitly contests individual immaturity, as it • Evade the platform’s restrictions on func-
for a different purpose than the one originally requires the exercise of independent reasoning and tionality (e.g., jailbreaking of Apple’s iOS)

https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press


intended the understanding of the platform’s rules
• Performance of forbidden practices (e.g., get
away with selling counterfeit products on
Amazon)

• Abuse the logic of the algorithm in users’


favor (e.g., disguise a user’s IP address to
access additional content on Netflix)
H D P O I
19
20 B E Q

Figure 1: Platform Power Dynamics over Time

We articulate these power dynamics through the phases of platform evolution to


explain how platforms transform the agents’ relative positions over time and its
impact on organizing immaturity. Figure 1 depicts the evolution in three phases.
Phase 1: Platform Formation and Field Enclosure
In platform formation, the primary objective for the platform owner is to get users to
adopt the platform and regularly perform their practices on it. Constitutional, jurid-
ical, and discursive power are three of the forms of power through which platform
owners attempt to enclose the field organizing the emergence of immaturity, for
example, by designing a value creation model to create “exclusive control rights”
(Boyle 2003) over dimensions of practices that were previously in the public
domain. At the same time, these forms of power organize immaturity. First, consti-
tutional power (in the form of rules, norms, categories, and languages) defines how
and under what conditions interactions are performed and how the different agents
can express their preferences. Through juridical power, platform owners have the
capacity to define the sanctions that will promote or restrict an agent’s capacity to
operate on the platform, for example, who can exercise their voice on the platform
and who cannot and what sanctions are going to be applied to misbehavior. Finally,
discursive power creates a common narrative about the value of the platform,
restricting the capacity of agents to think beyond discourses that are presented as
truths.
Phase 2: Platform Domination within Original Field
Platform adoption and sustained use create the conditions for it to increasingly
occupy the field. The increasing participation of agents on the platform can change
the predominant accumulation logics of the different agents in the field, shaping the
digital habitus. The process of capital accumulation of different agents leveraged by

https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press


H D P O I 21

distinction power defines further how immaturity is organized by promoting the


processes of self-infliction of immaturity. Capital accumulation on platforms is
expressed as more data and the levying of fees, and the influx of users is repurposed
as capital to develop the platform. In turn, users invest different combinations of
capitals (data about their practices, social networks, money, and other assets) with
logics of both consumption (purchasing, sharing digital content) and profit and
accumulation (influencer, merchant, driver). To thrive in the accumulation dynam-
ics, agents must increasingly invest themselves in the platform, adapting their
strategies so they align with those that are relevant to the platform and embedded
in the digital habitus. Users adapt their practices to learn the specific logics. This
brings user practices closer to their archetypical category, and because they are better
rewarded, it further legitimizes the practices of the digital habitus. When users grasp
the critical elements of the digital habitus that correspond to their type, their practices
experience and enjoy a viral thrust that characterizes the platform logic (e.g., they
may become social media superusers or influencers). This success in increasing the
capital leveraged by the mechanism of distinction power, such as rating in the
platform, calls for higher investment, increasing the users’ dependence on the
platform and thus contributing to the self-inflictive process of immaturity.
At the same time, the processes that reinforce platform power accumulation
coexist with other processes that create tensions that call for change and adjustment.
Misalignments between users’ practices and their expected behavior can quickly
accumulate, destabilizing the platform’s operation or posing challenges for its
governance. In addition, platforms with massive user bases and innumerable inter-
actions can become problematic for the platform owner to police, creating the space
for agents to exercise their hacking power. These counterpower accumulation forces
can therefore create an emergent enlightenment—as opposed to immaturity—for the
agents.
Phase 3: Platform Cross-Field Expansion
In a third phase, the platform’s domination over the field leads to the possibility of
integrating new fields, further contributing to the accumulation of power and the
organizing of immaturity. Once a platform has become the dominant agent, a
position in which the structure itself acts on the owner’s behalf, it can expand the
scope of the field to new geographies and users and even enter and integrate
previously separated fields. For example, Uber’s launch of Uber Eats was deployed
using the platform’s extant base of users (drivers and passengers, viewed now as
commensals).
From the domination position, the owner can operate in the various fields with
great freedom, changing the exchange rate of the capitals at play and accumulating
power. Highly dominant expressions of constitutional power include interoperabil-
ity lock-ins, the use of dark patterns and biased information that impede sovereignty
of choice, and digital workplace design and control. Juridical power can be com-
manded from a position of gatekeeping, permitting arbitrary suspension of users’
accounts, biased arbitration in a dispute, the imposition of discriminatory clauses,
restriction of access to the platform, or limits on freedom of speech. Abuses of power

https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press


22 B E Q

are typically supported by the discursive power that enacts the discourse of Manifest
Destiny and uses opaque arguments to justify the increased accumulation of power
and the need to enforce the juridical power measures of the platform owner. Also in
this phase, a full deployment of distinction power relates to the platform owner’s
ability to monopolize the capture and processing of data through control of the
technological architecture. This can be used to drive user choice in multiple ways,
such as information asymmetries about the activities of a market or participant and
political influences on social media platforms.
The activation of powers in the cross-field expansion phase depicts the dynamics
within a field in a given moment, but it does not mean that the dominion of the
platform owner is absolute or that the platform becomes a “total institution”
(Bourdieu and Wacquant 2007). What we highlight is how the field’s structural
homology with the platform eases a fast concentration of powers and creates
remarkable obstacles to modifying this situation, whether from within (due to users’
habituation) or outside of the platform (because of network and lock-in effects,
barriers to entry, and technical complexity). In addition to this, the form of dominion
that the platform’s specific logic enables is very effective because by creating
multisided businesses, it invisibilizes the specific accumulation and struggle dynam-
ics with respect to the core practices users perform. For example, Amazon is “a place
to buy and sell online,” and the fact that the company accumulates capital from the
capture of user data and the use of its sellers’ capitals is not evident to the platform’s
users. Thus the platform’s “rules of the game” may appear to be somewhat objective
and relatively neutral, but they are in fact part of the organization of immaturity.

DISCUSSION
In this article, we present a sociosymbolic approach to power dynamics in digital
platforms and how they relate to organizing immaturity. A sociosymbolic approach
explains the structural and agentic dynamics of power accumulation leading to
organized immaturity. We contrast the power asymmetries between the platform
owner, as the central coordinating agent, and the rest of the agents directly partic-
ipating in the platform to present five main forms of power enacted by the platform
owner: constitutional, juridical, discursive, distinction, and crowd. We also present
two forms of power that explain how users counteract the platform owner’s power
accumulation: crowd and hacking. We explain how these forms of power are
fundamental for understanding the different ways in which immaturity is organized.
We show that constitutional power limits the symbolic world of the users and
therefore their capacity to influence new rules and vocabularies that orchestrate
participation. We explain how through juridical power, the platform owners have the
capacity to define the sanctions that restrict the voice and participation of users. We
show how through the digital habitus, the logic of the field is constituted, explaining
the emergence of immaturity and its self-infliction. However, we also argue that
distinction power enacts the platform owner’s capability to shape behavior through
creating evaluative infrastructures that mediate the emergence of immaturity. Fur-
thermore, we argue that the construction of a narrative of omniscience, through

https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press


H D P O I 23

discursive power, explicitly asks users to delegate their own reasoning to the
platform. We also highlight the existence of forms of power (hacking and crowd)
that help users to accumulate power and resist the central authority of the platform
owners.
Finally, we describe power dynamics and their relation to organized immaturity
through three phases: first, platform formation, where forms of power—mainly
constitutional, juridical, and discursive—operate to promote the field enclosure
and set the basis for immaturity to occur; second, platform domination in the field,
where distinction power promotes the field reproduction and processes of self-
infliction of immaturity, while hacking and crowd power create resistance to the
central authority; and third, platform cross-field expansion, in which power accu-
mulation dynamics lead to the integration of new fields and increasing dynamics of
immaturity. In defining the power accumulation dynamics, we explain the emergent
character of immaturity and its relation to agents’ strategies.
By focusing on the digital platform and its power dynamics, we contribute in two
ways to the current literature. First, we build a framework that explains the orga-
nizing dynamics of immaturity, based on the relations between the platform struc-
ture, the digital objects, and the agents’ strategies. Through this, we expand the
understanding of organized immaturity in the light of sociological perspectives. Our
framework analyzes how immaturity is constituted in practice and explains and
nuances the possibility of emergence and the self-infliction dimensions of immatu-
rity. Second, we provide a dynamic framework of platform power accumulation
contributing to the platform literature. Finally, we also provide policy recommen-
dations on how to tackle immaturity, and we highlight potential avenues for further
research.
Rethinking Organized Immaturity from a Sociosymbolic Perspective
A sociosymbolic perspective on digital platforms and its power dynamics can push
the boundaries of current concepts of organized immaturity toward a post-Kantian
and more sociologically grounded view (Scherer and Neesham 2020). This contrib-
utes to the understanding of organized immaturity in three ways. First is by explain-
ing the different components of the emergence of immaturity through power
struggles. We show how struggles are the result of agents’ different strategies,
heterogeneously shaped by their positions on the platform and their practices, but
also by their discourses and the history of experiences of each individual that shape
the digital habitus. By showing the dynamics in these struggles, we contribute to
explaining the process through which immaturity emerges as a nonorchestrated
phenomenon.
Second, we explain self-infliction by moving away from the more political
understandings of autonomy erosion. Political perspectives of immaturity look at
the individual and its “(in)capacity for public use of reason” (Scherer and Neesham
2020, 1) and consider the “delegation of decision making to impersonal authorities
they cannot comprehend or control” (Scherer and Neesham 2020, 4) as a condition
of the individual. We, however, adopt a sociological view that focuses on the
generation of practices and places the individual in a space of sociosymbolic power

https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press


24 B E Q

struggles. We complement previous literature exploring the symbolic aspects of


technology and its impacts on society and, more concretely, on autonomy erosion
(Zuboff 2019; Stark and Pais 2020; Fayard and Weeks 2014) by providing a set of
forms of power that articulate how self-infliction is embedded in the digital habitus
and thus how immaturity is organized. Our sociosymbolic perspective explains how
the conditions of agency are shaped by the specific structure of the platform and its
power dynamics.
Last, looking at fields through the power dynamics between the different agents
can shed explanatory light on the formation process of organized immaturity. The
relationship between habitus and field operates in two ways: while the field struc-
tures the habitus as the embodiment of the immanent necessity of a field, the habitus
makes the field a meaningful space in which the agents may invest their capitals and
themselves (Bourdieu and Wacquant 2007). By defining the stages through which
this relationship unfolds, we contribute to showing the emergent, dynamic, and
accumulative nature of organized immaturity.
Contribution to the Understanding of Platform Power Accumulation
We have approached organized immaturity by analyzing platforms as spaces of
coordination and production of practices, shaped by relations engrained into a digital
habitus and the logic of the field. By better understanding the forms of power and the
role they play in field transformation, we have identified more clearly the different
forms of power accumulation through which digital platforms can become vehicles
for organized immaturity and its dynamics. This contributes to the literature of
platforms in the following ways. First, our description of the structural process of
power accumulation on the platform expands market and network approaches
(Jacobides 2021; Khan 2018; Eaton et al. 2015) by showing the importance of the
social, cultural, and symbolic dimensions of capital. This lays the foundations for
fundamentally reconceptualizing platform power and further explaining how power
is exercised by the platform owner (Cutolo and Kenney 2020; Kenney, Bearson, and
Zysman 2019).
Second, we enrich structural approaches to platforms by presenting how fields can
be transformed through dynamics of power accumulation that extend beyond the
consequences of an asymmetric structure (Curchod et al. 2019; Hurni, Huber, and
Dibbern 2022). Furthermore, our framework shows how platforms can be reshaped
by the interaction of agents’ strategies and the reconfiguration of the fields. By
introducing a field view, we provide a more holistic scheme in which power is both
accumulated and contested. We also highlight how agents other than the platform
owner play a role in producing and exercising forms of power. This nuances our
understanding of field dynamics and agent interaction in the context of platform
power dynamics.
Third, our model complements sociomaterial studies on platform power (e.g.,
Beer 2017; Kornberger 2017; Stark and Pais 2020) with the notion of the digital
habitus and its relation to organized immaturity. Other authors have analyzed
technological affordances as social designations of a space and the social and
cultural factors that signify a space and set a generative principle of governance

https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press


H D P O I 25

(Jung and Lyytinen 2014). Although these authors do not talk explicitly about
habitus or social capital, they reflect on the generative reproduction of norms by
individuals in contact with their social spaces; this is very similar to the Bourdieu
definition of habitus in social spaces. We complement the sociomaterial view of
platforms by showing how the digital habitus works and by emphasizing the role of
the platform as an organizing agent with a privileged capacity of capital accumula-
tion. We present the platform as a space of symbolically mediated power relation-
ships in which the digital objects and structural elements interplay to conform the
logic of the field. We provide an understanding of the multifaceted nature of power
as a process resulting from agents’ practices and strategies, the habitus, and capital
accumulation in a field. We argue that this conceptualization defines power in
platforms not only as an “instrument” (Zuboff 2019) at the service of the platform
owners but as a web of relations utilized by agents who can better exploit the
different forms of capital. We also contribute to the debate about the coordinating
role of platforms and how they create generative forms of distributed control while
power remains centralized, in an interplay between hierarchical and heterarchical
power relations (Kornberger et al. 2017).
Bourdieu’s (1977, 1990) concepts of capitals, habitus, and distinction have been
used before in the study of the social consequences of digitalization and platforms’
increase of power (e.g., Levina and Arriaga 2014; Fayard and Weeks 204; Romele
and Rodighiero 2020). We complement that research with a view of platforms’
accumulation of power and its role in the organizing of immaturity. We go beyond
the explanation of distinction power to define constitutional, juridical, discourse,
crowd, and hacking forms of power, thereby offering a more complete view of how
platforms accumulate power and organize immaturity.
Contributions to Practice and Avenues for Future Research
Our article provides a conceptual framework to practitioners that can enable
platform owners, users, and policy makers to fundamentally rethink how they
might address the platforms’ negative consequences for society. First, it highlights
immaturity as a relevant concept to address social issues in platforms. Our detailed
understanding of the mechanisms leading to immaturity and the manipulation of
individuals’ decisions can help policy makers to identify and set limits on these types
of powers, especially in the light of platform domination. By explaining the orga-
nizing dynamics of immaturity, we direct attention to the more holistic assessments
of the social consequences of platforms. Concretely, we emphasize how these are not
just concerned with the concentration in specific industries (such as retailing or
advertising) but also involve constraints on human rights (such as freedom of
speech). Furthermore, we show how the consequences of organizing our practices
through platforms are embedded in social structures and expressed in the transfor-
mation of fields. We believe that this line of thought is fundamental if we are to
collectively rethink the social role of platforms.
Our article has also limitations that open up avenues for further research. We have
identified not only forms of platform power accumulation but also forms of platform
counterpower accumulation. As our focus in this article has been on how platforms
organize immaturity, we have devoted more attention to the forms of power

https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press


26 B E Q

accumulation. However, future work is needed to deepen our understanding of how


platforms lose power. For example, in recent years, we have witnessed an increasing
backlash against big tech platforms, fueled by reputational scandals and vigorous
societal complaints (Joyce and Stuart 2021; Gawer and Srnicek 2021). We have also
observed a new wave of regulatory initiatives that intend to curb platforms’ power by
forcing interoperability and limiting self-preferencing and acquisitions (Cusumano
et al. 2021; Jacobides, Bruncko, and Langen 2020), even when the effectiveness of
these policies is being debated (Rikap and Lundvall 2021). For example, in Europe,
the new legislation of the Digital Markets Act (European Commission 2022a) and
the Digital Services Act (European Commission 2022b) are respectively intended to
create more contestability in digital platform markets. In the United States, there has
been intense debate around the possible revocation of Section 230, which has so far
provided a shield for platforms’ activities in social networks (SCDP 2019), leading
to abuses of power and increasing immaturity. In parallel to regulatory or external
counterpower mechanisms, research into power dynamics could also analyze the
flows of affects and affective intensification (Just 2019) that happen with the abuse
of the digital habitus. Incipient research (e.g., Just 2019; Castelló and Lopez-Berzosa
2023) has shown how these flows of affects not only shape collective meanings but
can also lead to increasing forms of hate speech and the renaissance of populist
politics. More should be researched about what forms of counterpower may emerge
in society to reduce populism and hate speech. We believe that our framework sets
grounds for studying the more concrete practices of immaturity in platforms but also
new forms of resistance.

CONCLUSION
Building on the concepts of fields, capitals, and habitus, we propose a sociosymbolic
framework to explain organized immaturity in digital platforms. We articulate six
forms of power that characterize the different ways in which platforms organize
immaturity. It is our suggestion that a more precise understanding of the digital
platforms’ role in driving organized immaturity can become the basis for funda-
mentally rethinking the role of the digital platform in society. Can the processes that
lead to organized immaturity be reoriented toward organized enlightenment? We
argue that a first step in this direction is to better understand how power is performed
in digital platforms, which is what our framework contributes to explaining.

Acknowledgments

We express our gratitude to this special issue’s editorial team, and particularly to Dennis
Schoene-born, for their extraordinary work in helping us develop this article into its current
form. We also express our appreciation to the three anonymous reviewers who provided
valuable guidance during the editorial process. We are thankful to the Research Council of
Norway, project “Algorithmic Accountability: Designing Governance for Responsible Dig-
ital Transformations” (grant 299178) and the British Academy, project “Fighting Fake News
in Italy, France and Ireland: COVID-19” (grant COVG7210059) for supporting this
research.

https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press


H D P O I 27

REFERENCES
Adner, Ron, and Rahul Kapoor. 2010. “Value Creation in Innovation Ecosystems: How the
Structure of Technological Interdependence Affects Firm Performance in New
Technology Generations.” Strategic Management Journal 31 (3): 306–33.
Alaimo, Cristina, and Jannis Kallinikos. 2017. “Computing the Everyday: Social Media as
Data Platforms.” Information Society 33 (4): 175–91.
Aral, Sinan, and Dylan Walker. 2012. “Identifying Influential and Susceptible Members of
Social Networks.” Science 337 (6092): 337–41.
Baldwin, Carliss Y., and C. Jason Woodard. 2009. “The Architecture of Platforms: A Unified
View.” In Platforms, Markets and Innovation, edited by Annabelle Gawer, 32.
Cheltenham, UK: Edward Elgar
Beer, David. 2017. “The Social Power of Algorithms.” Information, Communication, and
Society 20 (1): 1–13.
Bennett, W. Lance, Alexandra Segerberg, and Shawn Walker. 2014. “Organization in the
Crowd: Peer Production in Large-Scale Networked Protests.” Information, Commu-
nication, and Society 17 (2): 232–60.
Bhargava, Hemant K. 2022. “The Creator Economy: Managing Ecosystem Supply,
Revenue-Sharing, and Platform Design.” Management Science 68 (7): 5233–51.
Boudreau, Kevin. 2010. “Open Platform Strategies and Innovation: Granting Access
vs. Devolving Control.” Management Science 56 (10): 1849–72.
Bourdieu, Pierre. 1969. “Intellectual Field and Creative Project.” Social Science Information
8 (2): 89–119.
Bourdieu, Pierre. 1977. Outline of a Theory of Practice. Cambridge: Cambridge University
Press.
Bourdieu, Pierre. 1979. “Symbolic Power.” Critique of Anthropology 4 (13–14): 77–85.
Bourdieu, Pierre. 1984. Distinction: A Social Critique of the Judgement of Taste. Cambridge,
MA: Harvard University Press.
Bourdieu, Pierre. 1987. “The Force of Law: Toward a Sociology of the Juridical Field.”
Hastings Law Journal 38 (5): 814–53.
Bourdieu, Pierre. 1989. “Social Space and Symbolic Power.” Sociological Theory 7 (1):
14–25.
Bourdieu, Pierre. 1990. The Logic of Practice. Redwood City, CA: Stanford University
Press.
Bourdieu, Pierre. 1991. Language and Symbolic Power. Cambridge, MA: Harvard Univer-
sity Press.
Bourdieu, Pierre. 1993. “Génesis y estructura del campo burocrático.” Actes de la Recherche
en Sciences Sociales 96–97: 49–62.
Bourdieu, Pierre. 2005. “Principles of an Economic Anthropology.” In The Handbook of
Economic Sociology, 2nd ed., 75–89. Princeton, NJ: Princeton University Press.
Bourdieu, Pierre. 2011. “The Forms of Capital.” In The Sociology of Economic Life, 3rd ed.,
edited by Mark Granovetter and Richard Swedberg, 241–58. New York: Routledge.
Bourdieu, Pierre. 2014. On the State: Lectures at the Collège de France, 1989–1992. Edited
by Patrick Champagne. Cambridge: Polity Press.
Bourdieu, Pierre, and Loïc Wacquant, eds. 2007. An Invitation to Reflexive Sociology.
Malden, MA: Polity Press.
Boyle, James. 2003. “The Second Enclosure Movement and the Construction of the Public
Domain.” Law and Contemporary Problems 66 (1): 42.

https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press


28 B E Q

Bucher, Eliane, Christian Fieseler, Matthes Fleck, and Christoph Lutz. 2018. “Authenticity
and the Sharing Economy.” Academy of Management Discoveries 4 (3): 294–313.
Bucher, Eliane Léontine, Peter Kalum Schou, and Matthias Waldkirch. 2021. “Pacifying the
Algorithm: Anticipatory Compliance in the Face of Algorithmic Management in the
Gig Economy.” Organization 28 (1): 44–67.
Busch, Christoph, Inge Graef, Jeanette Hofmann, and Annabelle Gawer. 2021. Uncovering
Blindspots in the Policy Debate on Platform Power: Final Report. Luxembourg:
Publications Office of the European Union. https://platformobservatory.eu/app/
uploads/2021/03/05Platformpower.pdf.
Caillaud, Bernard, and Bruno Jullien. 2003. “Chicken and Egg: Competition among Inter-
mediation Service Providers.” RAND Journal of Economics 34 (2): 309–28.
Calo, Ryan, and Alex Rosenblat. 2017. “The Taking Economy: Uber, Information, and
Power.” SSRN Electronic Journal. https://doi.org/10/gfvmg3.
Castelló, Itziar, and David Lopez-Berzosa. 2023. “Affects in Online Stakeholder Engage-
ment: A Dissensus Perspective.” Business Ethics Quarterly 33 (1): 180–215.
Castells, Manuel. 2011. “Network Theory: A Network Theory of Power.” International
Journal of Communication 5: 773–87.
Clegg, Stewart R. 1989. Organization Theory and Class Analysis: New Approaches and
New Issues. New York: De Gruyter.
Costa, Ricardo L. 2006. “The Logic of Practices in Pierre Bourdieu.” Current Sociology
54 (6): 873–95.
Constantinides, Panos, Ola Henfridsson, and Geoffrey G. Parker. 2018. “Introduction:
Platforms and Infrastructures in the Digital Age.” Information Systems Research
29 (2): 381–400
Crémer, Jacques, Yves-Alexandre de Montjoye, and Heike Schweitzer. 2019. Competition
Policy for the Digital Era. Luxembourg: Publications Office of the European Union.
https://ec.europa.eu/competition/publications/reports/kd0419345enn.pdf.
Culpepper, Pepper D., and Kathleen Thelen. 2020. “Are We All Amazon Primed? Con-
sumers and the Politics of Platform Power.” Comparative Political Studies 53 (2):
288–318.
Curchod, Corentin, Gerardo Patriotta, Laurie Cohen, and Nicolas Neysen. 2019. “Working
for an Algorithm: Power Asymmetries and Agency in Online Work Settings.”
Administrative Science Quarterly 65 (3): 644–76.
Cusumano, Michael A., Annabelle Gawer, and David B. Yoffie. 2019. The Business of
Platforms: Strategy in the Age of Digital Competition, Innovation, and Power.
New York: Harper Business.
Cusumano, Michael, Annabelle Gawer, and David Yoffie. 2021. “Can Self-Regulation Save
Digital Platforms?” Industrial and Corporate Change 30 (5): 1259–85.
Cutolo, Donato, and Martin Kenney. 2020. “Platform-Dependent Entrepreneurs: Power
Asymmetries, Risks, and Strategies in the Platform Economy.” Academy of Man-
agement Perspectives 35 (4): 584–685.
Dewey, John. 1939. Freedom and Culture. New York: Putnam.
Eaton, Ben, Silvia Elaluf-Calderwood, Carsten Sørensen, and Youngjin Yoo. 2015. “Dis-
tributed Tuning of Boundary Resources: The Case of Apple’s IOS Service System.”
MIS Quarterly 39 (1): 217–43.
Etter, Michael, and Oana Brindusa Albu. 2021. “Activists in the Dark: Social Media
Algorithms and Collective Action in Two Social Movement Organizations.” Orga-
nization 28 (1): 68–91.

https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press


H D P O I 29

European Commission. 2022a. “Deal on Digital Markets Act.” March 24. https://www.
europarl.europa.eu/news/es/press-room/20220315IPR25504/deal-on-digital-mar
kets-act-ensuring-fair-competition-and-more-choice-for-users.
European Commission. 2022b. “The Digital Services Act Package.” https://digital-strategy.
ec.europa.eu/en/policies/digital-services-act-package.
Evans, David. 2003. “Some Empirical Aspects of Multi-sided Platform Industries.” Review
of Network Economics 2 (3): 191–209.
Faraj, Samer, Stella Pachidi, and Karla Sayegh. 2018. “Working and Organizing in the Age
of the Learning Algorithm.” Information and Organization 28 (1): 62–70.
Fayard, Anne-Laure, and John Weeks. 2014. “Affordances for Practice.” Information and
Organization 24 (4): 236–49.
Flyverbom, Mikkel, Ronald Deibert, and Dirk Matten. 2019. “The Governance of Digital
Technology, Big Data, and the Internet: New Roles and Responsibilities for
Business.” Business and Society 58 (1): 3–19.
Fuchs, Christian. 2007. Internet and Society: Social Theory in the Information Age.
New York: Routledge.
Furman, Jason. 2019. Unlocking Digital Competition: Report of the Digital Competition
Expert Panel. https://assets.publishing.service.gov.uk/government/uploads/system/
uploads/attachment_data/file/785547/unlocking_digital_competition_furman_
review_web.pdf.
Garud, Raghu, Arun Kumaraswamy, Anna Roberts, and Le Xu. 2022. “Liminal Movement
by Digital Platform-Based Sharing Economy Ventures: The Case of Uber
Technologies.” Strategic Management Journal 43 (3): 447–75.
Gawer, Annabelle. 2014. “Bridging Differing Perspectives on Technological Platforms:
Toward an Integrative Framework.” Research Policy 43 (7): 1239–49.
Gawer, Annabelle. 2021. “Digital Platforms’ Boundaries: The Interplay of Firm Scope,
Platform Sides, and Digital Interfaces.” Long Range Planning 54 (5): 102045.
Gawer, Annabelle, and Rebecca Henderson. 2007. “Platform Owner Entry and Innovation in
Complementary Markets: Evidence from Intel.” Journal of Economics and Man-
agement Strategy 16 (1): 1–34.
Gawer, Annabelle, and Nick Srnicek. 2021. Online Platforms: Economic and Societal Effects.
Brussels: Panel for the Future of Science and Technology, European Parliament.
https://www.europarl.europa.eu/stoa/en/document/EPRS_STU(2021)656336.
Ghosh, Arpita, and Patrick Hummel. 2014. “A Game-Theoretic Analysis of Rank-Order
Mechanisms for User-Generated Content.” Journal of Economic Theory 154
(November): 349–74.
Gillespie, Tarleton. 2010. “The Politics of ‘Platforms.’” New Media and Society 12 (3):
347–64.
Goldhaber, Michael H. 1997. “The Attention Economy and the Net.” First Monday 2 (4).
Gulati, Ranjay, Phanish Puranam, and Michael Tushman. 2012. “Meta-Organization
Design: Rethinking Design in Interorganizational and Community Contexts.” Stra-
tegic Management Journal 33 (6): 571–86.
Hilferding, Rudolph. 2005. Finance Capital: A Study of the Latest Phase of Capitalist
Development. London: Taylor and Francis.
Hunsinger, Jeremy, and Andrew Schrock. 2016. “The Democratization of Hacking and
Making.” New Media and Society 18 (4): 535–38.
Hurni, Thomas, Thomas L. Huber, and Jens Dibbern. 2022. “Power Dynamics in Software
Platform Ecosystems.” Information Systems Journal 32 (2): 310–43.

https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press


30 B E Q

Ignatow, Gabe, and Laura Robinson. 2017. “Pierre Bourdieu: Theorizing the Digital.”
Information, Communication, and Society 20 (7): 950–66.
Introna, Lucas D. 2016. “Algorithms, Governance, and Governmentality: On Governing
Academic Writing.” Science, Technology, and Human Values 41 (1): 17–49.
Jacobides, Michael G. 2021. “What Drives and Defines Digital Platform Power?” White
paper, Evolution Ltd., April 18. https://www.evolutionltd.net/post/what-drives-and-
defines-digital-platform-power.
Jacobides, Michael G., Martin Bruncko, and Rene Langen. 2020. “Regulating Big Tech in
Europe: Why, So What, and How Understanding Their Business Models and Eco-
systems Can Make a Difference.” White paper, Evolution Ltd., December 20. https://
www.evolutionltd.net/post/regulating-big-tech-in-europe.
Jordan, Tim. 2009. “Hacking and Power: Social and Technological Determinism in the
Digital Age.” First Monday 14 (7).
Joyce, S., & Stuart, M. (2021). Trade union responses to platform work: An evolving tension
between mainstream and grassroots approaches. In A Modern Guide to Labour and
the Platform Economy, edited by Jan Drahokoupil and Kurt Vandaele, 177–92.
Cheltenham, UK: Edward Elgar.
Julien, Chris. 2015. “Bourdieu, Social Capital and Online Interaction.” Sociology 49 (2):
356–73.
Jung, Yusun, and Kalle Lyytinen. 2014. “Towards an Ecological Account of Media Choice:
A Case Study on Pluralistic Reasoning While Choosing Email.” Information Systems
Journal 24 (3): 271–93.
Just, Natascha, and Michael Latzer. 2017. “Governance by Algorithms: Reality Construction
by Algorithmic Selection on the Internet.” Media, Culture, and Society 39 (2):
238–58.
Just, Sine N. 2019. “An Assemblage of Avatars: Digital Organization as Affective Intensi-
fication in the GamerGate Controversy.” Organization 26 (5): 716–38.
Katz, Michael L., and Carl Shapiro. 1985. “Network Externalities, Competition, and
Compatibility.” American Economic Review 75 (3): 424–40.
Kelkar, Shreeharsh. 2018. “Engineering a Platform: The Construction of Interfaces, Users,
Organizational Roles, and the Division of Labor.” New Media and Society 20 (7):
2629–46.
Kenney, Martin, Dafna Bearson, and John Zysman. 2019. “The Platform Economy Matures:
Pervasive Power, Private Regulation, and Dependent Entrepreneurs.” SSRN Elec-
tronic Journal. DOI: 10.2139/ssrn.3497974.
Khan, Lina M. 2018. “Sources of Tech Platform Power.” Georgetown Law Technology
Review 2 (2): 325–34.
Kitchens, Brent, Steve L. Johnson, and Peter Gray. 2020. “Understanding Echo Chambers
and Filter Bubbles: The Impact of Social Media on Diversification and Partisan Shifts
in News Consumption.” MIS Quarterly 44 (4): 1619–49.
Kolagar, Milad, Vinit Parida, and David Sjödin. 2022. “Ecosystem Transformation for
Digital Servitization: A Systematic Review, Integrative Framework, and Future
Research Agenda.” Journal of Business Research 146 (July): 176–200.
Kornberger, Martin. 2017. “The Visible Hand and the Crowd: Analyzing Organization
Design in Distributed Innovation Systems.” Strategic Organization 15 (2): 174–93.
Kornberger, Martin, Dane Pflueger, and Jan Mouritsen. 2017. “Evaluative Infrastructures:
Accounting for Platform Organization.” Accounting, Organizations, and Society 60
(July): 79–95.

https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press


H D P O I 31

Kretschmer, Tobias, Aija Leiponen, Melissa Schilling, and Gurneeta Vasudeva. 2022.
“Platform Ecosystems as Metaorganizations: Implications for Platform Strategies.”
Strategic Management Journal 43 (3): 405–24.
Krona, Michael. 2015. “Contravigilancia y videoactivismo desde la plaza Tahrir. Sobre las
paradojas de la sociedad contravigilante.” In Videoactivismo y movimientos sociales,
edited by David Montereo and Francisco Sierra, 17. Barcelona: Gedisa.
Lanier, Jaron. 2018. Ten Arguments for Deleting Your Social Media Accounts Right Now.
New York: Random House.
Lessig, Lawrence. 2009. El Código 2.0. Madrid: Traficantes de Sueños.
Levina, Natalia, and Manuel Arriaga. 2014. “Distinction and Status Production on User-
Generated Content Platforms: Using Bourdieu’s Theory of Cultural Production to
Understand Social Dynamics in Online Fields.” Information Systems Research
25 (3): 443–666.
Lianos, Ioannis, and Bruno Carballa-Smichowski. 2022. “A Coat of Many Colours: New
Concepts and Metrics of Economic Power in Competition Law and Economics.”
Journal of Competition Law and Economics 18 (4): 795–831.
Lupton, Deborah. 2016. The Quantified Self. Hoboken, NJ: John Wiley.
Lynskey, Orla. 2017. “Regulating ‘Platform Power.’” LSE Legal Studies Working Paper
1/2017, London School of Economics.
Lynskey, Orla. 2019. “Grappling with ‘Data Power’: Normative Nudges from Data Protec-
tion and Privacy.” Theoretical Inquiries in Law 20 (1): 189–220.
Lyon, David. 2018. The Culture of Surveillance: Watching as a Way of Life. 1st ed. Medford,
MA: Polity Press.
Maddox, Jessica, and Jennifer Malson. 2020. “Guidelines without Lines, Communities
without Borders: The Marketplace of Ideas and Digital Manifest Destiny in Social
Media Platform Policies.” Social Media þ Society 6 (2).
Margetts, Helen, Vili Lehdonvirta, Sandra González-Bailón, Jonathon Hutchinson, Jonathan
Bright, Vicki Nash, and David Sutcliffe. 2021. “The Internet and Public Policy:
Future Directions.” Policy and Internet. DOI: 10.1002/poi3.263.
Martin, Kirsten E. 2022. “Algorithmic Bias and Corporate Responsibility: How Companies
Hide behind the False Veil of the Technological Imperative.” In The Ethics of Data
and Analytics: Concepts and Cases, 36–50. New York: Auerbach.
McCoy, Jennifer, Tahmina Rahman, and Murat Somer. 2018. “Polarization and the Global
Crisis of Democracy: Common Patterns, Dynamics, and Pernicious Consequences
for Democratic Polities.” American Behavioral Scientist 62 (1): 16–42.
McIntyre, David, Arati Srinivasan, Allan Afuah, Annabelle Gawer, and Tobias Kretschmer.
2021. “Multi-sided Platforms as New Organizational Forms.” Academy of Manage-
ment Perspectives 35 (4): 566–83.
Noble, Safiya Umoja. 2018. Algorithms of Oppression: How Search Engines Reinforce
Racism. Illustrated ed. New York: NYU Press.
O’Connor, Cailin, and James Owen Weatherall. 2019. The Misinformation Age: How False
Beliefs Spread. New Haven, CT: Yale University Press.
Orlikowski, Wanda J., and Susan V. Scott. 2015. “The Algorithm and the Crowd: Consid-
ering the Materiality of Service Innovation.” MIS Quarterly 39 (1): 201–16.
Pasquale, Frank. 2015. The Black Box Society: The Secret Algorithms That Control Money
and Information. Cambridge, MA: Harvard University Press.
Philbeck, Thomas, and Nicholas Davis. 2018. “The Fourth Industrial Revolution: Shaping a
New Era.” Journal of International Affairs 72 (1): 17–22.

https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press


32 B E Q

Portes, Alejandro. 1998. “Social Capital: Its Origins and Applications in Modern
Sociology.” Annual Review of Sociology 24 (1): 1–24.
Richards, Neil M. 2012. “The Dangers of Surveillance Symposium: Privacy and
Technology.” Harvard Law Review 126 (7): 1934–65.
Rikap, Cecilia, and Bengt-Åke Lundvall. 2021. The Digital Innovation Race: Conceptual-
izing the Emerging New World Order. Cham, Switzerland: Springer.
Rochet, Jean-Charles, and Jean Tirole. 2003. “Platform Competition in Two-Sided
Markets.” Journal of the European Economic Association 1 (4): 990–1029.
Rodon Modol, Joan, and Ben Eaton. 2021. “Digital Infrastructure Evolution as Generative
Entrenchment: The Formation of a Core–Periphery Structure.” Journal of Informa-
tion Technology 36 (4): 342–64.
Romele, Alberto, and Dario Rodighiero. 2020. “Digital Habitus or Personalization without
Personality.” HUMANA.MENTE Journal of Philosophical Studies 13 (37): 98–126.
Sandberg, Jörgen, and Mats Alvesson. 2011. “Ways of Constructing Research Questions:
Gap-Spotting or Problematization?” Organization 18 (1): 23–44.
Scherer, Andreas Georg, and Cristina Neesham. 2020. “New Challenges to Enlightenment:
Why Socio-technological Conditions Lead to Organized Immaturity and What to Do
about It.” SSRN Electronic Journal. DOI: https://doi.org/10/gj8mhq.
Scherer, Andreas Georg, Cristina Neesham, Dennis Schoeneborn, and Markus Scholz. 2020.
“Call for Submissions Business Ethics Quarterly Special Issue on: Socio-
technological Conditions of Organized Immaturity in the Twenty-First Century.”
Business Ethics Quarterly 30 (3): 440–44.
Schwab, Klaus. 2017. The Fourth Industrial Revolution. New York: Crown Business.
Seymour, Richard. 2019. “The Machine Always Wins: What Drives Our Addiction to Social
Media.” Guardian, August 23, sec. Technology.
Srnicek, Nick. 2016. Platform Capitalism. Malden, MA: Polity Press.
Stark, David, and Ivana Pais. 2020. “Algorithmic Management in the Platform Economy.”
Sociologica 14 (3): 47–72.
Stigler Committee on Digital Platforms. 2019. “Final Report.” Stigler Center for the Study of
the Economy and the State. https://www.chicagobooth.edu/-/media/research/stigler/
pdfs/digital-platforms—committee-report—stigler-center.pdf.
Striphas, Ted. 2010. “How to Have Culture in an Algorithmic Age.” https://www.
thelateageofprint.org/2010/06/14/how-to-have-culture-in-an-algorithmic-age/.
Teece, David J. 2017. “Dynamic Capabilities and (Digital) Platform Lifecycles.” Advances
in Strategic Management 37: 211–25.
Thelen, Kathleen. 2018. “Regulating Uber: The Politics of the Platform Economy in Europe
and the United States.” Perspectives on Politics 16 (4): 938–53.
Tiwana, Amrit, Benn Konsynski, and Ashley A. Bush. 2010. “Platform Evolution: Coevo-
lution of Platform Architecture, Governance, and Environmental Dynamics.” Infor-
mation Systems Research 21 (4): 675–87.
Trittin-Ulbrich, Hannah, Andreas Georg Scherer, Iain Munro, and Glen Whelan. 2021.
“Exploring the Dark and Unexpected Sides of Digitalization: Toward a Critical
Agenda.” Organization 28 (1): 8–25.
Vaast, Emmanuelle, Elizabeth J. Davidson, and Thomas Mattson. 2013. “Talking about
Technology: The Emergence of a New Actor Category through New Media.” MIS
Quarterly 37 (4): 1069–92.
Valdez, Jimena. 2023. “The Politics of Uber: Infrastructural Power in the United States and
Europe.” Regulation and Governance 17 (1): 177–194.

https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press


H D P O I 33

von Hippel, Eric. 2006. Democratizing Innovation. Cambridge, MA: MIT Press.
Walker, Michael, Peter Fleming, and Marco Berti. 2021. “‘You Can’t Pick Up a Phone and
Talk to Someone’: How Algorithms Function as Biopower in the Gig Economy.”
Organization 28 (1): 26–43.
Wu, Liang, Fred Morstatter, Kathleen M. Carley, and Huan Liu. 2019. “Misinformation in
Social Media: Definition, Manipulation, and Detection.” ACM SIGKDD Explora-
tions Newsletter 21 (2): 80–90.
Ziccardi, G. (2012). Resistance, Liberation Technology and Human Rights in the Digital
Age. Dordrecht, Netherlands: Springer Science & Business Media.
Zittrain, Jonathan. 2009. “Law and Technology: The End of the Generative Internet.”
Communications of the ACM 52 (1): 18–20.
Zuboff, Shoshana. 2019. The Age of Surveillance Capitalism: The Fight for a Human Future
at the New Frontier of Power. London: Profile Books.

...

Mí Há ([email protected], corresponding author) is a postgraduate rese-


archer and PhD candidate at Surrey Business School, University of Surrey. He holds a MA
(analyse politique et économique; Hons) from Paris 13–Sorbonne Paris Cité and a Licentiate
degree (Hons) in economics from Universidad de Buenos Aires. He is interested in society’s
transformation through digitalization, with a focus on strategy and competition in digital
platforms.

I Có is a reader at Bayes Business School (formerly Cass) at City University of
London. She holds an executive MBA and a PhD from ESADE, Ramon Llull University and
a MSc from the College of Europe in Belgium. She is interested in social change in digital
contexts. Her research uses corporate social responsibility, deliberation, and social move-
ment theories to understand social and environmental challenges like climate change, plastic
pollution, and social polarization.

A G is chaired professor in digital economy and director of the Centre of
Digital Economy at the University of Surrey and a visiting professor of strategy and
innovation at Oxford University Saïd Business School. A pioneering scholar of digital
platforms and innovation ecosystems, she is a highly cited author or coauthor of more than
forty articles and four books, including The Business of Platforms: Strategy in the Age of
Digital Competition, Innovation, and Power (2019). Gawer is a digital expert for the UK
Competition and Markets Authorities, and she has advised the European Parliament and the
European Commission on digital platforms regulation as an expert in the EU Observatory of
the Online Platform Economy.

https://doi.org/10.1017/beq.2022.40
View publication stats
Published online by Cambridge University Press

You might also like