Introduction: Gender Narratives in the Platform Economy

The increasingly significant role of algorithmic management has a direct impact on epistemic rights of workers in platformised work arrangements. We recognise today’s algorithmic ecosystems as platforms that are transforming global value chains and restructuring labour markets. As artefacts of social power, platforms not only restructure work but also refashion the space of life. They challenge and chip away at the political consensus over social norms—displacing old rules and establishing new ones. This is the platform economy that knows more, knows better, and knows deeper on an ever-expanding time-space configuration built on data and digital intelligence. The dominant platform economy rooted in a neoliberal and neocolonial logic reproduces ideologies of social power, optimising knowledge to maximise profit. Knowledge based on other premises is not admissible. The epistemic standpoint of workers and their ways of knowing and being are thus invalidated and misrecognised, with little opportunity for participatory and collective knowledge modalities.

Women’s contribution to and experiences in the world of work have historically been minimised. In the developing world, up to 95% of the women work in the informal sector, beyond the pale of legislative guarantees (UN Women, 2015). The advent of digital labour platforms and e-commerce supply chains has been seen in policy discussions as presenting a never-before opportunity for women in the informal sector with the role of platformisation often held up as a new pathway for decent work that ‘formalises the informal’ by circumventing traditional feudal patriarchies in the economic sphere (OECD, 2018).

Yet, there is little evidence that platformised work destabilises the status quo. On the contrary, platformisation has meant precarious employment sans any labour protection. Poor women in developing countries are among the most vulnerable, as gender pay gaps and segregation persist in platform work and platform algorithms continue to reproduce or even exacerbate existing structural biases (Rani et al., 2022).

Digital labour platforms perform the crucial task of matching or connecting workers to customers, relying on several data points. This optimisation exercise is a highly layered algorithmic process. This is also true for e-commerce platforms that match consumers to sellers. The algorithm creates the veneer of neutrality through which platforms sustain their image as intermediaries who simply connect prospective service providers to clients or consumers. However, in reality, the algorithmic ecosystem intervenes at several stages of a worker’s journey through the platform, starting from registering on the platform to how transactions materialise, and rewards and punishments are experienced (Waldkirch et al., 2021). In the following sub-sections, I share four stories highlighting how gender impacts digital labourers in diverse conditions.

Damini and the Urban Company

Damini is a migrant from the rural northeast of the country, living in Bengaluru, the Silicon Valley of India. Damini works for the location-based platform—Urban Company, which provides home-based services ranging from cleaning, home installations, repairs, and carpentry to salon services and massages. Urban Company describes the workers providing these services as ‘service professionals’. Its application works by linking these workers to customer leads. Workers need a minimum level of credits to respond to customer leads, and conversely, responding to leads and successfully rendering the service earns workers credit points. A percentage of credits is taken as commission. Damini provides beauty and wellness services through Urban Company. During the COVID-19 lockdown, she returned to her village, but economic necessity compelled her to move back to Bengaluru and resume work.

This is what Damini says about her work:

We do not know how the price is fixed for each customer. This information stays in the app.

We also do not know how the company decides job allocation for different workers and why someone gets more tasks. We just accept what the app sends.

We cannot see customer reviews other workers have given. If we could, we won’t provide services to customers with poor reviews.

So how does Urban Company’s algorithmified workplace function? Workers register by sharing personal details on a web form and wait for a response from Urban Company. There does not seem to be a guarantee of a reply. The online profile of the worker is created by the platform’s management team members and not necessarily by the workers themselves.

Customer leads are allocated to workers by the algorithm, with no transparency. Criteria for the number of gigs per worker are not known. Workers are only told the service they are to perform and the customer’s address. The commission is operationalised through a system of credit balances. The platform sets the commission rate at a particular number of credits for each gig request, and the worker can accept a client request only upon paying the credit balance upfront. The calculations are not known to workers.

Urban Company offers discounts to customers during festivals, and the reduction in prices of services at this time is squeezed from the worker’s earnings. While market capture through discounts enables the platform to expand its footprint, this happens at the expense of workers.

Workers and customers rate each other, but only workers’ ratings are visible to customers and not the other way around. No online rating or other reporting mechanism exists for workers to create a record that their peers can also view. Lower ratings automatically trigger a re-training process irrespective of whether such ratings are justified. A post-facto complaints mechanism for reporting harassment at the workplace does exist, and the platform blocks customers after investigation.

Sushila, Yamuna, and Uber

Yamuna and Sushila both work for Uber. Sushila was running her own business—a clothing boutique—but after the rise of online e-commerce decimated her business, she decided to change careers. Now, Sushila works for Uber and Ola—an Indian ride-hailing platform that connects drivers and passengers, similar to Uber. In India, while some drivers may drive exclusively for one platform, others like Sushila drive for several simultaneously, taking advantage of the purported flexibility offered by these platforms. The excerpts below are from the interviews of Sushila and Yamuna:

We take up considerable risk to do our duty. Women drivers need safety. There should be CCTV in cars and an emergency call centre that women drivers can reach when they drive at night.

Sometimes, for 13 km, instead of 150 INR, the app shows our earnings as only 100 INR. We raise a complaint and ask why we have been credited less money. They say this may be because of a ‘network problem’, and since the route was not fully recorded on the GIS, the company received less from the passenger. Most of the time, they give us arbitrary reasons and don’t pay us the (balance) money.

So how does Uber’s algorithmified workplace function? Workers go to the Uber office with their identification, driving licence, and vehicle documents. After verification, the company onboards them. There is no clarity on how workers’ personal data will be used. The algorithm automatically matches drivers to customers, but there is no transparency regarding its workings. Drivers speculate that the factors may include ratings or the acceptance/rejection rate.

Drivers do not get to choose their destinations, as it remains unknown until they accept the ride, or their route, determined by the algorithm and GPS. The commission rate is fixed and not sensitive to real economic changes. Over time, as the company was able to corner a decent market share, there was also a reduction/roll-back of driver incentive programs.

Customers can rate drivers based on several factors, which are not disclosed to the driver. The drivers can see their ratings but are not told how their ratings impact other decisions in the algorithm. Drivers can also rate customers but cannot view a customer’s rating before deciding to accept/reject a ride. Such opacity can become a problem in the event of a dispute. Drivers’ accounts can be suspended or deactivated by the algorithm.

Jayashree and Amazon Mechanical Turk

Amazon Mechanical Turk (AMT) is a crowdsourcing website for businesses to hire remotely located ‘crowdworkers’ to perform discrete on-demand tasks that computers are currently unable to do. It is operated under Amazon Web Services and is owned by Amazon. Requesters can post tasks for crowdworkers, known as ‘Turkers’, to complete for a small fee. These tasks typically take the form of surveys, image labelling, question answering, and others that don’t require specialised skills for humans but are difficult or impossible for computers. Workers have to register on AMT.

Jayashree is a computer science graduate working on AMT for three years. Despite multiple attempts, she has not been able to procure an account of her own. She uses her cousin’s account and transfers half her monthly earnings to the cousin as ‘rent’. She is constrained by the need to care for her elderly mother, so she can only work from home.

Jayashree narrates a situation of extreme economic distress since the COVID-19 lockdown, observing that the higher-paying tasks have entirely dried up since the pandemic, thereby affecting her income. In 2019, she was making around 20,000 INR per month. Since the pandemic, however, she has earned less than 10,000 INR per month. Jayashree lives in the hope that she can get her own account on AMT. However, she has received no information on why her account has been repeatedly rejected and on what criteria it may be approved:

A lot of people have told me that they give jobs to US workers first and only then come to us. Indians get less work; only 20% of the available work is for Indians.

With AMT’s Masters Qualification, you get better work, I was told. But I don’t know on what basis AMT awards this.

This has happened to a lot of people—their work will be rejected, there will be no response, their account will be suspended, and there will be no response to emails.

For AMT, workers register with their name, address, gender, age, and nationality and wait for account approval. Reasons for acceptance/rejection are not clear. There is a grey market in worker accounts, with registered workers renting their accounts for a share in earnings or a hefty one-time price.

The algorithm automatically displays a certain list of tasks for workers. Far from being an open marketplace, the kind of tasks that are available for workers to choose are already predetermined by the AI. Workers have little agency over the performance of a task and strictly follow instructions from requesters, as any slight error can lead to the work being rejected, with consequent non-payment.

AMT charges clients a commission estimated to be about 20%; workers are charged transaction fees of about 3–4% per transaction. At one point of time, Amazon workers in India were paid in the form of gift cards instead of direct bank transfers.

Worker nationality affects the chances of landing work. Indian workers cannot bid for all tasks. Ratings and reviews are also crucial. Workers are given an ‘approval rate’ based on the percentage of tasks that are accepted/rejected by the requester they perform work for. Reasons behind the rejections can be found only by contacting requesters, who may or may not respond. Eligibility criteria for ‘Masters Qualification’ (a rank/score from AMT that enhances workers’ chances of landing a gig) are unclear to workers. Workers live under a constant threat of suspension, with little option to challenge the algorithm’s decision.

Sakhi, Diya, and the Self-Employed Women’s Association

The Self-Employed Women’s Association (SEWA) is a women’s trade union with a membership of 1.5 million self-employed women workers engaged in India’s informal economy. SEWA’s health team formed a health cooperative—The Lok Swasthya Mandli—that produces and markets herbal (Ayurvedic) medicines. SEWA was approached by Amazon to onboard its cooperatives onto a nationwide program that Amazon runs, called Saheli. This initiative aims to promote locally made products by women entrepreneurs in India. It advertises itself as an enabler of women entrepreneurs, helping them become successful sellers on Amazon. Purported benefits, as per the website, include reduced referral fee, personalised training, account management support, imaging and cataloguing support, increased customer visibility, and marketing support.

The experiences of SEWA with Saheli, as narrated by Sakshi and Diya, demonstrate how little of this promise has materialised:

Products listed on Saheli are not searchable on the Amazon main page. This is a big disadvantage because customers will never go to the Saheli platform to search; they will only search directly on Amazon.

The Amazon team had told us that data analytics such as SEO support and data for marketing would be available as part of the ‘package of services’. But this has not happened.

There is a premium charge for showcasing your products, whereby more people will be able to look at them. Those who avail of highly-priced service packages get priority in product search. We did not. So, if you search, our product appears on the bottom-most line of the shopfront page.

The figures that Diya and Sakshi reported about the sales on Amazon are startling. The cooperative sold no product in the last six months and merely one in the preceding year. The Saheli web platform does not give online sellers control over how their business is presented on the Saheli storefront.

Decoupling product searches on Amazon Saheli from the main Amazon platform means that microenterprises are not competing on an equal footing. The algorithm automatically determines how and where (priority) stores are displayed on the marketplace. More visibility requires an extra charge—so the algorithm can be purchased to work in your favour. Still, there is no guarantee of whether it will increase visibility and whether there is a link between visibility and sales.

Discussion: The Social Power of Platforms—A Feminist Analysis

Algorithmic management can be described as the deployment of ‘a diverse set of technological tools and techniques that structure the conditions of work, enabling the remote management of workforces’ (Mateescu & Nguyen, 2019). In machine learning environments, algorithmic management is neither bounded nor predictable. The platform economy optimises value extraction not only through endogenous processes that structure the conditions of work in the platform ecosystem but also through exogenous ones in which post-platform relationalities with prevailing economic, political, and cultural domains play a vital role. Algorithms thus derive from and reshape the realms of life tied to work. They wield agency to remake society. Using modalities of signification (meaning-making) and legitimation (rule-setting) platforms deploy algorithms to gain near-totalising social power.

The above narratives of women platform workers demonstrate the material conditions of work obtained through the opaque and exploitative algorithmic apparatus. Platforms normalise such opacity through the myths they build about flexibility and independence. To find work on the platform is to trade off the right to know. In none of the platforms surveyed were workers able to access information about the customer, their ratings, and past reviews, or get full information about the nature of the task. While worker data flows freely to the platform and is made available in limited ways to the customer, nothing is accessible to the worker.

Platforms argue that their workers are free to log in and log out as they please, accept or reject tasks that come their way, and have the freedom to go on leave whenever they so desire. While the formal flexibility offered by platforms is a significant motivator for many platform workers, they typically end up with little real choice of when and where to work (Wood et al., 2019). Workers fear cancellations or rejections of tasks, as the algorithm may penalise them with a lower rating. The opacity of algorithmic workings precludes the ability of workers to seek redressal and prevents them from challenging algorithmic decisions. To protect their job and income security, workers must simply resign themselves to the algorithm’s omnipotence, implying a punishing cost to agency.

The exacting control over workers in the endogenous operations of the algorithm thus erases the ability of workers to navigate relationships on the platform, entrenching a punitive regime that leaves them perpetually guessing about potential actions that can undercut their economic bottom lines. In a brutal paradox, the algorithm uses the worker’s own ‘labouring data’, as the very means of disciplinarity, squeezing labour surplus to gain market share. While labouring data fuels the platform’s intelligence rent, work itself mutates into an extractive social paradigm that individualises and disciplines labour power, accumulating the invaluable knowledge about differential social locations that then can be exploited differentially.

Algorithmic work life also needs to be situated in relation to the exogenous structures of choice, autonomy, and power that connect worker experiences of the algorithm to society’s wider social and political aspects. Many Indian states have signed partnership agreements with Amazon, to feature products of women producer organisations, under the Saheli banner. Amplifying statist discourses of women’s empowerment and corporate propaganda on social responsibility, such collaborations of convenience instrumentalise women workers and their economic initiatives. They divert the conversation from the necessary public investments for women’s economic participation and the much-needed governance of mainstream platform marketplaces. Camouflaging the damning invisibility of women’s businesses in the gamified environments of efficiency-optimising platforms, they instead co-opt society’s economic peripheries into the infinite offerings of corporate data services. The narrative of ‘e-commerce for women’ is Amazon’s easy ticket to future profits. Cloud majors like Amazon Web Services are desperately wooing Indian government officials to corner India’s public sector cloud market (Nishant, 2022).

Women microworkers who work for AMT face a Hobson’s choice. Eager to alleviate the household’s economic insecurity, these women dutifully enter the virtualised workspace of AMT each night—after a long day of household work. Hidden in the grey zone of non-regulation, they navigate an exploitative algorithmic regime, glad for the ‘opportunity’ to be able to ‘respectably’ support their family, and almost entirely unaware of the oppressive geo-economic and geo-political factors that legitimise such exploitation. For these women, the idea of self and autonomy starts from a relative position of subordination in the patriarchal household, one that makes AMT an attractive work-from-home proposition. In the case of the female drivers of Uber, we see households in economic distress mobilise women’s labour through a tryst with modernity that is held up by statist and corporatist epistemic frames celebrating them as empowered entrepreneurs who have broken into the all-male bastion of ride hailing.

Whether it be feminised microwork encouraging women’s seclusion or participation in the man’s world of driving that endorses their mobility, the algorithmic ecosystem derives legitimacy from the many exogenous shades of patriarchal social organisation. It enables platform companies to appropriate the labour of migrant women in on-demand work or educated home-bound women in cross-border labour chains. It allows states to spawn unregulated economies on the backs of undervalued women.

Perhaps, not so surprisingly, gender-based occupational segregation is rife in the platform economy; care work, beauty services, and domestic work are overwhelmingly performed by women, while other, more ‘masculine’ tasks such as driving are dominated by men. Sushila observed that out of 50,000 Uber drivers in Bengaluru, only 100 are women. It is possible to argue that the algorithms only reflect reality. But it is equally true that the algorithm becomes the new structure in which is imprinted the gender division of work that pushes women to undervalued and low-paying tasks.

In the dominant platform economy, algorithmic pursuit propels a global labour arbitrage founded on a collusion of patriarchies characterising nation-states, societies, and businesses—that find legitimacy through global policy. This bears similarities to how women’s cheap labour in South and East Asia spurred the growth of global value chains in apparel and electronics, lifting economically beleaguered nations out of the woods—all the while reinforcing global inequality and corporate impunity. Despite historical similarities, the current trajectories of our algorithmic world are also different, given the changing nature of work itself and the frightening prospect of a new polarised global job market. The digital context will not only leave most women out of the high-skilled, high-end, elite job segments, but also usher in an unprecedented precarity.

This thesis—it needs to be clarified—does not at all assume a passivity on the part of women who are indeed leading social change. Rather, the intention here is to point to how gendered locations place extreme constraints on women’s choices and, therefore, to the fact that the exogenous conditions of algorithmic life and the endogenous ones on platform-mediated work must be dismantled and rebuilt towards a paradigm of equality founded on meaningful choices.

Conclusion

Across jurisdictions, platforms have pursued a strategy of ‘regulatory entrepreneurship’ (Pollman & Barry, 2017) to lobby for maintaining the status quo that has permitted their exponential growth. A key tactic in this regard is the misclassification of workers, denying them an employment relationship that can form the basis of formal social and labour protection. Perpetuating the myth of the self-employed platform worker has allowed platform companies to proliferate while accepting no responsibility for the welfare of workers. The stories of women we discussed in this chapter point to a near-unbridgeable skew of power between the platform and its workers. They also point to how the platform economy emboldens social structures of oppression that reproduce racialised gender hierarchies by exploiting women, especially from the Global South.

Platforms not only subject women to exploitative conditions of work. By normalising the flexibilisation and individuation of work, the platform economy appropriates women’s labour in its entirety, for its perpetuation. Female wage labour is central to its logic of accumulation, but so is the unpaid care work that women perform. The precarisation of work that marks platform labour built entirely through algorithmified gaming raises a vital question—how can we move out of platform models built on extractivist algorithmic optimisation?

A new algorithmic radicalism needs to inform the principles, norms, policies, and practices of the platform economy, one that is just and inclusive. Firstly, opening up algorithms for political scrutiny is crucial. This can reconfigure endogenous operations of the algorithm, keeping the platform marketplace trained on values and norms continuously monitored for real-world impacts. Two crucial dimensions to realign the platform ecosystem comprise workers’ data rights, including the right to algorithmic accountability and the right to explanation, and gender-responsive algorithmic design to ensure affirmative action on platform workplaces.

Data rights as epistemic rights extend from data collection, use, storage, and sharing processes, to accessibility for workers to port or transfer their work experience with other job providers, to check the veracity of their data, or even collectively to set up alternative businesses. Design can be transformative. As women drivers we spoke to reflected, Uber could easily enable a positive gender bias in algorithmic matching. This could be potentially useful when women drivers are out at work late at night. A positive measure reported to us was how clients found to have harassed or abused workers are deplatformed by Urban Company.

Secondly, given that algorithms are intertwined with social structures unless the digital paradigm is extricated from its current colonising trajectory, endogenous restructuring will not go far. Algorithms must be directed towards new social relationalities, thus nurturing the exogenous conditions for radical change. From provisioning of public marketplace, compulsory quotas for women producers on e-commerce platforms, to legislation for social security nets and workplace health and safety guarantees in platform workplaces, socio-political nudges towards a resignification of algorithmic spaces can transform ideas of labour and value.

But a transformative agenda will need contemplating platform labour along alternative economic logics where efficiency on scale may need to be sacrificed for other gains. New platform marketplace architectures that redistribute value—such as women-owned and -managed models that are locally embedded and socially, environmentally, and economically sustainable—will need to be explored and legitimated. Public investments to nurture such business models and appropriate governance to rein in Big Tech-based market capture are vital. Policy interventions are also needed to socialise care. Algorithmic serendipity can contribute to the realisation of post-market, collectively organised, platform-based care arrangements on a societal scale, if policy encouragement is provided to such models.

What we are talking about is a paradigmatic shift from the current paradigm and its regressive institutional and technological structures to a new one. Supported by institutional transformation and new algorithmic practices and regimes, such a shift would lead to an alternative epistemic paradigm—one that expands women’s agency and choice in the economy and society.

Transforming the algorithmic paradigm needs institutional transformation. The pivotal role of ‘intelligent algorithms’ necessitates the recognition of precarity and impoverishment in the digital economy through political structures of law and policy. Countries are showing a willingness to recognise this. Many governments and courts have taken steps to address platform workers’ rights, including epistemic rights (Aloisi, 2022; European Commission, 2019; ILO, 2021; Gurumurthy et al., 2018; Wood, 2021). There is increasing consensus that algorithmic transparency in ratings and other mechanisms is non-negotiable, as is the access that workers must have to their data on the platform (Wood et al., 2019; Rani & Singh, 2019; Singh & Vipra, 2019; Rani & Furrer, 2021). The ILO is applying itself urgently to new ideas of universal labour rights (ILO, 2022).

A global churn is however necessary to redirect the purpose of value creation so that it is tied to a meaningful life for all that scrambling for a data-rich planet may not bestow. Stories about the human condition in algorithmified platforms narrated by women tell us that we need to strive towards a new subjecthood for the most marginalised, reconstituting computational pursuit as a political activity guided by a new institutional ethics.