1 Introduction

How do we approach issues of privacy and identity with regard to a fashion brands’ frequent use of recommender engines and social media analytics? There is a considerable academic interest in analysing the challenges to privacy of recommender engines in e-commerce (Milano et al. 2020). Personalised recommendation systems, from the exposure of individual user data to the direction of user interests, pose risks to privacy regarding user control of personal data (Wang et al. 2018). Recent headlines, such as ‘YouTube makes money by keeping users on the site and showing them targeted ads’ (Tufekci 2019) or ‘Instagram algorithm systematically boosts semi-nude pictures’ (Hamilton 2020) fuel the debate on incorporating guidelines and standards to protect user privacy in the design and deployment of consumer profiling (Paraschakis 2018: 35–36). Algorithms become more and more persuasive, adaptive, and seamless in relation to an individual’s preferences, taking advantage of the user’s conscious and unconscious attention (Eyal 2014: 7). This article enumerates some problems we need to consider when discussing the commercial use of predictive analytics by fashion brands, focusing on issues of individual autonomy and identity.

The main contribution of this article is to assess the role of identity and autonomy in the big data age considering the role of ‘fashion’ and ‘identity’ as they are influenced by recommender engines and social media analytics in the fashion domain. Current literature deals with questions of individual autonomy and identity within the algorithmic information structure. The individual is constituted by information based on algorithmic classification, including semblances of individual preferences (Floridi 2011). The connection between personal identity and informational privacy is shown in Agre’s (1997: 7) definition of privacy, which incorporates ‘the freedom from unreasonable constraints on the construction of one’s own identity’ as well as delineating that ‘control over personal information is control over an aspect of the identity one projects to the world’. Both conceptions recognise the ambivalence between the individual’s control over revealing aspects of their identity and shaping their identity on their own terms (Clarke 1994: 78; Agre 1997: 7).Footnote 1 Data protection laws, such as the General Data Protection Regulation, establish a rule-based framework to strengthen individual autonomy and informational self-determination by considering information asymmetries caused by big data analytics.Footnote 2 Privacy, on the other hand, is a right that has developed into a positive freedom to protect notions of personal autonomy and development under Article 8 of the European Convention on Human Rights.Footnote 3 Both are concerned with how the individual is situated within a social context and the external constraints on the expression and development of aspects of identity.

This article takes a different approach to examining questions relating to privacy, identity, and individual autonomy in the big data age, based on the notions of individual perception and self-relationality and their connection to the meaning of fashion identity. Hence, this different view of privacy incorporates the conscious and unconscious associations with the self that are affected by algorithmic decision-making and consumer profiling, focusing on the process of identity construction connected to the practice of ‘dress’ as explored by fashion studies and psychology.Footnote 4 This position is significant as it establishes both the relational nature of privacy to a social environment and the individual’s positionality with regard to controlling aspects that pertain to identity construction. This dynamic construction of privacy thus advances a nuanced perspective on the impact of algorithmic personalisation systems in fashion on individual autonomy and identity by focusing on the unconscious associations of the self.

‘Fashion’ is an inherently emotional field, investigating the social and personal aspects of the average consumer’s clothing. There is no universal definition of ‘fashion’ which stipulates a truthful depiction of ‘the wearer’, an exhaustive definition of ‘style’ or a fixed reading of a ‘trend’ (Svendsen and Irons 2006: 21). We can only gauge what ‘fashion’ means based on the social, cultural, and personal relevance of dress for the individual and perceiver, the fragility of taste, and the seasonality of trends (Landia 2018). We therefore need to identify how algorithmic personalisation services in fashion set out to model human behaviour and the implications that algorithmic decision-making has for an individual’s process of association in fashion.

The focus here is therefore on algorithmic personalisation in recommender systems and social media analytics in the fashion domain. Fashion recommender systems constitute a hybrid approach to collaborative and content-based filtering algorithms paired with deep learning methods to recognise semantic attributes in clothing (Hou et al. 2019). Social media analytics is an area of predictive analytics, whereby advancements in natural language processing enable the reading of semantics in language to identify fashion trends (Luce 2019: 29). With advances in machine learning and deep learning to analyse structured and unstructured data, computational models are increasingly equipped to deal with larger attributes in data, learn styles more efficiently, and adapt to a consumer’s perception of ‘fashion’ (Halan 2018).

This paper investigates the capacities of algorithmic personalisation systems in fashion to offer common representations of individual behaviour, persuade individual users, and employ subjective neutrality in human decision-making. First, algorithmic personalisation systems set the parameters for expressing identity in the Infosphere, based on the influence of the contours of self-representation and the communicative function of fashion. Second, fashion recommender systems shape the conditions for the individual’s expression of identity and free choice, which requires a deeper understanding of algorithmic personalisation systems affecting individuals’ unconscious association with fashion. Finally, we need to acknowledge that algorithmic personalisation systems, being based on the computational classification of individual attributes, introduce a new area of subjectivity that influences self-relationality.

2 A theoretical outlook on privacy with regard to the impact of algorithmic personalisation in fashion on autonomy and identity

‘Uniqueness, individuality, constant change and materialistic values are at the centre of our society, and they deeply affect the consumer’s concept of self and his/her own identity formation’

(Niinimäki 2010: 154).

Imagine a straightforward situation where you arrive at a party wearing the same red dress as your friend. Several thoughts may come to your head, such as ‘who looks better in the dress?’, and you might identify similarities and differences regarding the symbolic meaning of clothing, such as the meaning of the dress in relation to the occasion, your friend’s occupation or personality, as well as comparing your appearance with their body shape. This thought process, as a weighing-up of the process of self-representation (i.e., wearing a particular dress to a party) and perception (i.e., how do I perceive my friend wearing the same dress to evaluate my own appearance) is effectively the dialectic tendency that constructs my ‘fashion identity’. It defines my own sense of identity—my social self in the material world—and my personal self in my inner traits concerning my self-relationality.

Algorithmic personalisation systems in fashion induce so-called ‘perceiver variables’ within the data which illustrate the social aspects of fashion (such as providing style recommendations for a particular occasion) and the personal aspect of clothing (such as preferences for certain colours, as well as the cut and shape that suits my personality). Advancements in natural language processing allow for the large-scale analysis of user sentiment in social media data to identify the social aspects of fashion, such the ‘red dress’ being a popular choice for a party, whereby deep learning approaches for analysing user-item interactions in recommender engines will identify your preferences in a ‘red dress’ for your body shape. Algorithms illustrate pre-emptions to identify the social and personal aspects of my ‘fashion identity’. Algorithmic personalisation systems in the fashion domain are thus defining the dialectic tendencies of self-representation and perception.

Against this background, we need to assess how algorithmic constructions of ‘fashion identity’ affect one’s sense of self, focusing on individual autonomy and privacy. Several recent academic discussions highlight how algorithmic personalisation causes a refined ‘informational choice architecture’ including asymmetries in knowledge between the processed and processor, the creation of ‘filter bubbles’ and ‘echo chambers’ impacting an individual’s autonomy, as well as the control of personal information (Yeung 2017; Flaxman et al. 2016; Bozdag 2013; Mitchell and Bagrow 2020). How do I re-establish my sense of identity within the infrastructure using the tools of privacy to maintain my autonomy in disclosing aspects pertaining to the self? This is the classic question pervading current (human rights) discourse on the right to privacy in the big data age. For instance, we could ask ourselves whether a consent model for the processing of personal data can counter the continuous algorithmic tracking and processing of personal information defining user preferences.Footnote 5

We need to ask ourselves about the extent to which algorithmic personalisation systems in fashion relate to an individual’s perception and individuality regarding identity construction. Three observations, which I will elaborate on in Sects. 2.1, 2.2, and 2.3, highlight that algorithmic personalisation systems are an imperfect semblance of individual behaviour.Footnote 6 The first focuses on predictive and social media analytics to create individual profiles based on the matching of common preferences and general sentiment.Footnote 7 The second, suggesting that algorithms exhibit common readings of individual behaviour, investigates fashion recommender systems which discern the relevance of products regarding user-item interactions, and tailor and rank content based on individual attributes. The third observation is that algorithmic personalisation systems are a ‘human construct’ being subject to biases reflected in the input data and the output of decisions (Jones-Rooy 2019).

Considering the notion of individual autonomy and identity with regard to algorithmic personalisation in the fashion domain allows us to move away from an understanding of privacy based on the control of personal data and consider the inherent constraints of algorithmic personalisation on identity construction. It allows us to delve into questions of how to maintain an individual’s uniqueness and individuality mirrored in the process and conditions of identity-building. Therefore, considering the algorithmic ‘abstractions of fashion identity’ enables a fundamental re-thinking of privacy that protects an individual’s autonomy to shape algorithmic approximations of the self.

2.1 Algorithmic personalisation in fashion is about editing common preferences and representations of the self

Social media analytics define the parameters of how the social aspects of ‘fashion identity’ are identified. We need to investigate the function of algorithms in shaping the individual’s process of self-presentation, including the communicative function of ‘fashion’, focusing on the role of social media analytics to guide fashion brands’ instincts and trends. The fact that an individual’s social media activity is observed by methods of predictive analytics to inform a brand’s trend forecasting, marketing, as well as advertising strategies raises concerns regarding individual control of personal data and information as well as the exposure to content (Mitchell and Bagrow 2020). It is the process of content filtering and personalisation for targeted advertising that shapes an individual’s autonomy and privacy to set the parameters and conditions for their expression of ‘fashion identity’.

Margaret Boden, who writes on the capabilities of artificial intelligence (AI) more generally, highlights the ‘non-objectivity of AI programs’, which enforce rather than deny user subjectivity (Boden 1987: 655). She argues that ‘the point about subjectivity in human beings is that each of us has a mind which gives us an idiosyncratic view of the world’ (Boden 1987: 655). Thus, the purpose of an AI program is not to produce an objective representation or truthful depiction of the world but rather, to adapt to individual intentions, beliefs, and values, making a verifiable judgement (Boden 1987).

The issue with current applications of AI, such as social media analytics, is that algorithms engage with value-laden judgements. Considering the inherent limitations of natural language processing models to understand subjective attributes in (unstructured) data, algorithms set out to identify shared narratives of preferences in style and trends as well as the individual’s ambivalences towards the social self of fashion identity (i.e., their desire for conformity and differentiation). In this respect, predictive analytics, considering the user’s participation on social media and their negotiation of the ambivalences in the social self of fashion identity (i.e., developing targeted advertising based on users’ ‘liking’ or ‘following’ trends and individual profiles of preferences), directly act upon an individual’s subjectivity in expressing aspects of fashion identity.

This issue, setting the parameters of the communicative function of ‘fashion’ and implying a model centred on user subjectivity either affords or takes away an individual’s privacy to exercise an informed choice in expressing and developing aspects of the social self of fashion identity. Individuals living in so-called ‘echo chambers’ tend to engage with like-minded people or follow individuals that reflect their desires (i.e., those with similar opinions, values, or preferences).Footnote 8 What accelerates calls to protect an individual’s authenticity in the digital world is that our values and beliefs become a source of alienation (Lijster and Celikates 2019: 64–65). Algorithmic filtering can induce so-called ‘filter bubbles’ shaping the negotiation of shared narratives on norms and/or preferences based on the user’s relative exposure to content (Flaxman et al. 2016). Hence, it could be argued that the algorithms’ ubiquitous manifestation of ‘fashion narratives’ could affect an individual’s perception of the social self of fashion identity. In other words, the exposure to content summarising values of conformity can shape an individual’s perception in forming their own values, beliefs, and attitudes that define their authenticity. Take the example of a fashion brand that wants to use big data analytics to investigate how people perceive its new jeans collection. What are the boundaries or parameters of the right to privacy regarding the use of individual perception to target a user with ads for a new jeans collection for a politically conservative audience? The current understanding of privacy is well-suited to protect the expressive notion of ‘fashion’, such as regulating the user’s disclosure of personal data based on their informed choice, but less so to regulate algorithmic ‘harms’ on the individual’s ongoing negotiation of the social self of fashion identity in the Infosphere.

Hence, we need to grasp the implications of predictive and social media analytics in fashion for individual privacy including the conditions for identity-building. There is a considerable research on the impact of ‘filter bubbles’ on individual agency and choice but we need to go further than asserting an individual’s control of personal information or contours of appearance in the digital age (Susser et al. 2019). Predictive analytics in the fashion domain not only shape the deliberative perception of ‘facts’ regarding diverse fashion content but also the means through which we engage in reflective choice for individual sense-making. For instance, how does my constant exposure to jeans shape my relationality and unconscious associations with my own characteristics, such as my body image, my political views, or desires? Defining the right to privacy according to the conditions for identity-building addresses the frictions that social media analytics in fashion can produce in notions of individuality. Accordingly, it is important to investigate the extent to which emerging communication infrastructures in fashion undermine an individual’s autonomy to make diverse associations necessary for the inference of knowledge of self-regarding their fashion identity.

2.2 Algorithmic personalisation in fashion is about persuasion

Another aspect of algorithmic personalisation systems in fashion is the relationship between user and product attributes in fashion recommender systems. Two aspects of fashion recommender systems allow us to elaborate on the impact of algorithmic decision-making on notions of ‘individuality’: the use of computer vision and a Convolutional Neural Network (CNN) methodology to classify images and other unstructured information, and the interpretation of user-item interactions using a matrix factorisation technique.Footnote 9 In this respect, fashion recommender systems shape the notion of self-relationality through the algorithms’ potential to ‘nudge’ or persuade the user.

The algorithms’ quantitative characterisation of product attributes in fashion recommender engines seeks to personalise the user’s shopping experience within the contours of a brand’s image (Daolio 2018). A CNN methodology enables both the extraction of visual features in product attributes as well as the coordination of fashion items/outfits (Lin et al. 2019; Goncalves and Brochado 2020). It is this process of associating attributes like colour, shape, texture, and style that forms the basis of establishing the link between product and individual attributes, such as occasion, preferences in style, or mood (Guan et al. 2016). Recommender systems can thus shape the contours of algorithmic decision-making to establish a connection between visual appearance and emotional attributes in clothing.

Fashion recommender systems, exploring product attributes within non-linear relationships, apply these findings to match items with individual characteristics. They thus delve into ‘fashion narratives’, such as rules on style, cut, and shape in product attributes, defining the relationship between an individual’s perception and the process of inference of knowledge of self in ‘fashion identity’. Take the example of a dress with floral patterns, which connotes a ‘fit-and-flare style’ suitable for ‘girly girl [customers]’ (Cardoso et al. 2018: 82). An individual interacting with products with these characteristics will conduct the process of inference of self-regarding his or her fashion identity in light of the algorithms’ interpretation of ‘perceiver variables’ (i.e., interpretations of gender or age). How do we determine whether an individual is being ‘nudged’ to buy a certain fashion item or when the algorithm is being deceptive? The answer depends on whether the right to privacy can secure the conditions for identity-building, providing the space to reflect on the social and personal aspects of fashion with reference to the self.

The second point, reflecting on the recommender systems’ exploration of pre-existing fashion narratives, concerns the algorithms’ interpretation of user-item interactions and its impact on an individual’s unconscious associations within the personal self of ‘fashion identity’. The methodology to analyse user-item interactions can certainly identify correlations within the data, though it cannot causally connect the reliance on certain criteria (Beckwith 2019). Take the example of the Style Check application in Amazon’s discontinued Echo Look, which could prefer ‘all-black’ over grey looks without explaining why black items look better on the user (Chayka 2018). Focusing on the matrix factorisation technique in recommender systems, we can assume that the computational model represents products and users in a high-dimensional vector space which is inferred from the rating patterns (Koren et al. 2009). The method allows for inferences of preferences of data based on implicit feedback, such as browsing behaviour (Koren et al. 2009). These so-called ‘data trails’ (Mittelstadt 2017: 476) can either enhance or disturb an individual’s autonomous judgements. In other words, algorithms can either personalise the user’s shopping experience, giving them the tools to manage their appearance according to their preferences, or it can undermine their capacity to make a verifiable judgement regarding their ‘fashion identity’.

Indeed, commentators are often concerned about the impact of inferential analytics on an individual’s control of their data, underlining the individual’s passivity in their exposure to the non-transparent readings of algorithms (Wachter and Mittelstadt 2019). I would like to take this argument further and suggest not only does the lack of control over the (non-transparent) process of inferences raise privacy (and data protection) issues but also the algorithms’ lack of causality influences the process of unconscious thought. Take, for example, a fashion recommender system that infers from the individual’s browsing and typing behaviour that they have always wanted a particular body shape or an ‘hour-glass’ figure. This is not solely an issue pertaining to the legal use of personal data; it invites us to think deeply about the role of privacy in the formation of new values, which requires space to make the associations that contribute to our own well-being, scrutiny, and personal development. We need to think about this aspect of self-relationality that allows us to think freely. In this respect, we need to ask ourselves what is the role of the right to privacy in securing our own values considering the scrutiny of algorithms regarding the personal self of fashion identity?

It follows that algorithmic personalisation systems are about persuasion, which entails the identification of the inter-relationship between ‘fashion’ and ‘identity’ based on the algorithms’ modelling of user responsiveness to fashion products. Fashion recommender systems can have a significant impact on how user perceptions are formed, based on the presentation of information and the re-structuring of options according to the user’s preference structure. For example, a recent paper by Karl Hajjar, Julia Lasserre, Alex Zhao et al. develop a deep learning predictive sizing model which is argued to prevent a negative body experience, recommending products that suit the customer’s size and shape (Hajjar et al. 2021:77; Corona 2020). Nevertheless, fashion recommender systems constantly adjust to changes in user behaviour based on a set of properties and factors that influence an individual’s daily clothing decisions. These properties or ‘fashion narratives’ on ‘clothing’ are based on the algorithms’ interventions in the user’s conscious associations with ‘fashion’. In this respect, an important aspect of investigating the impact of fashion recommenders on the right to privacy is to elaborate on the nuances of persuasion in an individual’s impression formation, considering the suggestions on the nature of privacy noted above.

2.3 Algorithmic personalisation in fashion is about limited options and subjective neutrality and bias

The final aspect of algorithmic personalisation in fashion pertaining to an individual’s perception and self-relationality is the boundaries of inevitable and unacceptable algorithmic bias. Algorithmic bias is a consequence of the programmer’s subjectivity and/or the outcome of algorithmic modelling, which can be reflected in the target variables, the training data, and/or the feature selection of proxies (Barocas 2016: 680–691). In addition, we witness the incorporation of algorithmic decision-making based on efficiency and statistical objectivity (Rieder 2016). This subjective neutrality in algorithmic systems risks de-contextualising the individual’s presence and sense-making of ‘fashion identity’ to the contours resembling their attributes. In this respect, the role of privacy requires us to look deeper into the meaning of privacy for securing one’s reflective choice against the risks of differentiation from people with a semblance of similar attributes.Footnote 10

Algorithmic personalisation operates according to patterns and correlations in data, creating unstated assumptions that are based on a statistical probability of someone purchasing a certain fashion product. Accordingly, the very purpose of an algorithmic system is to differentiate between individuals, interpreting user profiles containing a number of features, which are compared to many other parameters from other users (Amoore and Woznicki 2018). The logic of differentiating between entities is clear, which is to enable more targeted decision-making. A fashion recommender system will suggest fashion items based on the individual’s profiles, such as their current geographical location or financial status. The task of differentiating between entities is an important aspect of algorithmic personalisation and predictive analytics, allowing fashion brands to tailor recommendations relevant to the consumer. Take the example of a predictive sizing application that needs to reflect an individual’s unique attributes and preferences of fit (i.e., height, body shape, weight, size) for accurate decision-making.

Whilst these individual attributes may not directly correlate with any protected characteristics under discrimination law, such as race, gender, or age, an algorithm may infer information that is sensitiveFootnote 11 or which reinforces a particular prejudice against individuals with specific characteristics.Footnote 12 The main issue is not only that recommender engines comprise human-made biases but also that their data is approximated to real-life events (Jones-Rooy 2019). Once we acknowledge this operational substance of algorithms, it becomes clear that we cannot deal with algorithmic bias exclusively as a matter of ‘fairness metrics’ but need a better grasp of the underlying role of the right to privacy to regulate emerging trends in ‘subjective neutrality’ within algorithmic decision-making.

In this respect, fashion recommender systems could raise several issues regarding an individual’s perception and self-relationality, as they are based on factual readings of an individual’s attributes and need to be scrutinised in terms of the right to privacy in identity construction. Take the example of a subscription-based service processing the user request ‘I need something to wear to a casual, outdoor, wedding’. Suppose each clothing style has several attributes (i.e., style, season, wearing occasion) which will be matched with the target client to infer their preference (i.e., what they will most likely end up buying). Nevertheless, a subscription-based service is more than the mere categorisation and matching of attributes with the individual; it is a process that allows the user to ‘make up’ identities, such as by consciously giving feedback on size and fit or providing instructions regarding the wearing occasion in the process (Webber 2019). Our own involvement allows us to receive more ‘accurate classifications’ that recommend an outfit we will most likely keep in our wardrobes. The key is, however, that the more user involvement there is in the recommendation process, the more the algorithm has to deal with latent and unstated features, which need to be inferred from other structured or unstructured data (i.e., interpreting text, visual data). Fashion recommender systems, dealing with multi-dimensional features of clothing and perception of clothing (e.g., a medium size could illustrate a large or small medium fit considering the user’s body shape and personal preferences), place an individual’s conscious choices within the categories one seeks to identify with. It is this association of attributes to clarify latent features that defines the parameters of social exclusion and inclusion.

Thus, we need to identify the extent to which algorithmic categorisation shapes individual perception, including the way we experience identity. As Katja de Vries accurately states, algorithms shape our sense of self within our own assigned social categories (i.e., my perception of lifestyle, health, well-being, and location as an ‘illusion’ regarding the algorithm’s dynamic categorisation of my social status) (de Vries 2010: 51; Milano et al. 2020: 962). But it is not only the algorithms’ categorisation of individual behaviour into social categories that encroaches on individual agency and choice but also the de-contextualisation of an individual’s attributes from their everyday experience of identity. For instance, a subscription-based service may infer my clothing preferences in light of my behavioural profiles on style, physical features, and budget based on the correlation of attributes and group similarities, rather than my interpretation of ‘perceiver variables’ of the social self of fashion identity. Thus, algorithms direct me towards the limited options to which I have assigned myself consciously (i.e., explicit feedback) and sub-consciously (i.e., implicit feedback that is detached from my subjective experience of self). In this respect, privacy, as an enabler of social interaction, induces us to strike a delicate balance between an individual’s perspective on identity regarding aspects of identification (i.e., the accurate description of my subjective sense of self) and the structural properties within the system of perception of identity (i.e., the ‘perceiver variables’ defining my interpretation of identity). What is the role of the right to privacy in setting the parameters regarding the impact of algorithms on social exclusion and inclusion?Footnote 13 This is an important question requiring the implementation of safeguards (and values) in the design of algorithmic personalisation systems before the systems’ deployment, to mitigate risks of unfair treatment.

In light of these considerations, we need to acknowledge that algorithmic categorisations introduce a new area of subjectivity. The problem with algorithmic categorisations and bias is that their operations result in a complex configuration of multi-dimensional and substantive relationships between attributes. Algorithms are designed to engage in a process of ‘task-centric abstraction’, which entails the classification of a problem within one social setting (Selbst et al. 2019). Let us suppose that a fashion recommender system, containing a neural network to detect the parameters of reading visual data, establishes relationships for recommendations targeted at ‘Muslim women’. The algorithms’ implied normativity in detecting the social and cultural aspects of ‘clothing’ might lead to some accurate suggestions (i.e., identifying an individual’s demographics and race) but it will not capture the variety of ‘identity’ within social-cultural contexts (i.e., an individual’s identification with ‘Muslim culture’ or their perception of ‘gender’, ‘age’, or ‘aesthetics’ in their social-cultural context).Footnote 14How do algorithmic categorisations define my self-relationality to my own attributes, and how does privacy secure the conditions for the exercise of these attributes (e.g., religion, traits of behaviour)?

2.4 Fashion identity and an abstraction of self? A conceptual perspective on the right to privacy affected by algorithmic personalisation systems in fashion

The discussion so far has established the bedrock for investigating the challenges to privacy posed by algorithmic personalisation systems in fashion, focusing on the individual’s perception and self-relationality in fashion identity. From social media analytics to fashion recommender systems, algorithmic personalisation systems delve into the process of communicating and developing aspects of identity. These considerations are significant, suggesting that privacy cannot only be associated with control of aspects of identity, but needs to go further to include a different conception of securing autonomy and conscious and unconscious associations of the self.

In other words, it is important to note that the limitations of AI techniques in analysing user sentiment and individual explicit and implicit preferences, illustrate the conceptual boundaries leading to an abstraction of the self in relation to one’s fashion identity. Algorithms in the fashion domain entail a form of knowledge resemblance to aspects of identity, which does not encompass the (subjective) experience of identity, such as my own relative perception of appearance applied to my own style and/or body shape. What happens is that you expand your knowledge of self (including the conscious and unconscious expression of perception and self-relationality) based on the algorithms’ process of associating personal attributes with fashion narratives. This process undermines an individual’s autonomy to define abstract entities including fashion narratives and how these ubiquitous manifestations shape my view regarding my own qualities of the self.

Where do these considerations leave us regarding the role of the right to privacy in securing the contours of identity-building? To suggest a different conception of autonomy, we need to unpack a very important limitation of privacy. It is the narrow understanding of personal identity as a form of knowledge reproduction in algorithmic systems which requires a different conception of privacy as a form of control over aspects of the self. The current theoretical conception of the right to privacy, as well as academic discourse on Agre’s definition of privacy (1997: 7), supports a direct propositional formula to secure the individual’s autonomy and identity in a social environment and against the readings of algorithms (Edwards and Veale 2017: 73; Eskens 2019: 172; Hildebrandt 2015: 102–103). However, identity is not always representational of social interaction but retains an essence beyond the observed individual state, that is, individual perception and self-relationality. Thus, we need an understanding of privacy that protects against the inherent reduction of fashion identity to literal attributes (such as fashion narratives on ‘gender’ or ‘casual style’) and considers an individual’s autonomy to shape the algorithmic approximations of the self. This analysis suggests that whatever our expectations of algorithmic personalisation to predict individual preferences, we should not make the error of reducing the discourse on privacy and autonomy according to algorithms’ inherent reductions of fashion identity. Thus, the discourse on challenges to privacy regarding algorithmic personalisation systems needs to correspond to a bigger picture to discuss the meaning of individual autonomy in maintaining perception and self-relationality within the constrained spectrum of possibilities.

3 Conclusion

Algorithmic personalisation in fashion does not entail the assessment of an individual’s fashion identity in terms of what is but rather what personal qualities illustrate relevant data for the algorithms’ knowledge construction. This paper has focused on the limitations of some advancements in AI techniques in the fashion domain to delineate the privacy-related challenges posed by social media analytics and recommender engines to autonomy and identity. The main suggestion is a theoretical understanding of privacy which considers an individual’s perception and self-relationality, and which goes beyond the individual’s control over aspects of personal identity in the Infosphere.

I have investigated this claim, taking three perspectives on the implications of algorithmic personalisation systems in fashion. Section 2.1 has focused on the way algorithmic personalisation systems shape communication structures and could affect an individual’s autonomy to make diverse associations necessary for the inference of knowledge of self-regarding their fashion identity. Another perspective regarding the implication of algorithmic personalisation systems in fashion has been investigated in Sect. 2.2, which argues that we need to elaborate on the nuances of persuasion in an individual’s impression formation considering the right to privacy. Finally, Sect. 2.3 illustrates the need to investigate the extent to which algorithmic categorisation shapes individual perception, including the way we experience identity, considering the complex configuration of substantive relationships of personal attributes in fashion recommender systems that undermine individual autonomy. The aim of this paper is not to offer a comprehensive account of the challenges to privacy posed by predictive analytics, nor a holistic solution to the meaning of identity in the big data age. Rather, it envisages a theoretical outlook on how to address the problems surrounding the individual interacting with algorithmic personalisation systems in fashion.

To conclude, we need to focus on the algorithms’ process of abstraction of self to establish the contours of individual autonomy in the big data age. This suggests that, contrary to the assumption that an individual needs a ‘right how to be read’ (Edwards and Veale 2017: 73; Eskens 2019: 172; see also, Hildebrandt 2015: 102–103), we need an understanding of autonomy that allows for a ‘right to not be reduced’ to algorithmic abstractions that are not comprehensible to an individual’s fashion identity. This conception of privacy allows us to think about autonomy and identity as a form of protecting the individual process of inference of knowledge of the self, rather than the individual’s narrow control of the algorithms’ knowledge production. Indeed, further research is needed, and this investigation which aims to establish a more nuanced conception of autonomy is, therefore, ongoing.