War today is unchanged. It is the physical, bloody use of kinetic energy to conquer space and time: the space and time—the here and now—of the lived body. The aim of this ‘storm of steel1’ is to conquer the atmosphere, to make it fundamentally unliveable for the human being, to eviscerate the body with metal, to burn the body’s nervous system with chemicals or to asphyxiate it with gas. The 2022- Russian War against Ukraine is a war of the killing and the maiming of a mass of humans on a muddy battlefield lined with trenches and strewn with mines. To count bodies as the defining resource in this war is to acknowledge that Russia will win.

But war today is also changed. Beyond the located storm of steel, there is a globally trans-spatial storm of bits: the digital maelstrom of images, videos, messages, comments, propaganda, truths, ideas and claims that constitute today’s global, imploded, informational war. How to begin to quantify or fight or protect the human against that? What Marshall McLuhan (1969) called the swirling ‘world-pool of electronic information movement’ (playing off whirlpool) captures us today through messaging platforms and apps, drowning us now in our own, self-produced and self-publicised content.

This is a war, however, that is split, splintered, fractured, streamed, personalised, shattered and sharded. War’s experience was once shared through the media of the day, from the lithographic presses and the metropolitan show business from the 1853-6 Crimean War (Keller 2001, ix), to the compulsive satellite immediacy of the 1991 Gulf War. However, the limited, simulacral, edited and propagandised, war reporting in or for a nation that defined the last century, was experienced within the mainstream bubble of public and commercial, regulated, broadcast mass media news. We more or less shared the same consensual reality and its vision of war. All that has changed.

War today exists first in its locale, as the barely knowable reality of the conflict and, simultaneously, as its immediation, in the photographs, videos and snippets that are immediately taken and posted, then commented upon and re-posted and which come to immediately virally overlay the space of the conflict. These primary mediations remain linked to the life world, to the space and time of the war and its experience by those in the region. It is first understood and fought over by those to whom it most concerns, and it is fought in the language and culture of the region. The War over Ukraine and the Israel–Hamas War today are first mediated and understood by Ukrainian and Russians, and by Israelis and Arabs. The hybrid messenger service/social network Telegram, for instance, is a new war front, an emergent, personalised, and largely unmoderated and uncensored form of digital war, used by many Russians and Ukrainians. Rather, oversight is sharded to the preferences of the individual who creates a given channel. This creates a challenge for new European regulation (the EU Digital Services Act (DSA) 2023/20242) (as well as for Telegram) which demands that platforms combat unlawful content and hate speech through moderation. This is in the sharded nature of Telegram’s users, producing individualised splintered realities, rather than something primarily platform or algorithmically directed. In this way, Telegram follows Merrin’s (2018, 60) argument that the establishment of ideas around the decentralisation of the organisation of war are not only a military transformation, but ‘more fundamentally a global, societal and personal revolution in information technology and information’.

In the 1990s, the USA theorised a new form of ‘network-centric warfare’ (Arquilla & Ronfeldt 1993) a mode of connected and coordinated, real-time military operations able to act fast and rapidly close out larger (less mobile and connected) forces. Today the public have taken up and reworked this concept, to transform network-centric war into an informational war fought over the networked platforms and through their own personal networks. Here, like in real-life war, users move rapidly to check and close out oppositional masses and their competing, threatening claims, narratives and realities.

Yet Telegram is not some disconnected entity. Localised experiences fed through apps such as Signal and Telegram are remediated through other, broader apps and platforms, usually in more explicable forms for global audiences. They are repurposed and re-presented in the English language, to help fight a global info-campaign. To focus, as with the EU 2023/2024 Digital Services Act, primarily on digital content and platforms’ capacity to regulate and moderate what is seen or not, and what is removable or not, misses the bigger picture. Rather, it avoids reflection on the reality of the individuals and States who are at war, and that the platforms they are attempting to regulate are inextricably part of what war is, to repeat, as part of the global, societal and personal revolution in information technology and information.

The EU relegating platforms such as Telegram in their hierarchy of attention by virtue of monthly active users, misses the point entirely. It is precisely the emergent and the easily exploitable smaller platforms such as Telegram that are the weakest point in connected media ecologies that offer a way in for the content that the EU is attempting to block. The same can be said for platforms such as 4chan, founded in 2003, and the centre of online memes and trolling, is part of an anything-goes, libertarian culture of a re-emergence of Web 1.0, yet one in which warfare accelerates into. This is insurgent media, a feeder for, as well as a threat to, the more established order that EU legislation is focused on.

What we have called the ‘Military-Social-Media-Complex’ (Merrin & Hoskins 2020) employs global satellites, global cables and the clouds of war, replacing the ‘fog’ of war of combatants, with a similar epistemological effect for civilian informational militias. Telegram as both the wounding and the witnessing app of choice for civilians and combatants in Ukraine, Gaza and Israel, is a media insurgency that is in plain sight in its granular spectacularity, yes, but at a global scale it prohibits intelligibility and ready (over)sight. Instead, a more human focused vision is required to offer protection that recognises individuals as participants in sharded war, including a new human right to deliver protection from physical or psychological threat from above (Grief et al. 2018). We see this a matter of a recalibration of human rights that goes someway to recognising the complexity and scale of the threats from the connections between the shards of individual participation in and exposure on emergent apps and platforms, and the opaque clouds of war.

To offer some semblance of the scale we write of here, by 25 February 2022, TikTok, well known as the short-video platform of choice for dancing schoolgirls and teenage ‘challenges’, had become one of the key informational battlefields for the 2022- Russia–Ukraine War. Within two weeks, videos tagged with #Ukraine had racked up 18.2 billion views. In November 2021, the BBC congratulated itself for reaching its highest ever global audience. According to the Global Audience Measure (GAM), ‘In 2020/2021, the BBC achieved record figures with an average audience of 489 million adults every week, with a chance of reaching close to 500 million people by 2022’ (BBC 2021). Half a billion. A week after hitting 18 billion, #Ukraine had reached over 24 billion views. Forget half a billion, that one hashtag on TikTok had been viewed by three times the number of people on the planet.

Many of the videos were recorded on soldiers’ own phones, but many more were from people’s streets and windows, documenting the invasion and shelling. This was the new smartphone, cell-phone-enabled military civilian-combatant experience of cell-shock: the traumatised, stupefied, amazed, stunned and wowed need to film what is happening to oneself, to record and show the world. When you cannot believe your eyes, you turn to camera lenses so you can convince yourself and others. This is the open-source OMG sent out to the world. Its message is not only look at this, look at what is happening, but I cannot believe what I’m seeing myself. When it bursts through into real life, into your own life, the previously mediated experience of war becomes simultaneously unreal enough to film and real enough to kill you.

Ukrainian videos vastly outnumbered Russian ones in the English-language TikTok feed. There was an ease of creation, an ease of sharing, an ease and pleasure of argument in the comments. War becomes an endless feed.3 But TikTok relies on a hyper-aggressive algorithm: the apex predator of algorithms. It feeds you what you seem to want, which is not necessarily what you want. It notes and analyses every nano-interaction with the videos it shows you: how quickly you swipe it away, how long you hesitate and look and if you like, favourite, follow or comment. Social media, messaging and video apps are united today in being engagement engines. Their only aim is to provide stickiness that keeps you on them, using them, adding data to them. Hence, the attraction of outrageous and extreme posts and content, for the platform does not care whether you are very happy or very angry as both produce engagement. All that matters is that you are very there. So swiping trains the algorithm. If you want more war, you get more war. This is swipe war: with every interaction, your eyes and fingers vote yes or no to war. Where previous conflicts such as in Syria found their media home on the long form, curated, search and click, semi-permanent4 YouTube, today the new home of war is the immediated, short-form, impermanent and black-box TikTok.

There were many memorable videos: the Ukrainian Snake Island defenders announcing ‘Russian warship, go fuck yourself5’; civilians standing in front of invading Russian tanks; the elderly Ukrainian woman telling Russian troops to put sunflower seeds in their pockets so when they die the flowers, a symbol of peace and new life, would grow in the ground; the old man arguing with Russian troops to go home; the Ukrainian farmer asking a broken-down tank if it needed towing back to Russia; and the videos of Zelensky alive in the capital.

But this is unverified and uninspectable war. How to verify your feed, with the platform seeing a flood of disinformation and misinformation, the digital unsettling of the history of war, mixing in to the present images from other wars, training air shows and movies, and, more than ever before, convincingly relevant and seemingly real images taken from video games. The engines of video game production become engines of social media disinformation. Others take advantage of the chance to add audio to videos, filming scenes to add gunfire to them (always the same gunfire). This may be state disinformation, but it could also be virally motivated, for likes, comments, followers and for the promise of TikTok creator fame. What kind of reality of war, in these short-form, snapshot videos are real? What do they mean? Securing provenance is a twentieth-century ideal that fact checkers dream of. Yet, it is impossible in this war. The bullet clips here are eclipsed by the film clips of TikTokers wanting to kill their stats.

Except. Except TikTok was not meant as a platform for major events. It is not certain that it does them well. It remains a creator space where the subject’s performative success is the central battle and its battlefield is small, brief, personalised and creator-centric. Its own content policies, however, badly implemented, do ban violent content, so its ability to show us the reality of war is limited. And then there is the fact that it is Chinese owned (by ByteDance, founded by Zhang Yiming in 2012). We know it has taken down sensitive material before such as Hong Kong protests and comments about Tiananmen Square and Xinjiang. This is important here because within a few weeks, the Ukraine War videos declined then disappeared on our feeds, despite our continued interest and interaction. We do not know whether the videos dried up, whether the algorithm naturally moved onto more apparently engaging material for us or whether the app or algorithm was meant to remove these videos from sight. The images of an invaded nation fighting back successfully against a local superpower may, indeed, have given China pause for thought regarding its own assumed plans for Taiwan.

So this is where we are: a world of filter bubbles, a world of media and personalised realities, a world of algorithmic decision-making, of multi-actor and even A.I. produced and faked disinformation, of deluded, incessant misinformation (‘filter bubbles’?), of exploded realities, and of foam worlds of personally formed and encompassed information. Hence, this is war as split, fractured, streamed, personalised, shattered and sharded. No one now experiences the same war. Most can swipe it away as soon as swipe to see it. We all experience splinters and shards of war. The idea of splinters may be important here. In Camera Lucida (1993) Roland Barthes, trying to find the secret of photography, claimed images contained a ‘punctum’, a detail which pierces the viewer. These are personal, remaining different for each viewer. Today our military videos serve as splinters, puncturing the vision and experience of the viewer whilst also representing the splintering of the whole. They are shards of war.

But the word shard also has another significant meaning. In online, multi-player video games such as World of Warcraft ‘sharding’ is a game design tool to prevent gamer overcrowding in outdoor areas and improve server performance. At a certain density, the game creates a new ‘shard’—a new copy of that area, to allow users to enjoy it without crowding or lag. This new world appears identical to the old and is seamlessly integrated into their experience. Hence, gamers can ‘live’ the same world, the same game processes, without realising they exist in separate worlds with no possibility of interaction.

So, we have sharded war. We exist in our own sharded, filtered, algorithmically and personally fed realities and our own experiences of war. This sharding is radically individualised and personalised, yet we experience the war as if it is the same war experienced by everyone else. But today, even mainstream war reporting has become sharded from the battlefield (the BBC’s Ukraine reports came initially live from Kiev, not live from the towns where Russians are killing civilians or live from the main front lines; just as western journalist reports from the Israel–Hamas War came mostly from outside of Gaza, only seeing explosions from afar), just as our own experiences have become sharded from the war’s reality. Even bottom-up, citizen open-source reporting has now stumbled. Its main platform of dissemination, ‘X’ (formerly Twitter), has become an unregulated free-for-all and increasingly less trusted as an informational source, whilst OS accounts have proliferated, often produced by biased actors or possibly states, all serving to further muddy—to shard—the realities of conflicts.

Just as in massively multi-player online video games, reality is digitally sharded and gamed and open to our gamification. Citizen-combatant informational war is our new game. Just look at the 2023 Israel–Hamas War on TikTok—there is an explosion of biased, propagandist content, all competing to lie, to dissimulate, to beat other narratives, to win with filters, lives, memes, disinfo and creator opinions. The Hobbesian concept of ‘the war of all against all’ was, for him, a description of the pure liberty preceding the establishment of society. Today, this is our digital reality. We are engaged in an always-on informational battle against everyone with a different opinion, against not a collective, but a new digital multitude. The most visible war in history is the least shared.

Notes

  1. 1.

    This is the title of Ernst Jünger’s memoir of the savagery and horror of trench warfare in World War 1.

  2. 2.

    https://commission.europa.eu/strategy-and-policy/priorities-2019-2024/europe-fit-digital-age/digital-services-act_en.

  3. 3.

    See also Hoskins & Shchelin (2022) on the ‘war feed’ as shared on Telegram.

  4. 4.

    The NGO Syrian Archive challenged the Google owned YouTube in its deletion of millions of videos and thus the erasure of vital evidence of human rights violations in Syria, https://www.wired.co.uk/article/chemical-weapons-in-syria-youtube-algorithm-delete-video.

  5. 5.

    https://twitter.com/aletweetsnews/status/1497008826201124870.