1 The Urban Internet of Things

As Cellary (2013) notes, there is no common consensus about what “smart” really means in the context of information and communications technology (ICT). Although this term has become fashionable, it is also broadly used as a synonym of almost anything considered to be modern and intelligent (Anthopoulos 2017). In an urban context, Batty and others note that the term smart cities correspond with the rapid spread of computation into the kinds of public and open environments that others, from Hardin (1968) to McCullough (2013), have called the commons, meaning the spaces in the city that are notionally set aside for collective use and exploitation by the community. While the term smart has many competing definitions and public perceptions associated with it, we consider a focus on sensing and computation in public spaces to be its defining characteristic. In this way, the aspirations behind smart technologies we relate to self-monitoring, analysis, and reporting technology (SMART) adapted from the association with computer hard disks as a way to internally monitor their own health and performance. SMART, in terms of disk drives, allows users to perform self-tests on the disk and to monitor a number of performance and reliability attributes and seems a useful close analogy. The ability to self-monitor, analyze, and report performance and reliability measures is, we argue, a closer definition of the smart city, especially when focusing on aspects of sensing the environment, communication, modeling, and analyzing based on data feeds from the urban context.

Covering urban areas in general and at multiple scales from the city as a whole down to the microscale of footfall at a given point in place and time, the potential for urban data collection is almost infinite and certainly satisfies accepted criteria for data to become big. In 2013, Ebbers, Abdel-Gayed, Budhi et al. stated that there are four main aspects of big data, these being data generated at a fast rate (velocity), very large and potentially unknown data quantities (volume), accuracy of the data (veracity), and different forms of data such as text, structured data, etc. (variety). Tennant et al. (2017) build upon this, noting that other aspects of big data have been added over the years, for example, volatility, referring to the length of validity of the data, which is particularly relevant when referring to real-time data streams; and value, referring to potential insights that can be derived by analyzing the data. Velocity, variety, volume, and veracity of data, interlinked with volatility and value, are central to the use of data within an urban context. This cuts across a broad spectrum of applications, but more especially applications consuming, analyzing, and visualizing data in an urban context from Internet of Things devices—or an Urban Internet of Things. Coulton et al. (2019) state that the term Internet of Things (IoT) was coined by Kevin Ashton in the late 1990s. Ashton explained how by using sensors to gather data that could be shared across the company’s computer network, they could streamline their supply chain. He called these data-enabled parts of the supply chain the Internet of Things, and the phrase caught on.

The potential of the concept is immense, as it is linked to the automation of data collection en masse. As Ashton (2009) notes, if we had computers that knew everything there was to know about things—using data they gathered without any help from us—we would be able to track and count everything, and greatly reduce waste, loss, and cost. We would know when things needed replacing, repairing, or recalling, and whether they were fresh or past their best. Linking this to cities, Batty and Hudson-Smith (2007), in their often-cited paper, called this the computable city, stating that by the year 2050, everything around us will be some form of computer. In essence, they were predicting an Urban Internet of Things.

Building on this, the Mayor of London published a document entitled “The Smarter London Together” roadmap, in 2016. The roadmap, which is a non-statutory document, builds on the first Smart London Plan from the Greater London Authority (GLA) in 2013. It provides a new approach based on collaborative missions and calls for the city's 33 local authorities and various public services to work and collaborate better with the aid of data and digital technologies (GLA 2019). As part of this work, the city has developed a number of test beds, allowing the exploration of research-led deployments. One such location is the Queen Elizabeth Olympic Park (QEOP), to the east of the City of London and an area we will focus on to explore actual examples of Urban IoT. As the GLA note, the park’s development is managed by the London Legacy Development Corporation (LLDC). Its ambition is to use the park as a test bed for new international standards in smart data, sustainability, and community building, sharing its successes across the city and beyond. This initiative has allowed the authors of this chapter to deploy a number of IoT-led initiatives within the park. Over the following sections, we explore these deployments while also focusing on the wider picture and also the current realities of Urban IoT in the context of our definition of smart—self-monitoring, analysis, and reporting technologies—and also within the view of what we define as the essential six Vs of Urban IoT: velocity, volume, veracity, variety, volatility, and value.

The Internet of Things is central to the collection of potentially all the types of data that are required to understand and manage an urban system. Link this further to knowing the location of each device and you have the potential of a real-time view of a city, or a representation of the city in software that is also known as a digital twin. As such, the development of digital twins has been used as one of the deployments for examination in QEOP.

2 The Digital Twin

Originally developed in the context of industrial design and manufacture during the early 2000s, the term digital twin was proposed as a means of monitoring the performance of industrial products with the aid of digital replicas. The digital twin would be connected to its physical counterpart, an aircraft engine for example, in such a way that any relevant changes in the state of the latter would be automatically sensed and registered (Grieves and Vickers 2017). In this way, the performance of complex and dynamic objects like aircraft engines, or even entire aircraft, could be modeled, monitored, and optimized throughout the entire industrial lifecycle, from design, through daily operation, and on to their eventual decommissioning and disposal. Each component could have its own digital twin, effectively giving us a nested hierarchy of digital twins all the way down to the most fundamental components.

New applications for digital twins are now being sought in other fields. At the urban scale, the digital twin is finding more immediate application in the convergence of IoT and building information modeling (BIM) (Deutsch 2017). A BIM model is a digital model of a building that has had the 3D geometric properties of the structure enhanced with quantitative values and semantic descriptions of the particular building components being represented (see Chap. 34). In principle, all of its components can be modeled, down to the smallest nut or bolt, in the same way as the original aircraft concept, to include information about their manufacture, appearance, physical properties, date of purchase, or installation and cost. The last two facilitate the additional time (4D) and cost (5D) dimensions used for scheduling BIM-based construction. Using open standards like the Industry Foundation Classes (IFC), BIM models can be federated to enable multiple stakeholders to collaborate by reviewing and updating a BIM during the building’s design and construction. At the same time, BIM is perhaps not an “obligatory point of passage” for a digital twin as some in the BIM industry might wish to suggest (cf. Law and Callon 1994).

While BIM provides an efficient means of constructing the 3D representations required for a digital twin of new builds, the models can quickly become static and outdated once they have been handed over to building owners. However, with the addition of embedded sensors and Internet-based connectivity, it is possible to continue monitoring aspects of the building’s physical and environmental conditions in real time. In this way, IoT provides the potential for sensing, connectivity, and feedback through actuation that serve to animate and bring the building’s digital twin to life by establishing its link to the physical counterpart. Figure 38.1 illustrates Here East (a building in QEOP), which was modeled in three dimensions and deployed with environmental IoT sensors to create a simple twin model. The model updates in real time, providing the twin aspect linked to the three-dimensional representation of the built form.

Fig. 38.1
figure 1

A digital twin with IoT sensors of the here east building at the queen Elizabeth Park, London

Even social aspects of the building’s everyday life can be incorporated for a more holistic, responsive, and participatory approach to building management and operation (Dawkins et al. 2018). It is this broad spectrum of connectivity through multiple aspects of IoT, from environmental sensor data through to information occupation and across to social network information, that provides the real key to a digital twin.

Here, we find ourselves in the realm of connected environments. As Hudson-Smith et al. (2019) define them, a connected environment is any place—a home, a building, a street, a park—where sensors have been deployed and connected via the Internet. Collecting data through these sensors allows them to be analyzed, checked for quality control, joined up with other data sets, and used to enhance the area, be it for management, social, environmental, or economic reasons. It is through the capture, processing, and analysis of longitudinal real-time operational data, increasingly performed in the Cloud, that the further possibilities for simulation and more exploratory and predictive use of a digital twin can be achieved. In this way, digital twins bestow on their users some of the powers of more enchanted objects like the crystal ball, insofar as they provide a digital means to see distant places and look into the past and future (Rose 2014). More prosaically, by representing the digital twin as a 3D model, and moving away from the use of abstract plots and graphs, the digital twin becomes more accessible to the public, and more relatable to a specific place. The digital twin is a new kind of enchanted object: a digital representation of the physical world that, with the addition of data collected from anything from building systems through to social and environmental feeds, gives each individual a kind of omniscience that can help one understand and act on one’s environment.

Just as digital twins are the sum of their components, we can also aggregate them to create connected assemblages at coarser scales. The digital twin at the urban scale is still an emerging concept. Some imagine an urban digital twin as a swarm of connected systems collaborating autonomously to intelligently manage energy, traffic, utility, roads, and communication networks (Datta 2016). The digital twin can be viewed as a mirror held up to this world, one that not only reflects the environment as we ordinarily see it, but also the unseen or invisible patterns of phenomena that find themselves encoded in flows of sensor data. With mirror worlds, as conceived by computer scientist David Gelernter in the early nineties, “the whole city shows up on your screen, in a single dense, live, pulsing, swarming, moving, changing picture.” This vision is currently being realized through the development of interactive virtual city models like Cityzenith, VU.CITY, Virtual Singapore, and CASA’s own Virtual London (ViLo).

Commonly viewed on the computer screen, tablet, or mobile phone, new opportunities of interacting with these tools and the data they orchestrate are being opened up by increasingly immersive virtual, augmented, and mixed-reality devices. While virtual-reality systems enable us to visit other places and times and immerse ourselves within those environments, augmented and mixed realities can bring that informational content to us by overlaying it on the everyday environment (from room to building to street, neighborhood, and city). At different scales, data and reality can be mixed, viewed, and shared. Such mirror worlds then often engage new contexts and audiences while also providing new opportunities for learning and the exercise of personal and collective agency in the urban environment (Dawkins 2017). Digital twins can be used to view a variety of information in a multitude of ways. The ViLO model (Figure 38.2) allows viewing via a traditional computer desktop as well as via virtual reality, augmented reality, and mixed reality, all with real-time, geo-located data. Given the pace of technology, the creation of digital twins is inevitable, allowing the digitalization of our world and thus opening up the opportunity for new insights into physical worlds. Indeed, in the recent report “Data for the public good,” the UK’s National Infrastructure Commission (NIC 2018) proposes the creation of a digital twin to unify the management of data concerning transport, rail, power, water, and communications infrastructures alongside meteorology and demographics across the whole of the UK.

Fig. 38.2
figure 2

QEOP in the “ViLO” model providing real-time IoT data within a 3D environment

3 Potential Versus Reality

The potential of the Urban Internet of Things is such that it could be viewed as new data revolution, moving forward our understanding of the logistics of cities. There are already an estimated 26.6 billion IoT things in existence with a predicted 75 billion connected things by 2025 (Statista 2018).

Such numbers do not necessarily, however, mean that there are 26.6 billion operational devices. We would estimate that less than a tenth of these devices are currently live, transmitting data; a tenth of those probably have quality control on their data feeds; and a tenth of those have a known location, indeed probably even less. The potential is of course there, and all technological developments take time to become embedded into methodologies and systems, which are often developed on a wave of hype, expectations, and disillusionment, and then finally enter production. The Gartner Hype curve is a useful way to understand such adoption of technology; the most recent (Gartner 2018) has digital twins approaching the peak of inflated expectations.

The first realizations of cities inside a computer in iconic, rather than in more abstracted mathematical form, were mooted in the 1960s with the Skidmore, Owings, and Merrill wireframe model of Chicago, an early exhibit of these possibilities (Batty and Hudson-Smith 2007). The intervening years have seen the development of 3D models beyond the wireframe and into photorealism on a global scale. Indeed, as Goodchild (2018) notes, the technical ability to create and visualize 3D renderings of the Earth was unavailable in the mid 1960s at the birth of GIS, but it was achieved in the early 1990s, and led directly to Google Earth and its many competitors.

The technology continues to develop and the more recent introduction of the Google Earth Engine essentially now provides public access to a multi-petabyte curated collection of widely used geospatial datasets (Gorelick et al. 2017). Beyond this level of detail is the current domain of systems such as ViLO, linking in building information systems, with geographical information systems (GIS) providing the linkage between buildings, data, and geography. However, these merely provide the skeleton to the twin and arguably can be compared to the wireframe model of Chicago from the 1960s in terms of where we are in creating a true digital twin.

If the model is the skeleton of the city, then the Internet of Things can be compared to the neurons in the brain, communicating via wireless protocols rather than neurotransmitters. At the moment, however, the city does not have a brain, and the devices communicate to diverse systems, sometimes joined up, such as is the case in terms of public transport networks and deployed sensors, but often as part of local initiatives using devices deployed by hobbyists, or as part of small research trials. The data are, however, starting to flow, and developments in networking and computing technology are enabling small, low-power devices to be deployed in the field and communicate over long distances. This is the revolution on the horizon and it is just starting to become a reality, allowing data-collection devices to go from a small number to a number that has the potential to be compared to the number of neurons in the brain, collecting data about the city.

Data created en masse at a hyper-local level opens up the prospect of a data-driven view of the city that was unimaginable when the first computer models were created. It is the ability to sense and collect data at a range of time scales, now becoming dependent on need rather than technical ability, that opens up the potential of IoT within an urban environment. IoT data cover a wide range of themes, from data relating to transport flows through to the density of crowds, environmental data about air pollution and temperature, through to economic transactions and foot fall and data relating to buildings. It covers all scales, from the hyper-local presence sensor under a desk that infers occupation, through sensors of room temperature and use of energy, up to city-wide transport data and urban heat islands, with the integration of GIS and smart-cities systems.

The use of such devices for input into a smart-city system can be broken down into the following aspects as highlighted in Figure 38.3:

Fig. 38.3
figure 3

Intel IoT reference architecture (Intel 2018).

Although the diagram in Figure 38.3 appears complex, it can be broken down into its components, each allowing the data to be collected, processed, analyzed, and finally visualized. Sensing and actuating are ubiquitous in our modern cities, buildings, and consumer products. Sensors refer to the technology that “converts a physical measure into a signal that is read by an observer or by an instrument” (McGrath and Ní Scanaill 2014). The emergence of the first thermostat in 1883 (US Patent No. 281884) is considered by some to be the first modern sensor and is still common-place in most monitoring systems. The 1990s witnessed the large-scale use of microelectromechanical (MEMS) sensors in automotive systems such as airbags and antilock braking, which introduced cheaper and more reliable sensing. The first consumer MEMS device, the Nintendo Wii controller of 2006, introduced a three-axis accelerometer which determined the motion and position of the controller. Economies of scale mean that similar technologies are now embedded in many consumer devices from phones to watches. From analog to digital, low cost to high, sensors cover a broad spectrum of operational parameters; for example, not all temperatures are equal and careful consideration needs to be given to the type of temperature sensor to be used (contact, non-contact, etc.).

Actuators on the other hand are the components of a machine that move or control some mechanism, by converting energy into motion. It is the mechanism by which a control system acts upon an environment. From the brute-force application in the construction site using hydraulics and pneumatics to the highly automated and controlled environment of the factory floor, all the applications have an ongoing operational cost—they are not fit and forget devices. The physical Internet has different maintenance requirements to those of the digital Internet.

Data generated by sensors or pushed to actuators are processed through gateways. These computational nodes can be on the same functional device (e.g. a mobile phone) or a separate compute module which gathers data from multiple sensing and actuating nodes (e.g. wireless sensor networks). The purpose of these data-collection devices is to capture, filter, and process data efficiently and to connect using wired or wireless communication technologies to legacy or Cloud infrastructure. This aggregation layer is often used to provide security, management, and data-preprocessing functions.

Data from gateways (or things) can be processed through any number of Cloud services, such as processing streams of data, implementing policies to make data available to different end consumers, or sending for storage. Data are typically stored for real-time analysis and presentation or archived to support offline analysis.

Smart technology can also use Cloud or edge architectures. These essentially describe where computing, storage, and analysis take place in the network. At the Cloud scale, data are typically sent to a centralized location where they are hosted on high-performance computing infrastructure and enjoy the benefit of compute power for complex analytic tasks. As an example, the meteorological network of weather stations maintained by the Meteorological Office around the UK all upload sensor data to servers where supercomputer facilities can be used to analyze and update rolling weather forecasts.

At the other end of the spectrum, there are many applications where it may be too expensive to send data via a data network to the Cloud, or where the latency in doing so means that useful analysis cannot be delivered in a timely manner. For example, autonomous vehicles need to operate at very low latency so that they can respond immediately to their surroundings; hence, many tasks are run locally in the vehicle, with non-time-critical information being sent to and from roadside infrastructure.

The final building block of IoT systems is the business intelligence layer, which both presents interfaces into the information being generated and provides the means to manage the system. IoT platforms provide the support software that facilitates communication, data flow, device management, and the functionality of applications. Outputs are typically screen-based and are increasingly accessed through virtual, augmented, or mixed-reality interfaces. As IoT systems mature, platforms are continually evolving to support the monitoring and management of connected devices at scale, since much of the value in the IoT supply chain is lost or made in the operational cost of those systems.

4 Putting It into Practice: Bats and Creatures

With the ability to visualize in three dimensions and collect data on the edge or within the Cloud via sensors and actuators linking into the digital twins of Urban IoT, the natural environment is often overlooked, especially by those focusing on city systems. Arguably, too many IoT test beds concentrate on smart transport systems, city logistics, or more traditional sensor-based devices. The opportunity of the Urban Internet of Things is the ability to look beyond the current normal and explore new possibilities. In terms of the health of an environment, bats are considered to be a good indicator species; a healthy bat population suggests a healthy biodiversity in the local area. As part of the QEOP test bed, Intel, in association with both University College London and Imperial College London, designed and deployed a “Shazam for Bats” project. Shazam is known for the ability to identify music through short audio clips, thus the aim to track and identify bats via IoT audio recording. A network of 15 smart bat monitors was developed and installed across the park in different habitats, creating a connected environment for monitoring wildlife.

The monitors (as pictured in Fig. 38.4) recorded the urban sound scapes via an ultrasonic microphone, with data processed by converting the sound into image files for data analysis. Each device processed the information locally using edge computing. As Premsankar et al. (2018) note, in edge architecture computing resources are made available at the edge of the network, close to (or even co-located with) end devices. Placing computing resources in close proximity to the devices generating the data reduces communication. Processing the data on the device has multiple benefits, firstly through reduced energy consumption and secondly through a dramatic decrease in the amount of data that has to be transmitted and processed on researchers’ computers. During the first year of the trial (which is ongoing), the implementation of edge computing allowed a data reduction from 180Gb per day down to 2.2 Mb per day, a factor of 80000. Without the ability to process the data locally and instead relying on WiFi or such-like local infrastructure, neither the data collection nor the analysis would have been possible.

Fig. 38.4
figure 4

Echo box installed in the QEOP (https://naturesmartcities.com)

The use of the Internet of Things for longitudinal monitoring was carried out alongside more traditional survey techniques. The continuous data collection and analysis did however open up researcher time to focus on other aspects of the data and to note other shifts in bat activity. The use of IoT is notable as it provides an ongoing data stream without going into the field, allowing a background level of activity to be established and thus a series of interventions such as street lighting strategies to be implemented, with data accessible and therefore available for expert analysis on a daily basis. The trial is of interest in terms of the six Vs of Urban IoT: the velocity and volume led to the implementation of edge computing, while the veracity was tested as the identification of bat species was uncertain at the start of the trail. The data remained volatile, with hardware and power supply issues allowing approximately 70% uptime during the first year of testing. The sense of value is ongoing, but the ability to monitor remotely with data arriving in a preprocessed form creates intellectual, logistical, and economic values in terms of access to new data and analysis methodologies, the ability to carry on logistical trails in the park, and the saving of researchers’ time.

Soft artificial intelligence (AI) is defined as non-sentient AI designed to perform at close to a human level in one specific domain. Soft AI is a reality now in the new generation of smart Internet of Things devices like Amazon’s Alexa, Apple’s Siri or Microsoft’s Cortana (Milton et al. 2018). With over 100 million Alexa devices sold worldwide (The Verge 2019), the public at large are becoming used to talking to devices in their own home. As another part of the QEOP Urban IoT deployment, a series of 15 devices were placed in the park to allow the public to talk to them about the environment. The deployment was part of the project known as “Tales of The Park,” looking at the wider issue of cybersecurity, trust, and risk within the Internet of Things. Using technology embedded into a series of 3D printed creatures (from bees through to otters and even garden gnomes), these geo-located devices used low-energy Bluetooth beacons to broadcast a URL to nearby users. A chatbot system then allowed users to converse with the devices via text-based messages using natural language. The IoT devices were aimed at communicating information about the local environment and the area’s flora and fauna to the public at large, displayed on plinths at eye level, and spread across the park during the summer of 2018. We illustrate one such installation in Figure 38.5.

Fig. 38.5
figure 5

One of the installations in the QEOP, in this case a gnome with embedded IoT technologies on a plinth

The majority of Urban IoT devices are small computers, often unseen, taking samples and communicating data out of sight. The aim of this part of the QEOP deployment was to make IoT visible, and to move beyond either the small hidden devices or devices in anonymous boxes, often found attached to lamp posts (more on lamp posts later in the chapter).

The creatures formed their own network of awareness, retaining information about the user as each device acted as a waypoint in the park. They opened up awareness of IoT devices being deployed with local environmental information, as well as moving the devices into a sense of awareness of the user as they learned more about the user at every interaction. In this sense, they open up the possibility of Urban IoT being more than invisible data collecting devices, and instead devices that chat and converse with users, allowing data to be both collected and communicated. Of course this opens up a whole issue around security and trust: how do you know which devices in the city to talk to? In the future, it may be necessary to address the possibility of a rogue Urban IoT, where devices are deployed to obtain information from the user without them either knowing or being aware. It is however an intriguing future to see Urban IoT as not only collectors but providers of information, and to have those devices be situated already within the environment, from trees to park benches and bus stops. All have the potential to be data collectors, and conversely, what could be more natural than talking to your bus stop for data on the bus times, weather, or air pollution, in the way you currently ask Alexa for information at home?

5 The Humble Lamp Post

The lighting of streets by electricity has brought a sense of security and wellbeing to our cities, towns, and villages for over 125 years. The first-ever electric streetlights in Britain were brought into operation in the 1870s in Holborn Viaduct and the Thames Embankment, London, and today, there are over 7.5 million streetlights in the UK (HTMA 2019). Lamp posts are part of the city; they are ubiquitous and almost unseen. As such, they make almost the perfect place for widespread, dense, and geo-located IoT sensors for the city. The process of transforming the lamp post into an IoT network is still in a conceptual stage, but test beds are in place at various locations around the world.

One such example is a trial to deploy customized multi-purpose lamp posts (MPLPs) in Kowloon East, Hong Kong’s smart-city pilot area. The MPLPs will be interconnected with a telecommunication network to form an IoT backbone. Leveraging IoT sensors fixed on the lamp posts, the MPLP aims to enable real-time collection of city data, such as weather, air quality, temperature, and flows of people and vehicles, for city management and the support of various applications of smart-city initiatives (SCW 2019). Another example is the Humble Lamp Post, a cross-European initiative to upgrade and standardize the 90 million street lights across Europe with IoT services. Such envisioned services include: offering a (potentially free) public WiFi network; providing the powered foundations for a mesh network of (IoT) sensors across the city; helping drivers find a parking place; improving public safety; and supporting environmental monitoring (air quality, waste, flooding). Figure 38.6 illustrates the range of sensors and services envisaged. They can be a place for electronic street signage, public information, and advertising (revenue); be the home of sensors that help direct visually impaired people; a powered Web of electric vehicle (car, bike) charging points; or even pedestrian-flow monitors that can help keep the high street a vibrant place (BSI 2017).

Fig. 38.6
figure 6

Sensors on the humble lamp post, UrbanDNA (2018).

A cross-technology and arts project, known as Hello Lamp Post, is an early example of using the lamp post as a social network. Using mobile-phone technology, the project started as an experimental urban-design intervention that operated in Bristol in July to September 2013. It used pre-existing identifier codes on street infrastructure to enable people to send text messages to objects such as lamp posts, post boxes, bins, telegraph poles, and so on. As Nansen et al. (2014) note, the project aimed to challenge ideas of efficiency tied up with the smart city by thinking about the city as a platform for social play. It allowed users to communicate with street furniture using SMS messages. Their exchanges with the objects were stored and used in exchanges to other people (Nijholt 2015), allowing a conversation to build, while the system was not directly automated (in comparison to the case of the chatbot creatures in QEOP). The project has been adapted for use in 12 cities around the world (Hello Lamp Post 2019) and was installed in the Queen Elizabeth Olympic Park during the summer of 2018 as part of the ongoing test bed for Smart London. Hello Lamp Post and the creatures in QEOP show that urban design and street furniture in cities can not only be conduits for more traditional digital data (data in binary form), but also for social data, collected from Urban IoT devices.

6 Urban Modeling

It is a little beyond this chapter to delve deeply into urban modeling, but it is worth noting that the first generation of urban models was designed and implemented in North America mainly during the years 1959–68, years which coincided with the launching of large-scale land-use transportation studies in major metropolitan areas (Batty 1979). In the intervening years, urban models and a variety of modeling techniques have been used to predict and forecast everything from the first transport models to population growth, housing supply and demand, air pollution, the behavior of crowds, retailing, urban economics, and everything in between.

A number of techniques such as agent-based modeling are expanded upon within this book. All of them, however, rely on data and are arguably only as good as the data input to the model, and then also only as good as the methodology behind them. So while an increase in data may be seen as positive in terms of allowing a wider understanding of our cities, a focus needs to be made on understanding the veracity of the data. In terms of urban modeling, even small changes to an input’s veracity can lead to a biased data set. As Harris et al. (2017) note, simulations that are based on biased data have the potential to increase biases by presenting results that are then used to influence policy. That said, the input from Urban IoT devices into urban modeling opens a new era in simulating and predicting our environment, but it requires standards and a joined-up approach to data analysis.

7 Talking to the Neighbors

As Summerson (2019) notes, the rapid rise of IoT devices within an urban context presents its own challenges. Summerson, leader of a UK government-funded organization known as the Future Cities Catapult (as of April 2019 renamed the Connected Places Catapult), notes that one problem is that much of IoT is still held in silos and separate systems that cannot communicate with each other. At the other end of the spectrum, however, irresponsible information usage raises serious—and arguably even dangerous—privacy and security concerns. Perera et al. (2018) highlight the issues by stating that IoT solutions often act as independent systems; the data collected by each of these solutions are used by them and stored in access-controlled silos. After primary usage, data are either thrown away or locked down in independent data silos.

A significant amount of knowledge and insight is hidden in these data silos that could be used to improve our lives; such data include our behaviors, habits, preferences, life patterns, and resource consumption. In short, at the current time, IoT devices often do not talk to each other; the data may be of high velocity and high volume and with a high level of veracity, but they are often isolated within a closed system. The system is often closed not only due to varying standards for sensing, communicating, and sharing data but also on a social-technical level, since IoT data is often private. As such the view of a self-monitoring, analysis, and reporting technology (SMART) city is complex and although often in close proximity, IoT devices are predominantly not aware of or communicating with their neighbors, making data collection and analysis within the IoT context an emerging challenge. As Summerson (2019) concludes, while IoT interoperability might be the key to accelerating improvements in traffic management, air quality and health, city planning, housing, and much more, the need to define and ensure the use of common languages and mechanisms—agreed IoT standards—has never been more urgent.

8 Conclusion

Digital twins are, according to Gartner (2018), at the peak of inflated expectations, while this arguably means the trough of disillusionment looms, before the arrival of wider use and a plateau of productivity. Their widespread use, and with it data collection, analysis, and use via Urban IoT devices, is on the horizon. To revisit the six Vs (velocity, volume, veracity, variety, volatility and value), without question, the volume and velocity are critical aspects of data in relation to Urban IoT devices. We are on the boundary of a change in the availability, use, and communication of data relating to cities. A majority of the estimated 75 billion IoT devices by 2025 will be in urban areas with a majority of them being able to provide data readings at a sub-minute and moving toward a sub-second frequency. In a similar change, the variety of data is increasing, from the ability to track foot-fall in real time, to pollutants at a hyper-local level, or levels of noise, through to the location of people and transport.

Advances in sensor technologies and networking are increasing the variety of information we are able to collect. Urban data, via the Internet of Things, are still in an early speculative phase and the veracity of the data is questionable. This is not only due to the quality of sensors but also to human factors. The volume of data can of course help with this; if you have enough devices deployed, then it is possible to identify rogue readings and delete them from any input or analysis. The value in terms of inputs into urban policy or urban modeling is long term, whereas the data collection is increasingly short term and high volume, raising issues around storage; and indeed, if data are simply used for the moment and then discarded due to excessive volume.

The opportunities for mass data collection via Urban IoT devices are immense, as are its potential inputs into urban modeling and policy. There are challenges, as we have noted, perhaps most notably in the veracity and volatility of data; but the value, volume, velocity, and variety of data collected from devices make the opportunities for Urban IoT almost limitless.