1 Introduction

Building Information Modeling (BIM) is not a new concept, but rather one that is playing an increasingly larger role in the architecture, engineering and construction (AEC) industry. From design to construction, the concept of BIM has been a feature across many industries for nearly 30 years [18]. It remains a strong and important player in the field because of its ability to allow designers to go beyond representing the physical space of a new or retrofitted building to the intrinsic properties of the structure as well. BIM is not just about the design of new buildings, it also plans for years of use. This is because designing, scheduling, constructing and evaluating a building is done in the BIM model long before any construction actually takes place. Although it is true that the future of the construction industry is digital and that BIM facilitating tools and standards (e.g., IFC) will foster long-term facility management, there are still technological and managerial challenges ahead [5]. The nature of these challenges depend on the building lifecycle, which is generally defined as a three-phase process [8]: (i) Beginning-of-Life (BoL) including design, manufacture and construction of the building; (ii) Middle-of-Life (MoL) including its use and maintenance; and (iii) End-of-Life (EoL) including its disposal and recycling. Our research puts special emphasis on post construction challenges.

One of the major challenges after the delivery of the building (i.e., when starting MoL) lies in the difficulty to close the information loop between all phases of the building lifecycle. For example, due to the lack of system integration and other factors such as the non-maturity of the IoT (Internet of Things), it is not that easy to collect, capitalize, and share information/knowledge acquired from MoL (e.g., during use and maintenance activities) with other building lifecycle stakeholders, and vice-versa [12]. This is all the more important since such information could result in enhanced decision-making in BoL (e.g., to improve the next generation of buildings and boost the innovation process by capturing new business and user needs), or in EoL (e.g., to guide decision-making about the reuse of components by having information related to the building use conditions). The establishment of such a closed-loop information/collaboration structure throughout the asset lifecycle is not only facing the AEC industry but other sectors, too, e.g., manufacturing where concepts such as Closed-loop PLM Footnote 1 [13, 14] emerged over the last decade.

Given the above, the contribution of our work is twofold: (i) design/develop an open & interoperable building management system that integrates efforts, outcomes (technologies, standards…) and directives of both the AEC industry and adjacent sectors such as Closed-loop PLM and IoT; (ii) set up an effective and evolutive requirement engineering framework for ensuring successful system component development. Sections 2 and 3 deal respectively with these two contributions, Sect. 4 presents proofs-of-concept of our system, conclusion follows.

2 When BIM Meets Closed-Loop PLM

The differences between BIM and PLM chiefly surround their capacity for technical and organizational integration. However, they both share a number of similarities relative to their approach to data sharing and project management activities [14]. Although there is only a few documented efforts of implementing PLM in AEC companies, the challenges that follow on from these shared characteristics may provide fertile grounds for sharing lessons learned. Section 2.1 focuses on BIM throughout the building lifecycle, while giving insights into current Closed-loop PLM research and practices. Against this background, Sect. 2.2 discusses the importance of having accurate requirements to transfer those practices to the AEC/BIM industry, and how this can be achieved using an evolutive requirement engineering framework (Fig. 1).

Fig. 1.
figure 1

Building lifecycle management system combining BIM & adjacent sectors’ efforts

2.1 Whole Lifecycle Approach

Managing vast amounts of disparate information throughout an asset lifecycle (car, airplane, building…) is an enormous challenge for organizations, particularly in terms of enforcement and compliance. Information governance enforces desirable behavior in the creation, use, archiving, and deletion of corporate information. By closing the loop, rules and policies are defined, policies are managed and enforced, authorized records are accessed when and as needed, and metrics are available to audit the current rules and policies. All this provides a way to continuously assess and update the process for optimum results.

Unlike this vision has been widely explored in PLM, it has only been in the last 5 to 7 years that an increasing focus on the application of BIM throughout the whole building lifecycle has emerged, and the significance of business process integration been acknowledged [14]. BIM servers are now being developed to provide a large integrated data- and knowledge-base that can be leveraged not only in design and engineering, but also in construction operations (BoL), facilities maintenance (MoL), and disposal activities (EoL) [8]. Such a building lifecycle’s vision is depicted in Fig. 2, where research efforts are increasingly focused on “closing the loop” to foster collaborative processes, shared resources and decision-making [2]. Although some challenges remains to be addressed in BoL, the major challenge in the context of ‘closed-loop’ information starts from the delivery of the building, where BIM and other BoL models fall into oblivion. This means that all the knowledge generated in BoL is not, or at least cannot easily be re-used in MoL and EoL, while some reports highlight substantial profits that could accrue from such information loops [13]; for example, Barlish and Sullivan highlight the fact that 85% of the lifecycle cost of a facility occurs after construction is completed [4] and, in this respect, that using BIM-related information in downstream processes could help to save money.

Fig. 2.
figure 2

Evolutive requirement engineering framework for system development

As mentioned above, the concept of closed-loop PLM (or CL2M) has developed theories and tools to enable closing the information loop between multiple lifecycle phases [11, 15]. This concept emerged from the PROMISE EU FP6 project, where real-life industrial applications required the collection and management of product instance-level information for many domains involving heavy and personal vehicles, household equipment, etc. Information such as sensor readings, alarms, assembly, disassembly, shipping event, and other information related to the entire product lifecycle needed to be exchanged between products and systems of different organizations. Based on the needs of those applications, requirements for data exchange were identified and, as no existing standards could be identified that would fulfill those requirements without extensive modification, new messaging interfaces were proposed (see e.g. [11]). Those specifications have since then been further developed by the IoT WG of The Open Group and implemented by several EU project consortia (e.g., LinkedDesign FP7, bIoTope H2020Footnote 2). Recently, The Open Group published those specifications as two distinct – but complementary – standards, namely the Open Messaging Interface (O-MI) and Open Data Format (O-DF) standards [11].

Our research work and contribution originate from this state-of-the-art with the specific aim of developing a user-friendly building lifecycle management system relying on open and interoperable standards like O-MI/O-DF. To this end, it is of the utmost importance to select and/or set up an effective and evolutive requirement engineering framework for the development of successful system components, especially in new and cross-domain contexts, as is the case in our study (combination of standards/directives from the AEC/BIM and Closed-loop PLM & IoT sectors). The next section briefly discusses such a requirement engineering framework.

2.2 Requirement Engineering Framework

Accurate requirements provide the foundation for successful product development. Three main steps can be identified in requirements engineering [16]:

  1. 1

    Requirements inception: start the process (business need, market opportunity…);

  2. 2

    Requirements management: to capture new needs/contexts over time.

In our context, customer needs must be transferred into product and process requirements without necessarily developing all possible technical characteristics, but only the ones that fulfil the needs for efficient closed-loop information and collaboration in a building’s lifecycle context. Let us add that the production activity is supposed to be traceable back at least indirectly to customer requirements.

Given this, our research work develops an hybrid framework based on the synthesis of well established techniques from software engineering and management theory and tools [9]. An overview of this hybrid engineering framework is provided in Fig. 2, which combines (i) a functional analysis (using the Octopus diagram); (ii) a requirement prioritization technique (using AHP – Analytic Hierarchy Process) [18]; (iii) a method that transforms prioritized requirements into quantitative parameters/specifications (using QFD – Quality Function Deployment) [7]; and (iv) a spectral algorithm method for clustering specification conflicts identified through the QFD matrices [3]. These phases are followed up by the software development phase, as well as the definition of KPIs (key performance indicators) to assess whether the system is free of defects, meets the user needs that may evolve over time, and so on. In the context of smart buildings, similar approaches for the development of smart home components was followed, e.g. Durrett et al. [10] used QFD to effectively satisfy customers’ needs as part of an integrated smart home environment. Popescu et al. [17] used QFD combined with AHP to identify a set of functions of lighting, heating, security, furniture, etc., that could have a critical contribution to the independent living capacity of people with special needs, if receiving smart abilities. However, none of these frameworks and analyses consider needs related to the whole building lifecycle, along with the imperative to enable closed-loop information and collaboration among distinct lifecycle stakeholders.

As a consequence, an appropriate hybrid engineering framework that enables to integrate such needs, and appropriate technology enablers from the AEC and PLM industries, is defined and proposed in this study, as will be discussed in Sect. 3.

3 A Hybrid Engineering Framework for Development of Building Lifecycle Management System Components

As previously stated, and depicted in Fig. 2, our framework starts with a functional analysis using the Octopus diagram that identifies the Primary Functions (denoted PF) as well as the Constraint Functions (CF) between the system to be developed and its environment (e.g., actors, directives, services, etc.). Figure 3 provides insight into the Octopus diagram related to our building lifecycle management system, where the different PF and CF functions are further described in Table 2. Based on these PF and CF functions, high-level requirements are formulated in the form of a hierarchy representing distinct categories (e.g., CF2, CF3 and CF5 falls within the scope of “Building lifecycle”-related requirements, etc.). Due to space limitation, the final requirement hierarchy is not presented in this paper, but a first insight into the categories are illustrated through the Octopus’s color code in Fig. 3.

Fig. 3.
figure 3

Illustration of a part of the requirement elicitation & specification steps

There are number of software requirements prioritization techniques, but according to a recent survey [1], AHP is the most widely used technique. Although we do not present the AHP process in this paper, one should know that AHP provides – as output – the list of requirements ranked in order of priority (Table 1).

Table 1. Primary & Constraint Functions (PF-CF) formalized through the octopus diagram

Such a requirement ranking, associated with the priority weights, are used as input of the QFD. QFD is both a requirement definition and conceptual design tool that systematically documents customer needs, benchmarks, competitors, and other aspects, and then transforms the list of prioritized requirements into design specifications. The QFD methodology flow involves four basic phases that occur over the course of the product development process. During each phase one or more matrices are generated (cf. block in Fig. 2 entitled “Turning initial requirements into specifications using QFD”), where the specifications (including their respective weight) resulting from the QFD matrix of phase n feed the matrix of phase n + 1. Figure 3 provides insight into the specifications resulting from the first QFD matrix, which all bring first technological or scientific enablers to fulfil one or more requirements. For example, Fig. 3 shows that the two most important enablers with respect to our initial requirements are (i) Data & service discovery: to enable any building stakeholder to discover and access, when and as needed, information sources and associated knowledge; (ii) Dynamic service composition: to enable building end-users to create their own services using a portfolio of “processing blocks” (including diagnosis, maintenance prediction, event-detection, storage…).

As highlighted in our requirement engineering flow (see Fig. 2), a spectral method for clustering conflicts that arise from the QFD matrices’ roofFootnote 3 is further applied, followed up respectively by a validation phase of the specifications and the development of the software components. To guarantee that our system remains competitive over the short and long term, specific Key Performance Indicators (KPIs) are defined to assess in a quantitative manner several aspects such as the software bugs, the user satisfaction and new needs, etc. Although not presented in this paper, it is important to note that these KPI metrics continuously feed the QFD matrices (see Fig. 2), thus helping to produce new releases of the system/software with added or rectified features.

4 Proof-of-Concept – Building Lifecycle Management System

The first releases of our building lifecycle management software have been suppliedFootnote 4, enabling any developer to deploy and instantiate it in his/her own environment/buildings. Two proofs of concept of such an instantiation are today available online, namely at Aalto Smart CampusFootnote 5 and Sophia Antipolis Smart CampusFootnote 6.

Figure 4 provides screenshots of the web-based dashboard that any building end-user/stakeholder can access and use as they see fit. First, stakeholders are able to discover (in a city or region) all the buildings that are compliant with our system, or more specifically compliant with the IoT standards used for data exchange (i.e., O-MI/O-DF in our case). The discovery can be achieved both in a visual manner (Google Map, as shown with the dashboard view denoted by 1 in Fig. 4) or automated manner (using the RESTful discovery mechanism supported by O-MI, see e.g. [12]). From this stage, the stakeholder can access both the Floor/2D model (see arrow denoted by 2) as well as the 3D model of the building (see arrow denoted by 3). Those views have been directly generated using – as input – the integrated BIM/IFC file, which is now able to be enriched with live sensor data, e.g. the room “Cafeteria” collects Co2, Humidity, Occupancy and Temperature sensor data. Along with this 2D/3D views, another view – a 360 ̊’s picture of the room (see arrow denoted by 4) – is supported wherein there is a twofold benefit (i) a building manager can notify the system that a new smart-connected object has been added, but also where it has been added in the room (e.g., a new smart coffee machine), but can also and foremost link the virtual sensor with the real-life information sources (i.e., physical sensors in this example); (ii) any end-user can see where the information source is located, which is a key contextual information that can be further used when developing additional services that rely on or use this information (e.g., if a sensor is located above an oven, the developer can identify it, integrate it to his/her knowledge, and handle it as needed). Finally, our system provides initial Analytics’ services (see arrow denoted by 5) such as basic plotting of sensor data over a certain period of time regarding one or a group of rooms, but also more complex/cognitive services like the prediction of specific events (energy, failures, etc.).

Fig. 4.
figure 4

Screenshots of the current building lifecycle management system (Aalto Smart Campus)

As a final step, we sum up in Table 2 the set of features – based upon the specifications resulting from the QFD matrix Level 1 (see Fig.  3 ) – that have been fulfilled in the first release of our building lifecycle management software (see column denoted “Today’s System features” in Table 2), but also the ones that will be addressed through the H2020 bIoTope project and that will be integrated later on (next releases). It can be observed that, in this first software/system release, the specifications having received the highest priorities with respect to the initial requirements (note that the specifications in Table 2 are listed in order of priority).

Table 2. Current vs. Future building lifecycle management system features

5 Conclusion

It is still challenging to close the information loop between multiple building lifecycle phases (e.g., by capturing information and knowledge from MoL for re-using it in BoL and/or EoL processes), which opens up opportunities for enhanced decision-making and cost saving (e.g., in facilities management). Our research addresses this lack of closed-loop system by developing an open, interoperable and integrated Web-based building lifecycle management system that integrates efforts, directives and technological enablers from both the AEC and PLM/IoT. This paper gives insight into the requirement engineering framework that has been set up for the development of system components, as well as the first proofs-of-concept of the system implementation (running on two distinct campus).