More than half of the world’s population currently lives in urban areas [1]. Cities have a multifaceted role in societies, and their design is important in enabling vibrant and sustainable environments for their users [2]. Our world generates data in unparallel pace, with data coming from diverse sources and devices, from anywhere and at any given time [3]. As more data become available, opportunities for evidence-based decision making and frameworks to guide decision-making emerge [4].

Nevertheless, interpretation of data and related knowledge extraction is a key challenge for decision-makers in the design of buildings and places [5, 6]. The overwhelming growth and continuous production of unstructured data significantly affects the way people understand and communicate new knowledge from data [7]. The increased amount and availability of generated information, often referred to as Big Data (BD), and the improvement of new technologies and analytical techniques help businesses to develop strategies, gain insights and improve their decision-making processes (DMPs) in various sectors, such as healthcare, energy, infrastructure, construction, to name but a few [3, 8]. By example, analysing data from transportation and mobility by means of tracking technologies to denote the spatial positions of moving entities, supporting DMPs of traffic managers and urban designers [9].

Harris [10 pp. 1, para 2] provides two definitions for decision-making:

“Decision-making is the study of identifying any alternatives based on the values and preferences of the decision maker”; and

“Decision-making is the process of sufficiently reducing uncertainty and doubt about alternatives to allow a reasonable choice to be made among them”.

Decision-making associated with urban design processes is challenging, not least due to data processing, and the identification of metrics. Although experience of a city is a subjective notion, the characteristics shaping its quality are mainly objective, such as the physical aspects of a city; air pollution levels, transportation and mobility, green spaces, and others [11]. However, urban designers and architects still rely on their expertise and precedent information when making design decisions. Complexity of the decision-making process introduces differing levels of expertise and worldviews, which can be inherently subjective (bias). All these challenges combined make “quality decision-making” difficult.

This study aims to investigate, quantify, and rank the relative importance of the decision-making factors contributing to the design of buildings and urban spaces. The key influences affecting DMPs are gathered from literature review and two research hypotheses are tested via analysis of survey data collected: 1) the quality of decision-making varies across the different roles, 2) the earlier the collection of information, alternatives, values, and preferences happens, the better and more informed decisions will be made.

The findings of this research are beneficial to design and construction organisations, practitioners, researchers, and stakeholders in understanding the factors affecting DMPs. This study provides a new means to evaluate performance of decision-making processes, when they are undertaken, by developing and applying a quantitative data-driven, evidence-based methodological framework. The current analysis built an applicable framework which was used to evaluate the importance of 32 variables in total, quantifying the way designers make decisions and capturing the social aspects introduced by the technological advancements and increasing data availability and their use in the design process, to be evaluated and optimized.

Decision-making in building and urban design

The design of a building or its surroundings is based upon synthesis of ideas which are constrained by multiple parameters [12, 13]. A “design- team” comprised of various specialists, such as architects, civil and structural engineers, traffic managers and others, often referred to as a “multi-disciplinary” team, is assigned at the start of every project [14]. These specialists develop a process of communication among the individuals and within the diverse groups. Each group is subject to several influences which determine the distribution of the individual communication patterns. The patterns of communication, and hence the final decisions are influenced by four variables: The type of decisions (decision criteria); The individual who made the decisions (the decision maker); The time and circumstances (decision environment); The time and decision-support required while making the decisions.

Since the 1980s, there has been a growing emphasis on decision-making, where effort in understanding the human information processing in the task of designing was initially introduced [15, 16]. These first studies revealed that design processes were characterised by either comparing alternatives to criteria, such as requirements or constraints, or by evaluation, in which the next steps were formed by it. In parallel to these experiments, efforts in designing computer-based tools began, to allow the process of design information to better understand designers’ intent, such as their rationale behind their decisions [17, 18]. Eventually, due to the multiple factors revealed and the associated complexity, these experiments were terminated. Based on this experience, Ullman and Paasch [18] defined design as “the evolution of information punctuated by decision-making”.

In that same period, the concept of “Knowledge Society” also began to emerge, in recognition that information alone would not bring about important change [19,20,21]. Rather, the key to effective decision-making is how people transform information into knowledge and subsequently manage that knowledge [22]. Therefore, it can be argued here that the potential value of information is revealed when leveraged to drive DMPs. Increased amounts of information and advanced BD analytics can reveal insights, enabling improved decision-making in critical development areas, such the design of a city.

Decision makers

The seminal work of Pettigrew [23] posed the question as to whether the decision problem itself shapes the process to a greater extent than the organisational context through which the process progresses. Rajagopalan et al. [24] suggested a priority for examining the degree of influence by which variation of DMPs are affected by variations in organisational, environmental, and managerial factors. Adding to this complexity, BD now introduce new challenges, focusing on methodologies, technologies, and tools, in which the decision-maker exists in each procedure of BD analysis [25]. More recent studies have attempted to present improved environment-related decision-making as an approach that incorporates intentional processes by which specialists (e.g., ecologists), stakeholders and decision-makers (DMs) work collectively [26]. Researchers indicate that planning and design decisions are by nature complex and subject to conflict due to the increased number of involved stakeholders [27, 28]; stakeholders with differing preferences and value judgements, operating at several levels.

Individuals affect the outcome in decision-making. Factors that contribute to this include cognitive biases, previous experiences or even level of commitment, values, and beliefs [29,30,31]. Cognitive biases represent thinking patterns based on observations that lead to memory errors or inaccurate judgements [32]. When cognitive biases influence decision making, information perceived as uncertain may be dismissed based on observations and prior knowledge. However, although this may lead consequently to poor decisions, in many cases, this may also enable individuals to make efficient decisions with the assistance of heuristics [33].

Nevertheless, there are internal analytical processes in the DMPs that can lead to differing interpretation and assimilation, as well differing speeds of decision-making, even when these based on the same data [34,35,36]. Confidence in choice making or the level of risk that a decision-maker is willing to take vary. In some cases, the age of the decision-maker can be proved to be significant, as described by Finucane et al. [37], noting the decline in cognitive functions as age progresses, and the associated potential overconfidence in the ability to make good decisions [38]. All these factors, relating to the individual, such as beliefs and values, bring additional different perspectives to a decision-making process [39].

The decision environment

An additional influential parameter is the decision environment. Every decision is made within an environment defined as “the collection of information, alternatives, values, and preferences available at the time of the decision” [10], para. 12. According to their study, Qiu et al. [40] established that diverse stakeholders may have similar preferences as to what constitutes provision of good quality urban green spaces, with differences in certain elements within these spaces. For example, a study of ecology students found that they had higher tolerance towards the view of natural processes such as decay compared to designers [40]. Another study revealed that providers and researchers involved with landscape planning appreciated the provision of rare species, which proved however of less value to the local users, such as people who live in the area or were regular users of the spaces [41]. An earlier study also established that although local users are more likely to visit formal or well-designed “artificial” spaces, the providers involved with landscape planning emphasised the presence of natural green spaces [42]. Further to these parameters, social pressures, such as the approval or disapproval of people in the close environment, such as friends and colleagues may affect decision-making and the ranking of factors to a given situation [43].

Although these differences are present, the reasons for their existence are still not clarified, given that an individual’s preferences are shaped from a wide range of subjective factors [44, 45]. The decision environment itself can be of great complexity. Changes in the decision environment impact DMPs, introducing risk and uncertainty. “Risk” in decision-making implies that the possible outcomes of a decision are known or can be worked out based on the probability of each outcome. As the decision environment continues to expand following integration of new information and alternatives, so decision-making occurs as close to its deadlines as is possible [10]. This presents benefits, such as the emergence of more information and alternatives, however, it also presents several risks, such as the decision-maker feeling overwhelmed from the breadth of available information.

Therefore, subjectivity forms an important component in decision-making, and hence there is the risk of the decision-makers’ preferences changing throughout the process, leading either to more informed decisions or equally to poorer ones [46,47,48,49,50]. However, there is still limited research as to how decision-makers can take more informed decisions without the associated risks.

Decision criteria

The criteria affecting decision-making are established and used to evaluate alternative courses of action in DMPs and they will in turn affect the outcome of a decision. Different criteria are suitable in different situations. In group decision making, decision-makers will explore the use of the criteria for example by applying different weightings, while the specific criteria chosen can help identify areas of exploration. Although these differences in perception and reaction to the criteria are present, the reasons for their existence are still not clarified, given that an individual’s preferences are shaped from a wide range of subjective factors [44, 45].

To minimise the risks involved, the creation and prioritisation of metrics for the decision-making process itself is needed [51]. Hansen and Ahmed [52], reviewed existing literature in design decision-making, concluding that the design process constitutes a series of decisions taken repeatedly. They presented a conceptual model in which all the decision episodes were recorded within a node, based on four distinct phases: Evaluation, Validation, Navigation & Unification, based on the evaluation and decision-making activities. Later work identified decision-making as a non-linear process, in which most decisions are made by revisiting the choice of criteria and the alternatives multiple times throughout the process, rather than being formed in a context of other decisions not in isolation [10]. Janssen et al. [53] identified 11 factors influencing decision-making based on BD via interviews and literature review. These included process transformation and integration, development of skills, data quality, flexibility of systems, collaboration, knowledge exchange and decision-maker quality as some of the key factors identified.


Decisions must be made within a specified time period, and a set of circumstances. Decision-making has been characterised by Harris [10] as a non-linear process, in which most decisions are made by revisiting the choice of criteria and the possible alternatives multiple times throughout the process. Heuristics are a common decision-making strategy which permit decision-makers, working with little information, to arrive at a correct and viable decision. Shah and Oppenheimer [54] argue that heuristics reduce work in decision-making, as they can be described as mental short-cuts, diminishing the work of retrieving information and streamlining the processes by reducing the amount of integrated information to those necessary for making a choice.

Decision-makers rely on heuristic approaches for convenience and speed. An important approach is the representative heuristic. If one of two options is recognisable, people tend to choose the one recognised, utilising a decision with the least amount of effort and information [55, 56]. BD enrich the content and scope of any DMP by presenting a plethora of new information, which in many cases, is acquired in real-time. However, solutions need to be implemented to handle and extract knowledge from BD and decision-makers need to be able to gain valuable insights from rapidly changing information [57]. Hence, BD approaches add to the complexity, effort, and time required to reach a decision, complicating even further the role of the decision-maker.

Decision support – theories, tools, and techniques

As settings vary, supporting activities may influence stakeholder perspectives in both a positive and negative way, therefore it is necessary to have mechanisms in place to support decision-making effectively within the design process. A wide variety of risk frameworks have been produced to-date to guide decision-makers through various DMPs in a clear and transparent way and can be replicated [58,59,60]. Whilst not fully removing subjectivity and biases, they can help based on the assumption that the decision made is fully or bounded rational [25, 57]. Quantification and standardisation of the quality of decision-making can assist in improving the processes and reducing the existing quality gaps. Additionally, specific skills are required to be able to judge the urgency of DMPs in each of the stages of the process, using the available resources, such as prioritising and recognizing the potential benefits or costs of a choice via data analysis [61].

Materials and methods

Methodological framework

For the purposes of this study, the authors developed and applied a quantitative data-driven, evidence-based methodological framework utilising qualitative information (Fig. 1). A questionnaire survey was conducted to explore the stakeholders’ perception in relation to decision-making within the design process and to identify the factors influencing DMPs. The data extracted from the questionnaire survey was analysed using Exploratory Factor Analysis (EFA), Average Relative Importance Index (ARII) and Spearman Rank Correlation Coefficient Test (rs) methods [62,63,64,65]. These methods are further explained in the following sections.

Fig. 1
figure 1

Overview of the proposed methodological framework

Empirical data collection and questionnaire design

The chosen data collection research method serves the purpose of gathering data from a large sample on a global scale [50]. Furthermore, questionnaire surveys showcase internal/external validity, and under proper construction, ethical advantages [51, 66]. In addition, questionnaire surveys are proven to provide trustworthy results when it comes to evaluating variables [52, 53].

In an exploratory survey of 136 participants, stratified by their involvement in the design DMP, decision-making quality was measured using an online questionnaire. A pilot questionnaire was issued, completed and tested by 13 stakeholders, prior the commencement on the main cohort of participants. The questionnaire was designed for all levels of designers and project managers within the construction industry and its structure was based on typical decision pathways and the rational model [10]. Thirty-eight questions formed four distinct categories in the questionnaire: Part 1: Questions concerning the respondents’ role in the organisation and role in the team, Part 2: Evaluative questions concerning decision-making quality, Part 3: Questions concerning the recency of tools to improve quality, and Part 4: Potential for dynamic operation. Variables of Part 2, 3 and 4 are presented below in Table 1 and Table 2.

Table 1 Review of variables (individual questions asked)
Table 2 Additional questions included in the online survey

Participants representing different levels of hierarchy, roles and experience responded to each of the categories using a five-point Likert-type scale (1–5), chosen as it is considered to ease the interpretation of the results indicated by several researchers [54,55,56]. In addition, the 5-point scale enhances the positive emotional limit for the respondents, who are called upon to evaluate the significance of pre-defined statements, and hence, this approach was selected to ensure the increased contribution of the respondents [67]. Due to the selection of a specific domain (building and places design industry), this research did not follow a population-based sampling (random group of participants) but used convenience sampling (targeted participants based on professional relativity and participatory willingness) [68]. This method was selected to avoid complications of dealing with a randomised sample and to aid obtaining basic data and trends relevant to DMPs within the design process.

Three questions were asked at the end of the questionnaire to gather additional insights in relation to the Data Driven Innovation (DDI) processes application (Sect. 4.3) (Table 2). Participants were asked questions targeted to their understanding of where the application of these processes would be useful and their perception as to when and how this could be applied. Participants had the opportunity to select more than one possible answer and no ranking was required. Finally, participants were given the opportunity to add any additional comments at the end of the questionnaire and these have been used via direct quotation, to provide detailed understanding on key themes [15].

Finally, Q22 was initially included in the questionnaire survey aiming to gather insights of the types of tools decision-makers use for the designing process of the built environment and its surroundings. Q22 was a multiple-choice type of question, however, it has been excluded from the analysis due to the lack of valuable responses and insights.

Participant profiles were identified through LinkedIn, and consultancy organisation companies following a direct approach. 1,500 invitations were sent out to the selected experts (stakeholders), inviting them to complete the online questionnaire. The survey was conducted from December 2019 to March 2020.

Respondents’ profile

The questionnaire survey was issued to the selected stakeholders (respondents), composed of experts, and consisting of academics and industrial practitioners with previous experience in the fields of project management, construction, and design. Usable questionnaires were returned by 136 respondents, comprising 14 stakeholder categories. The stakeholder categories along with the respondents’ professional background are summarised in Fig. 2. The identified stakeholders were located worldwide to obtain a diverse sample and be able to export a representative conclusion based on the following criteria: frequency of decision-making; being part of the stakeholder groups; and level of experience in the use of DDI.

Fig. 2
figure 2

Respondent’s professional background reflecting their organisational role and their project roles

Missing data

For the purposes of this study, the chosen mechanism of addressing missing values is Missing Completely At Random (MCAR) [69]. The assessment of the structure of missingness considers two factors: (1) percentage of missingness and (2) missing imputation methods. As described by Chen et al. [70] and Dong and Peng [71], the different missing imputation methods do not have any significant impact, when the percentage of missingness is low. In this study, the percentage of missing values is less than 5% of the total sample, therefore, a listwise deletion method - fully deleting the missing values - was performed.

Data analysis and validation

The data extracted from the questionnaire survey was analysed with a statistical software “Statistical Package for Social Sciences (SPSS)” from IBM [72]. An understanding of the underlying characteristics of multiple variables required, therefore Factor Analysis (FA), was used as the chosen statistical method due to the benefits that this technique offers when dealing with multiple variables (32 in total), [73]. FA helps in the categorisation of these variables into distinct groups which makes the analysis easier [74].The chosen factor analysis type is the Exploratory Factor Analysis (EFA). This method is used to reveal the fundamental concepts amongst a large set of factors, thus helping to understand the underlying structure of complex data [62]. The key objective of the EFA is to determine the minimum number of factors needed to produce correlations among the observed variables. The application of EFA is relevant in this study as majority of the variables involved cannot be quantified. Therefore, this method proves to be particularly useful in this study, as the qualitative approach is a fitting technique for collecting data, while quantitative analysis supports improved reporting [75]. Variables such as best practice strategies in DMPs need to be measured as observed variables.

Two tests were performed to identify whether EFA is a suitable type of analysis and if the sample of data is appropriate: the Kaiser–Meyer–Olkin (KMO) test to determine sample sufficiency and the Bartlett’s sphericity test was performed to examine the variables’ relationship, adequacy, and sphericity [76]. The correlations of the variables are the result of the use of the orthogonal (Varimax) rotation and oblique rotations that allow the analysed variables to correlate and finally decide on the factor space [77]. The total number of factors was concluded through the analysis of the Kaiser’s criterion and the eigenvalues [78]. As indicated in Field’s [77] work, the Kaiser’s criterion is more accurate when the sample size exceeds 250. Due to the lower number of responses for this study, the Kaiser’s criterion, and the scree plot were both considered.

The reliability value (Cronbach’s alpha) and the KMO test were performed for all the factors, following Karekla and Michaelides [79] and Almeida et al. [80] research “path” for the unifactorial structure of factors is a way to verify construct validity. However, when there are a small number of items in the scale (fewer than 10), Cronbach’s coefficient alpha values can be quite small. For the purposes of this study, both Cronbach alpha values and the mean inter-item correlation for the items were calculated.

The participants provided their expressed opinions on the level of agreement for each variable transformed on numerical scores, with values from 1 to 5. For this type of analysis, the mean and standard deviation of the variables were not considered suitable to determine the overall ranking [81]. Therefore, a calculated weighted average for each factor, divided by the upper scale of measurement, was adopted to provide an importance index [33, 74]. ARII values have been calculated for the defined factors to be considered with reliability in the collected data-sample using Eq. 1, including all ten factors initially and the average value for each factor (ARII) [63, 64, 82]. Following this, ARII values were extracted for diverse stakeholder groups, divided by role within a team, role within discipline and experience in DDI.

$$\mathrm{ARII}=\frac{{\sum }_{i=1}^{5}{w}_{i}{n}_{i}}{A*N}=\frac{{5n}_{5}+{4n}_{4}+{3n}_{3}+{2n}_{2}+{1n}_{1}}{5*N}$$
  • w is the constant that shows the weighting of each response (1 – not important up to 5 – extremely important)

  • ni is the frequency of the responses (i = 1 to 5 based on the Likert scale)

  • N is the total number of the responses.

The ARII value ranges from 0 to 1, with 0 not inclusive. The higher the value of ARII, the more important the criteria. The comparison of ARII with the corresponding importance level is measured from the transformation matrix as proposed by Chen et al. [83] shown below.

  • High (H): 0.8 < ARII < 1.0

  • High-Medium (H-M): 0.6 < ARII < 0.8

  • Medium (M): 0.4 < ARII < 0.6

  • Medium-Low (M-L): 0.2 < ARII < 0.4

  • Low (L): 0 < ARII < 0.2

Additional analysis was then followed with Spearman Rank Correlation Coefficient Test (rs) to better identify if there is agreement or disagreement among the diverse groups on ranking factors. This test is generally used to understand agreement on the relative importances of the identified factors [64]. Spearman Rank Correlation Coefficient Test (rs) values range between -1 and + 1, where a perfect negative correlation is rs = -1 and a perfect positive correlation is rs =  + 1. It is argued that values close to 0 indicate low or no correlation [84].

Preliminary analysis and factor extraction

The response rate, from 1,500 invitations, 136 responses were obtained, reflects a confidence level of 95% with 8% margin error. In addition, Vishwakarma [85] and Field [77] in their studies indicate that a minimum sample of 100 responses is required for extracting useful results, therefore meeting the minimum sample size threshold for further analysis to follow. Initial analysis showed a KMO value of 0.599 and the Bartlett’s sphericity test was significant (p < 0.001), which reveals that although the sample size is not adequate, the multicollinearity among the variables is at an acceptable level. To better understand why the KMO value is less than 0.60, the anti-image correlations for the pairs of variables – the individual questions asked - in the anti-image matrix was considered and revealed a few variables with values lower than 0.5, which is the satisfactory limit [77]. The variables with a value lower than 0.4 were initially removed and the tests were performed again with 30 variables in total. The variables removed were Q18 and Q30 (Table 1). The KMO value is 0.627 (KMO > 0.6 is acceptable) and the Bartlett’s sphericity test was significant (p < 0.001) (Table 3). Therefore, the EFA technique can be employed.

Table 3 Results for KMO and Bartlett’s Test indicating that EFA technique can be employed

The results of the analysis showed that the majority of variables were of high importance, and 7 of these of importance, indicating the valuable contribution of all the 30 variables in DMP. In this study, communalities of all variables (h2), which represent the proportion of the variance in that variable that can be accounted for by all extracted factors, are between 0.507 and 0.813 [86].

Although the Kaiser’s criterion the scree plot (Fig. 3) supports the choice of 8 factors in total as being the most appropriate, the cumulative variances for 8 factors do not meet the 60% threshold [78] (Table 4). Therefore, the authors used 10 factors where cumulative variances are 63.160%, and therefore is acceptable. The naming of the 10 factors was formulated by the authors to reflect the strategic perspective of the DMPs, while encapsulating the underlying concepts of the variables included in each factor.

Fig. 3
figure 3

Scree plot indicating the choice of eight factors (component number) as being the most appropriate

Table 4 Summary results of EFA: variables of decision-quality in DMP processes and their internal consistency

Factor loadings show the correlation of the factors (factor dimension) with the variables [87]. The higher the factor loadings are, the higher their significance is for the DMP quality [74]. Therefore, this indicates that a sample size of 136, from which the minimum factor loading is 0.423. Re-running the tests confirmed that the final KMO value is 0.638 and the Bartlett’s sphericity test was significant (p < 0.001).

The correlations of the variables are the result of the use of the orthogonal (Varimax) rotation and oblique rotations that allow the analysed variables to correlate and finally decide on the factor space [77]. Following the oblique rotation, a trivial correlation among the extracted factors was noticed, which led to the performance of the orthogonal (Varimax) rotation, which transforms the initial variables into new ones which can be interpreted (Table 4). The overall reliability analysis was performed together with Cronbach’s coefficient alpha, having a value of 0.717, regarded as acceptable [88].

Questionnaire content validity

The content validity of the questionnaire survey was based on identified variables as described in Table 1, which is further assessed [89, 90]. Cronbach alpha values and the mean inter-item correlation for the items was calculated and reported in Table 5 [91].According to Cohen [92], if inter-item correlation is within 0.10 and 0.29, then there is a weak correlation. If inter-item correlation lies within 0.30 and 0.49 a medium correlation while an inter-item correlation between 0.50 and 1.00 shows a strong correlation. However, according to Cristobal et al. [93], the items with corrected item-total correlation value up to 0.20 is acceptable for exploratory study for inter-item and item-total correlation.

Table 5 Factors of decision-making quality in DMP processes’ validity

Average relative importance index ranking from factor analysis

The average relative importance index (ARII) was calculated for each factor and the findings are presented in Table 6 [74]. ARII values indicated that Potential for dynamic operation factor has the highest ranking, with High importance, followed by the Recency of Tools with a score of 0.82463 and 0.82451 respectively. Nevertheless, due to the diverse influencing factors, as described earlier, a breakdown of the ARII values was further considered, based on previous experiences, disciplines, and team roles.

Table 6 Average relative importance index (ARII) for each factor and overall ranking


The influence of previous experiences, disciplines, and team roles on decision-making processes

A comparison among the levels of experience, disciplines, and team roles has been conducted to better identify the influence of these parameters on DMPs. The comparison was performed calculating the ARII values for each of these groups.

Previous experience

A comparison between levels of experience regarding the ARII values has been conducted and findings are shown in Table 7. The results indicate that Instinctiveness and Social Resistance ranked relatively high for the non-experienced users, in addition to the experienced users, in which Instinctiveness ranked almost closed to value 0 (~ 0.013). The experienced users in DDI considered Control as the most important factor, with an ARII value of ~ 0.79 while the non-experienced ones the Recency of Tools with a score of ~ 0.82.

Table 7 Summary results of ARII values and the factors’ ranking among stakeholders with different levels of experience in DDI implementation

The questions and results of the relevant variables regarding the Control factor are shown in Fig. 4. The ranking is a five-point Likert scale asking individuals to rate their responses to questions, from very infrequently or never (1) to very frequently or always (5). Due to the negative aspect of the Q12 and Q16 variables and based on the results of the factor analysis, the Likert scale have been inverted to very infrequently or never (5) and very frequently or always (1). Results indicated that there is confidence in their planning ahead and their enjoyment of the DMPs, due to the previous experiences of the participants. Control is one of the factors to be considered when dealing with non-experienced decision-makers.

Fig. 4
figure 4

Variables of Control factor for experienced users only (inclusive of replies with ranking of 4 and 5). (1 = Very infrequently or never and 5 = Very frequently or always)

Another interesting aspect for the experienced decision-makers in the use of DDI was the fact that Hesitancy factor ranked high, as of high—medium importance, in third position with a value of ~ 0.7782. In more detail, a high number of participants rated as sometimes (3) their answers to questions Q11 and Q14 (Table 1), with the percentages 39.1% and 65.2% respectively. In addition, the same number of people rated Q11 variable as infrequently (2) and frequently (4) with 26.1% while 8.7% replied as very frequently or always (5) indicating that experienced decision-makers in the use of DDI are more willing to follow a riskless scenario.

For both types of decision-makers, Potential for dynamic operation factor ranked high, and it was defined as of high-medium importance, with an ARII value of ~ 0.7935 for experienced ones and as of high importance for the non-experienced ones with ARII value of ~ 0.819. Results including all types of decision-makers returned similar results, with more than 79.5% agreeing with the beneficial use of data and new technologies within the DMPs of design. More specifically, Q32 (Table 1), had the higher percentage of agreement by the participants with 84.6% agreeing while only 1.47% strongly disagreeing that the use of such types of data and technologies would improve DMPs in the design industry.

Non-experienced users ranked Recency of Tools as the most important factor and details of the included variables are shown in Fig. 5. The ranking is a five-point Likert scale asking individuals to rate their reply from very infrequently or never (1) to very frequently or always (5) for variable Q23 while from strongly disagree (1) to strongly agree (5) for Q24 and Q25. Due to the lack of experience, based on the results of the online questionnaire, participants feel confident with the use of the digital tools provided to them and 72.6% agrees that the design process is digitalised to optimise design, therefore, DMPs in the design of buildings and their surroundings. An interesting fact is that the views of the non-experienced decision-makers of DDI are positive and feel that digitalised design process assists DMPs in terms of best outcome or effort and time spent. In addition, since Potential for dynamic operation is of high-importance for the specific group of participants, it indicates how decision-makers involved with the design process have experienced several improvements.

Fig. 5
figure 5

Variables of Recency of tools factor for non-experienced users of DDI only (inclusive of replies with ranking of 1,2 and 3). (For Q23: 1 = Very infrequently or never and 5 = Very frequently or always, For Q24 & Q25: 1 = Strongly disagree and 5 = Strongly agree)

Even though there is a positive overall feeling for DDI and its potential to improve DMPs, several of the participants raised some of the most controversial issues when dealing with real-life situations and the nature of the decision-makers as individuals. Below are some of the comments as extracted from the data, from an experienced DDI user and a non-experienced one respectively.

“Completely digital design is very possible in the future. Will it lead to better decision-making regarding real-life implementation? Most probably, it will hardly affect it at all” – Experienced DDI participant.

“Tools may be useful to answer some questions. However, I am not sure about it being strategic. Decision making is deeply personal and bias[ed] in my experience. At the end of the day, data can be misrepresented to support what we would like to see and hence unreliable. Can we trust data presented?” Non-experienced DDI participant.


Similarities in terms of ARII values arise from people belonging to disciplines that are heavily involved with the design process and its planning. Results from different disciplines within the building design and its surroundings are displayed in Table 8 below for total respondents among architects, engineers, and project managers.

Table 8 Summary results of ARII values and the factors’ ranking among stakeholders with different roles within the company structure (discipline)

The results indicate that the five factors ranking higher for the groups of architects and engineers are the same, although with different rankings overall, while for the project managers the four out of five categories remain the same. These are: Potential for dynamic operation, Control, Recency of tools, Thoroughness and Social resistance. Social resistance factor, although it remains of high importance for the other two groups, it is replaced in the PM category by the Instinctiveness one. Social resistance is ranking eight and considered of high-medium importance for project managers, in contrast to the other disciplines, where Social resistance is ranking in the 5 most important factors (Table 8).

The variables considered in the Social resistance factor indicate the need of belonging in a wider team or close collaboration (Fig. 6). The overall low score in this factor may have resulted due to the difference of decision-making processes in comparison with other disciplines. By example, when only one project manager role is assigned in a project rather than a team of people monitoring progress, or when for the sustainability of the project there is a need of practical solutions.

Fig. 6
figure 6

Variables of Social Resistance factor against Architecture, Engineering and Project Manager disciplines. (1 = Very infrequently or never and 5 = Very frequently or always)

Role within a team

The relative influence of design team members varies over the course of the design process and based on the specific roles; the factors that influence their decisions also vary. For the purposes of this study, three key categories of the design-team members have been identified: Members of the team (MoT), Lead designer roles (LD) and Project manager roles (PM). The distinction of the project manager role as a separate from the member of the team is due to the different responsibilities associated with this role, team structure and/or is not as heavily involved with the design as the rest of the disciplines.

ARII values have been calculated focusing on the key design-team categories and displayed in Table 9. As expected, MoT and LD showed similarities for most of the factors. For MoT, Recency of Tools scored higher than the rest of the factors while for the LD Potential for dynamic operation factor holds the first place. For both groups the score is 0.84 (high importance).

Table 9 Summary results of ARII values and the factors’ ranking among stakeholders with different roles within a team

Both MoT and LD are responsible for the deliverables of the project, with the first ones having the responsibility to produce the material and the latter ones the quality to be delivered. Recency of Tools includes three variables, the majority of them closely involved with the decision support methods related to the use of tools and techniques for the achievement of the end-goal - the delivery. Potential for dynamic operation factor is relevant to the requirements and insights for increased quality on the deliverables and the future of delivery. Decision-makers in leadership positions are responsible for the client-facing communication while for MoT roles is not a given.

Results for PM revealed that factors of Control, Potential for dynamic operation and Recency of Tools are equally important and considered of high importance with an ARII value of 0.81 for all three factors. Significant differences can be observed in the individual questions posed to the participants for the Control factor, revealing the key differences in the DMPs for PM roles (Fig. 7). Planning ahead, quality of delivery and future potential are considered as the driving parameters for the specific role. Results imply that these differences are due to the nature of the PM as a role, which is to ensure submission of the deliverables and their quality while managing at the same time client interactions and future collaborations.

Fig. 7
figure 7

Variables of Control factor against MoT and PM roles. (1 = Very infrequently or never and 5 = Very frequently or always)

Level of agreement among the stakeholders’ perceptions

The Spearman’s Rank Correlation Coefficient was tested for the different views of stakeholders and overall ranking (Table 10). The stakeholder categories tested were: Project Manager roles (PM), Lead Designers (LD), Member of team (MoT), Project management discipline (PMD), Engineering Discipline (ENG), Architecture and Master planning discipline (A&MP), Experienced in DDI (EXP) and Non-experienced in DDI (Non-EXP). The test returned high coefficient values, ranging between 0.818 and 0.988, implying that there is a positive strong correlation amongst the diverse rankings against each category, indicating a high-level of agreement among the groups, signifying the consistency, validity, and reliability of these findings. The only low correlation value was observed between the Non-EXP vs EXP stakeholder groups, where rs = 0.176, indicating that they have different perceptions of DMPs in design, as also observed in the ARII analysis. Nevertheless, it should be noted that only 17% of the respondents had experience in DDI processes, therefore, further investigation may be required for that specific group.

Table 10 Spearman’s Rank Correlation Coefficients among Stakeholders within a team, discipline, and experience levels.

Data Driven Innovation processes potential for implementation

Due to the wide range of sectors and their differences in DMPs, a better understanding of the perception of the key stakeholder groups in relation to the DDI processes application is required. The key questions included in the online questionnaire are listed in Table 2.

Participants had the opportunity to select more than one possible answer and no ranking was required. The highest frequencies for Q29 were recorded for the design sectors of: (a) Residential, (b) Master-planning and (c) Commercial, with the scores varying from 56 to 52 recordings, while the lower ones were recorded for (d) Aviation and (e) Landscape design. The rest of the sectors in the given list (Urban planning, Rail and Education) also had relatively high frequencies, ranging between 47 and 43 responses. Recommendations for additional sectors to be considered were made from 13 respondents overall, such as Healthcare, Hospitality, Automotive, and Institutional.

Following that, two additional questions formed part of this study. Q35 notes the perception of people for only using digital design processes for DMPs, while Q36 investigates the feelings of the stakeholders for the combined approach of digital design process with data analytics approaches. Frequencies for Q35 and Q36 are shown in Fig. 8 and Fig. 9 respectively.

Fig. 8
figure 8

Frequencies of Q35 capturing participant responses for a completely digital design process

Fig. 9
figure 9

Frequencies of Q36 capturing participant responses for a completely digital design process combined with data analytics

Regarding Q35, the majority of the participants felt that digital and conventional methods should both be employed, with a great number of participants stating a positive feeling of a complete application of digital design processes for the built environment and its surroundings. A lower percentage of participants felt that the application of the digital design processes might limit creative thinking, while several participants felt that relying on digital ways of designing is causing limitation of the design skills.

Similar results were recorded for Q36 with majority of the respondents having a positive feeling overall. A significant number of participants felt that this could not be immediately applied and could be possible in the future, while others felt that this approach requires more sense of scale. For both variables, some of the participants’ perception was negative or stated that they are not interested in being part of these applications, while they also felt that this may result in complete loss for specific professions involved with the design process. Another important parameter was raised which relates to the cost of these applications and the need for the processes to be standardised to be linked to the design, avoiding unnecessary complications.

Some of the participants noted the need for a collaboration of the DDI processes and their tools with the designers, rather than digitising the design processes completely. More specifically, one of the respondents noted:

“Digital design is ultimately part of a larger transition in terms of how we undertake projects using such tools alongside our foundation of knowledge, culture and values. Data driven processes will become more embedded in our work in the near future and will be a matter for the designer or architect to decide how to utilise this information and which tools to use...” - Participant Non-experienced in DDI

In addition, some of the participants felt that digital tools and their processes are just a way to move towards data-driven and evidence-based approaches, rather than allowing the tools to make the decision-making for the designers. Finally, one contributing comment towards this direction from a Non-experienced in DII participant noted:

“I believe that good designers will instinctively make good design decisions and that digital design tools and data analysis can be used to evidence these ideas and refine them to produce better outcomes”. – Participant Non-experienced in DDI


Qualitative information obtained from the stakeholders with the use of an online questionnaire survey assisted in understanding their beliefs and attitudes when a decision is to be made in the design of a building and places. Evidently, the factors identified are strongly correlated, suggesting that the variables are significantly associated to each other.

The ARII values and factors’ ranking calculated for the three categories of the decision-makers are displayed in Tables 7, 8, 9. Results indicate that for all categories four factors are always ranked as high-important ones: Potential for Dynamic Operation, Thoroughness, Recency of Tools and Control. The lowest score recorded among these factors was for Control, when dealing with engineering disciplines while the highest score was recorded for Potential for Dynamic Operation for the same group. The overall ARII ranking places Potential for Dynamic Operation as the most important factor, shared with Recency of Tools factor. Recency of Tools factor is also considered of high importance for all categories, ranking within the first five group of factors, except for the group of experienced in DDI processes. The two factors ranking last, were Hesitancy and Experience. Social resistance factor is part of the five most important factors as considered by the respondents for five out of the nine overall categories. Social resistance ARII values illustrate the fact that for some groups this factor is of importance while for others is not considered as important. These findings imply that there are some principles common for all types of stakeholders within the design of buildings and places. However, the overall ranking of the identified factors varies significantly amongst roles, disciplines, and experiences, implying that there is a constant need to better understand the DMPs in each project.

An interesting implication is that based on a sample of 136 respondents, only 17% of the respondents had previous experience with DDI, based on weighting values of 4 (Piloting DDI) and 5 (Effectively using DDI). This finding indicates the lack of specialised skills within the industry and the need for training to enhance understanding of DDI processes, rather than using new technologies blindly. More specifically, one of the respondents that had ranked the DDI experience with a weighting of 2 (Considering DDI), noted:

“Lack of uptake in DDI across the industry limits its benefit. Lack of understanding of software, process and resources limits current application and integration in design process.”

Another participant who ranked DDI experience with a weighting of 1 (No experience) also added a similar comment (partial extract to reflect the relevant observation):

“Digital Tools and engineering methodology are the future for construction design processes, but staff training is extremely important to ensure this vision is fully realised.”

The findings imply that DMPs are not as heavily influenced by their role within a team as they are if the decision-maker forms part of a different discipline. Results revealed that although DMPs present some differences between the disciplines of A&MP and Engineers, the greatest difference is observed between the Project Management discipline and A&MP discipline. This generally reflects the difference of the end-goals between the two disciplines and the perception of what successful design of a project entails. This can heavily influence the choices made along the design process, i.e., delivery on time and in budget versus aesthetics and layout efficiency.

Comparison of the ARII between experienced (EXP) and non-experienced (Non-EXP) DDI users illustrates the differences in stakeholders’ views. One of the key observations is the value for Instinctiveness, where although it is generally considered as of high-medium importance, for experienced DDI users, this factor has a score of ARII = 0.013. This finding indicates the level of reliance of the experienced users on their analysis outcomes and evidence-based methodologies, rather than relying on instincts to make a decision. In addition, although the impact of Hesitancy and Experience factors are generally considered as low within the rest of the stakeholder categories, for EXP these factors ranked in the third and fourth place respectively. Similarly, Social Resistance and Principled factors, although they are generally valued by the stakeholders, for EXP the ARII values are low. These findings illustrate the differences in the way of thinking between EXP and Non-EXP, revealing that experienced DDI users rely heavily on information to make a design decision, while non-experienced users value others’ opinions and prefer working in a collaborative environment, where decisions can arise from workshops and discussions. Nevertheless, results regarding Hesitancy and Experience factors imply that EXP are conservative against their decision-making process, while Non-EXP are open to exploration and alteration of their initial favourable option.


The aim of this research was to investigate, quantify, and rank the relative importance of the decision-making factors contributing to the design of building and urban projects. To achieve the aim, a survey was conducted to gain an insight of stakeholders’ perceptions as to which are the influencing factors affecting decision-making processes in the design of buildings and places. Via the use of Exploratory Factor Analysis, Average Relative Importance Index and Spearman’s Rank Correlation Coefficient, the research identified the key factors and their relative importance influencing DMPs in design. This study provides a new means to evaluate performance of decision-making processes, when these are undertaken, by developing and applying a quantitative data-driven, evidence-based methodological framework. The current analysis built an applicable framework which was used to evaluate the importance of 32 variables in total, quantifying the way designers make decisions and captures the social aspects introduced by the technological advancements and increasing data availability and their use in the design process, to be evaluated and optimized.

Four highly important and distinct factors were generated: Potential for Dynamic Operation, Thoroughness, Recency of Tools and Control. This study revealed that DMPs vary significantly among the decision-makers, and it is heavily influenced by their individual characteristics. Hence, there is an urgent need to identify how this process is undertaken each time, leading to optimised decisions. In addition, it has been revealed that although DDI processes are generally received as a positive addition to the DMPs from the individuals, it is not yet achievable due to key practical and cultural barriers identified. The current analysis can help practitioners to assign the most fitting roles to stakeholders within the design process. Additionally, these findings can also be used as a way of encouraging organisations in the building design industry to improve their approaches and final decision outcomes.

Implications andlimitations of this study include the fact that EFA conclusions are based on post hoc analysis, therefore being subject to possible errors. In addition, respondents possessed inadequate experience with DDI, hence variables may not have been properly evaluated. Another limitation is that the sample size includes respondents worldwide, where DMPs in urban and building design may differ due to present regulations or lack of available resources (i.e., data and tools).

Future research could be performed looking to validate results using a case-study based approach and collecting of in-depth qualitative data to better understand aspects of DMPs that were not included in this research. In addition, a focused country-based sample size selection could reveal further insights.