Advertisement

Introducing value driven design in engineering education: teaching the use of value models in preliminary design

  • Alessandro BertoniEmail author
Open Access
Article

Abstract

Methods and approaches for teaching engineering disciplines are evolving to adapt to the needs of companies and society. Engineering Design is one of the areas most influenced by such changes and constantly striving to develop more effective and efficient strategies to prepare the soon-to-be engineers to face the challenges of a real working environment. This paper presents an approach used to teach the use of value models for concept trade-off in the preliminary design phase, in line with the industrial challenges highlighted by the literature on Value Driven Design. The approach is based on realistic design sessions assigning to master students a value assessment challenge. The paper describes the rationale of the approach, its set-up, and a validation activity run to compare the performances of the students with those of industrial practitioners. The results of the study show that the students taking part in the design sessions can produce results that do not statistically differ from those of industrial practitioners. Moreover, the students’ self-reflection on the achievement of the intended learning outcomes show a high satisfaction toward the achievement of the educational objectives.

Keywords

Realistic design session Value models Value-driven design Engineering education CDIO 

Introduction

The engineering education curricula are slowly but constantly changing to address the deficiencies long discussed in engineering literature (Felder et al. 2000) and to adapt to the needs of companies and society. The need to renovate engineering education spending more effort in teaching design methods that are closer to industrial practices is highlighted in academic and industrial discussion since the ‘90s (see for instance Bordogna et al. 1993; National Research Council 1995), based on the idea that the students’ capabilities to solve problems, synthesize and analyze knowledge, and propose creative solutions, shall be the relevant skills to develop during the engineering education (Atman and Bursic 1996). Particular attention has been paid to the education in the engineering design area, whose importance has often been underestimated in the traditional engineering curricula (Dym 1999). According to Dym (1999), this happened in the past because of a combination of factors, including the inexperience of academics in being directly involved in design issues, and a general perception that in design education there was no real “scientific content”, since it could not be reduced into meaningful mathematical terms. Nowadays design is increasingly seen as a series of cognitive activities and practices that can be modeled, thus as a legitimate area of education. In recent years, researchers in engineering design have introduced a plethora of frameworks and models to be applied in different design phases, that have contributed to enhance the perception of design as a scientific and engineering-oriented discipline.

In detailed design phases, models have been developed to support the quantification of design parameters obtained as results of mathematical and logical processes (e.g. Finite Element Analysis). In more preliminary phases, where parameters are not yet defined, and mathematical correlations are unknown, models have been developed to support decision making mostly through qualitative assessment, based on expert judgment and personal experience (Eppinger and Ulrich 1995). In this area, a number of customized methods and tools to enhance decision-making have been proposed and tested in industry (e.g. Hazelrigg 1998; Ullman 2001; Collopy and Hollingsworth 2011). Among those, the concept Value Driven Design (VDD) (Collopy and Hollingsworth 2011) has lately emerged in the scientific literature and has been used as a framework for the development of methods and tools enabling the value assessment of products and services from a multidisciplinary perspective. While initially mainly applied in the aerospace industry, VDD-inspired methods have been recently deployed in other industrial contexts (Bertoni et al. 2016; Panarotto et al. 2017), reflecting a shared industrial need to use models (named “value models”) to preliminary quantify the long-run effect of an early design decision (i.e. understanding the impact of a decision from a multi-disciplinary and multi-stakeholder perspective).

Although, while the teaching of detailed design methods, such as numerical simulations or finite element analysis, is nowadays well established in academic courses, the same cannot be said for methods to model the “value” of design alternatives from a preliminary design stage. A challenge when teaching VDD is given by the fact that VDD methods are often developed by researchers and practitioners jointly collaborating in solving a specific company-related design problem. Therefore, generalizability and comprehensibility for applications outside the scope of the research are not prioritized, making difficult to understand the potential value of new methods for those not involved in the specific context in which the methods have been developed (i.e. lacking background knowledge and understanding of the problem and of the underlying logic of the solution).

Researchers in engineering design have discussed the barrier between the development of new methods in research and their generalized application in industrial settings (Wallace 2011; Gericke and Eckert 2015), highlighting that it is essential not only that those methods provide valuable results, but also that their role is clearly understood, to be successfully integrated into the everyday design activities. It is necessary that both experienced and novel engineers share the same perception of how to use a method, of its relevancy and of its potential. Looking at the engineering design education it is possible to affirm that in case of methods for detailed design, the self-evidence of the results (e.g. fractures or deformations) enables the students to clearly understand the benefits of a specific approach, while, when teaching how to use VDD methods, the teacher finds him/herself in the challenging situation of communicating the potential of the different approaches to an audience that lacks the contextual experience related to the complexity and uncertainty of a multidisciplinary early design analysis. This makes it difficult for the students to understand the value of a method in respect to its ability to manage and synthesize a variety of cross-disciplinary information. It is the role of education to fill this endogenous experience and knowledge gap between practitioners and new engineers, and the teaching activities shall be designed to provide the occasions for the students to learn the rationale and logic of the new methods, and to develop the individual skills and reflective capabilities to best fit the industrial needs.

This paper addresses the challenge of teaching VDD methods, with a specific focus on the use of value models for design concepts trade-off. The study presents the experience of teaching value models using realistic design sessions, in which the students practiced the use of VDD methods and reflected upon their relevancy. The use of realistic design sessions had a double objective. First, it aimed to provide the students with an effective way to develop their skills in evaluating and performing value analysis of different design concepts. Second, it aimed to prepare the students to effectively become “new engineers” by practicing the use of value models in a way that replicates a real industrial setting, allowing them to mimic the behaviors of practitioners when taking part of a early design evaluation. With the intent to verify the fulfillment of those two objectives, the paper describes the findings obtained by the analysis of the data collected from realistic design sessions run with both students and industrial practitioners on a common design case, i.e. a wheel for race bicycle.

The paper first defines the concept of value in the VDD domain. This is complemented by a description of the importance of value modeling in preliminary design, in relations with educational challenges and approaches in use. Third section presents the course set-up and frame the realistic design session into the CDIO syllabus. Fourth section presents the research design, and “Analysis of the data of the design sessions” section describes and analyze the results. The “Discussion and conclusion” section discusses the relevance of findings describing limitations and further improvements of the educational activities concerning Value Driven Design.

Defining value and value driven design

In the engineering and industrial context, methods and tools for value analysis, along with different interpretations of the concept of value, have been proposed since the 60s. In scientific literature, despite the centrality of the value concept, different interpretations exist about what the characteristics of value are and how stakeholders determine it (Day 2000). Miles (1962) first introduced the value analysis concept, defining the value of a product simply as the ratio of its performance over its cost. Later, authors from the fields of marketing and business management have either conceptualized value as the maximum amount a customer should be willing to pay, assuming full information about the product and competitive offerings (Shapiro and Jackson 1978; Forbis and Mehta 1981), or as a subjective perception linked to experience and extrinsic attributes such as brand name, logos, charm, social status and perceived quality (Zeithaml 1988). In the early ‘90s, in accordance with a more subjective interpretation of value, Normann and Ramirez (1993) stressed the notion of value-in-use, which highlighted the importance of the relationship between the company and the customer, whose perception of value is not limited to the product performance, but can be considered as an emotional bond established between a customer and a producer (1996). Researchers in marketing and service logic have stressed the subjectivity of this value perception stating that “Value is uniquely experientially and contextually perceived and determined by the customer” (Grönroos and Voima 2013, p. 146). Research in the domain of engineering design and systems engineering has instead highlighted the need to consider value as a numerical, quantifiable, encoding of preferences, reflecting the relative desire of someone to have one thing as opposed to other things (Collopy and Hollingsworth 2011). In such context, Value Driven Design (Collopy and Hollingsworth 2011) has become a popular umbrella term that collects several methodologies (O’Neill et al. 2010), ranging from previous work on Tradespace Exploration (Ross et al. 2004), to Value Centric Design (Brown and Eremenko 2008) and Value Driven Optimization (Castagne et al. 2009). It indicates an “improved design process that uses requirements flexibility, formal optimization, and a mathematical value model to balance performance, cost, schedule, and other measures important to the stakeholders to produce the best possible outcome” (AIAA 2015). In the VDD context, value has been defined as the level to which a product, or a technical solution, fulfills a set of needs from different stakeholders along its lifecycle (Bertoni et al. 2013). In the presence of complex engineering systems this includes more than performance and cost assessment, encompassing aspects such as: epistemic, emotional and image value (Steiner and Harmon 2009), relationship-based values (Kowalkowski and Kindström 2009), and value robustness to unexpected changes in the system (McManus et al. 2007).

The importance of value modeling in preliminary design

Decisions made in a preliminary design stage radically impact the specifications and configurations of a future product, playing a key role in defining lifecycle services, after-sales policies, suppliers and partners (Krishnan and Ulrich 2001). Nevertheless, at this stage, designers are in the situation of having limited knowledge about the consequences of a design decision, since the concepts under consideration are often immature. Later in the process, when a more established and developed knowledge base is available, major decisions have already been made and capital has already been committed, making it more costly and time-consuming to make changes (Ullman 1992). Studies in systems engineering have highlighted that up to the 70% of the total cost of a development project is already committed in preliminary design, and that the cost to remove defects radically increases afterward (INCOSE 2015).

Understanding the right design direction is therefore essential in preliminary design, and the use of models to help engineers deal with a wide and heterogeneous set of information is largely discussed in literature. In current practices, the use of requirements, expressing the engineering performances of a future product in a measurable and verifiable way, is considered as the main reference approach. However, a requirements-centered process is often unable to assure optimal decisions for an engineering system at an overarching level (Chen et al. 2013) and to add value to the solution space (Cheung et al. 2008, Collopy and Hollingsworth 2011). Researchers (e.g. Soban et al. 2011) claim that a qualitative assessment of the ‘goodness’ of a design is preferable against a quantitative encoding of preferences when qualitative data, assumptions and forecasts prevail. For such reason, methods are used in preliminary design to enhance engineers’ awareness about the value of a future design going beyond the mere calculation of product technical performances but encompassing the measurement of the customers’ satisfaction and of the product lifecycle performances. Quality Function Deployment (QFD) (Hauser and Clausing 1988) is one of the most used method in this respect, mainly because of its transparency in mapping engineering parameters to customer needs (Al-Ashaab et al. 2013). QFD has some limitations, in particular being unable to realistically model non-linear phenomena (Erginel 2010; Zhang et al. 2015), such as the relationship between quality and customer satisfaction (Anderson and Sullivan 1993) or the relationship between qualitative customer requirements and measurable technical requirements (Guenov 2008). Building on such considerations VDD literature has proposed different models to preliminary assess the value of a design concept. Novel engineers and students approaching the learning of such models need not only to understand the logic upon which they function, but also to understand the reasons why, in practice, a value modeling technique is more effective than another in terms of easiness of use, reliability of the results and uncertainty of the evaluation.

The challenge of teaching value modeling in preliminary design

As pointed out by Mclaren (2009) design is a very wide subject that can potentially cover the range of all disciplines within engineering, and it is recognized to be one of the core activities that an engineer undertakes. There are many approaches to teach design in engineering education, ranging from more “traditional” approaches focusing on specific modules treating detailed technical features of machine design, to more innovative approaches requiring cross-disciplinary skills in group work, logistics or management. When teaching models for VDD and systems engineering a challenge resides in making the student aware of the “broadness of the system” (Muller and Bonnema 2013), to make them understand the benefits and drawback of such approaches (Bertoni 2019). As highlighted by Muller and Bonnema (2013), systems engineering students might easily perceive the models as “open doors”, missing a frame of reference and a background to understand what such models or techniques offer, thus categorizing them more or less as the application of “common sense”.

A second challenge resides in the intrinsic multidisciplinary and multidimensional nature of a value assessment activity, i.e. encompassing both a wide set of engineering skills and a perspective over the system lifecycle, often outside the technical horizon of the typical engineering education. To this concern, the topic of interdisciplinary engineering design education has received a lot of attention in recent years by many higher education institutions worldwide. Academic literature shares the understanding that modern engineering designers need to possess multidisciplinary skills and those need to be part of the engineering education and training, as complementary to technical and subject-specific skills (Luft et al. 2013). For instance, Bronet et al. (2003) have highlighted the need to reroute traditional mechanical engineering programs toward higher integration of social and aesthetic aspects, while Ollis (2004) has identified a set of basic elements for the creation of multidisciplinary design courses. Jablokow (2008) has expanded the analysis of multidisciplinary skills for engineers by focusing on problem-solving and leadership capabilities, and Benedetto et al. (2010) have provided an example of cross-academic collaboration in creating multidisciplinary education programs towards innovation in engineering, architecture, and design. Additional contributions have come from Pop-Iliev and Nokleby (2011), proposing an approach for teaching concurrent design engineering, and from Vignoli and Onghia (2015) taking a more holistic standpoint on the engineering education, proposing an approach inspired by the Reggio Emilia method.

As a partial answer to the such challenges, the use of design sessions, in which students are given realistic design challenges in a controlled environment, has been recognized by Cash et al. (2012), as beneficial for design research. The use of design sessions to compare the behaviors of students and practitioners have often served as a testing environment for new design methods and tools, before moving to further development or to industrial implementation (e.g. Ellis and Dix 2006; Mitsui et al. 2009; Kleinsmann et al. 2012; Bertoni 2013). How to run small-scale design studies have been rigorously analyzed (Cash et al. 2012), emphasizing the benefit linked to the identification of trends and research directions. The use of design sessions should, however, not only be regarded as a source of data for design research, rather, it can recreate a learning environment that replicates the challenges and the context of a real industrial setting (Cobb et al. 2003). In such artificial contexts, researchers have highlighted that, when working on a design task, the major differences between students and engineering practitioners are present in the problem scoping and in the information gathering phases (Atman et al. 2007). The problem resides in the fact that most of the time the students do not realize the complexity of a problem, they underestimate the trade-off between design variables, thus increasing the risk to make bad decisions. According to Napp (2004), decision-making methods and theories for complex design environment should, therefore, be a major focus for improving design education. The setup of realistic design sessions capable of recreating a situation as similar as possible to a real industrial scenario can be regarded as an educational strategy to cope with this challenges.

Course set-up

The research presented is based on the data collected from realistic design sessions with students and industrial practitioners run respectively during the course of Value Innovation at Blekinge Institute of Technology and during industrial trainings and executive courses.

The term “realistic design session” is used to refer to a setting in which students are asked to deal with a design problem which mimics a real industrial challenge, but that it is run in a controlled environment (i.e. a university course), being aware that the design session results will not have any real impact on the company performances.

The course Value Innovation involved students at master level from the program of Mechanical Engineering, Industrial Economics, and Sustainable Product Service Systems Innovation. All students taking part in the course held a bachelor’s degree in an engineering related field, thus they possessed the basic knowledge to address engineering problems. According to the intended learning outcomes (ILO) of the course, the students shall, in the end, be able to:
  1. 1.

    Run a need analysis and a market analysis.

     
  2. 2.

    Develop and evaluate the value contribution of different design concepts.

     
  3. 3.

    Create the prototype of value adding solutions.

     
  4. 4.

    Work effectively in project teams.

     
  5. 5.

    Run value analysis.

     

The course accounted for 7.5 ECTS points in the European Credit Transfer System, based on learning achievements and students’ workload. It was structured as a combination of frontal lectures and autonomous group work based on a course project defined with a partner company. The course lasted for 2 months, and the design sessions were run 6 weeks after the beginning of the course. The first 6 weeks of the course were used to teach innovation theories, stakeholders’ analysis, customer needs analysis, creativity methods, and prototyping.

The design sessions were repeated in the frame of the course in three consecutive academic years from 2013 to 2015, thus 3 different classes of students took part to them.

The use of design sessions was planned to contribute to a major extent to ILO 5, and to a complementary extent to ILO2 and ILO4. This is because the students in the sessions were not asked to develop new solutions, rather they worked in teams to run a value analysis on design concepts already defined.

Framing the realistic design sessions in the CDIO syllabus

Literature on engineering design and systems engineering education highlights that the redesign of engineering training and education has to be approached from both a students and teacher perspective. Nadelson et al. (2015) stressed the necessity to enhance teachers’ capacity of teaching engineering design, proposing a support to significantly influence teachers’ knowledge about the engineering design process. Addressing both the students and the teachers’ perspective, an advancement in the field of engineering design education has been driven by the CDIO initiative (Crawley et al. 2014), promoting the context for an undergraduate engineering education to be a combination of four actions, namely Conceive-Design-Implement-Operate. The teaching and learning activities in the CDIO framework concern the direct participation of students in concrete design tasks, active learning, participation to workshop and project-oriented group tasks and collection of feedback from teachers. After its formulation, more than 150 academic institutions have joined to the CDIO initiative (CDIO 2018) with the vision that engineering graduates should be able to: Conceive-Design-Implement-Operate 
complex value-added engineering systems in a modern team-based engineering environment to create systems and product (Crawley et al. 2014). The CDIO initiative does not prescribe any specific approach, rather it shall be seen as a guiding framework to reform engineering education. To this concern the CDIO initiative encompass twelve standards to describe a CDIO program. Those are meant to support the transition of an educational programs toward a CDIO framework. Among those, the standard number 8 recognizes the relevant role of teaching and learning based on active experiential learning methods. In this line, research has recognized since the ‘90s the role of active learning in improving students’ attitude and writing and thinking ability (Bonwell and Eison 1991), and toward improving the capability to remember the topic of a lecture (Prince 2004).

The reform of an engineering education toward the CDIO principles implies an overall revision of an educational program from both a structural and content perspective. This shall be approached from a much higher level than a single educational activity as described in this paper. Nevertheless, the realistic design sessions were designed to contribute to achieve a few goals highlighted by the CDIO syllabus 2.0 (Crawley et al. 2011) in line with the university objective to promote the transition towards CDIO. In this context, the realistic design sessions are an example of the implementation of active learning. The activities were meant to contribute in addressing part of the CDIO goals dealing with analytical reasoning and problem solving, system thinking, interpersonal skills, and systems engineering and management in the conceiving phase. Table 1 summarizes the goals expressed in the CDIO syllabus that were addressed when designing the realistic design sessions. It has to be remembered, that such sessions are only one of the activities run in the frame of the Value Innovation course. The description of other course activities, addressing the examination of CDIO-related objectives in the frame of an industrial project, is outside the scope of this paper.
Table 1

List of the goals of the CDIO syllabus that the realistic design sessions contributed to address

CDIO syllabus goals

Activities contributing to the fulfillment of the goal

2.1.2 Modeling

Use of qualitative modeling in conceptual design decision making

2.1.3 Estimation and qualitative analysis

Comparison of design alternative through the EVOKE approach

2.1.4 Analysis with uncertainty incomplete and ambiguous information

Assessment of the level of knowledge maturity when populating the models

2.3.2 Emergence and interactions in systems

Identification of relationships between two variables and eventual quantification of them

2.3.4 Trade-offs, judgment and balance in resolution

Trade-off performances and fulfillment of needs while lacking a unique optimal solution

3.2.8 Negotiation, compromise and conflict

Request to find an agreement in compiling the assessment matrices

4.3.1 Understanding needs and setting goals

Analysis of how customers’ needs cascade down to component needs and requirements

4.3.3 System engineering, modeling and interfaces

Modeling the value of alternative designs and interactively analyze the effect of a change of customer priorities

Research design

The research at the basis of this paper had the overall objective to investigate how to effectively teach value-driven design to university students. The proposed realistic design sessions were designed as a learning activity in which the students could develop their skills in evaluating and performing value analysis. They focused on preparing the students to become “new engineers” by practicing the use of value models in a way that replicated a real industrial setting. The research objective was declined in the following research question:

RQ1

How can value models be taught in academia overcoming the endogenous experience and knowledge gap between industrial practitioners and engineering students?

The research hypothesis was stated as follows:

Hp1

The realistic design sessions effectively teach how to use value models by enabling the students to obtain results that do not differ from those obtained by industrial practitioners.

The hypothesis was tested by running the same design sessions with both teams of students and teams of practitioners and analyzing the differences and similarities in the results. The details of the research design are described in the following sub-sections.

Participants

Two groups of participants took part to the design sessions: university students and industrial practitioners. The first group consisted of a total 61 students attending the course of Value Innovation in year 2013, 2014, and 2015. The students were divided into design teams of either 3 or 4 people, rendering a total of 17 design teams in the 3 years. The size of the teams was set to 3 or 4 people in order to balance the eventual presence of strong or weak participants, to make the discussion easy to follow and to avoid parallel discussions (Cash et al. 2012). This is because smaller teams increase the time spent on silent “thinking”, while larger teams increase the possibility of concurrent verbalization and parallel conversation (Cash et al. 2012). The students were randomly assigned to the teams. The realistic design sessions were organized as activities in the frame of the course in substitution to a regular classroom lecture. The participation was not mandatory but recommended by the teaching team. The students did not receive any incentive to take part to the design session and the activity was not part of the grading, nevertheless the students were aware that the ability to use the method practiced during the design session could have been part of the final examination of the course. One teacher took the role of “experimental controller” as suggested by Cash et al. (2012), to facilitate the design session. His role was passive, he did not support or suggest answers to the design teams, still he could be asked to clarify the logic of the method used. To minimize possible biases given by the relation between students and teacher, the experimental controller was neither the course coordinator, nor the examiner of the course.

In parallel, the same design sessions were run with industrial practitioners involving 78 people divided in 23 design teams with different industrial affiliations. Thirteen teams of practitioners were employed by an aerospace component manufacturer and took part to the design sessions in 2013, voluntary registering to the activity in the context of a company training day. Seven teams of practitioners worked for a telecommunication company and took part to the activity in 2014, in the frame of an executive course run in collaboration with the university. In this case it was not possible to verify the voluntary willingness to take part to the activity by the participants. Three teams of practitioners consisted of engineers having direct experience in companies but also having at the same time an academic position. Those teams were built by practitioners working in the aerospace industry, in the construction machinery industry, in product-service systems development, and in sustainable development. Even in those cases the same teacher took the role of experimental controller in the same fashion as for what done with the teams of students.

Procedures

The design sessions were organized as a 75 min activity. A design episode was created concerning the value assessment of different design configurations of a wheel for a racing bicycle. The choice of the racing bicycle as the reference product was driven by the necessity to limit as much as possible the knowledge barrier toward the evaluation of the new designs. The use of a bicycle was believed to be easily connected to the participants´ background knowledge and direct users’ experience.

All teams received the same task to be delivered in the same timeframe, and the same setting was followed by both students and practitioners.

Among the possible approaches for value modeling available in literature, a specific value modeling approach named EVOKE (Bertoni et al. 2018) was used in the sessions. EVOKE was developed and tested in systems engineering and product-service systems design and was selected because recalling a number of methods that are common in many approaches for preliminary design assessment. First, EVOKE uses numerical correlations (i.e. 0, 0.1, 0.3, 0.9) typical of the QFD method (Hauser and Clausing 1988); second, it uses a weighting system to define the relative importance of the “value drivers”, in a very similar fashion to the weights of the customer needs expressed in QFD; and third it uses of non-linear merit functions to set the type of correlations (i.e. the preferred behavior in terms of maximization, minimization and optimization) amongst the variables on the rows (i.e. the “Value Drivers”) and the engineering characteristics on the columns (i.e. the “Quantified Objectives), similarly to what used in the Customer Oriented Design Analysis approach (Wooley et al. 2001). For such reasons, EVOKE was considered as a good reference model for the students to practice, in order to understand the value and the limitations of using value models. The detailed description of the value modeling approach using EVOKE is outside the scope of this paper and can be found in Bertoni et al. (2018).

The task concerned the choice of the best design concept of a bicycle wheel. The participants were asked to populate two QFD-like matrixes as proposed by the EVOKE approach. The first matrix required the setting of only correlation coefficients (0–0.1–0.3–0.9) and the second matrix required both correlation coefficients and non-linear functions. The set-up of the design sessions is summarized in Fig. 1.
Fig. 1

Set-up of the design sessions

The sessions featured an initial 10-min introduction. This phase was important to align the teams and establish a common base for the experiment. At this stage, the teacher introduced the objectives of the design session, explained the specific tasks related to populating the EVOKE matrixes, provided contextual information about the component to be analyzed, and explained the meaning of needs and requirements featured in the provided EVOKE template (in paper-based A3 format). The teacher presented the bicycle users’ expectations with a list of derived systems level needs. Additionally, he presented the component level needs from the standpoint of the wheel manufacturer. The 10 min introduction followed a standardized scheme supported by a power-point presentation so that the information presented was the same for all the teams.

In the second step, the teams were given a 20-min time slot and assigned the task of populating each intersection between system-level needs (i.e., related the bike) and component-level needs (i.e., related to the bike wheel). System-level needs, seven in the sessions, were thought as criteria mirroring how customers experience the use of the bike along its lifecycle. These criteria were: Vibrations and Noise, derived from the overall customer expectation related to ‘comfort’, Top Speed, Grip and All-terrains, related to expectations about ‘road performances’, and Maintainability and Robustness mirroring the ‘operating cost’ of the bicycle. Component-level needs were considered a first step towards decomposing customer needs into requirements for the wheel. These were defined as Stiffness, Friction, Weight, Manufacturability, and Reparability (Fig. 2). They mirrored how customers experience the component and did not propose any specification for the solution; hence they did not feature a unit of measurement. The participants were asked to set four types of correlations in the matrix: 0 (no correlation), 0.1 (weak correlation), 0.3 (medium correlation), 0.9 (strong correlation). The students were already familiar to such conventional notation as it was previously taught in another occasion (i.e. the Systems Engineering course) during the study program. Once the matrixes were filled the participants were asked to take additional time to self-reflect on the confidence that they had on the correlations. This was made by asking the teams to set a degree of confidence for each of the component level need considered in the matrix. The assessment of the level of confidence mimicked the knowledge maturity scale proposed by Johansson et al. (2011), corresponding to the following definition:
Fig. 2

Decomposition of expectations, needs, and requirements in the session

  • 5 = Members are very confident and recognize exact correlations.

  • 4 = Members are confident and agree on the correlations after a short discussion.

  • 3 = Members are fairly confident and agree on the correlations after discussion.

  • 2 = Members are poorly confident and have different opinions on the correlations.

  • 1 = Members are not confident with any value and recognize opposite correlation.

No right or wrong answer existed for the knowledge maturity assessment, rather the activity was undertaken to promote the participants’ reflections on the assessment. In this way participants were asked to rethink about what they had just done, so to make them more aware of the uncertainty and risks that were embedded in the preliminary design evaluation.

After the first phase, a 10 min wrap-up session allowed the teacher to verify alignment between the teams and to clarify the remaining technical issues about the use of the provided material. Furthermore, this interruption allowed the teacher to introduce the second stage of the experiment, to explain the meaning of ‘component-level requirements’, and to specify the concept of non-linear merit functions.

In the second phase, the teams were asked to establish correlation coefficients (0–0.1–0.3–0.9) and non-linear functions (i.e. Null, Maximization, Optimization, Minimization) at the intersection of component-level needs and requirements. Four requirements were selected at this stage: Tire Diameter (mm), Tire width (mm), Spoke thickness (mm) and Use of composite (% in weight). Participants were asked to set 20 correlations and 20 functions, for a total of 40 items, in 25 min, rendering a pace of one item every 37.5 s. This is similar to the pace featured in the first iteration (approximately one item every 35 s). There are two reasons for relying on such an intensive exercise. Even if a longer session would have granted the participants more reflective time, previous research (Tsenn et al. 2014) has shown benefits in constraining experimental activities in design episodes shorter than 1 h. Also, it was important to verify if students intuitively understood the logic of the value assessment and if they were able to quickly come to a decision. As for the first stage, the participants were asked to self-reflect on the confidence they had in setting the correlations and the non-linear merit functions, and were asked to express such confidence level on a scale from 1 to 5.

The two-phase process allowed the participants to familiarize with the QFD matrix and with the logic of linking features at different levels of granularity. During the second stage, the participants could focus their attention on populating the matrix and setting relationships, rather than discussing the technicalities of the exercise and the mechanisms of EVOKE.

Data collection and analysis

The data from the design sessions run by the students were collected in two occasions. Firstly, at the end of the sessions, templates were collected from the teams and the results were manually imported to a software environment to be later analyzed. Secondly, a self-assessment questionnaire was distributed to the students at the end of the course to collect feedback concerning the design sessions to verify the achievement of the intended learning outcomes. The response to the self-assessment questionnaire was done on a voluntary basis. In the course questionnaire, two questions were considered as particularly relevant to understand if the use of design sessions reached its intended learning outcomes, those concerned if the students believed to be capable to develop and evaluate concepts on value contribution, and if the students believed they were capable to perform value analysis. These questions were answered on a scale from 1 to 4 corresponding to the following definition: not at all (1), to a minor extent (2), to a good extent (3), to a large extent (4). The complete set of questions of the self-assessment questionnaire is attached in “Appendix A”.

The data collection concerning the activities with practitioners was limited to the transcription of the results from the EVOKE matrixes in a software environment. No questionnaire was distributed to the practitioners.

The test of the hypothesis was designed by defining the type of team (i.e. students or practitioners) as independent variable, and each correspondence between component-level needs and component-level requirements as depended variables (including both the correlation coefficient and the selected function). The presence of differences between the two independent groups (i.e. students vs practitioners) was tested with the Mann–Whitney U test, as described in detailed in “Hypothesis verification” section.

Ethical issues

The research did not entail handling of sensible personal information and physical intervention on human beings. Moreover, the research was conducted with methods that do not affect or put at risk the research subjects physically or mentally. Based on such characteristics the research did not require to obtain the approval of the research ethical committee, in accordance to the Ethical Review of Research Involving Humans approved by the Swedish Ministry of Education and Cultural Affairs (SFS 2003:460).

Analysis of the data of the design sessions

The analysis of the data collected from the questionnaires is provided in “Analysis of the data from the questionnaires” section, while “Hypothesis verification” section describes the analysis of the data collected from the templates.

Analysis of the data from the questionnaires

The questionnaires gave both the possibility to the students to self-assess their progress with a numerical scale and to add further written reflections if desired. Two specific statements addressed the measurement of the achievement of the intended learning outcomes of the course related to the use of the realistic design sessions. The statements were:
  • After the course completion, I feel confident I can develop concepts and evaluate their value contribution.

  • After the course completion, I feel confident I can run a value analysis.

Out of the 61 students that took part in the design sessions, 28 answered the voluntary questionnaire. The results are summarized in Fig. 3, showing that the large majority of students believe to be able to develop and evaluate design concepts and perform value analysis to a good and major extent. The answers of the students to the self-assessment reflected positive feedback toward the achievement of the intended learning outcomes, thus providing positive indications toward considering the use of realistic design sessions as an approach to teach the use of value models in preliminary design.
Fig. 3

Distribution of students answers to the self-assessment questionnaire. A numerical scale was used with 4 corresponding to “to a major extent” and 1 corresponding to “Not at all”

Hypothesis verification

In respect to Hp1 (The realistic design sessions effectively teach how to use value models by enabling the students to obtain results that do not differ from those obtained by industrial practitioners) the students’ correlations set in the templates were compared to those of the practitioners. To test the hypothesis, the correlations between needs and requirements by all the teams were summarized and divided into two groups: the answers given by the groups of students and the answers given by the groups of practitioners. The comparison consisted in verifying if the correlation coefficients and the non-linear merit functions were set irrespectively from the type of the group. To allow this comparison a non-parametric Mann–Whitney U test was applied to each correlation between needs and requirements and to each selected function. The non-parametric Mann–Whitney U test was the most suitable statistic since it allowed comparing the difference between two independent groups, accepting both ordinal and continuous variables as dependent variables. In order to apply the Mann–Whitney U test four assumptions needed to be verified, namely: (1) the dependent variable should be measured at the ordinal or continuous level; (2) the independent variable should consist of two categorical independent groups; (3) there should be independence of observations, and (4) the two variables are not normally distributed and it should be determined if the two distribution have the same shape. The data collected satisfied the first, second and third assumption, while the assumption of equal distributions between the two groups needed to be tested. To do so, a test of homogeneity of variance was applied on each correlation and on each function set by the two groups. Table 2 shows the results from the test of homogeneity variance based on median and with an adjusted degree of freedom concerning the correlation coefficients set between the component level needs and the component level requirements. Table 3 shows the results of the same test but concerning the type of non-linear merit function set between the component level needs and the component level requirements.
Table 2

Results from the homogeneity of variance test concerning the correlation coefficients set between the component level needs and the component level requirements

 

Levene statistic

df1

df2

Sig.

Stiffness-diameter

Based on median and with adjusted df

.801

1

37.563

.377

Friction-diameter

Based on median and with adjusted df

3.139

1

37.961

.084

Weight-diameter

Based on median and with adjusted df

.912

1

37.941

.346

Manufacturability-diameter

Based on median and with adjusted df

.003

1

37.871

.956

Repairability-diameter

Based on median and with adjusted df

1.139

1

37.559

.293

Stiffness-tire width

Based on median and with adjusted df

.709

1

37.890

.405

Friction-tire width

Based on median and with adjusted df

2.132

1

33.192

.154

Weight-tire width

Based on median and with adjusted df

.012

1

37.865

.914

Manufacturability-tire width

Based on median and with adjusted df

.185

1

37.265

.669

Repairability-tire width

Based on median and with adjusted df

.083

1

37.894

.775

Stiffness-spoke thickness

Based on median and with adjusted df

.064

1

37.991

.801

Friction-spoke thickness

Based on median and with adjusted df

3.159

1

36.062

.084

Weight-spoke thickness

Based on median and with adjusted df

1.540

1

35.630

.223

Manufacturability-spoke thickness

Based on median and with adjusted df

2.067

1

33.073

.160

Repairability-spoke thickness

Based on median and with adjusted df

.390

1

32.882

.537

Stiffness-composite material

Based on median and with adjusted df

.003

1

37.948

.954

Friction-composite material

Based on median and with adjusted df

.947

1

33.553

.337

Weight-composite material

Based on median and with adjusted df

1.504

1

22.000

.233

Manufacturability-composite material

Based on median and with adjusted df

.001

1

37.860

.977

Repairability-composite material

Based on median and with adjusted df

.265

1

36.097

.610

Table 3

Results from the homogeneity of variance test concerning the non-linear merit functions set between the component level needs and the component level requirements

 

Levene statistic

df1

df2

Sig.

Stiffness-diameter

Based on median and with adjusted df

.801

1

37.563

.377

Friction-diameter

Based on median and with adjusted df

3.139

1

37.961

.084

Weight-diameter

Based on median and with adjusted df

.912

1

37.941

.346

Manufacturability-diameter

Based on median and with adjusted df

.003

1

37.871

.956

Reparability-diameter

Based on median and with adjusted df

1.139

1

37.559

.293

Stiffness-tire width

Based on median and with adjusted df

.709

1

37.890

.405

Friction-tire width

Based on median and with adjusted df

2.132

1

33.192

.154

Weight-tire width

Based on median and with adjusted df

.012

1

37.865

.914

Manufacturability-tire width

Based on median and with adjusted df

.185

1

37.265

.669

Reparability-tire width

Based on median and with adjusted df

.083

1

37.894

.775

Stiffness-spoke thickness

Based on median and with adjusted df

.064

1

37.991

.801

Friction-spoke thickness

Based on median and with adjusted df

3.159

1

36.062

.084

Weight-spoke thickness

Based on median and with adjusted df

1.540

1

35.630

.223

Manufacturability-spoke thickness

Based on median and with adjusted df

2.067

1

33.073

.160

Reparability-spoke thickness

Based on median and with adjusted df

.390

1

32.882

.537

Stiffness-composite material

Based on median and with adjusted df

.003

1

37.948

.954

Friction-composite material

Based on median and with adjusted df

.947

1

33.553

.337

Weight-composite material

Based on median and with adjusted df

1.504

1

22.000

.233

Manufacturability-composite material

Based on median and with adjusted df

.001

1

37.860

.977

Reparability-composite material

Based on median and with adjusted df

.265

1

36.097

.610

The level of significance showed in the last column on the right was larger than 0.05 for each correlation. This indicated that the null hypothesis of the non-parametric version of the homogeneity of variance test was accepted, meaning that also the assumption of equal distribution was accepted. This implied that it was correct to interpret the p value of the results of a Mann–Whitney U statistic.

After the acceptance of the fourth assumption, the statistic was applied to compare the two groups of respondents. Given the limited number of observations (40 observation for each correlation between needs and requirements), the analysis was run on the exact significance of the test rather than on an asymptotic approach, and the two-tailed version was selected as the more appropriate. Table 4 shows the exact significance level of the Mann–Whitney U test obtained for both correlation coefficients (second column) and non-linear merit functions (third column) considering groups of students and groups of practitioners as independent variables.
Table 4

Results of the Mann–Whitney U test for the choice of correlation coefficients (second column) and the choice of non-linear merit functions (third column)

Correlation

Exact Sig. (2-tailed) correlations

Exact Sig. (2-tailed) functions

Stiffness-diameter

.362

.576

Friction-diameter

.072

.409

Weight-diameter

.232

1.000

Manufacturability-diameter

.669

.888

Reparability-diameter

.573

.197

Stiffness-tire width

.606

.557

Friction-tire width

.229

.896

Weight-tire width

.390

1.000

Manufacturability-tire width

.220

.547

Reparability-tire width

.665

.611

Stiffness-spoke thickness

1.000

.211

Friction-spoke thickness

.080

.297

Weight-spoke thickness

.108

.838

Manufacturability-spoke thickness

.582

.695

Reparability-spoke thickness

.991

.765

Stiffness-composite material

1.000

.836

Friction-composite material

.736

.133

Weight-composite material

.499

.149

Matt manufacturability-composite material

1.000

.563

Reparability-composite material

.181

.465

The results in the table show that none of the correlations featured a significance level smaller than 5% (i.e. 0.05), thus the null hypothesis was accepted in all the cases. The results of such analysis suggested that being part of the group of students or of the group of practitioners did not influence how participants populated the value models.

Discussion and conclusion

Engineering education is in evolution driven by the constantly changing needs of society and industry. Reforming the way engineering education is organized is a challenge that needs to be approached from different directions considering the wide spectrum of engineering disciplines. This paper has focused on a specific aspect of the engineering activity, i.e. the use of VDD methods in preliminary design, and how this can be taught to university students.

Value Driven Design is considered as a rather novel topic in the engineering design and systems engineering literature and no formalized methods applicable to education have yet to be presented. The use of value models in VDD requires individual thinking, judgment, and interdisciplinary cognitive capabilities going beyond the mere application of a method. For this reason, teaching VDD cannot be limited to the description of the logic of a method, rather it requires an approach stimulating students to think, judge and use their interdisciplinary knowledge to learn and reflect on the positive and negative qualities of the methods. The realistic design sessions presented in this paper recreated a design decision environment that mimicked the situation of a real industrial environment, in line by what proposed by Muller and Bonnema (2013). The adoption of the new teaching activity in the course was framed in the university effort to move toward the transformation objectives highlighted by the CDIO initiative.

Realistic design sessions engage the students in a scenario in which they directly experience complex issues and deal with them in a restricted timeframe. In the specific case of this paper, practicing the use of value models through group discussion and shared decisions, rather than through presentation of theory, was planned as an enabler for the students to more effectively reflect upon the tradeoff between benefit and effort spent on a value model, ultimately nurturing students’ discussion about benefits and drawbacks of VDD methods. This capability is particularly relevant in those areas of design in which there are no right or wrong answers, rather the best solution depends upon a mix of variables often difficult to know beforehand.

The analysis of the data collected through the course questionnaire showed that, after the activity, the students felt more self-confident in being able to perform analysis and value assessment in early design. Such results could be seen as an achievement of the “learning by doing” process granted by the design sessions. However, there is a gap between being confident in using a method and actually using it in the right way. This gap has been addressed by the testing of Hp1, assuming as positive a situation in which students and practitioners behave in the same way when doing value analysis. The results presented in Table 4 show that the 17 teams of students obtained results not having any statistically relevant difference from those obtained by 23 teams of practitioners, strengthening the hypothesis that the use of the realistic design session allowed the students to understand and apply the new method correctly, despite the lack of previous experience.

On the overall, the results of the data analysis indicate that the use of realistic design sessions contributed to the achievement of the intended learning outcome of the course (i.e. ILO_2 Develop and evaluate the value contribution of different design concepts, and ILO_5 Run value analysis), nevertheless, some limitations and possible improvements emerged and are worth being discussed.

Firstly, although the analysis indicates the sessions to be beneficial in the frame of the course, the students faced a design episode that referred to a knowledge domain relatively close to their experience (i.e. the use of a bicycle). The involvement of the students in a larger scale activity going beyond the 75 min timeframe would considerably expand the data collection possibilities. This would, at the same time, open to a larger possibility of introducing biases in measurement and analysis. In this respect framing the experiments in a student-driven project encompassing the development of new a product or system, from needs identification to requirements definition, is perceived as a relevant area for the further development of the educational approach. This could give the students enough freedom to test and select the type of value model more suitable for their specific situation, aiming to further foster team discussion and individual reflections.

Limitations are present in relation to the generalizability of the findings and to the research approach. The use of realistic design sessions has been decided in relation to the renewal of the structure and intended learning outcomes of the course of Value Innovation, and it has been applied on three consecutive years. No previous similar activity in the frame of the same course took place, and this renders a situation in which it is not possible to compare the results, in terms of achievement of intended learning outcomes, in relation to those of a control group subjected to a different teaching activity (i.e. not using realistic design session). Furthermore, different value modeling methods exist in VDD literature, respectively used in different industrial settings, thus it is not possible to claim the generalizability of the results for the teaching of any type of value model. In the experiment, the EVOKE approach was selected as representative of the main common characteristics of VDD methods because embedding the most popular features of conceptual design assessment methods, i.e. the use of QFD-like correlations with rank-weighted needs and the use of non-linear correlation functions.

In conclusion, in line with the university strategy to move toward a CDIO educational architecture, more research has to be spent on to the introduction of broader initiatives for teaching VDD, going beyond the introduction of a single activity in the frame of a course. “Learning by doing” is a remarkable development direction, however embracing such principle does not equal to provide quality education that really meets the intended learning outcomes. The risk is that of being overwhelmed by a plethora of teaching activities and methods claiming to provide improvement just because partially adhering to the CDIO architecture or because applying the “learning by doing” principle. A key challenge for education research in VDD is, therefore, to be able to discriminate between different methods and initiatives, so to bring forward those that provide scientific evidence of their effectiveness toward achieving higher learning outcomes for the students. The method proposed in this paper, and the analysis of its effectiveness, is an expression of such a belief and encourages a more detailed development of metrics to evaluate the effectiveness of educational approaches in engineering design.

Notes

Acknowledgements

This study was funded by the Swedish Knowledge and Competence Development Foundation through the Model Driven Development and Decision Support research profile at Blekinge Institute of Technology.

References

  1. AIAA VDD Committee. (2015). Value-driven design. Available at http://vddi.org/vdd-home.htm. Accessed February 22, 2019).
  2. Al-Ashaab, A., Golob, M., Attia, U. M., Khan, M., Parsons, J., Andino, A., et al. (2013). The transformation of product development process into lean environment using set-based concurrent engineering: A case study from an aerospace industry. Concurrent Engineering: Research and Applications, 21(4), 268–285.Google Scholar
  3. Anderson, E. W., & Sullivan, M. W. (1993). The antecedents and consequences of customer satisfaction for firms. Marketing Science, 12(2), 125–143.Google Scholar
  4. Atman, C. J., Adams, R. S., Cardella, M. E., Turns, J., Mosborg, S., & Saleem, J. (2007). Engineering design processes: A comparison of students and expert practitioners. Journal of Engineering Education, 96(4), 359.Google Scholar
  5. Atman, C. J., & Bursic, K. M. (1996). Teaching engineering design: Can reading a textbook make a difference? Research in Engineering Design, 8(4), 240–250.Google Scholar
  6. Benedetto, S., Bernelli Zazzera, F., Bertola, P., Cantamessa, M., Ceri, S., Ranci, C., et al. (2010). Alta Scuola Politecnica: An ongoing experiment in the multidisciplinary education of top students towards innovation in engineering, architecture and design. European Journal of Engineering Education, 35(6), 627–643.Google Scholar
  7. Bertoni, A. (2013). Analyzing product-service systems conceptual design: The effect of color-coded 3D representation. Design Studies, 34(6), 763–793.Google Scholar
  8. Bertoni, A. (2019). A reverse engineering role-play to teach systems engineering methods. Education Sciences, 9(1), 30.Google Scholar
  9. Bertoni, A., Bertoni, M., & Isaksson, O. (2013). Value visualization in product service systems preliminary design. Journal of Cleaner Production, 53, 103–117.Google Scholar
  10. Bertoni, M., Bertoni, A., & Isaksson, O. (2018). Evoke: A value-driven concept selection method for early system design. Journal of Systems Science and Systems Engineering, 27(1), 46–77.Google Scholar
  11. Bertoni, A., Bertoni, M., Panarotto, M., Johansson, C., & Larsson, T. C. (2016). Value-driven product service systems development: Methods and industrial applications. CIRP Journal of Manufacturing Science and Technology, 15, 42–55.Google Scholar
  12. Bonwell, C. C., & Eison, J. A. (1991). Active learning: Creating excitement in the classroom. 1991 ASHE-ERIC higher education reports. ERIC Clearinghouse on Higher Education, The George Washington University, One Dupont Circle, Suite 630, Washington, DC 20036-1183.Google Scholar
  13. Bordogna, J., Fromrn, E., & Ernst, E. W. (1993). Engineering education: Innovation through integration. Journal of Engineering Education, 82(1), 3–8.Google Scholar
  14. Bronet, F., Eglash, R., Gabriele, G., Hess, D., & Kagan, L. (2003). Product design and innovation: Evolution of an interdisciplinary design curriculum. International Journal of Engineering Education, 19(1), 183–191.Google Scholar
  15. Brown, O., & Eremenko, P. (2008). Application of value-centric design to space architectures: The case of fractionated spacecraft. In AIAA SPACE 2008 conference & exposition, San Diego, California, AIAA paper (p. 7869).Google Scholar
  16. Cash, P., Elias, E., Dekoninck, E., & Culley, S. (2012). Methodological insights from a rigorous small scale design experiment. Design Studies, 33(2), 208–235.Google Scholar
  17. Castagne, S., Curran, R., & Collopy, P. (2009). Implementation of value-driven optimisation for the design of aircraft fuselage panels. International Journal of Production Economics, 117(2), 381–388.Google Scholar
  18. CDIO. (2018). Available at http://www.cdio.org/cdio-collaborators/school-profiles. Accessed April 13, 2018.
  19. Chen, W., Hoyle, C., & Wassenaar, H. J. (2013). Decision-based design: Integrating consumer preferences into engineering design. London: Springer.Google Scholar
  20. Cheung, J., Scanlan, J., & Wiseall, S. (2008). Value driven design—an initial study applied to novel aerospace components in Rolls-Royce plc. In R. Curran, S. Y. Chou, & A. Trappey (Eds.), Collaborative product and service life cycle management for a sustainable world (pp. 241–248). London: Springer.Google Scholar
  21. Cobb, P., Confrey, J., Lehrer, R., & Schauble, L. (2003). Design experiments in educational research. Educational Researcher, 32(1), 9–13.Google Scholar
  22. Collopy, P. D., & Hollingsworth, P. M. (2011). Value-driven design. Journal of Aircraft, 48(3), 749–759.Google Scholar
  23. Crawley, E. F., Malmqvist, J., Lucas, W. A., & Brodeur, D. R. (2011). The CDIO syllabus v2. 0. An updated statement of goals for engineering education. In Proceedings of 7th international CDIO conference, Copenhagen, Denmark.Google Scholar
  24. Crawley, E. F., Malmqvist, J., Östlund, S., Brodeur, D. R., & Edström, K. (2014). Rethinking engineering education: The CDIO approach (pp. 11–45). Cham: Springer.Google Scholar
  25. Day, D. V. (2000). Leadership development: A review in context. The Leadership Quarterly, 11(4), 581–613.Google Scholar
  26. Dym, C. L. (1999). Learning engineering: Design, languages, and experiences. Journal of Engineering Education, 88(2), 145.Google Scholar
  27. Ellis, G., & Dix, A. (2006). An explorative analysis of user evaluation studies in information visualization. In Proceedings of the 2006 AVI workshop on beyond time and errors: Novel evaluation methods for information visualization, Venice, Italy (pp. 1–7).Google Scholar
  28. Eppinger, S. D., & Ulrich, K. T. (1995). Product design and development. New York: McGraw-Hill Higher Education.Google Scholar
  29. Erginel, N. (2010). Construction of a fuzzy QFD failure matrix using a fuzzy multiple-objective decision model. Journal of Engineering Design, 21(6), 677–692.Google Scholar
  30. Felder, R. M., Woods, D. R., Stice, J. E., & Rugarcia, A. (2000). The future of engineering education II. Teaching methods that work. Chemical Engineering Education, 34(1), 26–39.Google Scholar
  31. Forbis, J. L., & Mehta, N. T. (1981). Value-based strategies for industrial products. Business Horizons, 24(3), 32–42.Google Scholar
  32. Gericke, K., & Eckert, C. (2015). The long road to improvement in modelling and managing engineering design processes. In Proceedings of the international conference on engineering design 2015. Milan: The Design Society.Google Scholar
  33. Grönroos, C., & Voima, P. (2013). Critical service logic: Making sense of value creation and co-creation. Journal of the Academy of Marketing Science, 41(2), 133–150.Google Scholar
  34. Guenov, M. (2008). Covariance structural models of the relationship between the design and customer domains. Journal of Engineering Design, 19(1), 75–95.Google Scholar
  35. Hauser, J. R., & Clausing, D. (1988). The house of quality. Harvard Business Review, 6(6), 63–73.Google Scholar
  36. Hazelrigg, G. A. (1998). A framework for decision-based engineering design. Journal of Mechanical Design, 120(4), 653–658.Google Scholar
  37. INCOSE. (2015). Systems engineering handbook: A guide for system life cycle processes and activities, version 4.0. Hoboken, NJ: Wiley. ISBN 978-1-118-99940-0.Google Scholar
  38. Jablokow, K. W. (2008). Developing problem solving leadership: A cognitive approach. International Journal of Engineering Education, 24(5), 936–954.Google Scholar
  39. Johansson, C., Hicks, B., Larsson, A. C., & Bertoni, M. (2011). Knowledge maturity as a means to support decision making during product-service systems development projects in the aerospace sector. Project Management Journal, 42(2), 32–50.Google Scholar
  40. Kleinsmann, M., Deken, F., Dong, A., & Lauchec, K. (2012). Development of design collaboration skills. Journal of Engineering Design, 3(7), 485–506.Google Scholar
  41. Kowalkowski, C., & Kindström, D. (2009). Value visualization strategies for PSS development. In T. Sakao & M. Lindahl (Eds.), Introduction to product/service-system (pp. 159–181). London: Springer.Google Scholar
  42. Krishnan, V., & Ulrich, K. T. (2001). Product development decisions: A review of the literature. Management Science, 47(1), 1–21.Google Scholar
  43. Luft, T., Schleich, B., & Wartzack, S. (2013). Concept development for innovative products—A challenge for engineering design education. In DS 75-8: Proceedings of the 19th international conference on engineering design (ICED13), design for harmonies, Vol. 8: Design education, Seoul, Korea, 19–22 August 2013.Google Scholar
  44. Mclaren, A. (2009). Approaches to the teaching of design: An engineering subject centre guide. Higher Education Academy, Loughborough University, Leicester. ISBN 978-1-904804-802.Google Scholar
  45. McManus, H. M., Richards, M. G., Ross, A. M., & Hastings, D. E. (2007). A framework for incorporating “ilities” in tradespace studies. In AIAA space 2007, Long Beach, CA.Google Scholar
  46. Miles, L. D. (1962). Techniques of value analysis and engineering. New York: McGraw-Hill.Google Scholar
  47. Mitsui, H., Kambe, H., & Koizumi, H. (2009). Use of student experiments for teaching embedded software development including HW/SW co-design. IEEE Transactions on Education, 52(3), 436–443.Google Scholar
  48. Muller, G., & Bonnema, G. M. (2013). Teaching systems engineering to undergraduates; experiences and considerations. In INCOSE international symposium, Philadelphia, USA (Vol. 23, No. 1, pp. 98–111).Google Scholar
  49. Nadelson, L. S., Pfiester, J., Callahan, J., & Pyke, P. (2015). Who is doing the engineering, the student or the teacher? The development and use of a rubric to categorize level of design for the elementary classroom. Journal of Technology Education, 26(2), 22–45.Google Scholar
  50. Napp, J. B. (2004). Survey of library services at engineering news record’s top 500 design firms: Implications for engineering education. Journal of Engineering Education, 3, 247–252.Google Scholar
  51. National Research Council. (1995). Engineering education: Designing an adaptive system (p. 1995). Washington, DC: National Academy Press.Google Scholar
  52. Normann, R., & Ramirez, R. (1993). From value chain to value constellation: Designing interactive strategy. Harvard Business Review, 71(4), 65–77.Google Scholar
  53. O’Neill, M. G., Yue, H., Nag, S., Grogan, P., & de Weck, O. (2010). Comparing and optimizing the DARPA system F6 program value-centric design methodologies. In Proceedings of the AIAA SPACE 2010 conference & exposition, Anaheim, California (p. 8828).Google Scholar
  54. Ollis, D. F. (2004). Basic elements of multidisciplinary design courses and projects. International Journal of Engineering Education, 20(3), 391–397.Google Scholar
  55. Panarotto, M., Wall, J., Bertoni, M., Larsson, T., & Jonsson, P. (2017). Value-driven simulation: Thinking together through simulation in early engineering design. In 21st international conference on engineering design (ICED). Vancouver: The Design Society.Google Scholar
  56. Pop-Iliev, R., & Nokleby, S. B. (2011). Concurrent approach to teaching concurrent design engineering. In Proceedings of the Canadian design engineering network (CDEN) conference, Kaninaskis, Alberta.Google Scholar
  57. Prince, M. (2004). Does active learning work? A review of the research. Journal of Engineering Education, 93(3), 223–231.Google Scholar
  58. Ross, A. M., Hastings, D. E., Warmkessel, J. M., & Diller, N. P. (2004). Multi-attribute tradespace exploration as front end for effective space system design. Journal of Spacecraft and Rockets, 41(1), 20–28.Google Scholar
  59. Shapiro, B. P., & Jackson, B. B. (1978). Industrial pricing to meet customer needs. Harvard Business Review, 56(6), 119–127.Google Scholar
  60. Soban, D., Hollingsworth, P., & Price, M., (2011). Defining a research agenda in value driven design: Questions that need to be asked. In Proceedings of the 2nd international air transport and operations symposium, TU Delft, Netherlands.Google Scholar
  61. Steiner, F., & Harmon, R. (2009). The impact of intangible value on the design and marketing of new products and services: An exploratory approach. In Proceedings of PICMET 2009, Portland, Oregon USA.Google Scholar
  62. Swedish ministry of education and cultural affair (2003). Act concerning Ethical review of Research involving Humans. SFS 2003: 460 [Swedish Code of Statutes].Google Scholar
  63. Tsenn, J., Atilola, O., McAdams, D. A., & Linsey, J. S. (2014). The effects of time and incubation on design concept generation. Design Studies, 35(5), 500–526.Google Scholar
  64. Ullman, D. G. (1992). The mechanical design process. New York: McGraw-Hill.Google Scholar
  65. Ullman, D. G. (2001). Robust decision-making for engineering design. Journal of Engineering Design, 12(1), 3–13.Google Scholar
  66. Vignoli, M., & Onghia, F. (2015). Reggio Emilia engineering education. In DS 80-8 proceedings of the 20th international conference on engineering design (ICED 15), Vol. 8: Innovation and creativity, Milan, Italy, 27–30 July 2015.Google Scholar
  67. Wallace, K. (2011). Transferring design methods into practice. In H. Birkhofer (Ed.), The future of design methodology (pp. 239–248). London: Springer.Google Scholar
  68. Wooley, M., Scalan, J., & Eveson, W., (2001). Optimising the development of a medical device using formal engineering design techniques and the CODA system. In Proceedings of the 27th international conference on concurrent enterprising (pp. 367–376).Google Scholar
  69. Zeithaml, V. A. (1988). Consumer perceptions of price, quality, and value: A means-end model and synthesis of evidence. Journal of Marketing, 52(3), 2–22.Google Scholar
  70. Zhang, X., Tong, S., Eres, H., Wang, K., & Kossmann, M. (2015). Towards avoiding the hidden traps in QFD during requirements establishment. Journal of Systems Science and Systems Engineering, 24(3), 316–336.Google Scholar

Copyright information

© The Author(s) 2019

Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Authors and Affiliations

  1. 1.Department of Mechanical EngineeringBlekinge Institute of TechnologyKarlskronaSweden

Personalised recommendations