1 Introduction

The concept of hybrid has been applied for many years across multiple manufacturing industries as a solution to produce parts in a more efficient and productive way by combining two or more technologies [1]. During the last years, the term gained interest and found applications also in workflows involving additive manufacturing (AM). Hybrid AM processes can be defined as the use of AM with one or more secondary processes or energy sources that are fully coupled and synergistically affect part quality, functionality and/or process performance [2].

The promise of hybrid manufacturing is more powerful than any stand-alone technology and hybrid AM is not making an exception as hardware technologies supporting such functionalities are striding ahead. Recent advances in hybrid AM brought to the market several commercial hybrid machines integrating additive with one or more secondary processes or energy sources [3,4,5,6]. Main reasons for the use of hybrid AM systems in low volume production, high construction variability of parts include repairing/re-manufacturing of high-value parts, addition of features of certain complexity on existing parts, deposition of special layers to produce multi-metal parts, improvement of the surface finish and precision of the deposited parts [7].

Process deviation avoidance is a strong requirement for any kind of production and this requirement is particularly great within production via AM. These technologies are often not capable of respecting the nominal design and the dimensional and geometrical accuracy of the resulting product remain a bottleneck for AM regarding quality assurance and control [8]. For instance, for laser metal deposition (LMD) the process parameters must be thoroughly selected to deposit the correct amount of material to match the target layer thickness. Any inaccuracy bears the risk to cause either overbuilt or underbuilt geometries which will propagate throughout the deposition [9]. The distance between the nozzle and the substrate was identified as one of the key parameters for the layer height control and various methods, monitoring solutions and algorithms were developed and tested to support the real-time adaptation of process parameters for specific metal AM applications [10,11,12,13]. In addition to a non-respect of the working distance, the escalation of process instabilities can be promoted by an uncontrolled heat buildup during process execution. Monitoring the instant temperature and behaviour of the melt pool as well as the correlation with the setpoint values of predefined process parameters can provide valuable information for the development of closed loop control algorithms for LMD processes [14, 15]. These approaches are mainly focused on the control of either laser power or scanning speed which could bring along undesired consequences such as the modification of the microstructural morphology of the deposited layer [16]. Generally, control algorithms addressing exclusively intra-layer dynamics could not solve for optimal conditions while considering the full geometry of a part. In order to have a better grasp of how the process evolved across the entire build, Sammons et al. introduced a repetitive process to enable layer-to-layer control by using a spatial dynamic model and stability process criteria for the laser metal deposition [17]. Furthermore, in order to handle uncertainties in the model parameters and process setpoints, more robust process controllers can be developed considering the integration of non-invasive monitoring solutions. These can be seen as key enablers for process defect identification and containment by generating knowledge supporting subsequent interventions based on the type and size of defect [18, 19]. Prediction and control models enabling the in-process adaptation of parameters were already proposed by considering the fusion between real-time sensor data and part quality data from offline inspection devices [20, 21].

Despite various implementation efforts, full quality control over part geometries generated with pure LMD systems is still hard to achieve due to many far-from equilibrium and highly dynamic phenomena caused by extreme heating and cooling cycles. The hybridization of additive and subtractive offers a compelling proposition to overcome this current limitation by leveraging the synergetic effect of both technologies while considering cross-specific interfaces and minimizing their individual limitations. Such integration is not without challenges though and a number of studies focused on the interactions between processes and their effects on the planning of workflows for hybrid process chains. Automated process planning solutions were already tackled by considering the full 3D laser-aided deposition of a part followed by a five-axis surface finish machining with collision detection [22], the fixed sequence of operations to create a multi-axis setup for the realization of a part [23], the concept of variety management and production volume for the manufacturing of a family of parts [24] or the manufacturability, accessibility and additive-subtractive process capabilities to find adapted hybrid process plan with the support of computational frameworks [25].

Beside the exclusive integration of additive and subtractive technologies, part inspection has been recently used to a certain extent by several methods, tools or models to determine and adapt hybrid process planning strategies [26,27,28,29,30].

While the integration of LMD with CNC machining was the hybrid AM strategy preferred by most commercial AM equipment manufacturers, their accompanying software capabilities vary considerably in terms of availability, complexity and breadth [31]. They are mainly accommodating the definition of static and unidirectional manufacturing strategies without consideration given to the realistic manufacturing conditions showcased while the process is run on the machine. The online adaptivity is generally limited to process parameters which can be adjusted during the manufacturing runs without interrupting the ongoing process [32]. On the contrary, adaptive CAx systems allow the automated handling of manufacturing process chains despite the influence of uncertain process and product conditions [33]. In this respect, a flexible CAPP system for fused deposition modelling, milling and probing was introduced in [34] while considering feature extraction and recognition, process interactions, cost formulations and a multi-agent system to define the sequence of operations for the manufacturing of prismatic parts. Nevertheless, the literature on dynamic hybrid AM process planning is rather scarce and the need to develop unified software solutions with decision and regenerative process planning capabilities for hybrid manufacturing is of great importance and still a matter of research [35,36,37].

Today’s manufacturing industry realm is undergoing a digital transformation characterized by innovation through building intelligence into the processes to become more event- or insight-driven. Hybrid manufacturing hardware with AM capabilities is progressing fast but the programming of CAx chains associated with these systems is at a large extent done manually and adapted digital platforms to support an automated workflow in order to maximize their incredible potential are falling behind. This work is motivated by the need to bridge this gap and to capture the logic behind an adaptive hybrid process chain with the aim to support the achievement of enhanced product quality and improved operational efficiency in hybrid AM workflows. Furthermore, considering the inspection as an integrated process of the hybrid chain provides an excellent opportunity to build additional knowledge about LMD deviation scenarios and to support dynamic process planning and informed decisions about the course of the subsequent manufacturing steps by considering sequential updates of the start geometry for both additive and subtractive processes.

2 Methodology

Generally, the technological integration and programming of the additive and subtractive processes demand significant human intervention, involve the execution of predetermined process sequences and do not accommodate adaptive logics. The focus of the proposed methodology is on regenerative CAD/CAM/CAI activities while aiming to provide the complete logic enabling the full automation of a hybrid AM process chain. The automation of such workflow would not be possible without a comprehensive analysis of the process chain deployed for product creation and a proper integration between the hybrid manufacturing hardware, software and controller layers.

2.1 Hybrid machine

The manufacturing equipment considered is a hybrid 5-axis machine with a working envelope of 800 × 800 × 1200 mm which enables the execution of multi-axis tool paths for both types of manufacturing operations available, additive and subtractive respectively.

The material processing capabilities are enabled by two laser sources, namely a continuous wave (CW) and a pulsed wave (PW) which can be easily interchanged in function of the selected process. The CW laser provides the heat input for the LMD process, rising as one of the most promising additive technologies capable of producing near-net shape components with layer-by-layer construction directly from a 3D CAD model. This technology is particularly interesting for the deposition of green field parts, the realization of localized coatings and restoration applications. However, the produced parts typically deviate from the targeted nominal shape and a certain amount of post processing is required to guarantee the consistency with CAD data. Undesired irregularities add up over multiple layers and lead to error propagation which influences not only the parts’ shape and surface quality but also their mechanical properties. One solution to overcome these drawbacks and to enhance the performance of the additive process is represented by the adoption of a hybrid approach. This approach, in our case, combines the AM process with laser ablation. The ablation process relies upon the capability of the PW laser to improve the geometric and dimensional accuracy along the build direction either by selectively removing irregularities or by reducing the layer thickness. Furthermore, this secondary energy source is capable of reducing the surface roughness and altering the surface composition and properties of materials while delivering similar digital flexibility as the layer-wise additive production and easy integration opportunities for hybrid workflow automation.

Apart from the laser sources, the optical chain of the machine includes a beam combiner, a beam switch and a laser scanner. The beam combiner is one of the key components of the innovative hybrid additive manufacturing process chain of the machine as it allows a seamless swap between the CW laser source for the deposition process and the PW laser source for the ablation. In order to enhance the versatility of the hybrid process, the CW optical chain was designed to accommodate a beam switch which is a three-port device connecting two different process fibres to the CW laser source. In particular, a 100-µm core diameter fibre, giving 170 µm spot diameter and a 600-µm core diameter fibre, giving 1020 µm diameter spot, have been implemented. The smaller laser spot will be selected for the realization of geometric features requiring higher accuracy and resolution while the larger spot will be preferred to deposit near-net shape parts in a shorter time. The scanning head is the last component of the optical chain. The motion of the ablation laser spot in the x–y working area is provided by a 2D laser scanner, consisting of two main parts: (i) 45° rotated mirror, used to change the direction of the laser beam coming from the PW source from vertical to horizontal and (ii) two mirror galvanometers (X-axis and Y-axis), used to change the position of the laser spot onto the working field. To provide uniform irradiance and scanning rate across the focal plane, a F-theta lens was implemented.

The hybrid machine is equipped also with a pyrometer, melt-pool camera as well as a sensing system embedded into the gantry-structure and called Vision Box (VB) (Fig. 1). The VB contains hardware components to monitor and control the temperature distribution and actual part geometry. Real-time monitoring of the melt pool geometry on the working surface during the deposition allows optimizing the process: through the melt-pool camera, it is possible to have measurements of the melt pool length, width and height. Moreover, the process camera can be used also to control the laser focus positioning, as deposition and ablation processes cause variations of working surface z level and thus a close-loop feedback is necessary for proper adjustments.

Fig. 1
figure 1

(a) Vision Box; (b) 4D hybrid 5-axis machine; (c) Optical chain, monitoring and inspection components

The actual part geometry is materialized with the support of a 3D vision system capable of stereoscopic image acquisition and 3D object reconstruction based on structured light technique, mounted also inside the VB. The VB is connected to an external PC that manages scan triggering and processing, as well as the communication with the machine PLC for synchronizing the machine poses with the part geometry acquisition.

2.2 Software platform infrastructure

The dynamic regeneration of the manufacturing strategy will demand a broader roll for the planning and control software driving the manufacturing process. By introducing a closed control loop, the process disturbances are addressed with the support of some form of feedback data from various sensing systems (e.g. melt pool imaging and pyrometry) to re-establish the values of monitored variables within specified ranges. The proposed approach complements such measures based on part-level geometric information provided by 3D scanning activities. The heterogeneity of the measured data will ensure an extended monitoring of the complex physical phenomena occurring during part manufacturing, increasing the effectiveness of the control scheme.

A dedicated software platform, consisting of different modules, is proposed to support the regeneration of the manufacturing strategy based on the in-process analysis of the current run (Fig. 2). Within the platform, diverse CAD/CAM software tools are collected together under a single interface named the Geometric Processing Tools (GPT) that exposes individual features of the platform in a common manner. In particular, the GPT (custom development) contains the master executable for running the geometric processing tools consisting of the commercial CADfix® geometry modelling engine, the ModuleWorks® CAM libraries and the HPDC—Hybrid Process Chain Decision libraries (custom development) built based on MATLAB® script functions. The ModuleWorks® SDC Framework (commercial) calculates adaptive tool paths for laser metal deposition and non-contact inspection operations and converts them into NC file with the support of the custom post-processor. The HPCD consists of a series of algorithms and tools to perform the scan-to-CAD operations, the computation of difference volumes between nominal and as-built part, the redesign of the volumes, the selection of manufacturing strategy to be subsequently deployed as well as the full planning of the ablation operation.

Fig. 2
figure 2

Software infrastructure for automated manufacturing strategy regeneration

The 4D Hybrid Process Dispatcher (custom development) collects the user input and orchestrates the communication and data exchange between all modules of the platform (GPT, VSB, CNC/PLC).

2.3 Macro-phases

The automation of the regenerative strategy is implemented across a predefined set of activities structured in four macro-phases, as pictured in the workflow diagram in Fig. 3. The first macro-phase consists of an offline collection of the user input, converted in specific rules, dependencies and requirements to be used by the downstream modules. Starting from this input, a preliminary hybrid process chain will be defined considering the split of the geometry of the part in a number of sections. For every single section, an adapted process plan will be defined sequentially based on the previously completed manufacturing steps. The part sections are delimited by checkpoints, whose height levels are issued after running the part geometric complexity assessment (Sect. 3.2). The manufacturing strategy for the first section is established based on the user input whereas all subsequent manufacturing sections are subject to an in-process regeneration of the initial strategy based on the feedback provided by an inspection operation.

Fig. 3
figure 3

Main macro-phases of the regenerative workflow

The hybrid process chain approach introduced in this work relies on the concept of regenerating a manufacturing strategy starting from an initial set of conditions distributed across default manufacturing operation templates. An inspection operation will be deployed sequentially at every predefined checkpoint. This will trigger iterative activities within the inspection data analytics and strategy regeneration macro-phases. As an outcome, it will be decided whether the initial manufacturing conditions are still valid or a regeneration of the manufacturing conditions is needed. Both inspection data analytics and strategy regeneration macro-phases (Sect. 4) are considered as the core of the decisional path leading to the most adapted strategy. Regardless of the outcome of the decision-making process, a fresh new part program will be issued for execution after the completion of activities performed at every pre-defined checkpoint.

2.4 Regenerative part program

Generally, a nominal CAD geometry is transformed into a processed part by means of a manufacturing process chain. Very often, the final geometry of a part can be reached by using alternative manufacturing processes associated with specific CAx processing steps. A decisional logic is implemented in order to define which CAx entities and operations need to be instantiated for a manufacturing step of the automated hybrid process chain, as shown in Fig. 4.

Fig. 4
figure 4

Decisional logic and CAx activities to be deployed at each checkpoint

From a structural point of view, the implementation of the hybrid process chain considers both linear and non-linear approaches. The linear dimension is linked with the preliminary concept about the manufacturing steps needed to process a part, materialized into an initial hybrid process chain at the end of the job preparation macro-phase. Practically, this is translated into a number of subsequent operations whose sequence (e.g. Deposition_check_1, Inspection_check_1, Operation_check_2, Inspection_check_2, …, Operation_check_n-1, Inspection_ Inspection_check_n) is defined in function of the number of checkpoints established during the part geometric complexity assessment, as shown in Fig. 3. Consequently, the NC commands for the execution of the very first deposition and all inspection operations are issued whereas no action is undertaken for the operations labelled as Operation_check_xx. The underlying logic is that the strategy to manufacture the entire part is unknown prior to the execution of the part program corresponding to the first checkpoint geometry. The manufacturing of the subsequent sections of the part will be subject to an in-process adaptation based on the output of the inspection and the recovery strategy instructions issued by HPCD in the strategy regeneration macro-phase. After every inspection operation, various CAx activities are performed with the aim to generate data to be used in the planning of the following manufacturing steps. Three alternatives are possible (i.e. A1, A2 or A3), as shown in Fig. 3 and Fig. 4. Hence, the non-linearity is associated with the possibility to have alternative process chains between every two consecutive inspection operations.

The main output of the hybrid process chain workflow will be a part program whose manufacturing sections will be regenerated dynamically by mapping the identified deviation scenarios to possible regeneration strategies after each checkpoint inspection activity. The manufacturing strategy is regenerated with the support of operation-specific templates, initialization and parametrization files. The templates, stored in a static folder, contain a complete list of characteristic parameters for each considered operation (i.e. deposition, inspection and ablation) set to default values. Basically, the definition of the conditions under which a specific application will launch, the environment that should launch into and what should happen when it terminates is specified in the static folder (Fig. 5). In addition to the templates, the static folder contains the executables and libraries of the GPT and list of default values for various parameters which do not change between manufacturing runs.

Fig. 5
figure 5

File management system for regenerative part program structure

The Process Dispatcher (PD) employs the policy setup by the part master folder architecture to name and archive dynamically the manufacturing projects jointly with the inherent folders, subfolders and files supporting the automated workflow. The level 1 subfolders (L1) are CheckpointCAD, InspectionCoP and a number of Operation_check_xx and Inspection_check_xx folders defining a preliminary hybrid process chain for the manufacturing run and are created in function of the number of checkpoints defined during the part geometric complexity assessment phase (Sect. 3.2). By default, every Operation_check_XX folder includes a dedicated subfolder for each operation available in the hybrid process chain. The input and output files subsequently stored in level 2 (L2) and level 3 subfolders (L3) are geometry files, tool path and NC files or 3D scan data. The HDPC decides which templates and geometries need to be updated for each operation in order to output adapted NC files for the subsequent sections of the part.

2.5 Data flow, event management and control

In the proposed hybrid process chain, various types of information need to be passed from one entity to another. Any type of automation involving the dynamic adaptation of the manufacturing strategy requires a close integration of various type of data such as CAD and CAM data, process technology, measurement and inspection system, machine architecture and control. In this respect, the PD was implemented to streamline the communication between various modules and to synchronise the regeneration strategies with the machine’s control during execution. The PD’s interface (Fig. 6) has groups of input parameters for the project and operation setup, for CAD file import as well as tools to visualize the connection status with the CNC and the laser sources.

Fig. 6
figure 6

4D Hybrid Process Dispatcher GUI

Apart from collecting user’s input, the PD controls the execution of the on-going process chain while orchestrating the data exchange between the various tools and algorithms of the platform. These pertains to the generation of the projects’ folder architecture associated with the imported CAD parts, the linkage of the input parameters to the geometry elements, the transfer of the CAD geometries and redesigned volumes between the modules, the generation of the batch and initialization files and the control of the launch, execution and termination of the various modules of the platform.

Within the proposed software infrastructure, the GPT interfaces with the PD through a simple scripting interface similar to the INI files commonly used to configure software. The GPT is a software tool that takes a set of input arguments (e.g. a CAD model, layer height, laser power) and outputs either a NC file that can be executed on the hybrid machine or one or several modified CAD models. The GPT script is broken into a hierarchical list of actions where the higher levels (primary actions) are described in terms of the actions performed at the lower levels (i.e. secondary and atomic actions). In the current version of the software, three main primary actions are undertaken by the GPT, namely the bulk deposition, difference volume and checkpoint generation.

The CNC of the hybrid machine controls the interpolation of machine’s axes to realize complex trajectories of the head (both for deposition and ablation process) and allows, through PLC functions blocks, to also control all sub-systems of the machine (i.e. optical chain and auxiliaries). The control solution adopted is the OPEN-XLi a product of the Osai OPENcontrol family which includes a powerful soft PLC fully integrated with the system and sharing the hardware resources. The proposed software platform takes advantage of the flexibility of the integrated PLC to automate the import and execution of the regenerated part programs. The OPENcontrol uses WebService library by the SOAP protocol over HTTP to communicate with the PD. PLC oriented functions and shared variable management are employed to enable the communication and data exchange between the dispatcher and the CNC-PLC.

Three modes are available for the execution of part programs, one manual and two automated, with direct impact on the PLC behaviour. The manual mode requires the intervention of the user to load and launch the execution of a new part program. The file mode relies upon a list of part program names whose sequence is written in a text file available at a specific location on the disc and overwritten iteratively after the inspection activities. In dispatcher mode, the NC kernel gains access to the generated/re-generated part program files based on the file path indicated by the PD and follows sequential instructions for their activation and execution.

3 Job preparation

Prior to the activation of the automated routine to regenerate a manufacturing strategy, two main activities related to the first macro-phase should be addressed, namely the collection of the user input and the part geometric complexity evaluation.

3.1 User input

The PD interface supports the collection of the user input required by the various software modules of the platform. Initially, the nominal CAD file of the part and a number of parameters for the deposition and the inspection must be provided. The list of parameters was kept to a minimum and some values are intended to be shared by different activities of the workflow. For instance, the layer height parameter will support not only the definition of the first deposition strategy but also the resolution at which the geometric complexity assessment is performed. Another relevant input for the deposition operation is the deviation tolerance defining the threshold value above which the regeneration of the initial strategy becomes necessary.

As far as the inspection is concerned, the user needs to provide three parameters for each rotary axis, namely the start angle, the increment value and the number of increments. These values jointly with the order parameter will be used to automatically generate a number of viewpoints for the 3D scanner to reconstruct the geometry of the part.

Due to its supporting role, the ablation operation does not require an explicit input from the user. If the selected regeneration scenario involves an ablation, the outputs from previous activities will be used for the setup of the operation (e.g. cloud of points from inspection, checkpoint CAD from geometric complexity assessment and redesigned volume for ablation from HPDC).

3.2 Part geometric complexity

It is expected that during the execution of the laser metal deposition process, the errors will begin to accumulate as the material deposition proceeds. This is mainly due to the variations in the actual height of each deposited layer compared to the height predicted during slicing. Nevertheless, it is impractical to check for deposition errors every layer because those errors must be measured against the cold final form of the model, rather than the hot in-process shape, and cooling time can be substantial for these parts. Thus, it is desirable that measurements be taken at infrequent intervals throughout deposition. The solution pursued here is to partition the nominal CAD model into sections, referred to as checkpoints, in advance of manufacturing and based on an assessment of the part geometric complexity. The final manufactured part will then consist of several separate volumes deposited on top of each other with inspection and recovery phases taking place after each volume has been deposited.

Ultimately, there is one criterion that determines the number of inspection operations: part geometry sectioning must take place before the accumulated errors become unrecoverable.

Based on previous knowledge related to the deposition of a specific material with pre-defined process parameters and tool path strategies, we can argue that when depositing a fixed slice profile, deposition errors are expected to accumulate at a somewhat predictable rate. Similar defects are expected to be present at similar positions for similar slice perimeters and so, where a stack of slices is similar in profile, these defect hot-spots are where errors will accumulate most rapidly. Thus, a volume with a fixed slice profile can simply be sectioned at a regular interval to yield reasonable volumes for deposition.

When a slice profile is substantially different compared to the previous layer, the defect hot-spots for the new layer can align in an unpredictable way with the hot-spots from earlier layers. This results in a more unpredictable rate of error accumulation as more layers are deposited. A similar issue can arise when slice profiles change slowly over a number of layers, amounting to a substantial change in profile over time.

With all that in mind, the sectioning process can now be described broadly with two rules:

  • A section should not be higher than some maximum, user-defined height based on a conservative estimate for the rate of error accumulation.

  • A section should be created when a layer changes substantially, either suddenly, or over the course of a number of layers.

    Detecting either a sudden or gradual change in layer profile is not trivial and is currently accomplished by relying on the following heuristics:

  • A change in the number of loops defining the perimeter of a given slice should always trigger a checkpoint. This will typically correlate well with the end of some large, extruded section and the beginning of some smaller features deposited on top of that section. It is also the feature we expect to see when two, previously separate features merge into one (e.g. an arch).

  • A change in the length of the perimeter of the current slice as a percentage of the length of the previous slice. This rule attempts to capture the sudden appearance of a slot or other hole in the side of the deposited region.

  • An accumulated shift in the boundary of each slice. For this heuristic, a rasterization of each slice is generated and the number of pixels that change from one slice to the next is added to a running total. A checkpoint is inserted when that total estimation of surface area overlap (SOA) exceeds some fraction of the total number of pixels in the rasterization of a given slice (e.g. 50%). This rule aims to capture cases where the slice boundary only changes gradually. Estimates of draft angles can also be calculated with a similar process by identifying the pixel in each rasterization that is furthest from a pixel in the other — that is, the maximum draft angle can be determined by finding the red pixel that is furthest from any black pixel (Fig. 7). Subtracting the rasterization of some slice from the previous slice yields an estimate of the surface area in the previous slice not covered by the current slice. Conversely, subtracting the rasterization of the previous slice from the current slice reveals the material on the current layer overhanging the previous layer. To introduce some degree of scale-independence, we measure change in SAO as a percentage of each layer, with the threshold specified as an accumulated percentage. Counting changes in pixels has also the convenient property that a slice profile that simply shifts sideways (e.g. where the volume is a cylindrical rod at an angle to the deposition vector) is identified as a feature requiring extra inspection even though each slice has the same area and perimeter.

Fig. 7
figure 7

Rasterization and superimposition of two subsequent layers

The metrics gathered based on these heuristics can be used to make a classification of the general complexity of the part. Inspection of the slice area overlap measurements and loop counts allow us to identify change to part topology and the presence and magnitude of wall angles.

4 Regeneration strategy planning

The inspection data analytics and strategy regeneration macro-phases aim to deploy a number of subsequent activities in order to make informed decisions about the in-process regeneration of the manufacturing strategy (Fig. 4).

The systematic capture of digital geometric content related to the part being manufactured is realized with the support of a non-destructive optical testing technique simply called inspection. The final point cloud obtained after each inspection operation describes the geometry of the current state of the build in terms of a nx3 matrix of point coordinates. This file will be used to reconstruct the volume of the build and to perform its registration with respect to the nominal checkpoint. Its main scope is to analyse and extract relevant geometric and dimensional information in order to describe possible deviation scenarios, to be used for the planning of the regeneration strategy implemented with the support of the inspection data analytics and strategy regeneration macro-phases. The underlying goal of implementing an inspection data analytics process is to support a better understanding of the interplay of material, process and geometry in the production of AM parts, leading eventually to the finding of optimum conditions for the aforementioned variables. The very last macro-phase of the workflow addresses the configuration of the operation templates and the generation of the corresponding tool paths and the associated NC files based on the user input and all the decisions previously taken.

4.1 Registration

At each inspection stage, a point cloud of the current state of the build is compared against a CAD model representing the expected part shape at the current checkpoint. The cloud stored in the Inputs subfolder of the InspectionCoP folder (Fig. 5) is registered by the GPT to the CAD model using Iterative Closest Point (ICP). Various filtering techniques are performed before and after registration to retrieve a point cloud that is suitable for downstream processing. The Point Cloud Library processing tools are used for each of the processing steps. Before registration, statistical outlier removal clears away noise in the incoming cloud. The baseplate is identified in the point cloud data by searching for the largest planar area of point samples and the CAD model is then registered to the point cloud data using ICP. The transformation matrix calculated by ICP and the filtered point cloud saved in.PLY format are stored in the Outputs subfolder and will be used downstream by the HPDC for the extraction of the difference volumes.

4.2 Analysis and volume redesign

The transformation matrix and the filtered point cloud are used as input by the HPDC for data point interpolation in order to recreate the digital surface representation of the deposited object. In order to compare the reconstructed and the reference model of the object, the STL format is used.

The comparison of the 3D volumes leads to the extraction of differences between them labelled as difference volumes. In function of the detected deviations, a maximum of two difference volumes can be extracted at every checkpoint. These volumes, denoted hereafter as over-sized deposition and under-sized deposition volume respectively, represent deviations of the actual to the nominal part dimensions with positive values indicating the deposition of an excess of material and negative values indicating the lack of deposited material with respect to the current checkpoint CAD.

The deviations’ tolerances, defining the threshold beyond which a specific type of deviation is not acceptable, are taken into account during the comparison process between the actual and nominal volume. An example of an over-sized deposition volume is shown in Fig. 10a.

The resulting volumes act as primary geometric data enabling the definition of the deviation scenarios and the decisional path leading to strategy regeneration, described in the subsequent Sects. 4.3 and 4.4.

The analysis of the extracted volumes is sequential and starts with the under-sized deposition volume since this one, when detected, will influence also the interpretation of the over-sized volume. The readiness of each difference volume to be processed with the available technologies is analysed considering the laser spot diameter and the standoff distance as representative technological parameters. In case of the presence of under-deposition, an initial one-micron layers’ height discretization is performed. After the volume discretization step, the shape complexity of the difference volumes is assessed considering quantitative data extracted on a layer-wise fashion and labelled as layer statistics (i.e. number of contours within each layer, area of each contour, minimum distance between the contours, Z-level for each layer, and total surface area overlaps between consecutive layers). A custom computation algorithm detects within both the over-sized and under-sized deposition volumes the presence of certain features which might hinder the deployment of a specific operation. For instance, the presence of tiny valleys and /or gaps could constrain the exclusive deployment of the deposition as subsequent operation to compensate for the deviation. The deposition operation can be only instantiated after the removal of the critical features through ablation which will practically trigger the preparation of a full recovery strategy.

Consequently, the volumes are inspected to detect the presence of peaks in a similar fashion as for the valley condition. The detection of peaks with small areas and very close to each other invalidates again the deposition operation to fill the voids around them and the ablation is called to remove them first. Hence, both initial difference volumes need to be also updated. The next two conditions address some of the physical constrains of the manufacturing equipment in terms of the standoff distances to be respected for each process.

The recommendations issued from the aforementioned reasoning steps will enable the redesign of the initial difference volumes. The goal of the redesign step is to adapt the volumes to a particular deviation scenario to be compensated during the manufacturing of the next section of the part and in strict correlation with the operations to be carried out. The redesign considers the difference volumes as start geometries subject to Boolean operations such as union, intersection or subtraction. An example of volume redesign steps for an ablation operation is depicted in Fig. 8.

Fig. 8
figure 8

In-process difference volume redesign for an ablation operation

4.3 Decision

In the decision step, the need for strategy regeneration is evaluated with respect to the type and sequence of operations as well as the process and tool path parameters defined for each one in the initial strategy. Furthermore, the custom algorithm developed for the redesign of the difference volumes supports also the mapping between the deviation and regeneration scenarios leading eventually to the selection of the most adapted regeneration strategy for a specific situation (Fig. 9).

Fig. 9
figure 9

Mapping between the deviation and regeneration scenarios

The analysis of the difference volumes issued in the previous step enables the identification of four deviation scenarios: (i) in-tolerance deposition (ITD), (ii) over-sized deposition (OSD), (iii) under-sized deposition (USD) and (iv) mixed-sized deposition (MSD).

ITD scenario corresponds to the situation in which the outer surface of the build matches the nominal shape of the part within a pre-defined tolerance. This tolerance might have different values in function of the analysed surfaces of the part (i.e. top-surface and lateral faces). The OSD scenario implies the presence of extra-material filling areas which correspond to a general positive gap with respect to the nominal geometry considered while the presence of exclusive deficient material filling areas, which correspond to a general negative gap, will be categorized as an under-deposition scenario (USD). The MSD case provides a much more complex situation where any mix of the former three scenarios can be present. That is to say that for instance on the current top surface of the build over-, under- and in-tolerance deposition can be simultaneously detected. The ratio between deviation types as well the distribution level assigned for each deviation will support the downstream selection of the most adapted strategy (Fig. 10).

Fig. 10
figure 10

Alternative full recovery strategies (sample alternatives for OSD scenario)

The next decisional step is to map an adapted regeneration strategy to the identified deviation scenario. Three regeneration strategies were defined: (i) no recovery, (ii) partial recovery and (iii) full recovery considering a gradually increasing level of complexity from the first to the last one. The “No Recovery “ is the most straightforward strategy to be triggered by an ITD scenario with no further action required but a mere initialization of the deposition template with the parameters previously employed, to be used for the subsequent checkpoint of the part. The “Partial Recovery” strategy is triggered by the generation of HPDC-defined geometries, redesigned after every inspection operation either as a positive or a negative volume. This strategy is generally paired with either OSD or USD scenarios. In addition to the alteration of the input geometries, the “Partial Recovery” scenario accommodates the change of the process parameters and tool path strategies to manufacture the specified volume by deploying either a deposition or an ablation operation. The co-existence of both redesigned volumes for deposition and ablation leads invariably to a “Full Recovery” strategy which will be mainly linked with MSD scenarios. In this scenario, an adapted hybrid process chain is triggered (i.e. ablation-inspection-deposition) to complete the following manufacturing step. Every single operation of the process chain requires adapted input data and in return will provide the associated manufacturing instructions to be executed by the machine.

Since the deposition is the leading operation for part manufacturing, the deployment of a hybrid strategy is mainly justified by the need to additionally perform an ablation. The deposition laser spot, due to its larger diameter, cannot process thin areas or gaps because of lack of quality and functional requirements. If such areas are detected, a redesign of the initial volumes is mandatory. The ablation will have a higher impact on the volumes redesign decision for both types of manufacturing operations, due to the higher spatial resolution capability associated with the smaller ablation laser spot.

Currently, the assignment of process parameters for the manufacturing operations of the hybrid process chain is supported by a knowledge base linking a limited number of process recipe solutions to the deviation scenarios depicted in Fig. 9.

4.4 Strategy regeneration

The deployment of a specific recovery scenario suitable for the current manufacturing situation will impose distinct requirements on the data needed for the regeneration of each operation. Mainly the geometries, process recipes and tool path parameters need to be instantiated in the default template files available in each operation folder. Within this step, the HPCD has a two-fold role. On one hand, it makes available the pre-defined process recipes as well as the redesigned volumes and surfaces for all operations required by a regeneration scenario. On the other hand, it generates in full autonomy the tool paths and NC codes for the ablation operations. The regeneration of the other two operations of the hybrid workflow (i.e. deposition and inspection) will be handled by the GPT.

The deposition is the default operation available across all regeneration scenarios. Generally, LMD process showcases technological limitations related to the manufacturing of sharp corners, thin geometries or the accurate management of the build height. If out-of-tolerance deviations are detected with the support of the inspection operation, the HPCD script generates a redesigned volume which will completely replace the predefined checkpoint geometry to be used for the deposition of the next section of the part. This volume will be instantiated as a new Containment Body in the Inputs folder of the deposition operation. The base surface (i.e. MachSurf.stl) and the build direction (i.e. DriveMesh.stl) will be successively updated in the very same folder. Once all geometric data and process recipes are updated, the SDC framework will be called by GPT for toolpath calculation. Simultaneously, the NC file generation API (i.e. PPFramework) can also be called by defining PPF parameters and selecting the target post-processor. Upon the completion of the layer strategies and corresponding NC part program, the GPT will wait the instructions for the calculation of the subsequent operations. The PD will transfer the part program to the machine’s controller for execution.

The ablation NC file is prepared by the same general HPDC script with the support of a custom function. Depending on the chosen strategy, the HDPC can output either the instructions to exclusively steer the laser beam of the scanner or an adapted sequence of commands able to synchronise the motion of the optical axes of the scanner with the mechanical axes of the machine. The script employs a lean procedure for the generation of both motion strategies starting with the discretization of the redesigned volume for ablation. Contours will be detected within all layers issued during the discretization step and converted in a sequence of polygons. Consequently, for each polygon, the associated tool path will be generated based on parallel lines polygon filling methods available for the scanning system.

The redesign of the volumes for both deposition and ablation operations in response to a specific deviation scenario justifies the need to plan a hybrid manufacturing strategy. According to the concept of the regenerative part program, only one sequence of processes is available for the regeneration of the strategy between two consecutive checkpoints, namely ablation-inspection-deposition. This sequence, mostly associated with MSD scenarios, will be followed for the initialisation of the templates, tool path calculation and part program generation and execution for each constituent operation. This is to say that the operation subfolders included within an Operation_check (Fig. 5) folder will be instantiated upon needs and the manufacturing files prepared for each one will be transferred by the dispatcher to the machine’s controller and executed by respecting the very same sequence.

5 Case study

A fully automated CAx hybrid workflow introduces a high level of complexity in terms of the data flow and event management between the various software modules of the platform as well as with respect to the sequencing and execution of the manufacturing operations. In this respect, a simplified case will be discussed in this section.

The repair and addition of features is one of the key capabilities of a LMD process making possible to locally reinforce, repair or enhance the functionality of existing parts previously manufactured with other manufacturing processes. First, experiments were conducted with the goal to control the shape deviations generated by a LMD operation relying upon the capabilities of the hybrid workflow to adapt the manufacturing strategy with the support of in-process inspection and laser ablation operations.

In order to capture the logic of the hybrid workflow, a sample part consisting of two distinct checkpoint geometries was considered. The first checkpoint geometry consists of a 4-mm height disc shaped part on top of a flange substrate (Fig. 11a). The CAD topology of the first checkpoint features areas with fully dense material as well as an array of holes with the diameter d = 3 mm. The second checkpoint geometry considers the deposition of 4 bosses with a height of 2.5 mm on top of the disc (Fig. 11c). Each boss features an internal hole with a starting diameter of 3 mm and a draft angle of − 5° whose axis is aligned with the axis of the corresponding hole defined for the checkpoint 1 geometry.

Fig. 11
figure 11

(a) First checkpoint geometry; (b) Tool path strategies for the deposition of the first checkpoint geometry; (c) Second checkpoint geometry

The material selected for the experimental verification is Inconel 718, a nickel alloy well known for maintaining its mechanical properties at high temperature and generating elevated cutting forces and strong abrasive tool wear during conventional machining [38]. Particularly, its poor machinability recommends it as a good candidate for laser processing techniques. For the LMD tests, a spherical gas atomized powder (MetcoAdd 718F, supplier Oerlikon AM®) with a grain size ranging between 45 and 106 μm was employed.

A distinct tool path pattern was selected for each half of the disc (i.e. contour offset for sector S1 and raster for sector S2 respectively) while considering identical process parameters. The tool path strategies (Fig. 11b) together with the design of the first checkpoint geometry specimen were defined with the intent of simulating the generation of various types of shape deviations, which will subsequently trigger in-process decisions related to the following manufacturing steps. The parameters considered for the laser metal deposition operations are depicted in Table 1.

Table 1 Laser metal deposition parameters

In agreement with the folder architecture depicted in Fig. 5, the PD defines dynamically for the sample part a structure consisting of the following operation folders: OP1.Deposition_check1, OP2.Inspection_check1, OP3.Operation_check2, OP4.Inspection_check2 and OP5.Operation_check_f. Apart from OP3 and OP5, all operations are fully defined initially considering the initialization of the templates, geometries and process recipes based on the initial user input. The objective of the OP1 deposition is to build the disc shape of checkpoint 1 (Fig. 12a) while the subsequent OP2 inspection aims to digitalize the current state of the AM build. The inspection operation is planned based on the user input and consists of 7 consecutive poses (considering the following positions of the rotary axes of the machine: A0 C0, A-35 C0, A-35 C60, A-35 C120, A-35 C150, A-35 C180, A-35 C270). The point clouds corresponding to the different viewpoints are merged together to represent the entire part geometry (Fig. 12b). Due to an accurate calibration procedure, the coordinates of the obtained point cloud refer to the working coordinates system used for the OP1, so any deviation correction through a recovery strategy can be applied at the precise location of the deviation.

Fig. 12
figure 12

(a) First checkpoint geometry as-deposited; (b) Point cloud generated by the first inspection operation; (c) Over-sized deposition deviation; (d) Under-sized deposition deviation.

The identified deviations can be assigned to the OSD scenario (Fig. 12c) and are particularly noticeable around the contour of the holes (H1 to H6 in Fig. 11b), at the interface area between the two tool paths (W7 and W8 in Fig. 11b) as well as around the inner and outer contour of the disc. Furthermore, the registration between the first checkpoint CAD geometry and the cloud of points displays also an USD deviation case on the full-dense material areas of both S1 and S2 disc sectors (Fig. 12d).

For the sake of conciseness, the next steps will be discussed exclusively considering the deviations related to the S1 sector. The simultaneous presence of over- and under-sized deposition is associated with the MSD scenario which calls for a Full Recovery strategy. Since the top of the checkpoint 1 geometry represents the start surface for the laser metal deposition of the bosses (i.e. checkpoint 2), an adapted recovery strategy must be deployed for its preparation. Otherwise, it is very likely the deviations will accumulate across the subsequent deposition which would make hardly feasible, if not impossible, to plan and execute a complete recovery after the final inspection to meet the nominal specifications.

The initial folder architecture generated by the PD contains also the OP3.Operation_check2 and OP5.Operation_check_f folders. Both contain 3 subfolders corresponding to the operations available on the machine. The Inputs and Outputs subfolders for each manufacturing operation considered under OP3 and OP5 folders are initially empty since the decision on how to regenerate the strategy cannot be taken before completing the OP2 and OP4 inspection data analytics steps. The HPDC prepares the redesigned volumes, the layer strategies and process parameters needed for each operation of the regeneration scenario and the PD updates the aforementioned subfolders accordingly. The volumes are constructed starting from the scanning data of the current build and considering the technological constraints of the additive and subtractive operations as well as the nominal checkpoint geometries. For instance, the hybrid process workflow to be deployed for OP3 as a result of the detected deviations requires the execution of the following sequence of operations: (i) OP3.1 — ablation of the over deposited contours around the holes, (ii) OP3.2 — inspection of the ablated surfaces and (iii) OP3.3 — the deposition of additional layers on both sectors of the disc to compensate for the missing material followed by the deposition of the bosses.

The outcome of the OP3.3 is characterized by the OP4 inspection following the same approach of OP2 for OP1 deposition. The OP5 is intended to launch a final regeneration of the strategy in case the OP4’s inspection data analytics step notifies the presence of out of tolerance deviations.

6 Discussion

The registration of the checkpoint1 CAD geometry with the cloud of points and the extraction of the difference volumes set the premises for the planning of the regeneration scenario. The main objective of the OP3.1 ablation operation is to remove build errors in situ before they can propagate into larger deviations and to prepare the base surface for the subsequent deposition steps. In order to put forward the differences, the OP3.1 deployed the ablation only around the holes H2-H5 while no material was removed for the holes H1 and H6. The parameters considered for the laser ablation are listed in Table 2.

Table 2 Laser ablation parameters

An example of the outcome of the ablation operation is shown in Fig. 14a with reference to the feature H4 while the most representative activities of the ablation workflow are listed in Fig. 13a and b.

Fig. 13
figure 13

a Point cloud and associated mesh for H4 deviation; (b).stl file and volume slicing for H4 deviation

The OP3.2 inspection operation is simply a re-iteration of the OP2 inspection and generates a new cloud of points characterizing the status of the checkpoint 1 geometry after the ablation. A new registration with the nominal CAD is performed to quantify the outcome of the ablation operation. The HPDC generated the ablation strategy for the H2-H5 features and the associated part program considering the individual topology of each deviation. Due to the different shape and height of each deviation, the ablation strategy consisted of 89, 82, 123 and 124 layers with variable thickness for the H2, H3, H4 and H5 features respectively. As shown in Fig. 14a for the H4 hole, the strategy employed reduced significantly the initial deviation and improved the surface quality of the resulted bottom surface. A difference between the maximum peaks resulted after the ablation of both wall shaped deviations with respect to the bottom reference was estimated at 0.017 mm, which was within the pre-defined tolerance. Hence, the result of the ablation is considered valid and the PD proceeded further by initializing the OP3.3 deposition operation and triggering the GPT to generate its associated deposition strategy and part program. Inevitably, these are influenced by the outcomes of the OP1. The manufacturing strategy was established in this case based on preliminary tests with the material selected following a systematic approach for the deposition of single tracks, single layers and 3D bulk geometries. However, any alteration of the part geometry and tool path strategy could lead to significant differences between the nominal and the real layer thickness values when the same process recipe is applied. The under-sized deposition quantified with the support of the in-process inspection requires a redesign of the checkpoint 2 geometry considering the compensation for the under-sized deposition regions. The volume corresponding to the missing material, which displayed an average height difference of 0.83 mm for the S1 sector, was added to the initial checkpoint 2 geometry (Fig. 14b).

Fig. 14
figure 14

(a) Top surface of H4 hole as-deposited and after ablation; (b) Second checkpoint redesigned with H5 and H6 bosses; (c) Second checkpoint with H5 and H6 bosses as-deposited and tolerance evaluations after registration with the redesigned checkpoint.

The lack of material can be explained by an unadapted layer height value which controls the nozzle standoff distance and influences the powder catchment efficiency. Due to a diverging stream effect, the catchment efficiency drops above and below the focus. In this respect, before proceeding further with the deposition of the bosses, 5 additional layers were deposited to reach the nominal height of the disc by using the same toolpath and process parameters as for OP1 but with a reduced layer height value of 0.16 mm. Figure 14c shows the bosses deposited on top of the H5 and H6 as well as the outcome of the registration process which display an in-tolerance base surface for the bosses. The registration shows also the presence of geometric inaccuracies for both H5 and H6 bosses associated with the multiple laser start/stop phases and tool path discontinuities, sharp corners and overlaps as well as due to the fixed set of process parameters. For this type of feature, a composite planning and control method with an adapted correction strategy during the deposition of the sharp corners and the circular contour of the hole would be in a better position to improve the shape accuracy. To this very same scope, the prior ablation of the over-sized deposition deviation around the hole H5 proves also beneficial by reducing the top surface deviation around the hole contour by approximatively 20% when compared with the H6 boss feature.

Considering the rather low removal rate of the ablation process (i.e. 0.76 mm3/min), any additional shape deviation resulted from the deposition operation will invariably extend the time of the subsequent ablation which consequently might have a non-negligible impact on the productivity of the entire workflow.

The example discussed shows one of the many possible deviation scenarios requiring a dynamic change of the process planning strategy in order to avoid the accumulation of errors and to preserve the build shape accuracy. This promotes the capability of the proposed approach to infer dynamic decisions with respect to unforeseen events which can be hardly predicted in advance.

Despite all recent and ongoing technological advancements with respect to LMD process, trial-and-error still plays an oversized role in achieving the desired characteristics and tolerances of a part, especially when a part is manufactured for the very first time. This is mainly due to the high number of variables to be accounted for during the part deposition process. Although the developed software platform goes beyond the automation of repetitive tasks and brings a significant contribution to the digital transformation of hybrid process chains, additional tools with enhanced intelligence are required to support the decisional process. Amassing data related to previous experiences and implementing the right tools to analyse, interpret and derive useful information is key to the setup of knowledge-based manufacturing enabling an automated parametrisation of the hybrid process chains. Process simulation engines coupled with AI learning techniques capable of learning complex correlations between process inputs and outputs and inferring new knowledge based on previous experience are still needed. They will promote a faster identification of major sources of variability in the process and the automated selection of optimal process parameters for particular out-of-tolerance scenarios. Moreover, they will perfectly complement the already implemented in-process capability of self-redesign of geometries, self-generation of adapted process chains, automatic elaboration of tool path strategies, output and deployment of associated NC codes.

7 Conclusions

This paper introduced a novel software infrastructure and the underlying methodology for the automated planning of a hybrid AM workflow, capable of in-process strategy and part program regeneration in function of detected AM build deviations. The quantification of the deviations is enabled by a prior part geometric complexity assessment providing a number of checkpoints and a recurrent comparison between the nominal geometry and the current state of the build. The workflow is demonstrated on a hybrid machine integrating laser-based additive and subtractive manufacturing technologies as well as inspection. First tests proved the capability of the platform to create, secure and operate adaptive workflows while inferring dynamic decisions and scaling up hybrid process automation with respect to unforeseen events which can be hardly predicted in advance. The major contributions of this work are as follows:

  • Implementation of a decisional process underlying the regeneration mechanism to manage repetitive patterns, to compensate for deviations and to take actions related to the subsequent manufacturing steps to complete a part.

  • Streamline of a dynamic data exchange between the constituent CAx entities of the proposed software platform to support the automated generation of adapted hybrid process chains

  • Step-by-step update of the subsequent part section geometries considering the nominal CAD, the current status of the AM build and the technological constraints of both additive and subtractive processes of the hybrid chain

  • In-envelope evaluation of the outcomes of intermediate manufacturing steps, mapping of the identified geometry deviation scenario to possible alternative strategies, automated tool path generation, output and deployment of associated NC codes to support in-process part program regeneration

  • Efficient use of laser ablation for the removal of limited irregular shape deviations of AM parts

Despite very promising results displayed during the selective removal of scattered deviation volumes, the ablation could be quite impractical when a high quantity of material must be removed. Conventional machining processes such as milling can offer a better productivity and, in this respect, the proposed software platform can accommodate the requirements of process chains such as deposition-inspection-milling specific to other hybrid systems with minimal development and implementation effort. In such case, modifications should be mainly performed on the operation templates, initialization and parametrization files contained in the static folder. In addition, the specifications and constraints of the additive and subtractive tools and processes as well as the feasible ranges of process parameters need to be accounted for during the re-design of the volumes and the strategy regeneration respectively.

The methodology and implemented software infrastructure have practical potential which can benefit various applications aiming to improve part quality and performance across a broad range of industries such as subtractive and additive manufacturing, part inspection and measurement automation. Future research will concentrate on the elaboration of a comprehensive knowledge base with case-specific (e.g. green field parts, repairing applications) action rules, operation-specific guidance and predictive models to provide extended decisional support for the automated parametrisation of the hybrid process chain.