Data Processing in Industrie 4.0
The pressure on companies to increase their flexibility and efficiency in manufacturing is constantly increasing. Factory managers therefore need to be able to obtain information in real-time across physical production systems for better decision making. Transparency on a production- and strategic level, for example, offers the advantage of being able to respond more quickly to volatile demand (time-to-market) and helps in reducing lead- and down-times. This can lead to a significant production gain and competitive advantage. Current approaches are challenged to bring results from the IoT world to decision makers in an appropriate manner. We introduce data models that serve as a mediator to create a better understanding between factory owners and data analysts. Particular challenges lie in the orchestration of the complex process steps, the vertical transparency of information, as well as in mutually contradictory optimization calculi (e.g., cost, speed, quality, sustainability). Due to better communication between factory managers, data analysts and people working at the line-side, the previously mentioned configurations can be implemented more transparently and consequently more efficiently.
KeywordsSmart data management Complex event processing Data analytics Industrial Internet of Things (IIoT)
Decision makers in industries are nowadays confronted with a flood of information that needs – if being eventually useful – to be collected, monitored and correlated. Production processes are often complex, involving thousands of components from a plethora of suppliers. Interruptions, related to unforeseen events in manufacturing (e.g., machine breakdown, unscheduled maintenance or software problems), transportation (e.g., vehicle breakdown, traffic delay, wrong delivery), or even at higher tiers in the supply chain (e.g., component quality, availability of materials, etc.) may lead to losses of hundreds of thousand euros per hour. In addition, there may be a high variation in received orders from customers, orders may change just a few days before the delivery time, or orders’ quantities may change from the actually requested numbers. All of the aforementioned circumstances require a flexible production environment. Existing work on agility and flexibility in production and manufacturing can be found in [1, 2].
Following the movements of Industrie 4.0, situations like this can be mastered and efficiently handled if actual information is collected, brought in the right business context and presented to decision makers in an appropriate way. Otherwise, the sheer volume of useful information remains in data silos and consequently tangible business values remain hidden and cannot be leveraged. By combining information sources from production and bringing them into an appropriate context in terms of end-user and business aspect, this information may reveal potential production issues, indicate critical situations and may ultimately be exploited to increase business value. In this work, we sketch a solution that will be developed and further investigated within a joint research project called DISRUPT1. Following this idea, information is collected from sensor systems at the production level which are subsequently aggregated and correlated at the data level using complex event processing (CEP). The ability to give predictions, based on models previously trained through a machine learning process, enhances deterministic CEP capabilities by features that allow actual forecasts. Resulting data driven Key Performance Indicators (KPIs) can be enriched with available production knowledge and presented (so called semantic-lifting) by means of management dashboards to the responsible decision makers in a suitable manner. Deviations from the reference values can therefore not only be detected close to real-time in combination with predictions; the proposed approach allows to effectively influence production via the modelling perspective, the computation of aggregates and compound figures. Hence, management dashboard can be more easily personalized in the sense of the decision makers, in collaboration with data analysts.
Outputs from the sensor management platform are used as an input for real-time streaming analytics (aka CEP) in order to compute data-driven KPIs that are consequently interlinked (so-called semantic lifting) via applied knowledge management to enable dash-boarding which reveals the business context and allows strategic production monitoring (c.f., Fig. 1).
Source events (e.g., rejection ratio, downtime, etc.) are collected from the sensors within the IoT and Sensor Management layer. On the layer above (Streaming Analytics & Prediction Layer), base events are aggregated to compound figures using streaming analysis which is based on EPL (Event Process Language) rules executed on a CEP engine. Certain Events – e.g., the downtime of machinery – may concurrently trigger the PMML (Predictive Model Markup Language) Execution Engine to predict possible outcomes based on historic behaviour. All of the afore mentioned results create compound figures which are passed to the upper level using again compounds of aggregated events. The Knowledge Management layer aligns compound events with the business layer using a semantic database and semantic-lifting. Eventually, the enriched figures are displayed on Business Context and Strategic Production Monitoring level using dashboards and monitoring applications.
2 Data Collection and Event Aggregation Framework
2.1 Real-time Streaming Analytics
Streaming Analytics is based on a commercial solution  using an in-memory architecture that enables real-time processing of extremely fast, large data volumes – orders of magnitude larger than traditional database-based IT solutions. Rules of the CEP are represented in EPL which need not to be specified and defined at design-time. These rules could, e.g., represent the calculation of spikes to identify whether individual sensor values are within normal operating bounds or the gradient of recent values to tell how much a sensor’s value is changing. Figure 2 illustrates an example of a CEP rule written in EPL.
Following the work of  and , we allow dynamic adaption of event patterns within the CEP following a model-based approach: CEP rules are modelled at the service layer (e.g., within the dashboard displaying business context and process overviews) and transformed into executable rules in EPL that can be injected at run-time into the CEP. This approach is feasible as all meta-model information from the sensor devices is available through the attached sensor management system.
2.2 Predictive Analytics
In parallel to the streaming analytics described before, events are distributed via the Message Bus to be stored in an Event Store within the batch layer (cf. Fig. 3). The Event Store can be an SQL-based database (e.g., PostgreSQL, etc.), a non-relational database (e.g., Cassandra, Apache CouchDB, etc.) or could as well be implemented using Big Data technology (e.g., Apache Hadoop). The mere purpose of the Event Store is to persist data from the IoT event sources and make it available for further processing within data analytics phase. Here the events are used for Predictive Modeling Machine Learning. Applications such as KNIME , WEKA , R , and other statistical analysis tools are used. With the help of these programs and procedures (e.g., Bayesian Networks, Clustering, Gaussian Process, Neural Networks, Support Vector Machines, etc.) models on the collected events are trained to obtain a PMML model. This model can subsequently be executed on the predictive execution engine  to hypothesize outcomes (based on the trained models) from triggering events.
2.3 Event Message Bus
Reliable transport is enabled via a messaging bus  which offers a publish-subscribe mechanism to allow fast and efficient distribution of relevant information within the platform. The message bus’ purpose is not only to handle events coming from the IoT devices and raw sources; it also facilitates reliable information transport as an outcome of the streaming analytics- and the predictive analytics component.
2.4 Effective Sensor and IoT Device Management
Effective Sensor and IoT Device Management, Event Message Bus, Real-time Streaming Analytics and predictive analytics are proposed instruments to collect, process and partly manipulate the data. An important add-on is the semantic enrichment of this data. The semantic is either given by the sensors (via the sensor management layer), the data schema or the way data have been collected (e.g. operational semantic), but a semantic enrichment of the data would enable not only a machine-based interpretation of the data but also a cross-domain and cross-sensor processing of the data. We propose in the following a model-driven approach to add this semantic to the data in order to achieve smart data management.
3 Smart Data Management
Data, information, knowledge and the corresponding interpretation is a key challenge in information technology in general and in data management in particular. The aim is not to only collect data, but also provide the corresponding context to support human- and/or machine interpretation of that data. We see therefore the dashboard not only as a collection, abstraction and visualisation of data, but as a decision support tool that behaves according to the knowledge layer; this layer either supports decision makers or, in well-defined cases, also enables automated machine-based interventions.
The logical architecture in Fig. 1 from above also introduces the so-called knowledge management layer as an additional layer to the traditional data management dashboards. We propose to use concept models for the realisation of such a knowledge-based layer, as conceptual models can support the full range of knowledge representation starting from semi-formal models that support human interpretation, up to strict-formal models that enable machine interpretation . Informal models represent knowledge using graphical models in combination with textual descriptions, whereas strict-formal models partly use the graphical representation but focus more on the formal semantics via ontologies as well as corresponding inference rules definition. The model-driven knowledge-based approach, which we are proposing in this text acts as a moderator between domain-specific targeted goals and the monitored data. The model-based approach can either map from top to bottom or from bottom to top.
From top-to-bottom mediation is considered if human decision makers design their intentions and then configure corresponding sensors and data formats that fit the purpose of the intention. A typical top-down mapping would be to define produced parts per hours and then configure sensors – e.g. a scanner that counts the produced parts at the end of the line – providing the appropriate figure for the intended purpose. In the presented architecture as depicted in Fig. 3, this is, for example, realised through the model-based definition of event patterns and their translation to executable EPL-based CEP rules (c.f. Sect. 2.1). A bottom-to-top mapping, on the other side, provides findings that can be seen out of the data, and then identifies the corresponding business impact, where the data can provide a meaningful contribution. A sample may be an anomaly detection in the pattern of a machine condition monitoring. After analysing the potential impact of this anomaly, these findings may be allocated to a domain-specific intention for predictive maintenance.
3.1 Model-Driven Data Management
In this sub-section, we illustrate how intentions can be mapped to sensor data and vice versa. Model-driven support for “Smart Data Management” consists of the (a) model-based approaches to support data management as well as (b) model-based approaches to introduce smartness. Introducing smart monitoring in production, applying business intelligence and applying data analytics to predict future behaviour is a well-researched topic including but not limiting (a) with respect to smart monitoring to multi-agent-based systems [23, 24], smart sensors [25, 26], edge computing [27, 28, 29], or large scale sensor architectures .
Model-driven management has been realised with ADOSCORE2 , that is a realisation of a Scorecard, which has been applied not only for financial strategy management – as originally intended – but also on knowledge and performance monitoring. This knowledge and performance monitoring has been iteratively improved to fit the purpose for real-time and complex data analysis.
Smart data management has been realised with PROMOTE [16, 17], that is a realisation of process-oriented knowledge management and can be extended with machine interpretable formalism like workflows, semantics, rules, cases or agent configurations.
3.2 Model-Based Data Management
- Business Intention with Critical Success Factor –
The intention of the dashboard is capitalised by collecting critical success factors and grouping them to goals that need to be achieved. This is described in Fig. 4, where the goal “efficient production” is described with corresponding success factors dealing with “not production time”. This step is often performed in workshops in order to collect available knowledge from decision makers to establish the Critical Success Factor Model.
- Cause and Effect Model –
It transforms the original business intention to a measurable structure of a dashboard, whereas each critical success factor is a so-called KPI and each group of critical success factors becomes a goal or a sub-goal. Such a model is depicted in Fig. 5, where the goal is specified and the success factors are measured with corresponding KPIs. We propose to have domain-specific KPIs which are different to the commonly used data KPIs. For this reason, we specify the KPI with domain-knowledge about the specific part of the goal, the type of measure, the definition of the ambition and realistic implementation as well as the time interval. Our proposal is to define the KPI as domain-specific and not as data-specific.
- Data Access with “\(\alpha\)”-Indicator Model –
Each domain-specific KPI is mapped to a so-called data indicator with the corresponding algorithm. In order to simplify the model, this indicator combines (a) the data collection using sensors, (b) the data storage using snapshot data bases and (c) the data access algorithm. A sample provided in Fig. 6 shows the list of all the data sensors, where each data sensor has the corresponding technical access information to send a query or invoke an API.
It is therefore seen as a unique combination to access relevant data, hence it is called \(\alpha\)-indicator similar to the alphabet that provides the atomic characters that can be used to construct words, those \(\alpha\)-data indicators are the atomic data providers that enable to construct data management dashboards. Traditionally, a straightforward top-down modelling is applied by analysing in collaborative workshops the business intensions and the critical success factors. The realisation of an appropriate data sensor is then a stepwise transformation into a strict data access format.
The benefit of this approach is to quickly gain a dashboard that is streamlined to the well-specified business intentions with an extensive analysis of the critical success factors, where each success factor is monitored with a corresponding KPI.
3.3 Smart Data Management
Current qualitative assessments about KPIs are often harvested in form of questionnaires or ratings from working groups or responsible persons. Although this is an excellent way to harvest the opinion, heuristics and experience of knowledge workers within the organisation, current findings in data analysis, anomaly detection or visual analytics have proven to at least massively support those assessments, and show the opportunity to even exchange them in particular cases.
Current digitization trends enable through the use of new technologies like edge computing, IIoT, smart sensing and cloud computing a massive increase of sensor data and hence enable more complete status reports. The management of the different sensor data becomes now a management challenge on its own, which can be supported by knowledge-based techniques.
ADONIS® is a business process management tool, which is based on (a) the so-called meta-model approach that allows a quick individualisation by configuring the so-called meta-model and (b) a data repository that allows to not only store model information but also semantics and technical information. In this context, ADONIS® is proposed to be used, as it provides powerful business process management features to describe, analyse, simulate and document the production process as well as enables to integrate semantic information that is necessary when describing the data. ADOSCORE was originally developed as a Balanced Scorecard tool, but has evolved to a more generic dashboarding tool. The strength of ADOSCORE is to support – as one of very few tools – not only the data visualisation part, but focuses on the design of knowledge and hence on the knowledge abstraction when creating the cause and effect models.
4 Integration and Realization of Data Collection and Knowledge Management
We proposed a decision support system involving a continuous automated data analysis and prediction. The messaging bus used provides a standardized, reliable, and efficient inter-communication between all the components. It acts as a central component that received information from the sensors and IoT devices on the shop-floor but also ensures communication between all the other components of the DISRUPT framework. The information exchange between the dashboard and the data providers is supported through this bus following a publish-subscribe paradigm that is the enabler for real-time streaming analytics , event collection , and the sourcing for the predictive execution engine  ensuring the delivery of data in real-time.
This selected approach particularly fits the requirement of modularity of the whole system; when required, every component can be unplugged from the bus and substituted with technological equivalent solutions offering the same features. The users in that way are not anymore constrained on a specific product or company but can choose the preferred one and update their decision over time due to the absence of vendor lock-in.
The whole environment of the smart data management dashboard, consisting of the data management design environment, the data sensors as well as the data dashboard are both commercially available in a tool set consisting of ADONIS3 and ADOSCORE  as well as available as research prototypes as a result of European projects in the world-wide open innovation community ADOxx.org4.
The realisation of this environment uses and improves the open-source micro-service framework OLIVE5 provided by OMiLAB6. OMiLAB is a world-wide acting community on model-based approaches  and provides support for model-based realisations. It is used as main container for the dashboard, which guarantees the modularity of the environment, needed for its continuation over time and, in order to enrich its features, the integration with other micro-service components resulting from other internal projects.
Data analytics, data processing and data anomaly detection is introduced as part of the IoT and Sensor Management and Streaming Analytics & Prediction layer in Fig. 1. The current data harvesting interface from ADOSCORE, where a person has to manually enter the qualitative data – e.g., the rating between 1 (excellent) and 5 (insufficient) – is now supported, as this person gets the results of a data analysis that has been configured for the particular purpose. The fact that the data harvesting is still performed manually is approached, by first minimising the need of qualitative data compared – that are manually entered – to quantitative data – that can be automatically imported, as well as adapt the data analytics results to ease the transfer – in the ideal case, to automatically import – from the results of the data analytics systems to ADOSCORE.
The intention is to demonstrate the idea that a technical integration of all data sensor devices is not necessary, as long as a semantic integration can be achieved. This semantic integration can be achieved in several ways, starting from simple tagging to more advanced techniques. This tagging or, in more advanced cases, the semantic lifting is performed by a designer; in our case, we call it Data Service Designer. The data needs to be stored in an accessible way, often in form of a time-series snapshot database. For the provision of data in form of a snapshot database, we call it Data Service Engine, indicating the need of storing the data streams and enabling an access as well as performing calculations on it.
Figure 8 introduces the SEOR (Systematic Energy Operational Rating) as a result of the European research project ORBEET7, where data services are used for the data access to energy monitoring data. We are using this environment and transform it into the field of Industry 4.0 for continuously monitoring of data streams resulting out of the data management environment for Software AG. Second, the meaning of data is described using a semantic description, which is added to the data services. A terminology, taxonomy or an ontology can be used to semantically enrich the data services, hence, the meta-data of the data streams. In Fig. 8, this semantic enrichment is provided by the business process management tool ADONIS. Hence, the data streams are collected by the data services and semantically enriched with the business process context, in which they are relevant. This semantic description can be enriched to better manage the mass of available sensor data and hence tackle the second aforementioned challenge of smart data management.
5 Application Scenarios in Industrie 4.0
There is a vast number of application scenarios that recommend the use of the proposed framework. However, when looking into different industry domains in more detail, different industry requirements show up.
5.1 Automobile Industry
The automotive industry is facing a saturated market which is today component driven. However, digitization is a key factor for further growth, allowing car manufacturers to transform to a software and solution-focused industry . In this transition process, digitization enables the key factors such as connected supply chains and Industrie 4.0. Connected supply chains allow traceability of components and products to support inventory management. Following the proposed approach, the acquisition of real-time information coming from IoT devices and sensors delivers the required data based on which real-time analytics and knowledge management will achieve time visibility at the level of supply chains, but also sustain collaboration, visibility, business continuity and dynamic responsiveness on unforeseen situations. According to , investments in this area are expected to yield as much as $1bn per OEM, in order to improve efficiency, reduce costs, and increase the overall collaboration and innovation speed.
In addition, automotive industry is facing an increase in the complexity of connected products, the increased number of model/variants, mass-customisation, reduced lead-time while business critical KPIs must be met (e.g., maximise the use of resources, minimise environmental impact). The proposed solution framework provides an Industrie 4.0 compliant solution to compete with companies in emerging economies, having a competetive advantage in terms of production costs, labour rates, tax regulations, etc. Especially IIoT technologies will support the Automotive OEM in tacking the challenges to increase the competitiveness of interconnected supply chains and existing productions.
5.2 Home Appliance Industry (“White-Goods”)
The rise of transport costs, the need for higher efficiency and productivity, the customer and user demand for greener products, the higher instability of raw material and energy prices and the shortening of the lead time for production will push for a more critical assessment of the delocalisation strategy towards low-cost countries. In addition, megatrends are shifting European manufacturing towards practices which lead to increased responsiveness to massively customised demand without abolishing the notion of production with respect to both cost and resource consumption.
the need to optimize the consumption of resources through the use of energy- and material-efficient processes and machinery
the fact that increasing processing power of ICT and more sophisticated analytical software enables real-time performance analysis
the need for a significant boost in efficiency, safety and resource sustainability in production and logistics and
the need for reduction of design errors and “time to market”
the optimisation of production processes through digital factory modelling
Following the proposed DISRUPT solution, enterprises are enabled to constantly monitor the continuous changes in manufacturing processes, suppliers networks and other challenges across the value chain, as new opportunities and threats appear constantly in an increasingly interconnected world.
5.3 A Generic Solution to Industrial Challenges
Taking the above mentioned problems into consideration we may conclude that DISRUPT’s vision  is relevant to the above challenges thus offering a strong research and innovation potential for a manufacturer of this sector. From our application scenarios above (c.f. Section 5.1 and 5.2), production faces the challenge of optimizing the production of different products in an inter-factory supply chain. Our observation on one production factory that produces several products for sub-sequence production and assembly is the need to improve the resilience against in-house disruptions and disruptions from outside.
In-house disruptions are mainly caused by machine breakdowns and require a re-scheduling considering the overall production plan and the still available other production lanes. The skills and experiences of the currently available operators need to be considered to re-schedule the production. Functional tests are the bottleneck; hence, the production needs to be flexibly adapted having the test line and the operators and their experience as input and the optimised production for the overall production plan as the output. Smart data processing is hence needed, not only to monitor the current status of the production line, but also to identify anomalies – e.g., in the condition monitoring of the machines – to predictively indicate a potential issue with a machine breakdown. Depending on the expected impact of that machine breakdown, the production manager should flexibly adjust the plan to either re-schedule or wait till the machine is fixed.
The data are typically provided from different sources, hence not only from sensors that are added to the line, but also from operators that either raise an urgent issue or a potential idea for improvement. The data processing, hence, is not only considered to react on indicators that passed thresholds but also to identify potential improvements provided by operators. This is considered as knowledge, which is gathered and processed by smart data processing to enable further processing in knowledge management systems.
6 Conclusion and Outlook
The proposed IoT framework is currently realised in the EU project DISRUPT  between BOC and Software AG to currently support two distinct Industry-4.0 application scenarios. Smart data management is applied for improving operations research in logistics and production management by visualizing information and events – coming from the shop-floor – to decision makers in a transparent and suitable way. Especially, the traceability of production and process KPIs from the strategic level down to the very sensor reading offers the advantage of being able to respond more quickly to unforeseen and swiftly changing situations.
An iterative approach is currently performed by first integrating the available tools, second to introduce model-driven semantics for the model-based support and finally introduce semantic algorithms to realise smart mechanisms for model-based support. Following the presented approach will enable companies to become more competitive. However, the integration between the companies, partner’s processes and the collaboration of all stakeholders is key to ensure the effectiveness of the proposed approach.
- 1.Löffler C, Westkämper E, Unger K (2011) Change drivers and adaptation of automotive manufacturing. International Conference on Manufacturing Systems (ICMS), p 6Google Scholar
- 2.Westkämper E, Zahn E, Balve P, Tilebein M (2000) Ansätze zur Wandlungsfähigkeit von Produktionsunternehmen, WT. Werkstattstechnik 90:22–26Google Scholar
- 3.Eirinakis P, Buenabad-Chavez J, Fornasiero R, Gokmen H, Mascolo J, Mourtos I, Spieckermann S, Tountopoulos V, Werner F, Woitsch R (2017) A proposal of decentralised architecture for optimised operations in manufacturing ecosystem collaboration. Working Conference on Virtual Enterprises PRO-VECrossRefGoogle Scholar
- 4.Marz N, Warren J (2015) Big Data: Principles and best practices of scalable real-time data systems, ManningGoogle Scholar
- 5.Software AG (2016) Company white paper: the APAMA platform, under-the-covers: an in-depth view of ApamaGoogle Scholar
- 6.Software AG (2016) Product fact sheet: universal messaging. https://www.softwareag.com/corporate/images/SAG_UniversalMessaging_FS_Sept13_v3.5_WEB_tcm16-111010.pdf. Accessed 13.02.2018Google Scholar
- 7.Berthold MR, Cebron N, Dill F, Gabriel TR, Kötter T, Meinl T, Ohl P, Sieb C, Thiel K, Wiswedel B (2007) KNIME: the Konstanz information miner, studies in classification, data analysis, and knowledge organization. Springer, Berlin, HeidelbergGoogle Scholar
- 9.R Core Team (2013) R: A language and environment for statistical, computing. R Foundation for Statistical Computing, ViennaGoogle Scholar
- 10.Krumeich J, Zapp M, Mayer D, Werth D, Loos P (2016) Modeling complex event patterns in EPC-models and transforming them into an executable event pattern language. Multikonferenz Wirtschaftsinformatik (MKWI), pp 81–92Google Scholar
- 12.Software AG (2017) Company white paper: why you need zementis, predictive analytics. https://resources.softwareag.com/products-analytics-decisions/why-zementis-whitepaper. Accessed 13.02.2018Google Scholar
- 13.Frost & Sullivan (2016) Automotive Industry IT Spending, CIO Focus, Trends, and Highest Growth Areas, ReportGoogle Scholar
- 14.Woitsch R, Hrgovcic V (2011) Modelling knowledge: an open model approach. Proceedings of the 11th International Conference on Knowledge Management and Knowledge TechnologiesGoogle Scholar
- 15.Guschlbauer E, Lichka C (2013) Umsetzung des Prozesscontrollings, Prozessmanagement für Experten, Impulse für aktuelle und wiederkehrende Themen. Springer Gabler, Berlin HeidelbergGoogle Scholar
- 16.Woitsch R., Process-Oriented Knowledge Management: A Service-based Approach,PhD Thesis, Vienna (2004)Google Scholar
- 17.Woitsch R, Utz W, Hrgovcic V (2013) Integration von Prozess- und Wissensmanagement, Prozessmanagement für Experten, Impulse für aktuelle und wiederkehrende Themen. Springer Gabler, Berlin HeidelbergGoogle Scholar
- 18.Lichka C., Der modellbasierte Business Scorecarding-Ansatz zur Strategieoperationalisierung, University of Vienna, PhD Thesis (2006)Google Scholar
- 19.Karagiannis D, Woitsch R (2010) Knowledge engineering in business process management, business process management 2, strategic alignment, governance, people and culture. Springer, Berlin HeidelbergGoogle Scholar
- 20.Roussopoulos N, Utz W (2016) Design semantics on accessibility in unstructured data environment, domain specific conceptual modelling, concepts, methods and tools. Springer, Berlin HeidelbergGoogle Scholar
- 21.Utz W, Woitsch R (2017) A model-based environment for data services: energy-aware behavioral triggering using ADOxx. Collaboration in a data-rich world. PRO-VE 2017. vol 506. Springer, Berlin HeidelbergGoogle Scholar
- 23.Wooldridge M (2002) An introduction to multi-agent systems. Wiley & Sons, HobokenGoogle Scholar
- 27.J‑ Parikh AD, Weihl WE (2004) Edge computing, extending enterprise applications to the edge of the internet. ACM New YorkGoogle Scholar
- 30.Kabáč M, Consel C, Volanschi N (2017) Designing parallel data processing for enabling large-scale sensor applications, Personal and Ubiquitous ComputingGoogle Scholar
- 31.Rossiter J (2003) Model-based predictive control: a practical approach. CRC Press, Boca RatonGoogle Scholar
- 32.Bemporad A (2006) Model predictive control design: new trends and tools. Proceedings of 45th IEEE Conference on Decision and ControlGoogle Scholar
- 33.Kouvaritakis B, Cannon M (2001) Non-linear predictive control: theory and practice, ISBN 978-0852969847, The Institution of Engineering and Technology, IEE PublishingGoogle Scholar
- 34.Park K, Zheng R, Liu X (2012) Cyber-physical systems: milestones and research challenges. Int J Comput Telecommun Ind 36:1–7Google Scholar
- 35.Stojanovic N, Dinic M, Stojanovic L (2015) Big data process analytics for continuous process improvement in manufacturing. Big Data, IEEE Publishing, Santa Clara, CA, USA, pp 1398–1407Google Scholar
- 39.Woitsch Robert, Hrgovcic Vedran, Robert B (2012) Knowledge product modelling for industry: the PROMOTE approach. 14th IFAC Symposium on Information Control Problems in Manufacturing, International Federation of Automatic ControlGoogle Scholar