Logotipo - Consorcio nacional de industriales del caucho
Siguenos en
Etiqueta Europea de Neumáticos
Innovación > Proyecto 0D-Steam

Proyecto 0D-Steam

STREAM-0D project has been funded by EU within the Horizon 2020 research and innovation framework program; goal of the project is the creation of an innovative integrated control system to be applied in the production lines.

The project consortium involves a companies made of research institutes and universities, SMEs and end-user companies coming from seven different European countries. ITAINNOVA is the project's coordinator and STANDARD PROFIL as participant. Both are partners of Spanish rubber CONSORCIO. That is a clear example of colaborative project.

The main goal of the project is the creation of an innovative integrated control system to be applied in the production lines to ensure Zero-Defect Manufacturing and the Data-Driven Models (DDMs) represent a crucial element to predict potential product defects.

Within the STREAM-0D project,  collect and harness the vast amount of data they generate, and create helpful data visualisations to identify patterns and validate improvements to their processes. The IES-SCAN data platform includes an array of data analysis capabilities, which are used in STREAM-0D to enable advanced modelling and analysis of the pilot cases according to their very diverse requirements.

As the STREAM-0D project progressed, ZF team in Gliwice and ITAINNOVA, one of the other partners involved in the project, implemented a ROM, a mathematical model, and created a new station – the HBubble – that will allow precise determination of one of the most important parameters of the brake booster, output pressure.

ZF Brake Booster - Lap gap


ZF Brake Booster - Output rod-Disc reaction


ZF Brake Booster - Output rod with data matrix

Getting to know the HBubble station

The HBubble test station performs the reaction disc deflection test, measuring among others the height of the extruded bubble: based on the results achieved by STREAM-0D, a correlation between deformation and the height the ratio disc after crushing led to the identification of the so-called lap gap; this has a direct impact on the pressure output mentioned earlier.

The output rod and reaction disc components used during the test, take part in the next steps of the process, and the added values are the saved data in the matrix code placed on the output.

All measuring instruments used for building the station, test methods and ROM algorithms were precisely selected and determined by ZF Gliwice and ITAINNOVA, based on the data analysis carried out directly on the production line in Gliwice and several other tests carried out during ongoing production. This is where the appropriate strength, route and speed of the test of the HBubble test station were determined.

The output rod is permanently marked with a data matrix that carries also the information obtained through the above-mentioned tests; this represents an added value in terms of full traceability in the subsequent steps of the assembly process of the brake booster: it is now possible to perform an immediate reaction in the event of any deviations from the process parameters, or even carry out a real-time control of the process based on the mathematical ROM model.

Product Traceability

Thanks to these new implementations, at each stage of production the components included in the finished goods can be always to identified. Also, each element included in the production process can “communicate” with the other, unifying the real world of production machines with the virtual world – Internet and information technology.

 hanks to the implementation of the STREAM-0D solution, Standard Profil experienced great improvements in data management: process data are now merged with product data, allowing a better control of the production. It is Standard Profil itself that, in this article, tells the story of this advancement.


Gathering data is not something new: since 2013, some devices had already been capturing data in Standard Profil for a while. Anyhow, only a few of them were actually capable of doing this, since there was no standard for the equipment regarding connectivity. Also, we did not make any processing with the data apart from storing it. There was also a clear separation between the process and product data as each source had its own software layer. These sources were isolated from each other and nothing but the eyes of those who queried the data connected them.

The methodology was simple: we stored everything and in case we found we had a problem, we checked the stored values and attempted to guess what happened.

 Time for improvements

The first point we improved was making our system reactive. We had a clear idea of the need of knowing if something went wrong at the time it happened, not later on, as consequences are worse. With this idea as a base, we started to build a system to monitor the big amount of parameters that our operators were using as recipes while producing.

Using the experience of people, we started to learn the key-parameters for our process and for the quality of the product. From a developer point of view, we liked the idea of leaving behind this issue based ‘data pooling’ strategy we were using.

Since then, we have defined standards for data capture on new equipment and we have invested time in developing a methodology that allow us to join our data sources to establish relations between process and product data. The connection between the world of process and product data was the next logical step.

Making the first studies of relations between process and product parameters brought us the first conclusions, and probably to the first frustrations. How is it possible that two apparently similar process batches led to products with different properties? By now, we know that we just needed more data.

 Experience comes into play

Our workers’ experience has been the main source of knowledge for a long time because of the complexity of the process and the cost (mostly in time) of the equipment required to measure the properties (both physical and chemical) of the materials used.

Starting with the mix to elaborate the rubber (our main raw material) and followed by extrusion, which involves giving the main shape and vulcanization of the profile, we find ourselves across a production line that can measure more than 100 meters, filled with equipment from different manufacturers. The main properties of the materials are measured, same with the resulting products, but with a lower frequency than our process data which is monitored in real time. There is yet currently not much affordable equipment to do that in real time due to the very specific requirements we have. This turned to be the main caveat while making relations between process and product parameters. Even the discrimination between good and bad parts is not a direct task, as the flexible nature of the product makes online systems complex.

The amount of data between process and product sources differs by orders of magnitude. In some of our tests, we have a relation of 1:18.000 product to process data points.



With the adoption of projects like STREAM-0D, by working with a wide group of partners, we were able to get the conclusions that enabled us to find relations and develop equipment to capture mechanical properties from our product and link them with other non-trivial to measure properties by processing data. We are now able to generate models that point us in the direction to keep improving our processes much faster than by user experience. Sometimes the conclusion is that we still lack some key data in any part of the process: if that is the case, we check for the available solutions for capturing that. If the equipment does not fit our requirements, we study other measurable items, which have a correlation with the data we are trying to measure.

This way of giving a weight to a set of parameters is what differences between having data and having information. Being able to quantify the importance of a variable is key nowadays in the life of a process, but being able to predict the result of a process with its inputs, in real time and with constant computational cost is what innovative solutions like the methodologies developed with STREAM-0D propose.

Detailed simulations alone are not sufficient to create an effective process control: in order to obtain really accurate prediction models of the production process running in real-time, ROMs must be fed also with offline real data coming from the plant.

Checking a ROMs (Reduced Order Models) under certain conditions for an output forecast is something that can be done in real-time, but ROM based control systems are not so common and the strategy to build them might differ depending on the process characteristics. Why? The reason is that simulations do not cover all possible combinations of inputs, transient periods or sudden changes in input variables and conditions.

STREAM-0D project proposes a methodology to generate ROMs based on detailed simulations of the manufacturing process: The use of ROMs for process control purposes (namely accurate-prediction-making and the set-up of a methodology for the correction of potential deviations) is mandatory: with ROMs it is in fact possible to cover a wide scenario of variables; this cannot be done just with simulation.

 More than a simulation

STREAM-0D proposal relies on inferring new output-data based on simulated data. With reference to the bearings’ pilot case, a simplified example scenario would consider a pair of predicted results from simulation:

  • firstly, a simulation would run with a surface temperature of 65ºC and an input dimension of 30.00 mm, providing an output of 29.995 mm for the same dimension at 25ºC;
  • secondly, a simulation would run at 67ºC for the same 30.00 mm dimension estimating an output of 29.993 mm at 25ºC;

For the non-available simulation of the same 30.00 mm dimension at 66ºC, a fairly good approximation would therefore be 29.994 mm.

Although this is a feasible way of creating a control model with detailed simulation data, it is still a blind model, since after a prediction is made, the assessment of the accuracy of the prediction from the model is still not checked. Therefore, in order to match simulations with real world, STREAM-0D team works in making the proposed offline control model able to learn from reality. To do so, once the simulation model has been built, it is fed offline with real data from the plant.

The goal is to reproduce the actual output from the real input data and thus validate or correct the data predicted by the ROM.

The principles underlying this process are the same as those used in data analytics or machine learning that is, inferring relationships between inputs and outputs based on experimental/real data. STREAM-0D approach is not a brute force attack from inputs to outputs, but a fine tune from predicted outputs to actual outputs, as most of the task is done beforehand through detailed simulations. This makes the learning process pretty faster and simpler so it can be implemented in the control model and run at real-time production pace.

Summarizing, STREAM-0D methodology comprises a detailed simulation of the manufacturing process, a control model based on the simulation data, and continuous refinement of the model through actual data, all done offline until the prediction confidence is good enough for handing over the control of the plant to the model.

At that stage, having a model capable of forecasting accurately the output will reduce dramatically the amount of rejected outputs.

 Predictive models need to be light, portable and can be used without the hassle of providing each and every user with a high capacity internet connection to cloud services providers.

 Tensor compression

This is where compression methods come into play. Specifically, ECN relies for 0D-Steam project a tensor compression to achieve compact encapsulation and efficient manipulation of multi-dimensional data. As a consequence, simulation predictions can be encoded with little data, and decoded on the user’s demand, in real-time, with little computing effort, in virtually any computing platform.


STREAM-0D successfully links physics-based models with data-driven models (DDMs) and provides a smart adaptive control system for the detection and prevention of defects. In this context, the work of LMS, which is one of the project partners, is crucial.

It is described the importance of automated data acquisition within the framework of industry 4.0 towards the direction of Zero-Defect Manufacturing (ZDM). The existing OT (SCADA, PLCs) systems and IIoTs are generating vast amounts of data, related with the machine and process states, as well as with product quality. Extracting process control value from these, requires on the one hand, a solid data management and processing infrastructure and on the other hand, robust tools for data transformation[1].

Process control is based mainly on static, closed-loop systems with limited parameterization capabilities and simplified functionalities. These aspects lead to poor defect detection-prevention characteristics. Therefore, enhancing the current control schema’s accuracy, adaptability and effectiveness is essential[2].

 The evolution of adaptive control

Physic-based and Data Driven models exhibit exceptional characteristics towards a more adaptive and precise control architecture. Although computationally demanding, the first ones are offering determinism and high accuracy by mathematically formulating complex aspects of the manufacturing processes and products which their underlying physics are already known[3]. Data driven models, as the name implies, offer a probabilistic representation of the system, utilizing machine learning and statistic methods for implementation, along with substantial amount of training and evaluation data[4].

The integration and real-time use of these models can transform the data from the sensor systems by adding control value and creating additional opportunities in assessing, evaluating and optimizing the manufacturing processes[2]. In addition, pieces of an initial framework for implementing complete digital instances of processes and products can emerge from these developments[5].

It’s worth to mention that the concept of adaptive control within manufacturing is not new, the linking of real-time simulation models and process control, only became possible with the major developments in IC Technologies the past decade. Better, more accurate simulation tools, combined with precise data acquisition systems, sensors and monitoring systems, are able to provide a complete approach, exploit correlations and structures within the production data.

LMS - Adaptive Control

In STREAM-0D, we aim to provide these methodologies and systems under a complete unified solution for the purpose of optimizing the production and achieving Zero defect manufacturing. STREAM-0D successfully links physics-based models with data driven models and provides a smart adaptive control system for the detection and prevention of defects. The per-case customized User Interface, links the modules together, completing in this way the innovative STREAM-0D solution towards Zero-Defect Manufacturing.

STREAM-0D represents a clear example of the Industry 4.0 paradigm, providing also an interesting model of technological integration.

Podéis acceder a vídeos en los enlaces:





Usamos cookies propias o de terceros necesarias para que nuestro sitio web funcione adecuadamente, pero también usamos cookies que nos ayudan a personalizar la web y que nos permiten, además, mostrarte publicidad personalizada a tu navegación tanto dentro como fuera de nuestra página. ¿Consiente el uso de las cookies?
Aceptar todas las cookies Solo las estrictamente necesarias Configuración