QUALITY MANAGEMENT BLOG

How can Big Data help to improve QM in the future?

We are currently in the midst of a profound upheaval in industrialization. Digital transformation and Industry 4.0 are buzzwords that describe this change, as is Big Data.

Digitalization in the manufacturing industry is advancing at an increasing pace. Man and machine are becoming increasingly networked, Industry 4.0 is becoming reality. Value creation is taking place in an increasingly networked manner. Analyses based on Big Data are one of the most important approaches in this Industry 4.0. Big Data refers to data volumes that require digital methods to process. This is due to the amount or type of data collected. Beyond the manufacturing industry, Big Data analyses are often an integral part of the business model. In the financial sector, for example, it is standard practice to use big data analyses to identify market trends. Such methods are also used in retail to develop better customer profiles, for example.

Electrical industry leads the way

The manufacturing industry in Germany uses Big Data processes primarily for quality management. Big Data is used by almost seven out of ten companies that pursue Industry 4.0 approaches. These companies evaluate data generated during production on a large scale in order to monitor and ensure the quality of the products. The electronics industry is a pioneer in this approach. Three quarters of the companies here use big data for quality management. In the mechanical engineering industry, on the other hand, big data is used almost as frequently in product development as in quality management.

Unused potentials

Big Data processes are undoubtedly on the rise. However, there is still potential in that context. Even companies that use Big Data do not exploit all the possibilities of the approach. Only in rare cases, for example, do companies use the information gained from Big Data to get to the bottom of the causes of quality problems. It is true that there is widespread recognition that valid data is the basis for identifying and solving quality problems. But companies have so far made only limited use of the information obtained during production. It is not uncommon for companies to be content with documenting when measurement data deviates from established numbers or standards. This is only the first stage of several analytics options in Big Data. It is also referred to as Descriptive Analytics. At the very least, companies are establishing transparency in the manufacturing process here. However, at this stage, it is not yet possible to investigate cause-effect relationships. This is the content of the so-called Diagnostics Analytics.

Often lack of IT requirements

The approaches of predictive or prescriptive analytics in the Big Data approach go one step further. Here, the aim is to identify potential problems before they arise or to automatically rectify difficulties that have occurred. This ambitious approach requires that the company has knowledge about cause-effect relationships. The manufacturing analytics approach is even more sophisticated. Here, the entire quality management is based on the findings of Big Data. However, many companies are not yet in a position to apply this method. They lack knowledge and the necessary technical requirements such as adequate database structures and software. Often, these companies still use traditional QM systems, which, however, are no longer sufficiently efficient with increasing product complexity.

How to use your production data effectively

In an increasingly competitive environment, it is negligent not to collect and evaluate company data effectively. The basis for industry analytics based on Big Data methods is that all relevant machines and production devices are equipped with Internet-of-Things components. In some cases, the existing sensors have to be adapted for this purpose. As a rule, however, the necessary adjustments are straightforward. Once these prerequisites are met, the systematic collection and evaluation of data can begin. The data is first collected and then transferred to a central database. In order to design the Big Data process in a target-oriented manner, these factors must be taken into account:

  • Scope: What is the total amount of data to be collected and stored?
  • Difference: How big are the differences between the data to be collected and documented?
  • Speed: How fast is data generated and transmitted?
  • Validity: What about the data quality? How accurate and reliable are they?
  • Value: What value does the collected data represent for production and value creation?

It is not new to store generated data and transfer it into comprehensive reports. It becomes interesting for quality management at the point where even very small changes in the data are documented and evaluated by artificial intelligence or statistical methods. Here, algorithms of data mining are applied. These methods are able to recognize patterns in data that are repeated. With industry analytics methods, it is now possible to comprehensively represent all operational processes. Previously, only documentation and analysis of individual, limited data sets was possible.

Go back

On this page, your editors blog.