Session 3 Poster Session

Wednesday, 11.11.2020, 13:40-15:20 o'clock


Hendro Wicaksono: Design of Virtual Engineering and Digital Twin Platform as Implementation of Cyber Physical Systems

13:40-13:50 o'clock

(Paper ID: 1124)

Many industries in Indonesia face several challenges to adopting new technology from Industry 4.0, especially for Digital Twin (DT) and Virtual Engineering (VE), which are integrated with Cyber-Physical Systems (CPS). Lack of human resource and fund are still significant challenges to developing DT and VE. This preliminary study aims to introduce the initial design of an open-source platform to create Virtual Engineering applications that combine Digital Twin concepts and immersive experience using open-source tools and affordable hardware. This platform consists of three parts, 3D Object Management, VE Module, and VE Interface. The developed platform has successfully demonstrated how a physical device can be integrated with its virtual model using a Digital Twin. With this result, industry stakeholders can learn and try to develop twin digital platforms in their industry using this design.

Werner Quint: Big Data Management using Ontologies for CPQ Solutions 

13:50-14:00 o'clock

(Paper ID: 1115)

In recent years, due to a progressive complexity of handling and processing business data, proper big data management has become a challenge, especially for SMEs that have limited resources for investing in the requested business transformation process.

As a solution, we suggest an ontology based CPQ software approach, where we show how the implementation of semantic technologies and ontologies affects data integration processes. We also propose a method called “ontology-based data matching”, which allows the semiautomatic generation of alignments used to formalize the coherence between ontologies. The proposed method will ensure consistency during integration, significantly improving the productivity of enterprises.

Maurice Meyer: Framework for Data Analytics in Data-Driven Product Planning

14:00-14:10 o'clock

(Paper ID: 1163)

Industry 4.0 and the trend towards digitization have changed today's products significantly. So-called cyber-physical systems are able to locally capture and process data and communicate it to other systems or users. On the other hand, concepts such as PLM and the digital twin allow more data to be analyzed along the life cycle of the product. While many companies are currently trying to use this data to create data-based services, this also results in new possibilities and potentials for product planning and product engineering. An analysis of product life cycle and usage data can allow drawing conclusions about the usage of the product, its current state and failures and thus show potentials for optimization and planning of the next product generation. Product life cycle analytics plays an essential role in data-driven product planning. In addition to the actual analysis, an analytics project must always take into account the use case, the data collection and acquisition.

In this paper we propose a framework for successful realization of data analytics solutions in product planning. Starting from a thorough analysis of challenges in data-driven product planning, we derive requirements for structured data analytics solutions in product planning. The proposed solution stems from standard models as CRISP-DM [1], the four-layer model for Analytics Use Cases and the Analytics Canvas [2] and offers structured solutions to fulfill the specialized requirements of data-driven product planning. It consists of four phases “use cases”, “data sources”, “data acquisition & integration” and “data analysis”, each presenting corresponding methods:

Firstly, we present approaches for identifying relevant analytics use cases based on issues within product planning. Secondly, we propose methods for holistically identifying and structuring data sources along the product life cycle. For the third phase, we summarize methods and procedures for multiple data source acquisition and integration. Lastly, we present a structuring approach of methods and a selection procedure for a simplifying definition of the algorithmic data analytics solution.

Based on a specific application example, we illustrate the application potential of using the framework.

Carlos Solon Soares Guimaraes: IoT Architecture for Interoperability and Monitoring of Industrial Nodes 

14:10-14:20 o'clock

(Paper ID: 1118)

Rapid advances in industrialization methods with information and communication technologies, sensor networking, ubiquitous computing, and more recently the Internet of Things, have spurred progress in developing the next generation of manufacturing technology. This article addresses the problem of incompatibility between the components required by the automation and robotics industry. The architecture model and implementation responds to the needs identified by the industry, facilitating interoperability and measurability of industrial technologies. In the industry value chain, monitoring for intelligent maintenance plays a key role, that of maintaining productive availability and allowing the use of assets throughout their life cycle, at the lowest operational cost.

Based on the advances in frameworks and middleware defined by robotics software, a IoT architecture for Industrial environments (IIoT) is proposed, based on reference models for Industry 4.0, which provides edge layers for field devices, fog for gateways and supervisors, and cloud for business logic and services according to the devices or industrial nodes. The validation for IIoT architecture presents the results of experimental analysis of sensor tests applied in a manufacturing environment to enable integration and sharing of resources. Finally, we discuss the real contributions of research to the high cohesion and low coupling of the architecture based on interoperability between the industrial nodes.

Marvi Michael Müller: Knowledge management on the shop floor through recommender engines

14:20-14:30 o'clock

(Paper ID: 1131) A widely spread approach for problem solving in production is shop floor management. For the processing of problems there is a necessity for written communication — often on paper but a rising number of companies is using digital systems. The increasing flexibility, internationality and complexity in production systems make standardisation to sustain the results of a problem solving process difficult. Therefore already solved problems may occur in a similar way and have to be addressed again as the knowledge about solutions is not shared. Knowledge management methods in the way they have been developed in the 2000s are not widely spread in the field of production because of their high effort. To overcome that gap this paper presents an intelligent system that recommends knowledge about previously solved problems in the production environment.

Recommender engines are widely used in e.g. web shops. But for the creation of a value creating, industry ready application in terms of knowledge management on the shop floor innovations for several obstacles are necessary:

  • User interface and workflow: The system needs to be embedded in already established workflows to prevent additional effort.
  • Recommendation quality: The recommended elements have to be valuable for the user.
  • Amount of input data: The system can only be as good as the data base. The problem solving process and corresponding data structure needs to be easy and quick enough to be conducted frequently by workers while still documenting enough information.

To overcome these obstacles four steps are needed: a) Understand workflow and data structure in companies, b) design system and implement, c) test the quality of the recommendations and d) validate via user feedback.

a) Qualitative analyses of problem solving management are conducted in four companies regarding workflows and generated data: amount, structure, frequency and content. From the findings a common data structure and workflows are newly developed.

b) Adaption of an idea management system to implement the workflow and utilize the recommendation system.

c) A big threat to the quality of recommendations are extremely short texts, which are generated in daily problem solving. To test the quality of the recommendations in this condition a dataset of 5584 solved issues from one of the interviewed companies is tested with 195 relevant test input data and the common metrics for recommender “precision” and “recall” are measured. To find the ideal setting of the recommender 20 different scenarios are tested.

d) To validate the value of the recommendations, the results are discussed with the creators of the test data set. Finally a quantitative study is conducted with 15 industry representatives to show the validity of the results of the qualitative study.

In summary, this work makes three contributions: A suitable data structure for recommendations on the shop floor is suggested, results are presented, how recommender engines work with shop floor data and insights of the industry perception, if such a system creates value, is given.

Hendro Wicaksono: An Automated Information System for Medium to Short-Term Manpower Capacity Planning in Make-To-Order Manufacturing

14:30-14:40 o'clock

(Paper ID: 1119)

In today's tough economy, it is important for (Make-To-Order) MTO companies to be responsive to customer demand and market fluctuations and to keep the costs as low as possible at the same time. Strategies, such as holding finished goods in inventory as a buffer against variations in customer demand cannot solve the problem because of both high inventory costs and inflexibility towards customer preferences. Therefore, the new cost-efficient strategies responding directly to changes in market demand are required. Unlike in Make-To-Stock (MTS), MTO companies hold capacity in reserve. Thus, they are able to make efficient utilization of available capacity to satisfy customer needs. This then leads to a constant capacity planning problem. The companies are facing fluctuations between overload by lack of sufficient capacity, and idleness by excess of capacity comparing to the level of demand.

Among all the planning resources, the available manpower is one of the most essential parts of the MTO operations. Therefore, the allocation and adjustment of manpower capacity that suits different planning horizons is a predominant measure to meet the changing capacity demands. Nonetheless, a signing each individual labor to various types of tasks and orders on a day to day basis continually for the planning horizon of several weeks or months is difficult.

This paper presents an approach of automated manpower planning model which can be used by MTO operations to achieve a better transparency and synchronization of capacity load for short to medium planning horizons. The approach is implemented as a software tool, to automate the data processing and analysis, which helps to dramatically reduce the corresponding data operation efforts and planning time.

The tool connects with different IT systems and extract data from them automatically. It then analyzes the data, and visualize the capacity load on a day-to-day basis at the work center level. Furthermore, the tool contains a simulation function to run scenarios of adding, deleting and shifting orders to manage capacity changes and prioritization.

This paper also presents the validation of the approach and tool in a real production unit in a German small MTO manufacturing company. The validation shows that the approach helps production planner to have enough time to take measures to adjust capacity in future. It provides production planner with the visualized number of redundant workers in the production line over the next few weeks. It allows them to further assign surplus manpower through different measures such as encouraging to use time accounts/annual leaves, supporting other organizations which need extra capacity and shifting production slots.

Gunnar Vorwerk-Handing: Consideration of Uncertainty within the Conceptual Integration of Measurement Functions into Existing Systems

14:40-14:50 o'clock

(Paper ID: 1105)

In order to facilitate the integration of new, intelligent functionalities in components and technical systems, for example within the context of Industry 4.0, meaningful information on characteristic process and state variables is required. Examples are real-time measurements of process-related variables or the measurement of the current system status, e.g. with regard to downtimes and maintenance intervals. However, many existing systems - especially those with long lifecycles - have not been developed based on this background respectively under these requirements and therefore cannot provide this information satisfactorily. A promising approach for providing the required process and state variables in existing technical systems is the subsequent integration of measurement functions, e.g. in form of retrofitting. The specific technical implementation of such an integration depends on the individual system. However, the main questions regarding the measurement point and the measurand, used to obtain the required information, arise for every system.

Prior works describe how a combination of physical effects under consideration of the properties of a system can be used to find and establish correlations between a required target quantity and potential measurands. A distinction between the target quantity and the measurand is assumed, whereby the quantification of the target quantity represents the actual objective and the measurand is only measured for this purpose. The aim of this model is to establish and describe the physical relationship between the system-individual target quantity and potential measurands under consideration of the system. Different potentially usable physical effects as well as the properties of components offer a large number of potential solutions on this level. However, there is a significant degree of uncertainty regarding the dependence of the potential solutions on environmental and boundary conditions, e.g. in terms of disturbance variables and/or properties of processes and components. For this reason, the identification and consideration of uncertainty is of great importance, especially if it enables an early assessment of the viability of the solutions.

In this paper, existing approaches for the identification of uncertainty are analysed and adapted with regard to the integration of measurement functions in existing systems. In the model described in prior works, the physical relationship between a target quantity and potential measurands is established. The developed extension uses this correlation in a reverse manner. The effects of the identified uncertainty are estimated from the measurand to the target quantity and can thus be considered in the development process. The identification of uncertainty and the consideration of the effects of the identified uncertainty on the determination of the target quantity is demonstrated using the example of a sensor integration in an elastic claw coupling.

Linda Salma Angreani: Systematic Literature Review of Industry 4.0 Maturity Model for Manufacturing and Logistics Sectors

14:50-15:00 o'clock

(Paper ID: 1127)

A maturity model is a widely technique to measure several aspects and identify current state of processes in an organization, which can be used as a starting point for business improvement. In the Industry 4.0 context, several terms are used to express the model, such as readiness assessment model, roadmap, framework, and maturity index. They have the same purpose of measuring how the current state of an organization unit is capable of adopting and implementing the concept of industry 4.0 in the future. Many researchers had proposed maturity models for assessing Industry 4.0 readiness and maturity since 2011 when Industry 4.0 was commenced. However, there has been no attempt to analyze empirical evidence systematically. This paper aims to analyze currently available maturity models related to Industry 4.0, and provide a synthesis on those maturity models from 2011-2019. This paper describes a systematic literature review (SLR) of empirical studies implemented on the maturity model published in several reputable and relevant sources. It focuses on manufacturing and logistics sectors, since the processes in both sectors can be highly improved through the introduction of technologies such as cyber-physical systems, internet of things, and artificial intelligence.

In general, the primary purpose of the review is to address the following questions:

(1) Based on what dimensions do researchers develop Industry 4.0 maturity models, and what are the most used and influencing parameters in those dimensions?

(2) How do those maturity models compare to each other in terms of dimension complexity, techniques, maturity leveling, and kind of application sectors of the model?

In conclusion, the maturity model in the context of Industry 4.0 is promising to guide the adoption of industry 4.0 technologies at the organization level. However, just having a maturity model is not enough. More efforts are needed to facilitate the application of it. This paper also provides recommendations for researchers and practitioners to implement it.

Majid Sodachi: Inspiration of Industry 4.0 to Enable a Proactive Sustainability Assessment Model through the Supply Chain

15:00-15:10 o'clock

(paper ID: 1196)

Nowadays, implementation of sustainability concepts and sustainability assessment framework are crucial factors for insuring the competitive advantage of many industries. Many pioneer firms have spread sustainability frameworks through their supply chain from the upstream (strategic) level to the downstream (operational) level. So, sustainability assessment has as a valuable tool is increasingly considered in senior-level decision makings. The long term and dynamic behavior of parameters at the senior level are considered as challenges for the successful accomplishment of sustainability assessment. This gets more important as by the advent of Industry 4.0 paradigm firms are engaged in a more dynamic changing environment and have faced with rapid decision-making. Considering the role of sustainability assessment for protecting firms in its three pillars, there would be a challenge among Industry 4.0 and sustainability assessment in smart manufacturing. In this paper, the resiliency concept has been focused to fulfill this challenge.

This paper considers dynamic behavior of data analytics in Industry 4.0 affecting three domains of the firm, environment, and society, and uses an analytical Markov decision process-based approach to address the sustainability issues. This enables an adaptive and proactive sustainability assessment model that encompasses the vast and seamless communication of data in Industry 4.0 among factory, environment, and society and supports the robust decision-making structure for sustainability realization. The paper elaborates the capabilities of Markov Decision Processes (MDP) as a statistical tool for treating the dynamic of data analysis in Industry 4.0 context while fulfilling sustainability assessment through resilience structure. A case study is designed to investigate the details of framework capabilities for supporting decision making in film senior levels.

Hendro Wicaksono: How Relevant Are Environmental Factors in The Ergonomic Performance Assessments?

15:10-15:20 o'clock

(Paper ID: 1120)

A suitable working environment is crucial to ensure the worker’s safety and health, results in higher productivity in production systems. As one of the most important elements of production, the assembly processes require the most human involvement. However, researches on ergonomics in assembly systems focus merely on task-related physical factors such as action forces, posture, movement, task repetition, etc. This paper aims to investigate the relevance of environmental measures of temperature, humidity, ventilation, noise, lighting, and cleanness to the assembly workers, and the relative importance of environmental factors in comparison with the task-related physical factors. A survey was conducted among 20 assembly workers and engineers in the hope to realize the urgency to integrate environmental characteristics into routinely conducted ergonomic performance assessments.

Discussion and Comments