Part 1: How do model-based approaches impact Configuration Management?
Have you noticed that the speed of advancements in technology is rapidly increasing?
Have you thought about the impact of these advancements to Configuration Management (CM)? To date, the biggest impact to CM was going from paper based document to a digital document, where the document was still recognizable as a document.
Now with Industry 4.0 and (Industrial) Internet of Things the drive and need for a digital twin and therefore a digital thread will challenge the ‘classic’ view of CM on what a document is.
With A glimpse into the future of CM, the Industry 4.0 committee of the IPX Congress will shed light on the impact of Industry 4.0 and the internet of Things on CM. And how these are impacting the way products and processes are documented.
In the previous article (The True Impact of Industry 4.0 revealed ) the model-based enterprise (MBE), was indicated as the top high risk area impacting configuration management.
The role of CM in an organization
To understand the impact of MBE to CM, we first must understand what the purpose of CM in a company is.
Every company exists because it creates value, if a company does not create value long enough, it goes bust. To create this value, each company has a value chain. In this value chain people create and or use knowledge to contribute to the value creation activities.
This is where CM comes into the picture, these knowledge artifacts need to be identified, structured, and linked in order to find, use, reuse them, ensure consistency and traceability. Typically, in CM this knowledge is stored in documents and linked to items these documents describe.
The world is dynamic and change is a part of that, which means that this knowledge is not created once, but changed many times, for which there is a change process with dedicated roles to facilitate the impact analysis, implementation planning and the validation and release of the knowledge artifacts to the repository of released knowledge (consisting of many databases in a company). To ensure that the set of released knowledge artifacts remains consistent, changes are traceable and a user always knows which revision is the effective one; in other words that you are in control and not in a state of chaos.
Models instead of documents
But if this knowledge is no longer documented as a document but as a model, if it is no longer written down in a word document or excel file but modeled in a language that can also be understood by computer programs like a machine learning application. What kind of impact will that have on identifying the knowledge artifacts that we need to control and on how can you perform an impact analysis and plan your implementation accordingly?
Let’s first define what a model is:
a digital graphical, conceptual, mathematical, logical and/or physical construct of a real system, subsystem or component designed to support activities in the value chain.
This does not yet give a lot of clues how to identify knowledge artifacts. Is one model one knowledge artifact? Take for example a very simple model, the Bill of Material (BoM). It consists of uses-relationships between items and some attributes (single level).
From a CM perspective you do not want to control the individual BoM-line (the individual relationship), but the entire BoM. One BoM-line does not mean much without the rest of the BoM-lines. Also from an ownership perspective the BoM has an owner an individual BoM-line does not.
So the BoM is the knowledge artifact. But what is a knowledge artifact?
a set of information stored in digital or physical form that must be released as a whole and can be released separately from other knowledge artifacts.
The BoM is a set of information and must be released as a whole and can be released separately from other knowledge artifacts. For instance, a BoM can be released before you release the test specification or other BoMs.
Is this valid for all models? Probably not. Take this example flight entertainment system MBSE  model which contains more than 120 diagrams and is stored in one single XML file.
These diagrams express requirements, functional, logical and physical aspects of the system (often referred to as an RFLP construct) and are most likely not created by one person in an organization. Some diagrams are input to create other diagrams. So anywhere from 4 to 120 knowledge artifacts are contained in this model.
Why is this important you might ask?
Knowledge artifacts must be identified and structured in such a way that it facilitates change efficiently. Too granular is not efficient because for impact analysis you need enough detail to come to a business case and for implementation planning you do not want to plan an individual BoM-line to be changed. It will be easier to plan the whole BoM with all changes that need to be done in this BoM because this is done by one person. A knowledge artifact is the unit of work, the deliverable that can be planned and must be released as a whole and can still be released separately from other knowledge artifacts.
This means that although models are used to document requirements, functional, logical and physical aspects of a system, it remains a requirement for CM to be able to identify the knowledge artifacts within a model and be able to perform impact analysis, implementation planning, validation and release of these artifacts accordingly.
From a CM2 methodology perspective this means that a model can be a collection of multiple primary and secondary data sets (knowledge artifacts). As shown in the following picture.
The next article will focus on how to structure the various types of models as part of the As planned/As released baseline according CM2 methodology.
If you are interested you can also read:
This article was originally published on LinkedIn.
 The True Impact of Industry 4.0 revealed, 2018 – Martijn Dullaart
 Industry 4.0 Impact to IPE-CM Rev C – IPX Congress Industry 4.0 Committee, 2017 (only accessible if you have a CM2-Comprehensive certification or higher)
 a Glimpse into the Future of CM – Keynote ConX18 – IPX Congress Industry 4.0 Committee
The author is chair of the Industry 4.0 committee of the IPX Congress formed by cross industry leaders from AGCO – Gary DSouza, Airbus – Stephen Watts (Retired), ASML – Martijn Dullaart, Cummins – Greg Russ, FNSS – Mehmet Tunc, Gulfstream – Max Gravel, IpX – Todd Egan & Joe Anderson, and Northrop Grumman – Paul Nelson. Together they are working on a solution to help answer the following questions:
Do you also wonder why Configuration Management never gets mentioned in conjunction with Industry 4.0 or (Industrial) Internet of Things? Would you like to know the impact of Industry 4.0 on CM? What are the risks and opportunities? What will it mean to your CM processes? What requirements should IT tool vendors be made aware of and subsequently support? And what will be the impact to your organization and the people in it? How can you mitigate the risks and how can you prepare your organization to be ready for change?