One of the most complicated parts of any PLM implementation is data modeling. Depending on PLM vendor, product and technology, the process of data modeling can be called differently. But fundamentally, you can see it in all PLM implementations. This is a process, which creates an information model of product and processes in a specific company. It is not a simple process and it requires significant preparation work. This work is usually part of implementation services. Data modeling work is never done. Once created for the first implementation, data model will be extended with new data elements and features as new business requirements will be coming.
Fundamentally, PLM Data Model is core element of any PDM / PLM implementation and system. The ability to model design, engineering and manufacturing data as well as processes, obviously comes as a very important. However, since the topic of modeling is about company products and process, it is always coming as something unique in the organization. Old PDM systems were not flexible and requires physical change (rebuild, compiling) to include specific product and process data model elements. For the last decade, PDM / PLM systems are including elements of flexible data modeling. These tools have capabilities to configure of data model. At the same time, even coding is not required to extend the data model, the cost of “configuration” can be significant too.
PLM Data Model Uniqueness
What make PLM Data Modeling so unique? Why do we need it? Maybe we can avoid this process, by supplying something generic and not requiring change for every customer? There are two extreme examples I want to bring in the context of these questions.
One is about Excel (or spreadsheets). Basically, we can model almost everything in the spreadsheet. People love it, since it is flexible and simple to use. However, it gets complex within time and creates complex dependencies between spreadsheets. You need to have Chief Excel Officer in your organization for full time job to maintain initially simple spreadsheets.
Another idea to create “universal PLM data mode”. This data model can include “everything” engineering and manufacturing organization might need. We can identify all pieces and put it together once and forever. It might work, but within time, you will face a growing demand to introduce changes in this “universal data model” to support new business requirements.
Standards and Best Practices
For long time “PLM standard” was a long standing goal to simplify PLM implementations. It was demanded by customers as something that can solve proprietary data modeling efforts. For the last few decades, industry developed several useful standards that applicable in PLM. One of the most notable is STEP – ISO 10303 is one of them. In addition to that, large software and service vendors are introducing so called “best practice”- a simplified way to delivery data model for a specific segment of customers, industry vendors. The fundamental difference between standards and best practices, in my view, was at the level of “agreement” achieved between parties involved into this activity.
The importance of PLM data modeling.
PLM (or engineering and manufacturing) data models are an interesting topic and real problem. In many cases, it defines the success of the implementation or PLM software in general. This is a technical and marketing issue at the same time. At the same level data modeling influence implementation and product architecture, it is always used as part of the marketing story. PLM data model is a key of the future success of PLM implementations
PLM Model: Granularity, bottom up and change
Granularity
Granularity in data modeling is getting traction and, in my view, it is a very good sign. One of the problems in PLM is a diversity of implementation and needs. PLM tools implemented lots of functional goodies over the past decade. However, customizations are messy and complicated. Current data model organization is outdated in most of PLM systems these days. The last major PDM / PLM data modeling change was made back in 1995-2000, when flexible data model concept was introduced. Since that time, data modeling remained the same with very small variations of how to configure data steps and create a better user interface for data model configurations.
Bottom-up data modeling
How to build an efficient data model for PLM implementation? How to build a model that answers to the specific customer needs? Current vendor’s proposal is to make a selection from the list of all possible “modules” – BOM, change management, project management, etc. It is usually presented as a list functional modules.
I see few problems in this process. Selecting of big data model blocks can created too many constraints and create a complex dependencies and compatibility issues.
The idea of bottom-up data modeling relies on the capability to define very granular pieces of data and build it step-by-step bottom up to the level reflecting customer requirements.
Cost of Change
Cost of change is one of the most critical characteristics in enterprise software these days. PLM software is a great example. PLM models become not flexible and keep lots of dependencies on PLM system implementations. Combination of custom data models and implementations can create a situation where new features cannot be introduced. In extreme situations, it causes so called “version lock”. In some other cases ROI of PLM project to introduce new functionality is too low.
The future, in my view, is building very granular functional services alongside with the bottom up data model schema. It will allow to decrease cost of change, reduce dependencies between components and in the end, reduce a cost of change.
PLM data modeling remained unchanged for too long. Relational data modeling technologies, single database schema, organizational dependencies and configurations. Future data modeling in PLM will have to deal with existing systems, but propose a new way to model data in the age of connected systems, global manufacturing and supply chain.
[…] in PLM all starts from data or data modeling. The question you might ask – what is the right data model for PLM? Also you might be […]
First of all Thanks for the wonderful post .
A basic question that comes to my mind is — How dependant are we on a PLM tool to get stated on a data model (where we already have a good undertsanding of the process and data) ?
I also explain the reason i ask this question (and is also an afterthought after going through your post ) – if we do start agnostic of a PLM tool (for e.g. in excel) we may end up defining a monolithic way of defining the data model ana may need to redo once a tool is introduced — is this true ?
Thank you again