Showing results for 
Search instead for 
Do you mean 

An Insight to Truth

by Dreamer on ‎02-10-2016 09:22 PM

For some time document management, PLM, and ERP systems have been banging the drum of becoming the company single source of truth.  I would like to take a different approach and discuss the data management from a framework of an insight to truth. This approach is agnostic to where data is stored, and specific to how it is defined, processed, and presented to stakeholders for management of the product. The following is a flowchart of a typical company risk management process. (I will present a comprehensive view of Medical Device Risk Management in an upcoming blog post.)
Every one of the processes outlined are regulated.  Not only is the content regulated, but also the workflow for processing, storage, and approval. For example, when processing a corrective action in the company QMS, correction definition and correction plan approval are potentially completed by different departments of the company.  The correction itself is then completed by yet another department. All of this information is then driven to the product risk management system, and if required, to the product.  Frankly, the user doesn’t care where the data resides. They only care if it is current, approved, and easy to access. In a typical scenario, the data is driven to a document and implemented/approved as part of the change management system in the form of a document change order. As systems become more sophisticated, we are no longer just approving documents, but also the data elements that comprise the document, and those data elements are also referenced in other components of the company database - for example, corrections are often referenced not just in the CAPA, but also in a non-conforming material report (NCMR). In summary: Documents are an excellent way to present or compile data. They are a very poor method for Interrelating data.   What is needed is a comprehensive data model focused on the organization of the:
  • product requirements (inputs)
  • realization elements (outputs)
  • evidence of conformity
  • data/document workflows, including processes used to manage the product (e.g. CAPA, Risk Management, Complaints, etc.)
The maturity and organization of the data pile is typically built in the following hierarchy:
As the design and risk management proofs are built, the data is populated into the elements.  Naturally, the data/reports associated with the CAPA would be stored with or linked to the CAPA data element. The CAD files for a part would be stored with the part output data element in the context of the BOM - or DMR in Medical Device vernacular, the DMR includes everything needed to build the product, work instructions, software code, inspection instructions, etc., not just the physical parts used in construction of the build hierarchy. Care must be taken to assure the appropriate approvals, and data are not duplicated, but in this model we don’t really care whether the data is in one database or the other. We are mostly concerned with logic, and context for appropriate information at the point of need with traceability to any related data element. The following is a Risk Management / Design Control comprehensive traceability flow diagram.
It is easy to see in this logic diagram that product outputs (parts, assembly instructions, code, etc.) are directly related to production testing, manufacturing requirements, and product requirements, for example. Every idea related to the output is then easily and logically associated for immediate access when the data elements are properly associated. Conclusion The current data management paradigm is document management. This model is useful, but rigid, allowing for only one dimensional use, when in fact the data contained in these documents is used for many purposes. While documents are an essential tool used for collaboration with outside stakeholders, it neglects the importance of logical granularity, and tends to be static - only updated when the company reviews its base function. A logical data element model could be used to sidestep the rigid model and provide an active, cohesive structure to facilitate frequent update of the logical elements in the product data pile.
4EasySteps2016