Context in batch manufacturing provides the “where” and “when” for a given recipe. Analytics can then be run within the context of a particular piece of equipment or unit and across all levels of a batch, providing the ability to perform batch-to-batch comparisons.

The Importance of Data Ops & Context in Batch Manufacturing

Q&A with Michael Bartlett- Sales Director, Life Sciences & Vinodh Rodrigues, VP Customer Success | Quartic

Tell us about DataOps and context and why they are important in batch manufacturing?

DataOps is the technical application of governance and standards for data in preparation for its use in an enterprise.   This provides standardization for data, equipment, and processes for like systems across an organization.  Context in batch manufacturing provides the “where” (equipment) and “when” (batch, unit operation, phase, etc) for a given recipe.  Analytics can then be run within the context of a particular piece of equipment or unit and across all levels of a batch, providing the ability to perform batch-to-batch comparisons. 

 

More specifically - what are ISA95 and ISA88 and why are they significant?

ISA95 is a standard created by the International Society of Automation (ISA) to establish common terminology for the description of a manufacturing enterprise, both for control functions and information exchange.  ISA88 is another standard created by the ISA to provide guidelines for batch process control.  These standards allow for a standard “language” for equipment and processes that vendors and manufacturers can follow to optimize configuration of equipment and systems to support batch manufacturing.

 

What functions does a DataOps platform provide? (looking at the data quality angle here in addition to context)?  

A DataOps platform organizes and prepares data for use in a number of applications.  Features may include data cleansing and filtering, time alignment, enforcement of naming and equipment standards across the enterprise, and contextualizing equipment and process models. This prepared data is then served to end users, analytics, and applications for consumption via robust, secure, and timely data pipelines.

 

What are the different levels of product/batch/ISA88 context and how does availability of data at each level help batch analytics?  

The ISA88 Procedural model, which was originally linked with the ISA88 Equipment model but now can be linked to the ISA95 Equipment model, is a standard terminology to describe batch control.  Starting with the recipe, the steps are batch/unit procedure/operation/phase.  Some even extend monitoring down below phases to phase steps.  Availability of these discrete steps in a batch helps users that want to visualize attributes like sensors compare the relative steps from batch to batch to try and quickly note differences.  Often, these differences – such as a cycle time bottleneck or quality deviation – occur during a specific “problematic” operation or phase, rather than over the entire batch.  Being able to drill into these batch substeps, and better yet, using ML-based advanced analytics to do so, makes it easier for SMEs to pinpoint the root cause of batch performance issues to specific batch steps and process parameters. 

 

How do DataOps platforms overcome lack of rich context coming from execution systems (MES and BES)?

A DataOps platform that is batch enabled will have inherent capability to use streaming or event data coming from the process control system as triggers for the recording of batch events.  For example, a batch start may be indicated by the closing of an inlet valve, the starting of an agitator, and a level above a certain height.  The stopping of a batch might be indicated via conditions on the same data points, or different ones.  With the right inputs a batch can be tracked at every step. 

 

How can a DataOps platform with AI/ML capability overcome lack of rich context?  

AI/ML capability in a DataOps platform is rare, and it can go one step further than simple calculations using trigger tags.  DataOps platforms that include classification models, which are supervised machine learning methods, can be used to detect and label input data to identify conditions for the start and end time of batch events.  

 

Tell us about "context-in-motion" and why is it important in batch optimization?  

Context-in-motion is a term used to describe capability of Quartic.ai’s DataOps pipeline.  A typical DataOps platform will store a reference to the data stream but leave the data in place.  The context engine in Quartic.ai goes beyond pointers – the stream processing service automatically contextualizes the incoming data when it passes through the DataOps engine.  This capability adds more performant and scalable real-time analytics for closed loop optimization use cases, where sub-second processing is necessary.  

 

Why is industry so interested in moving DataOps functions to the cloud?

The interest to move digital capabilities to the cloud isn’t unique to DataOps.  Cloud infrastructure scalability is available to any software that is natively built to take advantage of microservices.  The virtually limitless infrastructure resources a cloud vendor can provide, as well as the offloading of FTEs required to support infrastructure and cyber security internally, has become more popular in the last decade.  DataOps functions that natively run in the cloud and have the versatility to connect to many types of data sources, especially at the edge, are in demand because they make cleaned, contextualized manufacturing data available to any digital consumer that also runs in the cloud.  

 

What standards and certifications do cloud vendors need to obtain to give customers confidence in security, availability and reliability of their SaaS offering(s)?

It is recommended that cloud vendors obtain at a minimum two compliance certifications to show that they are conforming to high standards in third-party management of customer data; SOC 2 (Type 1 and 2) and ISO 27001.  SOC 2 compliance was developed by the American Institute of Certified Public Accountants. SOC 2 Type 1 details the systems and controls the vendor has in place for security compliance, and auditors verify whether the appropriate trust principles are met at an instance in time. SOC 2 Type 2 assesses how effective processes are in providing the desired level of data security and management over a period of time.  ISO 27001 is built around implementation of an Information Security Management System (ISMS) appropriate to the organization.  Once the ISMS is developed, a risk assessment and risk treatment plan, with continual monitoring, are put into place.

 

How can a DataOps platform be made scalable?

A DataOps platform can be made scalable both in how it consumes resources and how it technically scales to fit an enterprise.  From the resource perspective, a DataOps platform that’s built using cloud-native microservices and running in a container is best suited to take advantage of the flexibility of both on-premise cloud and virtual private cloud infrastructure.  Pipelines that use cloud native data exchange protocols like Kafka will interact with those microservices more efficiently.  Technical scaling follows the principle of “build once, deploy many.” The DataOps platform allows creation of equipment and process classes (templates) for each unique asset and process type, which can then be quickly instantiated to any like asset and process across the enterprise.

 

What types of data source integration and data types are necessary for a comprehensive DataOps platform to support batch optimization?

If Life sciences batch manufacturing is considered the optimal industry to target, and automated batch optimization is the functionality with the highest value, a DataOps platform should support all data sources that help to achieve that goal.  For automation systems, the protocol standard is generally OPC UA.  If the goal is to send optimized setpoints back to the automation and control system, bi-directional OPC UA is required. MQTT is becoming a more widely used communication protocol for IoT and edge sensors.  For benchtop analyzers, a validate-able file ingestion mechanism should be available to turn batched data files into streaming data.  Connectivity to Manufacturing Execution, Batch Execution, and Lab Management Systems is usually done via SQL, and the DataOps platform must have the option to create batch events from those systems.  Finally, for in-line analysis, the system must support consumption and storage of spectral data and metadata, whether it be directly via a protocol or through standard file output.  

 
 
The content & opinions in this article are the author’s and do not necessarily represent the views of ManufacturingTomorrow

Comments (0)

This post does not have any comments. Be the first to leave a comment below.


Post A Comment

You must be logged in before you can post a comment. Login now.

Featured Product

Get RFQs on Die Casting, Stamping, and Extrusion With Xometry, Your Source for Custom Parts

Get RFQs on Die Casting, Stamping, and Extrusion With Xometry, Your Source for Custom Parts

Xometry is your source for custom parts. Now, in addition to getting instant quotes on 3D Printing, CNC Machining, Sheet Metal, and Injection Molding, customers can create and send RFQs for die casting, stamping, and extrusion work to our nationwide network of pre-vetted manufacturers with just a 2D drawing. You will receive and be able to review responses from qualified shops within 7 days on an advanced web-based RFQ management platform. To learn more go directly to our site to issue and RFQ today. Stop wasting time managing RFQs through email and by phone, and start issuing RFQs at scale and in the cloud.