Biomanufacturing in the life sciences industry is complex and intensely competitive, with key players turning to data and AI to extract more leverage from their manufacturing processes. For one multinational pharmaceutical and biotech company, developing smarter manufacturing practices was a critical part of their strategy to meet their 5-year business targets. By improving OEE, reducing the number and frequency of product deviations, and increasing overall worker productivity, they expected to maximize the financial impact of upcoming blockbuster drugs.
The Challenge
Overall, leadership saw an important opportunity to build better visibility across their manufacturing sites and deploy new digital and AI-based tools that would help optimize processes and drive productivity. They wanted to more easily compare performance across product lines, sites, and production processes to identify underperformers and inefficiencies, and recommend corrective improvements that would increase throughput, quality, and yield.
But while they had a good way of collecting and storing certain enterprise data through onsite historians and data warehouses at each site, the data itself was limited and difficult to use. Data was represented differently site-to-site and was stored in silos that made it challenging to find and pull all relevant IT, OT, and ET data for a particular asset or process. This made it very difficult to get an apples-to-apples comparison of site performance. Additionally, deploying and scaling new digital use cases was incredibly labor-intensive, and the program leadership team didn’t see a clear path to leverage the full potential of AI.
The Journey
The digital transformation team had already started reworking parts of their overall technology architecture to meet the demand for current and future needs. But they needed a means to integrate and contextualize various siloed data including production data, quality data, genealogy information, lab equipment telemetry, and more into a unified digital twin for each site.
After careful research, they selected the Cognite Industrial AI and Data Platform, which includes Cognite Atlas AI™ and Cognite Data Fusion®, and started deploying in key sites worldwide. Uniquely powered by an industrial knowledge graph, high-performance data storage, and an open API, Cognite is the first, secure landing point for the customer’s data in the cloud, contextualizing Historian data (from HiveMQ), SAP, LIMS, ERP data (from Snaplogic), and key documents (from Veeva), among other data sources. With Cognite, the customer is able to connect, onboard, and contextualize data from any source into standard data models without vendor lock-in.
Unlocking Business Value
The customer sees Cognite as a core enabler for rapidly scaling use cases with agentic user experiences that can solve a myriad of business problems across operations, management, maintenance, and quality control. These valuable AI and data capabilities lay the foundation for easier data access, smarter digital and AI use case development, and rapid enterprise-wide expansion over the next 12-48 months.
Today, the customer is prioritizing key data and AI-driven use cases based on common demand from their manufacturing sites. These include:
- Improving schedule adherence through agentic batch tracking and decision support
- Improving productivity and uptime by enabling production and maintenance resources with agentic maintenance planning capabilities
- Improving productivity for process experts through agentic process intelligence applications
With Cognite, they are able to move faster with a shared data foundation and have deployed two sites with asset twins to-date. This is already enabling them to start driving OEE improvements while increasing quality and yield, setting them on track to meet their 5-year business goals.
