When one of the world’s healthcare giants, GE Healthcare, reached out to us with a production challenge, we knew it wouldn’t be a typical project. Operating in more than 100 countries and generating nearly $20 billion in revenue in 2024, GE is a household name in the global pharmaceutical and medical technology space.

Their Norwegian branch asked us to design an AI system that could detect anomalies – specifically tipped vials moving along conveyor belts – and alert operators without interrupting existing operations. What sounded straightforward at first quickly revealed itself to be far more complex than in most industries, given the strict regulations and constraints of pharmaceutical manufacturing.

The Background: Why AI Is Hard to Implement in Pharma

Some industries embrace AI quickly, while others move more carefully. Pharma belongs to the second group – not because of a lack of technology, but because it operates under some of the strictest rules. Every step, from drug formulation to final packaging, must comply with regulations like Good Manufacturing Practice (GMP) and FDA/EMA guidelines. Even the smallest deviation can have serious consequences, both for business and for patient safety.

In sectors like retail or education, AI models can be retrained and deployed quickly. In pharma, however, even the tiniest adjustment to a process or system demands thorough documentation, testing, and regulatory approval. That makes AI adoption slower and more complex, but it also highlights just how valuable and transformative a successful implementation can be.

The lack of reliable and proven methods for validating AI model-based solutions

Pharmaceutical software validation is typically based on GAMP-5 guidelines (Good Automated Manufacturing Practice). Whether a system requires validation depends on its impact on the final product – and ultimately on patient health and safety.

Traditional IT systems can usually be validated through deterministic testing, but AI models are inherently non-deterministic. Their results depend on probabilistic reasoning and data distributions, which makes conventional validation methods hard to apply. If an AI-based system were subject to full validation, it would require extensive documentation, strict lifecycle management, dataset governance, model versioning, and rigorous testing procedures – an enormous overhead that often makes AI adoption nonviable.

In this project, the situation was different. Our anomaly detection system did not directly affect the product or patient safety, but rather helped operators react earlier – and more cost-effectively – to incidents that would eventually be detected anyway. Thus, the solution didn’t require GAMP-5 validation, which made its implementation feasible.

When Modifying the Production Line Isn’t an Option

The main goal of the project was to stop tipped vials from reaching the filling machine – a problem that caused costly downtime. Before our solution, operators had to monitor production manually, either by adjusting machine sensors or reconfiguring equipment depending on the vial type (glass or plastic, ranging from 7 to 100 ml) used in each production cycle. Manual monitoring was time-consuming and led to delayed defect detection, longer downtimes, and higher costs. Early anomaly detection is therefore crucial to prevent such losses.

We needed a smarter approach: a system that could spot anomalies in real time, without constant manual adjustments. This challenge became the starting point for our AI-enabled camera solution – transforming how production monitoring was done at GE Healthcare.

Camera as a Sensor

In pharmaceutical production, making changes to the line itself – like installing new AI sensors – is rarely an option. It’s expensive, time-consuming, and requires layers of regulatory approval. Instead, we had to think differently. Our idea was simple but impactful: use a camera as the sensor.

We designed a solution built around three core components:

  • an industrial-grade Basler ace camera with a low-distortion lens,
  • an edge computing unit optimized for AI (reServer Industrial J4012),
  • and custom machine learning software.

Together, these elements created a system that could detect tipped vials in real time, trigger alarms to alert operators, and – after further training – even distinguish between different vial types. In other words, a camera became the “eyes” of the production line, turning a familiar device into a smart anomaly detection system.

From Idea to Full Functionality

Turning the concept into a working system required careful integration of hardware, software, and production-specific constraints. Step by step, we built a solution that could operate reliably in a demanding pharmaceutical environment.

Physical Setup

We optimized the camera and lens configuration to capture images of vials as they moved along the conveyor belt. In the proof of concept stage, the vision system was placed behind a plastic curtain so we could test it without disrupting production. We also designed custom stands and protective cases that can be easily disinfected, along with mounting fixtures that allow the camera to be installed non-invasively in the production environment.

Computer Vision Analysis

The camera continuously captured images, with each frame processed in real time to determine whether vials were standing or tipped, count them, as well as to identify the vial type. To achieve this, we built dedicated software powered by a computer vision deep learning model and trained a convolutional neural network (CNN) to recognize vial positions with high accuracy. Standing vials are processed by the counting module.

Sound and Light Alerts

Whenever the system detected a tipped vial, it sent a digital input/output (DIO) signal and immediately triggered both a light and sound alarm on the tower, alerting machine operators without delay.

Data Visualization

All analyzed data was stored in a local InfluxDB database and visualized in Grafana dashboards, providing operators with clear, real-time insights into production performance and anomaly trends.

Not Without Challenges

Every custom project comes with its own set of hurdles, especially when tested in a real production environment. During implementation, we faced two main issues:

Challenge 1: False alarms in case of obstructions in the camera’s view

At first, the system occasionally triggered false alarms when an operator’s hand entered the field of view. The initial model had been trained only on clear images, without obstructions, as such cases were not expected in the production environment. We addressed this by enriching the training dataset with images that included hands and other interruptions. As a result, the model learned to ignore these obstructions, preventing false alarms.

Challenge 2: False alarms caused by vials stuck at the conveyor bends

In the initial proof of concept, the system was designed to recognize vials only along a short, straight section of the conveyor belt. However, we discovered that tipped vials often got stuck at the curved part of the format and never reached the original observation area. To address this, we expanded the region of interest to cover the entire conveyor format, including the bend.

During early production tests, some standing vials were misclassified as tipped when stuck in the bend. This issue occurred because the training dataset had not included such cases, as upright vials rarely got stuck during initial data collection. We resolved this by extending the dataset with new images reflecting these problematic scenarios. This significantly reduced the false alarm rate.

Challenge 3: Plastic curtain causing reflections

During testing, we encountered issues with the plastic curtain placed in front of the camera. The curtain caused reflections, blurriness, and occasional ghosting of objects, all of which affected image quality in unpredictable ways. The severity of these distortions varied depending on external conditions – for example, sunlight entering the cleanroom through a nearby window significantly increased reflections on the curtain surface.

These factors reduced the system’s ability to consistently detect vial positions. Placing the camera inside the Unidirectional Airflow Filtered area (UDAF) will provide a clear, unobstructed view of the vials and ensure stable image quality, independent of changing lighting conditions.

Measurable Results

After months of development, testing, and fine-tuning in a real production environment, the system proved its effectiveness. Not only could it detect tipped vials with remarkable precision, but it also added valuable functionalities like vial counting and downtime tracking – demonstrating the full potential of using AI in a highly regulated pharmaceutical setting.

The system enabled operators to react earlier to incidents, reducing production line downtime and associated delay costs.

99.89% Accuracy in Anomaly Detection

Pilot tests conducted under real production conditions demonstrated that a camera-based sensor system can reliably detect tipped vials. Our solution achieved 99.89% accuracy, regardless of vial size or material, proving that AI can operate effectively in a complex pharmaceutical environment.

Downtime Tracking

By analyzing vial movement, the system can track machine downtime with high precision. Stoppages are counted only after 10 seconds of no vial movement, ensuring accurate monitoring without false triggers.

Scalability

The project confirmed that a single hardware setup can handle multiple tasks – such as anomaly detection and downtime tracking – and can be further expanded to incorporate additional functionalities without new hardware components

Non-Invasiveness

Using a camera as a sensor eliminated the need to modify or interfere with the machinery itself. This non-invasive approach respects legal and operational constraints, allowing production to continue uninterrupted.

Flexibility

The system is highly flexible: settings can be updated, and new features added, simply by modifying the software – no hardware changes required. This is particularly valuable when the AI model needs to learn from new events that were not previously encountered in the production line.

Summary

This project demonstrates how AI can transform pharmaceutical production, even in highly regulated environments. By using a camera as a sensor, we developed a non-invasive, flexible system capable of detecting tipped vials in real time, tracking machine downtime, and providing actionable insights through intuitive dashboards. The solution achieved 99.89% accuracy, proving that AI can operate reliably under the strict constraints of pharmaceutical manufacturing.

Beyond solving the immediate challenge, the system was designed for adaptability. When GE Healthcare requested additional functionalities, we were able to extend the capabilities, modifying only the software layer. This highlights the flexibility and scalability of AI-driven solutions, paving the way for further innovations on the production line.

In short, the project not only improved operational efficiency and minimized downtime – and its associated costs – but also created a robust foundation for ongoing AI-enabled improvements, demonstrating what’s possible when technology meets real-world manufacturing challenges.