Need Help? Talk to us at +91 7604906337
Manufacturing

92% Accuracy: How Students Built a Computer Vision System That Transformed a Manufacturing Line

Students created an advanced computer vision model for a manufacturing company, achieving 92% defect detection accuracy and cutting inspection time by 60%, while deploying the model directly on the factory floor.

Learners: Vikram Joshi, Meera Nair, Siddharth Patil


The Challenge

An automotive manufacturing company produces steel components where every part must pass visual quality inspection. Surface defects like cracks, scratches, dents, and corrosion can cause parts to fail in the field — and in automotive, that means recalls, lawsuits, and reputational damage.

The inspection process was entirely manual. A team of 12 inspectors stood at the end of the production line, visually examining each part under bright lights, marking defects on paper forms, and sorting parts into “pass” and “fail” bins. It was slow, subjective, and inconsistent. One inspector might flag a scratch that another would ignore. By the end of an 8-hour shift, fatigue set in and accuracy dropped further.

The numbers told the story: a 6% defect escape rate (defective parts shipped to customers), a 12% false positive rate (good parts rejected unnecessarily), and inspection was the bottleneck — production line speed was capped at 40 parts per minute because that’s as fast as the inspectors could work.

The company wanted to automate inspection with computer vision. But this wasn’t an academic problem — they needed a model that could run in real-time on actual factory hardware, in variable lighting conditions, with dusty cameras and vibrating equipment.

Our Approach

Three students — Vikram, Meera, and Siddharth — took this on as their capstone project. Over 12 weeks, they went from understanding the manufacturing domain to deploying a working model on the factory floor.

The first challenge was data. The company had thousands of historical inspection records but almost no labeled images. The students spent the first 3 weeks building a custom dataset — they collected 15,000 images across 4 production shifts, labeled them by cross-referencing with inspector logs, and augmented the dataset with realistic transformations: simulated dust, brightness variations, and motion blur to match real factory conditions.

The Model Architecture. They started with a pre-trained ResNet-50 backbone and fine-tuned it on the defect detection task. But after early experiments, they switched to a custom multi-head architecture — instead of just outputting “defective or not,” the model predicted defect type, severity, and location simultaneously. This gave operators actionable information at a glance: a minor scratch (probably fine for non-critical components) versus a crack (always reject).

Edge Deployment. This is where most academic projects fall apart. The model had to run on an NVIDIA Jetson edge device mounted on the production line, processing images in under 200ms per part at 40 parts per minute. The students:

  1. Quantized the model from FP32 to INT8, reducing inference time from 180ms to 45ms
  2. Optimized the image preprocessing pipeline to avoid unnecessary CPU-GPU transfers
  3. Built a fallback system — if the model’s confidence was below 0.85, the part was flagged for manual review instead of auto-rejected

Key Metrics

92%
Defect Detection Accuracy
60%
Faster Inspection
6% → 1.2%
Defect Escape Rate
45ms
Inference Time Per Part

Results & Impact

After 4 weeks of shadow deployment (running alongside human inspectors to validate accuracy), the system went live. The results exceeded expectations.

The defect escape rate — the percentage of defective parts that shipped to customers — dropped from 6% to 1.2%. The false positive rate dropped from 12% to 3%, meaning far fewer good parts were being thrown away unnecessarily. At the company’s production volume, reducing false positives alone saved an estimated $340,000 per year in wasted material.

The production line speed increased from 40 to 65 parts per minute because inspection was no longer the bottleneck. The model processed each part in 45ms — faster than any human could blink.

The 12 human inspectors weren’t replaced. Their roles evolved. Instead of standing at the line doing repetitive visual checks, they now manage the system, handle the ~5% of parts flagged for manual review, and focus on process improvement — analyzing defect patterns over time to fix root causes upstream in production.

The most impactful finding came months after deployment: when the team analyzed defect trends from the AI’s logs, they discovered that 70% of surface defects originated from a single stamping machine that had a misaligned die. Fixing that one machine eliminated the majority of defects at the source — something that was invisible when inspection was manual and data wasn’t systematically collected.

What the Learners Say

"The model was the easy part. Deploying it on a vibrating factory floor with dusty cameras and fluorescent lighting that changed between shifts — that's where we learned what 'production' actually means. The real world doesn't have clean test sets."

— Vikram Joshi

"When the plant manager showed us the defect trend analysis and told us it helped them fix a root cause that had been costing them for 2 years — that was the moment I understood why AI engineering matters beyond the model accuracy numbers."

— Meera Nair

Contact us

Email: tribeofprogrammers@gmail.com Call: +91 7604906337
© 2025 top