A new approach to classification

Achieve extraordinary flexibility with a new classification network

Written by Esben Korre

At JLI, we have created numerous AI-powered vision systems for complex inspection tasks. Leveraging artificial intelligence helps us automate the detection and classification of defects that would otherwise be very difficult to do with the needed degree of accuracy.

However, we continuously try to push the boundaries of what we can achieve with machine vision, and lately we have spend a significant amount of time in our lab working on how to add another layer to the solution.

Flexible adjustment of grades

Because, even though a well-trained classification network can deliver high accuracy, it also has some limitations.

Say you are inspecting the surface of lacquered boards, and the vision system detects a defect and classifies it as a scratch. Then that’s that. Often that may be enough, but some manufacturing companies might want to look at the severity of the scratch as well. Is the scratch so bad that the item must be scrapped or could it maybe be packed as 2nd grade?

Adjusting the tolerance is particularly relevant for industries where the components used in production may vary in quality. The production of for example lacquered wood surfaces can vary slightly over time, depending on the season, wood quality, quality of the lacquer, etc.

In those cases, you could end up getting a disproportionate amount of scrap at times if you work with a fixed tolerance.

Training on images annotated with “severity”

This is why we are working on developing a more flexible classification network. We do this by training the neural network with images that are annotated with not only the type of defect but also a percentage of how severe a given defect is rated. So a small mark might get a value of 70%, for example, while a pressed corner would be 100 percent.

In this way, the network not only learns to recognize the difference between error types but also to grade the individual errors. As an operator, you can use this to adjust the system's tolerance. For example, to avoid too much scrap, you can increase the tolerance so you accept marginally worse-graded objects for a period of time to avoid excessive waste. Those could then be sorted and packed as 2nd grade.

And in the same way, you can easily increase the tolerance again.

Increased workload upfront

Training a neural network in this way currently demands a larger workload for annotating and a different type of network, but the benefit is substantial: a classification network with unprecedented flexibility.

The ability to easily adjust the accepted grades will also eliminate the many iterations of the network that would otherwise be necessary to retrain a network when something changes in production.

So even though a flexible classification network requires more resources upfront, in many cases you will be looking at a quick ROI.