This website uses cookies primarily for visitor analytics. Certain pages will ask you to fill in contact details to receive additional information. On these pages you have the option of having the site log your details for future visits. Indicating you want the site to remember your details will place a cookie on your device. To view our full cookie policy, please click here. You can also view it at any time by going to our Contact Us page.

Machine vision improves quality control

24 September 2008

A powerful inspection tool that has been used to improve quality control, machine vision certainly has its place in food processing. One application is for precision slicing various products.

Continue reading this article

Register now for free and access every article and to register for the print edition.

AEW Delford Systems, now part of Marel Food Systems, says it pioneered the application of advanced computer and vision technology to look at and measure a cross-section of product such as bacon prior to it being sliced.

This allowed the slicers to produce greater volumes of more accurate slices to a closer tolerance with consistent pack weights. Its latest products have been developed in partnership with OEM vision specialist, Firstsight Vision, part of the Stemmer Imaging Group.

Careful selection of cameras and the development of suitable image processing methodology using Firstsight's Common Vision Box toolset led to a measurement solution that, the company claims, could not be achieved by traditional machine vision technology. It has been adopted on a range of commercially available high-speed, high-volume, continuous feed slicer products for bacon, ham and cheese.

In addition, powerful grading functions were introduced that, it was said, could deal with the variation of product found in meat and maximise the value of the product being processed.

For bacon slicing, a high-speed camera is used to scan the leading face of the product before slicing. The system measures the area, lean/fat ratio and the lean/fat structure, and takes into account different densities, before adjusting the thickness of the next slice and providing grading information to labeling technology down line. This maximises the on-weight percentages and minimises giveaway.

For products such as cheese and ham, the vision system is used in conjunction with laser profiling. This follows contours so closely at the cutting face that virtually any variation in product composition - however irregular - is detected.

Holes in cheese, voids in ham, areas of fat and even lean/fat ratios are measured at the blade, slice by slice. This means, the company claims, high on-weights, low giveaway, consistently accurate grading, high output and excellent product presentation with minimal manual intervention.

Although the image processing algorithms for slicing bacon, ham and cheese are complex, the product is presented in a fixed orientation to the cameras and the slicer blade, so that the result of the image measurements can be used to automatically adjust the thickness of the slice. The importance of the correct orientation of the product was highlighted in a feasibility study carried out by Iris Vision (, the Dutch distributor of Common Vision Blox.

The study was carried out to determine the potential for image processing methods to control slicing off the green top of a carrot during automatic carrot processing. If too little is cut off, some of the unwanted green tops remain. If too much is cut off, carrot is wasted, leading to reduced productivity. A conveyor belt was used to transport pre-washed carrots between 150-300mm at arbitrary angles past the camera.

The requirements were that not more than 10% of the total length of the carrot should be cut off and that the part cut off should be less than 3cm in length. An upstream process ensured this.

Twenty carrots per second had to be processed. The top of the carrot is the end at which the carrot is at its thickest perpendicular to the longitudinal axis. From the point of maximum thickness the carrot decreases in thickness much more quickly toward the green end of the carrot than toward the root. The algorithm, developed using Common Vision Blox therefore searches for the cutting point on the green side of the thickest point.

Since the carrots are arriving at arbitrary angles, the angle of rotation of the carrot must be determined first. Results from the feasibility study show about 85% of the processing time is spent determining the angle of rotation of the carrot. If mechanical methods are used to present the carrots in a given orientation, the process can be greatly speeded up.

Contact Details and Archive...

Print this page | E-mail this page