Multispectral imaging checks for chicken disease
Automated inspection station relies on three-CCD camera and off-the-shelf software to enhance food safety.
By R. Winn Hardin, Contributing Editor
Poultry inspectors are prone to repetitive stress injuries, such as carpal tunnel, because they stand on the processing line for several hours continuously and examine up to 35 chickens a minute for diseases, fecal contamination, and a host of consumer concerns. Unfortunately, there has been no other way to meet the requirements of the Poultry Products Inspection Act, which requires that all poultry be inspected before sale to the public. The problem could worsen as growing consumer demand increases pressure for higher-speed poultry production and inspection lines.
Over the past several years, the US Department of Agriculture (USDA) Agricultural Research Service (ARS) has developed several automated poultry-inspection systems based on machine vision, including a multispectral imaging system that uses a common aperture camera comprising a DuncanTech (now RedLake) three-CCD color camera fitted with a three-interference-filter prism assembly, a National Instruments (NI) frame grabber, and a standard Pentium 4-based PC running The MathWorks MatLab and NI LabVIEW analysis programs. This system, which continues to undergo industrial testing, has already managed to detect systemic diseases with nearly 100% accuracy for poultry running at kill line speeds of 140 birds per minute.
Through years of research, Yud-Ren Chen and Kuanglin Chao of the ARS used principal component analysis of hyperspectral images of chicken carcasses to determine that the average intensity of light reflected by the chicken breast within specific spectral bands could be used to identify diseased chickens. To automate the process, a system needed to be developed that could locate a specific region of interest (ROI) on the bird and make calibrated measurements of the light intensity (see Fig. 1).
Chen’s team developed several systems, ranging from fiberoptic spectrophotometers for spot source spectral measurements to wide-area hyperspectral imaging systems capable of separating dozens of spectral bands per image cube. The most recent and rugged system was designed with price-performance and industrial needs in mind. The team used an altered common aperture Redlake MS2100 three-CCD color camera. Light from eight 100-W tungsten-halogen lamps reflects off the chicken breasts and is collected by the MS2100 camera. The camera’s internal prism assembly separates the light into three optical channels, which then passes through individual 20-nm FWHM Andover Corporation interference filters centered around 480, 540, and 700 nm (see Fig. 2).
Figure 2. Light reflected off the abdomen of a carcass is captured by a three-CCD color camera. The camera uses interference filters to capture monochrome images of the carcass within the spectral bands. Average intensity measurements of the abdominal ROI identify systemically diseased carcasses.
Each filtered channel hits a separate CCD imaging sensor, and the resulting image is passed via a Camera Link interface to a Broadax Systems industrial PC running a 2.4-GHz Pentium IV microprocessor. A Camera Link utility allows camera triggering, and camera settings are controlled by NI LabVIEW software that utilizes ActiveX to implement the image-processing algorithms and real-time operation through MatLab.
“To meet the throughput speed of 140 birds per minute, we needed to acquire images in less than 10 ms,” explains ARS’s Chao. “Because of the filtering and the short exposure, we needed as much light as possible. We also needed to make sure that we overpowered any changes caused by ambient light.” Chao placed four lamps above and four below the chicken carcasses to make sure there was sufficient light in the thigh and lower abdomen area. Using a Labsphere Spectralon diffuse-reflectance target for calibration, integration times of 4.5, 5.0, and 9.5 ms were set for the 700-, 540-, and 460-nm channels, respectively, each at 2 dB (see Fig. 3).
A NI PCI-1428 converts each of the three 656 × 493-pixel images to an 8-bit format. Chao says that with 16 Mbytes of on-board memory for first-in-first-out buffering, the PCI-1428 can easily handle the masking, edge detection, and other image-processing functions needed for real-time inspection, or the processing can be done by the PC’s microprocessor using internal RAM.
The process begins with LabVIEW triggering the Camera Link utility to capture a black reference image with the lens cap on. A black backdrop is placed behind the hanging chicken-carcass-inspection station, and an image is acquired for flat-field correction, focusing on intensity variations in the black backdrop behind the carcass. A value of 0.1 is set as the threshold between background and chicken. The system uses the 700-nm image to create a mask, setting all pixel values below 0.1 to 0 and then taking the resulting mask and applying it over the 480- and 540-nm images. This filters out any spurious reflections from sources other than the chicken carcass. Results are displayed through LabVIEW on the PC monitor (see Fig. 4).
Applying the mask to the 480- and 540-nm images creates a boundary line around the chicken, but most visible clues to systemic disease are found in the breast and abdominal area of the carcass. To find the boundaries of the ROI on an image, four corner points must be located. The lower-left and lower-right corner points are the conjunction points between abdomen and thigh along the chicken boundary. The upper-left and upper-right corner points are the conjunction points between abdomen and wing along the chicken boundary. The lowest point of the boundary, which is on the thigh, is then located as the start point P1 (see Fig. 5).
Figure 5. The ROI for systemic disease in chickens is in the breast and lower abdominal area (top). A contour of the carcass is created from the mask image and a series of operations performed to locate the four corner points that define the breast and abdominal area (bottom).
From this start point along the boundary, seven points were located, noted as P1(x1, y1) to P7(x7, y7); the distance d where d = xi-xi-1 was approximately 5.61 mm, nine pixels in the first batch, and six pixels in the second batch. Based on trial and error, it was found that the corner point would be missed if the distance d were shorter or longer than 5.61 mm and the boundary was not a smooth curve or, in some cases, not a continuous line. The following relational and logical operations were carried out:
AL = {(yj+1 . yj ≥ 0) and (yj+2 . yj+1 ≥ 0) and (yj+3 . yj+2 ≥ 0)} (1)
BL = {(yj+3 . yj+4 ≥ 0) and (yj+4 . yj+5 ≥ 0) and (yj+5 . yj+6 ≥ 0)} (2)
CL = {(yj+3 . yj+4 ≤ 0) and (yj+4 . yj+5 ≤ 0) and (yj+5 . yj+6 ≤ 0)} (3)
DL = {(yj+3 . yj+4 < yj+4 . yj+5) and (yj+4 . yj+5 < yj+5 . yj+6)} (4)
TL = {(AL) and ((BL) or ((CL) and (DL)))} (5)
When the logical value of TL was false, the point at the boundary line and adjacent to the start point P1 was selected as the new start point, and thus, another seven points were located to repeat the operations. Empirically, at the first instance at which the logical value of TL was true, the point P4 was at the conjunction point between chicken thigh and abdomen, and thus, this point was defined as the lower-left corner point PLL. Similarly, the following relational and logical operations were carried out to determine the lower-right corner point:
AR = {(yj+1 . yj ≤ 0) and (yj+2 . yj+1 ≤ 0) and (yj+3 . yj+2 ≤ 0)} (6)
BR = {(yj+3 . yj+4 ≤ 0) and (yj+4 . yj+5 ≤ 0) and (yj+5 . yj+6 ≤ 0)} (7)
CR = {(yj+3 . yj+4 ≥ 0) and (yj+4 . yj+5 ≥ 0) and (yj+5 . yj+6 ≥ 0)} (8)
DR = {(yj+3 . yj+4 > yj+4 . yj+5) and (yj+4 . yj+5 > v+5 . yj+6)} (9)
TR = {(AR) and ((BR) or ((CR) and (DR)))} (10)
The operations were repeated until the first instance at which the logical value of TR is true, upon which the point P4 was defined as the lower right corner point PLR. From the lower left corner point PLL(xi, yj), the point P1(xi-i, yk) in the boundary line was located, and N pixels from P1(xi .i, yk) to PN (xi-i, yk-(N-1)) were compared to the chicken boundary. The distance between P1 and PN was approximately 9.35 mm; thus, the number N was 15 for the first batch and 10 for the second batch. Along the chicken boundary, this operation would be repeated until all pixels from P1 to PN were all within the chicken boundary, which indicated the starting point of the chicken wing. Therefore, the point P1 would be defined as the upper left corner point PUL. Similarly, from the lower right corner point PLR(xi, yj), the point P1(xi -1, yk) in the boundary line was located, and N pixels from P1(xi -1, yk) to PN (xi-1, yk + (N-1)) were compared to the chicken boundary.
This operation is repeated along the chicken boundary, so that the point P1 is determined as the upper right corner point PUR. After the four corner points were located, the straight line between PLL and PLR, the straight line between PUL and PUR, the segments of the chicken boundary lines between PLL and PUL and between PLR and PUR are defined as the ROI boundary.
Using a sample set of unwholesome chickens, the average ROI intensity for both the 480- and 540-nm images is calculated. The average ROI intensity values are also calculated for a set of wholesome chickens. These values are used to find a threshold setting for each channel for separating wholesome chickens from diseased chickens. This calibration step is repeated for each new installation and at periodic times to account for changes in the carcass reflectivity caused by changes in season.
According to Chao, the 540-nm wavelength is the best channel for identifying diseased chickens discovered so far, but by adding the 480-nm channel, the system’s accuracy can rise to greater than 95%.
The ARS system has been field-tested on production lines at commercial chicken-processing plants. Chao is building a multispectral imaging system that will replace the fixed three-channel multispectral imaging system with one utilizing a lightweight ImSpector spectrograph from Spectral Imaging attached to the C-mount of an electron-multiplying CCD (EMCCD) camera. “With the EMCCD camera, we can go to shorter exposure times and higher throughputs in excess of 140 birds per minute, perhaps 200 birds per minute,” Chao notes.
Features, advantages, benefits
Stork Gamco is a manufacturer of poultry-processing equipment that includes automatic transferring of birds from picking to eviscerating, multistage counter flow scalding, high-speed evisceration systems, and intelligent cut-up systems. The company is beginning to test the ARS chicken-inspection system.
“This new technology developed by Chen and his colleagues could possibly revolutionize the poultry-inspection process,” explains Jamie Usrey, application engineer at Stork Gamco. “This approach will allow the inspection process to move to more of a scientific-based methodology and will reduce the subjectivity of the grading process. Incorporating this new process could add a higher level of food safety.”
Company Info
Andover Corporation
Salem, NH, USA
www.andcorp.com
Broadax Systems
City Of Industry, CA, USA
www.bsicomputer.com
Instrumentation and Sensing Laboratory, Henry A. Wallace Beltsville Agricultural Research Center
Beltsville, MD, USA
www.ars.usda.gov
Labsphere
North Sutton, NH, USA
www.labsphere.com
National Instruments
Austin, TX, USA
www.ni.com
Princeton Instruments
Trenton, NJ, USA
www.princetoninstruments.com
Redlake
San Diego, CA, USA
www.redlake.com
Spectral Imaging
Oulu, Finland
www.specim.fi
Stork Gamco
Gainesville, GA, USA
www.stork.com/gamco
The MathWorks
Natick, MA, USA
www.mathworks.com