Consumer Packaged Goods

HYPERSPECTRAL IMAGING: Spectral vision systems support the food industry

In a collaboration between AIDO and technological center AINIA, researchers are studying different types of spectral vision systems for food-related applications such as inspection of fruits and vegetables and assessing quality of meat, cheese, fish, and bread.
April 1, 2012
4 min read

Computer vision is currently being used in a number of food-related applications, including inspection and grading of fruits and vegetables as well as assessing the quality of meat, fish, cheese, and bread.

As part of a collaborative project between AIDO—the Technological Institute of Optics, Color and Imaging (Valencia; Spain; www.aido.es) - and AINIA (Paterna, Spain; www.ainia.es) - a technological center with more than 1000 associated food industry companies, researchers are currently studying different types of spectral vision systems. The aim of the project, known as Optifood-Spectral, is to explore how spectral image-analysis techniques can be applied to serve the needs of integrators that develop automated analysis systems.

To demonstrate the effectiveness of such systems when applied to food production, Emilio Ribes, PhD, Multi-Spectral Unit manager, and his colleagues at AIDO have developed two prototype systems using hyperspectral and infrared (IR) imaging techniques.

In the development of the hyperspectral imaging system, a Xeva-1.7-640 camera from Xenics (Leuven, Belgium) with a spectral response from 900–1700 nm was coupled to a liquid-crystal tunable filter from Cambridge Research & Instrumentation Inc. (Hopkinton, MA, USA). In this manner, spectral samples with bandwidths of 10 nm could be obtained across the 950–1650-nm spectral range.

Software developed in MATLAB from The MathWorks (Natick, MA, USA) is first used to configure the system, the number of spectral bands acquired and their corresponding exposure times. Depending on the exposure time and the number of bands to be acquired, the total acquisition time can vary. If 70 bands are to be acquired, for example, the total acquisition time is approximately 2.5 min.

Ribes and his colleagues analyzed the spectral response of a piece of pork using ENVI 4.1 image-analysis software from Exelis Visual Information Solutions (Boulder, CO, USA). In the range from 1200–~1350 nm, and from 1500–1650 nm, meat can be clearly distinguished from fat and bone (see Fig. 1). Although the fat and bone profiles seem to exhibit a similar response, some differences exist in the range from 950–1100 nm.

FIGURE 1. Using a hyperspectral imaging system, researchers at AIDO have analyzed the spectral response of pork. In the range from 1200–~1350 nm, and from 1500–1650 nm, meat (bottom) can be clearly distinguished from fat (top) and bone (center).

Ribes and his colleagues have also developed systems that highlight the differences between how near-IR (NIR) and thermal IR (TIR) cameras can be used to classify such products.

One system consists of a UI-6410SE uEye 640 × 480-pixel GigE camera from IDS Imaging (Obersulm, Germany) and a 384 × 288-pixel Miricle TIR camera from Thermoteknix (Cambridge, UK) mounted on a mechanical arm. Results of acquiring images of grapes after removal from refrigeration are shown both for the TIR camera and the NIR camera (see Fig. 2).

FIGURE 2. Results of acquiring images of grapes after removal from refrigeration are shown both for a TIR camera and an NIR camera. In the left image, the grapes with less water content (i.e., riper grapes) and/or of smaller size (A) appear lighter than the grapes with higher water content and/or of a bigger size (B). This cannot be observed when using an NIR camera (right).

After 15 min of being removed from refrigeration, the grapes tend to warm to reach the 25°C ambient room temperature, but grapes with more water content change their temperature more slowly that grapes with less water content. This velocity warming effect can also be due to different sizes of grapes, i.e., smaller objects are less resistant to changes in temperature.

In the left image, the grapes with less water content (i.e., riper grapes) and/or of smaller size (A) appear lighter than the grapes with higher water content and/or of a bigger size (B). However, this difference cannot be observed when using an NIR camera (the right image).

Both multispectral and IR-based systems from AIDO are prototypes; they have not yet been field-tested or commercialized—a task that Ribes and his colleagues hope will be taken up by the many members of the AINIA alliance.

Note: The project Optifood-Spectral has been funded by the Valencia Regional Government through its Institute of the Small and Medium Enterprises (IMPIVA) and the European Regional Development Fund (ERDF) over 2011.

More Vision Systems Issue Articles
Vision Systems Articles Archives

Sign up for Vision Systems Design Newsletters

Voice Your Opinion!

To join the conversation, and become an exclusive member of Vision Systems Design, create an account today!