A team of researchers led by computer science Professor Serge Belongie at the University of California, San Diego (San Diego, CA, USA) has developed an iPad app called Visipedia that will identify most North American birds, with a little help from a human user.
The app is essentially an interactive field guide that uses computer vision algorithms to analyze a picture a user submits to an iPad, after which the app calls up pages with pictures and information about a bird species that is a likely match.
To find an accurate result, the app may ask the user to identify a specific part of the bird in the original picture, such as the head, the tail or the wing. Sometimes, the app also will ask what colors can be found on that part of the bird. Based on a user’s answers, the app generates more results, until it finds the correct answer.
First, however, the system analyzes the image to minimize the number of questions a user has to answer. Typically, the app takes about five to six questions to get it right. The goal is to fine-tune it until the right answer is produced after only two to three questions.
The app also allows the user to get additional information about the bird species from Wikipedia and Google. Users can also see additional pictures on Flickr.
Belongie has now applied for a grant that would allow him to hire a developer, who could package the app’s code for other tablets and smart phones and maintain it in order that it could be offered in the App Store. Meanwhile, a demo version of the app is available upon request from http://visipedia.org.
Belongie’s app is not the first to take advantage of the cameras built into portable devices. Recently, apps have been developed to measure radioactivity levels, to helps farmers determine the quality of rice, to measure the relative speed and distance between a vehicle and those in front as well as to measure heart rate and rhythm, respiration rate, and blood oxygen saturation levels.
-- by Dave Wilson, Senior Editor, Vision Systems Design