Monday, January 24, 2011

Paper Reading #2: Exploring Interfaces to Botanical Species Classification

Comments
Jaideep Balekar - http://jd-hci.blogspot.com/2011/01/paper-reading-2-early-explorations-of.html
Luke Roberts - http://lroberts-tamuchi.blogspot.com/2011/01/paper-reading-2-ifeelim-innovative-real.html

Reference Information
Title: Exploring Interfaces to Botanical Species Classification
Authors: Sean White and Steven Feiner
Presentation Venue: CHI 2010: 28th ACM Conference on Human Factors in Computing Systems; April 10-15, 2010; Atlanta, GA, USA

Summary
This paper describes five user interfaces for identifying botanical species. As explained in the paper, it takes a lot of time and knowledge to identify botanical species, so the researchers have developed algorithms to speed up the process. In developing the five interfaces they used ethnographies of the botanists to determine what would best suit them in the field.

Tablet PC
The first prototype they describe uses a Tablet PC, WiFi Camera and Bluetooth GPS Receiver. After taking the picture, the Tablet PC presents the botanist with possible matches and information about the species. It also keeps track on the plant was found using the GPS. The researchers noticed that this prototype encouraged people to collaborate since one person needed to work the camera while the other the Tablet PC.


Ultra Mobile PC

The second prototype uses Ultra Mobile PC. The camera is a part of the device, which makes the overall device smaller and easier to carry. It also makes use of live video to provide direct feedback about the leaf, making it easier to compare with possible matches. This prototype was preferred over the Tablet PC, because it was easier to use in the field.

iPhone User Interface


The iPhone prototype is very much like the Ultra Mobile PC prototype except that a GPS receiver is integrated into the phone, making it smaller and easier to use. The algorithm used to match the species is either on a remote server or a secondary server in the field, so a network is required for full functionality.

The fourth prototype includes two different augmented reality interfaces that make use of the same architecture as the Table PC and Ultra Mobile PC. Though they both make use of a head-worn display, the first interface uses an orientation sensor and two buttons to give the user control over centering and changing the view. To change the magnification or select a different species, the user tilts his head. The second interface uses a clipboard and a square piece of cardboard. The user places a leaf on the clipboard and matching results are displayed along the right side of the clipboard. To zoom in on the leaf, the user brings the clipboard closer to his eyes.

Augmented Reality Interfaces
The researchers found that the augmented reality prototypes made comparison easier for the botanists because they did not have to compare a physical image to one on a display screen. The zooming features were also easier to learn, because they were second-nature to the users - bring it closer to zoom in and pull it back to zoom out.

Microsoft Surface

The final prototype uses a Microsoft Surface that they plan to use in the Smithsonian and not in the field. In this interface the user places the leaf on the surface and matching results are displayed along the top of the surface. By tapping an image, the user can zoom in on it. For this prototype details of the leaf can be more easily lost since the image is captured from below.



Discussion
This research is very important for botanists since the process of identifying plants takes a lot of time and knowledge. I found it interesting that the researchers made use of ethnographies to help them better design these different prototypes. Though all the prototypes make use of the same matching algorithm, it is interesting to see how different each user interface is. I also found this to be a unique area of CHI.

Since the focus was the different prototypes used, I found that they did a good job at presenting the material. However, I’m also interested to see the algorithm they used in matching the different species.

Future work for this area could be basic follow up studies. All the prototypes have their own strengths and weaknesses, so it will be interesting to see which prototypes emerge as the favorites and if any are better suited for specific tasks in the field.

4 comments:

  1. I have the same desire as you to check out how they made the matching algorithm. With so many different variations of leaves I think it would be extremely difficult to match them by simply using an algorithm. I like the fact that they display many possible matches, and can see why the tablet would be desirable for that reason.

    ReplyDelete
  2. I wonder the information used is strictly from the leaf, or its configuration on the main plant can also be considered. I know some plants have distinctive patterns.

    ReplyDelete
  3. It will be interesting to see how well a computer can identify the subtleties between botanical species with bitmap comparisons alone and which models prove to be the most effective additions.

    ReplyDelete
  4. It would be interesting to see what type of error that the software could deal with in terms of how damaged the plant is possibly.

    ReplyDelete