Animal detection for photographic censusing
Author
Parham, Jason RemingtonOther Contributors
Stewart, Charles V.; Cutler, Barbara M.; Yener, Bülent, 1959-; Radke, Richard J., 1974-; Berger-Wolf, Tanya;Date Issued
2021-12Subject
Computer scienceDegree
PhD;Terms of Use
This electronic version is a licensed copy owned by Rensselaer Polytechnic Institute (RPI), Troy, NY. Copyright of original work retained by author.; Attribution-NonCommercial-NoDerivs 3.0 United StatesMetadata
Show full item recordAbstract
Animal population monitoring is hard to do at large scales. It is too logistically demanding to track thousands of animals with invasive tools like ear tagging, and methods like aerial surveys and hand-based counts cannot track individuals over time. A database of unique animals and their sightings can be a critical tool for conservation; ecologists gain a more intimate and timely understanding of an endangered species' health when they can estimate life expectancy, visualize migration patterns, and quickly measure the effects of conservation policies. This dissertation proposes photographic censusing, a way to visually track the population of an entire species with as little human effort as possible. The method is based on a two-day event called a photographic censusing rally, formed as a sight-resight study (building off of mark-recapture) to estimate the size of the population. Photographic censusing is highly automated, is designed to be bootstrapable for new species, and uses citizen scientists to collect large volumes of photographs across a large geographic area. A novel 5-component animal detection pipeline is proposed to analyze collected images of animals and filter sightings of animals for ID. The pipeline offers a whole-image classifier for quick filtering, a bounding box localizer to find annotations, an annotation labeler to determine species and viewpoints, a coarse segmentation algorithm to mask the background, and a component to recognize poor sightings, and is evaluated on new datasets. This research also presents the Great Grevy's Rally (GGR) results from 2016 and 2018. These censusing events attempted to catalog the entire resident population of Grevy's zebra (Equus grevyi) in Kenya and, combined, collected over 90,000 images from more than 350 volunteers. The GGR analysis in 2018 was done with automated tools but still required large amounts of work (~18,500 human decisions), cost USD $50,000+, and took over three months. This dissertation discusses the work needed during a photographic census and analyzes failure modes that require human interaction. The novel concept of Census Annotation (CA) is introduced to find comparable regions of animals for automated ID, which drastically increases automation. The 56,588 images from GGR 2018 were reprocessed with the latest recommended methods presented in this work; 11,916 annotations were automatically found for comparable, right-side Grevy's zebra; ID curation used 1,297 human decisions before converging, and 2,820+/-167 Grevy's zebra were estimated to be in Kenya in 2018. This result is consistent (within 0.3% of the original estimate of 2,812+/-171) with previous estimates on GGR 2018 data and was achieved with a 93% reduction in human effort.;Description
December 2021; Dept. of Computer ScienceDepartment
Dept. of Computer Science;Publisher
Rensselaer Polytechnic Institute, Troy, NYRelationships
Rensselaer Theses and Dissertations Online Collection;Access
Users may download and share copies with attribution in accordance with a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 license. No commercial use or derivatives are permitted without the explicit approval of the author.; CC BY-NC-ND. Users may download and share copies with attribution in accordance with a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 License. No commercial use or derivatives are permitted without the explicit approval of the author.;Collections
The following license files are associated with this item:
Except where otherwise noted, this item's license is described as Users may download and share copies with attribution in accordance with a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 license. No commercial use or derivatives are permitted without the explicit approval of the author.