Among amphibians, adults have traditionally been identified in capture-mark-recapture studies using invasive marking techniques with associated ethical, cost and logistical considerations. However, species in this group may be strong candidates for photo-identification based on natural skin features that removes many of these concerns, with this technique opening up opportunities for citizen scientists to be involved in animal monitoring programs. We investigated the feasibility of using citizen science to distinguish between individuals of an Australian anuran (the sandpaper frog, Lechriodus fletcheri) based on a visual analysis of their natural skin features. We collected photographs of marked individuals in the field over three breeding seasons using a smartphone device. This photo-database was used to create an online survey to determine how easily members of the general public could photo-match individuals by a comparison of two facial skin features; black banding that runs horizontally above the tympanum and a background array of tubercles present in this region. Survey participants were provided with 30 closed, multiple choice questions in which they were asked to match separate images of a query frog from small image pools of potential candidate matches. Participants were consistently able to match individuals with a low matching error rate (mean ± SD of 26 ± 5) despite the relatively low quality of photographs taken from a smartphone device in the field, with most query frogs being matched by a majority of participants (mean ± SD of 86.02 ± 9.52%). These features were found to be unique and stable among adult males and females. Thus, photo-identification is likely to be a valid, non-invasive method for capture-mark-recapture for L. fletcheri, and likely many anurans that display similar facial skin features. This may become an important alternative to artificial marking techniques, with the challenges of manual photo-matching reduced by spreading workloads among members of the public that can be recruited online.
|Number of pages||16|
|Publication status||Published - 9 Apr 2021|