Tag Archive | Zooniverse

Evaluating Project Participation Through Zooniverse

The following piece was written by OAS Communications Coordinator AnnaKathryn Kruger for the Snapshot Wisconsin newsletter. To subscribe to the newsletter, visit this link

One of the easiest ways to participate in Snapshot Wisconsin is by classifying photos through a website called Zooniverse. Zooniverse is a crowdsourcing service that is accessible to anyone, anywhere, and the site has hosted Snapshot Wisconsin since 2016. Snapshot Wisconsin’s most prolific Zooniverse volunteer has contributed over 65,000 classifications to the project’s dataset. To date, 1.9 million trail camera photos have been processed through Zooniverse, and more than 7,500 different individuals have registered to participate.

Zooniverse volunteers play a pivotal role in Snapshot Wisconsin. Analyzing volunteer participation gives staff a better idea of how to effectively engage volunteers and can also offer researchers a look at how patterns in participation relate to the overall quality of the data acquired from the platform.

In the interest of exploring a quantitative assessment of volunteer participation in Snapshot Wisconsin through Zooniverse, researchers conducted a Latent Profile Analysis (LPA) of our volunteers. LPA can be used to organize a given sample size of people into groups based on observable variables, such as user activity over time. Through this, researchers were able to ascertain how many different groups of people exist in the sample, which individuals belong to which group, and what characteristics are unique to each group. This allowed researchers to hone in on specific patterns in user engagement.

Researchers identified measurable variables unique to each volunteer and their activity on Zooniverse between November 2017 and February 2019. These included the number of days each volunteer was active, time elapsed between active days, and the amount of time volunteers spent on the site on active days. From this, researchers parsed volunteers into three profiles: temporary, intermittent and persistent.

Volunteer Groups

Profiles of Snapshot Wisconsin volunteer participation on Zooniverse

Temporary volunteers are those who exhibited rigorous participation, but only for a short period of time. Intermittent are those characterized by the significant amount of time elapsed between a relatively small number of active days. Persistent are those who demonstrated high levels of activity across the entire period examined.

Measures of accuracy specific to each group revealed that temporary volunteers demonstrate lower accuracy in their classifications compared to intermittent volunteers. Though intermittent volunteers tended to allow more time to go by between active days, the consistent practice ultimately made their classifications more accurate.

In this instance, we may turn to an old adage: practice makes perfect. It comes as no surprise that practice and accuracy are correlated, and that volunteers become better at classifying photos with more time spent doing so. In the graphic on the right, all four photos are of porcupines, though they are of varying degrees of difficulty when it comes to classification. Though classifying photos like these may be tricky at first, over time certain characteristics begin to stand out more readily – a porcupine may be identified by their lumbering gait, or the way that their quills appear from different angles and in different light. The more frequently one sees these traits, the easier they become to identify. Volunteers who participate at any level, whether temporary, intermittent, or persistent, are of great value to the project, and the more time spent on Zooniverse, the more likely that the classifications assigned to each photo are accurate.

Porcupine

Citizen science is an integral part of the Snapshot Wisconsin project, and is in fact core to its mission, which is to rally the knowledge and resources of citizens across Wisconsin and throughout the world to build a comprehensive and highly accurate portrait of Wisconsin wildlife. No two Zooniverse volunteers are quite the same, and each individual informs our understanding of how citizen science can be utilized effectively in research. No matter how one chooses to participate, participation alone brings us closer to our goal.

January #SuperSnap

This month’s #SuperSnap features a mother black bear (Ursus americanus) and her cub from Marathon County. Black bear cubs are born in mid-January with an average litter size of three to four cubs. However, litters of as many as six cubs have been reported, certainly enough to keep mom on her toes!

A huge thanks to Zooniverse and Snapshot WI volunteer Swamp-eye for the #SuperSnap nomination!

M2E62L183-183R399B400

Continue classifying photos on Zooniverse and hashtagging your favorites for a chance to be featured in the next #SuperSnap blog post. Check out all of the nominations by searching “#SuperSnap” on the Snapshot Wisconsin Talk boards.

December #SuperSnap

This month’s #SuperSnap features six sandhill cranes (Grus canadensis) from Waupaca County. Did you know there are many different titles for a group of cranes? This group could be referred to as a “dance,” a “construction,” or a “swoop”. Sandhill cranes migrate across our state every year and can often be spotted in open prairies and marshes.

Thank you to Zooniverse volunteer Swamp-eye for nominating this photo!

M2E75L215-215R399B412

Continue classifying photos on Zooniverse and hashtagging your favorites for a chance to be featured in the next #SuperSnap blog post. Check out all of the nominations by searching “#SuperSnap” on the Snapshot Wisconsin Talk boards.

November #SuperSnap

This month’s #SuperSnap features a tom turkey dressed up in his best bow-tie wattle and ready for Thanksgiving dinner! Turkeys use their wattles for a variety of reasons. This loose skin around their neck allows them to expel extra heat during the hot summer months. Male turkeys (toms) also use their wattle to attract female turkeys (hens) when blood rushes to the area, causing the wattle to turn a bright red color. If a turkey is frightened, blood may also rush out of their wattle, causing it to turn blue.

A tom turkey

Thank you to all our Zooniverse volunteers for nominating their #SuperSnaps. Continue classifying photos on Zooniverse and hashtagging your favorites for a chance to be featured in the next #SuperSnap blog post. Check out all of the nominations by searching “#SuperSnap” on the Snapshot Wisconsin Talk boards.

Un-deer the Weather

Snapshot Wisconsin cameras capture tons of deer throughout the year. In fact, deer account for nearly two-thirds of the wildlife captured on Snapshot Wisconsin trail cameras. Since there are so many photos of deer taken, we see some deer that look like they might be hurt or have a disease. Here are a few examples of deer who are looking a little under the weather and what might be ailing them:

Swollen chest

Nancy (or known by her Zooniverse handle @NBus) is a wildlife health expert here at the Wisconsin DNR who let us know that a swollen chest like this is not unusual in deer. Nancy shared the following response to this image, “It is likely either an abscess (pus-filled) from a penetrating wound that carried bacteria under the skin or a seroma (serum-filled; serum is the non-cellular portion of the blood, not the red and white cells) from a blunt trauma to the chest. The chest is a common part of the body for deer to injure as they run and impact something. And gravity then allows the accumulated pus or serum to gather in a bulge on the lower chest. In either case, the body will likely be able to resolve it and the deer will be fine.”

Warts

Another example that shows up semi-frequently is warts. Like many mammals, deer are susceptible to warts caused by a virus. These growths, called cutaneous fibromas, are caused by the papilloma virus. Usually the deer’s immune system can keep the warts in check or get rid of them. Sometimes if the warts appear in areas that obstruct the deer’s ability to eat, they could become a larger issue (source).

Thin and scraggly

Finally, we will touch on thin or scraggly looking deer. Especially in the spring, deer can start looking very skinny and ragged. This one above is shedding its winter coat and is probably a little thin since this was taken in the middle of May in Wisconsin, when food can be hard to come by. However, this is not outside the norm for deer this time of the year. We are often used to thinking of an image of plump deer, but in reality, the appearance can vary greatly based on time of year and food availability. 

If you want to see more examples of common deer health issues please visit our previous deer health blog titled, “Is this deer sick?” from February 2018. Learn more about Wisconsin health by visiting this DNR link

August #SuperSnap

This month’s #SuperSnap features a pair of wood ducks from Richland County! Their colorful head makes them stand out against the early spring growth in this vernal pool. The wood duck (Aix sponsa) does not have any close relatives in North America (Audubon). This makes it a unique bird that prefers the shaded waters in woodland areas. 

This slideshow requires JavaScript.

 

Thank you Zooniverse volunteers Kjreynolds1957 and Nsykora for nominating these birds. Continue classifying photos on Zooniverse and hashtagging your favorites for a chance to be featured in the next #SuperSnap blog post. Check out all of the nominations by searching “#SuperSnap” on the Snapshot Wisconsin Talk boards.

July #SuperSnap

This month’s #SuperSnap features a coyote (Canis latrins) as it approaches a Snapshot Wisconsin camera deployed in Racine County. Snapshot Wisconsin recently surpassed 30 million trail camera images – staff members and volunteers alike are consistently amazed by some of the images coming out of the project. Thank you to Zooniverse volunteers WINature and Swamp-eye for nominating this series!

This slideshow requires JavaScript.

Continue classifying photos on Zooniverse and hashtagging your favorites for a chance to be featured in the next #SuperSnap blog post. Check out all of the nominations by searching “#SuperSnap” on the Snapshot Wisconsin Talk boards.

June #SuperSnap

This month’s #SuperSnap features a mink from Waupaca county, stepping into ice cold water. Mink are amazing swimmers and divers. Even in the winter, you ask? Yes, thanks to insulation from a thick underfur & oily hair, minks maintain their aquatic lifestyle year round, although less so when it’s cold.

Thanks @Tjper for nominating this sequence!

This slideshow requires JavaScript.

Continue classifying photos on Zooniverse and hashtagging your favorites for a chance to be featured in the next #SuperSnap blog post. Check out all of the nominations by searching “#SuperSnap” on the Snapshot Wisconsin Talk boards.

May #SuperSnap

This month’s #SuperSnap features one of the best quality wolf photos ever captured on a Snapshot camera. Thanks to @crazylikeafox and @smuerett for bringing attention to this one from Waupaca County! The Wisconsin DNR, along with other organizations, have monitored wolf populations in numerous ways including with a network of volunteers who conduct winter tracking surveys. If you want to learn more about wolves in general, visit our wolf fact sheet.

This slideshow requires JavaScript.

Continue classifying photos on Zooniverse and hashtagging your favorites for a chance to be featured in the next #SuperSnap blog post. Check out all of the nominations by searching “#SuperSnap” on the Snapshot Wisconsin Talk boards.

May Science Update: Maintaining Quality in “Big Data”

Snapshot Wisconsin relies on different sources to help classify our growing dataset of more than 27 million photos, including our trail camera hosts, Zooniverse volunteers and experts at Wisconsin DNR. With all these different sources, we need ways to assess the quality and accuracy of the data before it’s put into the hands of decision makers.

A recent publication in Ecological Applications by Clare et. al (2019) looked at the issue of maintaining quality in “big data” by examining Snapshot Wisconsin images. The information from the study was used to develop a model that will help us predict which photos are most likely to contain classification errors. Because Snapshot-specific data were used in this study, we can now use these findings to decide which data to accept as final and which images would be best to go through expert review.

Perhaps most importantly, this framework allows us to be transparent with data users by providing specific metrics on the accuracy of our dataset. These confidence measures can be considered when using the data as input for models, when choosing research questions, and when interpreting the data for use in management decision making.

False-positive, false-negative

The study examined nearly 20,000 images classified on the crowdsourcing platform, Zooniverse. Classifications for each specie were analyzed to identify the false-negative error probability (the likelihood that a species is indicated as not present when it is) and the false-positive error probability (the likelihood that a species is indicated as present when it is not).

false_negative_graph

Figure 2 from Clare et al. 2019 – false-negative and false-positive probabilities by species, estimated from expert classification of the dataset. Whiskers represent 95% confidence intervals and the gray shading in the right panel represents the approximate probability required to produce a dataset with less than 5% error.

The authors found that classifications were 93% correct overall, but the rate of accuracy varied widely by species. This has major implications for wildlife management, where data are analyzed and decisions are made on a species-by-species basis. The graphs below show how variable the false-positive and false-negative probabilities were for each species, with the whiskers representing 95% confidence intervals.

Errors by species

We can conclude from these graphs that each species has a different set of considerations regarding these two errors. For example, deer and turkeys both have low false-negative and false-positive error rates, meaning that classifiers are good at correctly identifying these species and few are missed. Elk photos do not exhibit the same trends.

When a classifier identifies an elk in a photo, it is almost always an elk, but there are a fair number of photos of elk that are classified as some other species. For blank photos, the errors go in the opposite direction: if a photo is classified as blank, there is a ~25% probability that there is an animal in the photo, but there are very few blank photos that are incorrectly classified as having an animal in them.

Assessing species classifications with these two types of errors in mind helps us understand what we need to consider when determining final classifications of the data and its use for wildlife decision support.

Model success

When tested, the model was successful in identifying 97% of misclassified images. Factors considered in the development of the model included: differences in camera placement between sites; the way in which Zooniverse users interacted with the images; and more.

In general, the higher the proportion of users that agreed on the identity of the animal in the image, the greater the likelihood it was correct. Even seasonality was useful in evaluating accuracy for some species – snowshoe hares were found to be easily confused with cottontail rabbits in the summertime, when they both sport brown pelage.

bear_photo

Not only does the information derived from this study have major implications for Snapshot Wisconsin, the framework for determining and remediating data quality presented in this article can benefit a broad range of big-data projects.