Blog Post
by Amy Freitag / Virginia Sea Grant, NOAA Chesapeake Bay Office

Good Science and Bad Science in Democratized Science

January 22, 2014

A common complaint about the increasing reliance on citizen science to understand our changing environment is that the information collected will not be as rigorous as from professional data and that the data might be used incorrectly to support opinions rather than scientific conclusions. In our investigation of incorporating citizen science into marine protected area monitoring, this concern is quite valid, as marine protected areas are politically charged and general scientific understanding of the ocean is still in its infancy. However, these concerns are not unique to citizen science - and what’s more, citizen science may be more scrutinized about rigor despite equivalent quality dataphoto: Arsenic life story proves that professional scientists make mistakes too, from dimland

The point at which science is judged by peers and the public is usually when results are summarized and publicized through scientific literature and media. There are two points at which citizen science groups might struggle more than professional scientists in meeting their judges’ expectations. The first is in how the raw data is summarized and presented as ‘results’. The second is in what conclusions are drawn from said data. This is mostly because not all citizen science groups have trained statisticians in their group available to make these documents. But as every professional scientist can attest, these steps are difficult even for someone with appropriate training.

For most citizen scientists, volunteer time goes toward data collection - they are trained, handed a data sheet, and told where to go. Protocol development and verification is an important part of the process here for ensuring rigor, and for many citizen science groups, is a step performed by professional scientists. In more well-coordinated research areas, government agencies offer standardized protocols that can be deployed across a wide spatial scale and perform at least some of the data managent. For example, the Environmental Protection Agency hosts water quality protocols and resulting data. Trust in protocol can also be eased by technology that is verified at the manufacturer and deployed by volunteers, but is somewhat “idiot-proof”. Smartphones in particular aid protocol development by offering relatively easy, trusted ways to take geographic coordinates, photographs, and even spectroscopy. Therefore, there a number of ways citizen science groups can verify their methods. While this requires great effort, it’s not quite as gray an area in the rigor judges’ eyes.

When data are all collected and entered into a computer is the first point where authoritative decisions must be made. For instance, if your group is counting marine debris on a beach, how do you count objects that have persisted on the beach since the last count? If counting human activity on a beach, do you count yourself? What time periods to you use to break up the data for comparison - weekly, monthly, annually? How do you account for foggy days where visibility is low when counting birds? These are all questions encountered by citizen science groups in the Central Coast. Answering them largely depends on what scientific questions drive the program, and ultimate responsibility for the decisions falls on the program coordinator's shoulders. Answers to these sorts of questions are often whittled out of professional science programs by reviewers once the resulting article goes for peer review. Not all citizen science programs publish in peer-reviewed journals, so need some sort of place to bounce ideas off others of how best to tackle the data. In the end, every person in charge of data must tackle those questions - and as humans, they are likely to make good calls and bad calls. It’s up to the scientific community as a whole to fix the bad ones.

Providing raw data is the safest way to gain trust - because people can look at the raw data, analyze it for their purposes and draw their own conclusions. But since not everyone has the time or skills to do so, every scientist - professional or not - must summarize their data in a way that makes sense to their audience. It’s even better if the summary includes a paragraph of why this research is important and what impact the knowledge will have on management or education or basic scientific understanding. Citizen science groups, as well as many industry scientists, are scrutinized in this step because of fears that they are writing their opinions or agenda into the summary in addition to pure conclusions from the data. To a certain degree, it’s human nature to want to look for trends or differences and mistakes are made in all flavors of science. Think back to when you first learned about how to read a graph - and a classic book says it all in the title: “How to Lie With Statistics”. Bottom line: it’s really easy to lie unintentionally and readers need a basic understanding of statistics to evaluate your conclusions. Approaching any claim with a grain of salt is smart practice.

In the end, one needs to pay attention to assumptions when evaluating science. Citizen science has to fight the assumption that their data is not rigorous while professional science can rest under the societal assumption that its data is trustworthy and authoritative. These assumptions are constructed partially through institutions around science that lend authority to professionals, like peer-reviewed journals, without similar options for citizen science groups. Such institutions reaching out to all types of science would aid the navigation through evaluation of scientific claims. Fundamentally, healthy skepticism combined with open-mindedness about what science looks like will help find the best information out there.

Recent blog posts

North Coast
Blog Post
by Jessica Williams / California Ocean Science Trust
November 20, 2017

As the North Coast MPA baseline monitoring period is nearing completion, we wanted to extend our thanks to the many individuals and organizations involved in this comprehensive multi-year effort across the North Coast.

North Coast
North Central Coast
Central Coast
South Coast
Blog Post
by Cheryl Chen / Ecotrust, Future of Fish
November 13, 2017

The first of its kind in California, we are proud to announce the launch of the California Fisheries Data Explorer.