Publication:
Assessing the Effectiveness of Citizen Science: A Case Study of Publications Produced by Earthwatch Projects

No Thumbnail Available

Date

2017-12-15

Published Version

Published Version

Journal Title

Journal ISSN

Volume Title

Publisher

The Harvard community has made this article openly available. Please share how this access benefits you.

Research Projects

Organizational Units

Journal Issue

Citation

Research Data

Abstract

Worldwide, decision makers and non-governmental organizations are increasing their use of citizen science-based projects to improve how they manage natural resources, track species at risk, and conserve protected areas (Conrad, 2009). Engaging the public provides access to a large labor force, enabling scientists to collect data on unprecedented spatial and temporal scales. Despite its benefits, citizen science (CS) is frequently stigmatized, and many in the scientific community believe that data collected by volunteers are of lower quality than those collected by experts. There is a strong need to assess citizen science on the merits of its research outcomes, as well as to explore the variables that may be associated with the production of quality data. This thesis builds upon the findings of a 2016 publication that used Earthwatch― an environmental non-profit that has been running field-based citizen science research expeditions since 1971― as a case study. This 2016 publication evaluated the quality of Earthwatch-supported citizen-science data based on each project’s ability to produce peer-reviewed publications and contribute to conservation policies (Chandler et al, 2016). This assessment measured publications and policy outputs for 62 research projects that were partnered with Earthwatch between 2008 and 2014. The analysis looked at seven independent variables, but did not evaluate the sustainability of funding, which is essential for a project to continue its operations on the ground. The analysis also did not consider the implementation of quality assessment and quality control (QAQC) measures, which are important for mitigating volunteer error with data collection. This thesis will address the gaps presented in this initial publication by augmenting the dataset that spanned from 2008-2014. It will add project outputs with respect to the number of peer-reviewed publications produced from most of those same projects from an expanded 2000-2016 window. This thesis will also consider two new independent variables including; 1) a measure to quantify whether projects are meeting their required annual budgets to cover core costs, and 2) a scoring system to quantify the different QAQC measures in place for each project. The results of this study can be used to inform Earthwatch’s decision-making for project selection, as well as bolster research outputs for existing projects. More importantly, the results can also inform other organizations and agencies that support research through citizen science, and provide insight on whether funding and QAQC measures play a significant role in helping citizen science projects produce tangible contributions to science. Overall, the results of this study found a negative correlation between assurance of data quality score and the total peer-reviewed publications produced by the principal investigators supported by Earthwatch. Contrary to my expectations, there was also a negative relationship between the project QAQC Score and the mean number of publications produced annually. A stepwise Akaike Information Criterion test (AIC) was also performed to select the model of best fit. Results indicated that the ratio between funds provided and the project’s approved budget best described the pattern of the mean publications produced annually. The AIC results also indicated that the affiliation of the PI best described the pattern of projects’ mean MoS score (a scaled metric of publication outputs).

Description

Other Available Sources

Keywords

Environmental Sciences

Terms of Use

This article is made available under the terms and conditions applicable to Other Posted Material (LAA), as set forth at Terms of Service

Endorsement

Review

Supplemented By

Referenced By

Related Stories