Gateway to Think Tanks
来源类型 | Article |
规范类型 | 其他 |
DOI | 10.1371/journal.pone.0069958 |
Comparing the quality of crowdsourced data contributed by expert and non-experts. | |
See L; Comber A; Salk CF; Fritz S; van der Velde M; Perger C; Schill C; McCallum I | |
发表日期 | 2013 |
出处 | PLoS ONE 8 (7): e69958 |
出版年 | 2013 |
语种 | 英语 |
摘要 | There is currently a lack of in-situ environmental data for the calibration and validation of remotely sensed products and for the development and verification of models. Crowdsourcing is increasingly being seen as one potentially powerful way of increasing the supply of in-situ data but here are a number of concerns over the subsequent use of the data, in particular over data quality. This paper examined crowdsourced data from the Geo-Wiki crowdsourcing tool for land cover validation to determine whether there were significant differences in quality between the answers provided by experts and non-experts in the domain of remote sensing and therefore the extent to which crowdsourced data describing human impact and land cover can be used in further scientific research. The results showed that there was little difference between experts and non-experts in identifying human impact although results varied by land cover while experts were better than non-experts in identifying the land cover type. This suggests the need to create training materials with more examples in those areas where difficulties in identification were encountered, and to offer some method for contributors to reflect on the information they contribute, perhaps by feeding back the evaluations of their contributed data or by making additional training materials available. Accuracies were also found to be higher when the volunteers were more consistent in their responses at a given location and when they indicated higher confidence, which suggests that these additional pieces of information could be used in the development of robust measures of quality in the future. |
主题 | Ecosystems Services and Management (ESM) ; Postdoctoral Scholars (PDS) |
URL | http://pure.iiasa.ac.at/id/eprint/10370/ |
来源智库 | International Institute for Applied Systems Analysis (Austria) |
引用统计 | |
资源类型 | 智库出版物 |
条目标识符 | http://119.78.100.153/handle/2XGU8XDN/129595 |
推荐引用方式 GB/T 7714 | See L,Comber A,Salk CF,et al. Comparing the quality of crowdsourced data contributed by expert and non-experts.. 2013. |
条目包含的文件 | ||||||
文件名称/大小 | 资源类型 | 版本类型 | 开放类型 | 使用许可 | ||
Comparing%20the%20qu(2407KB) | 智库出版物 | 限制开放 | CC BY-NC-SA | 浏览 |
个性服务 |
推荐该条目 |
保存到收藏夹 |
导出为Endnote文件 |
谷歌学术 |
谷歌学术中相似的文章 |
[See L]的文章 |
[Comber A]的文章 |
[Salk CF]的文章 |
百度学术 |
百度学术中相似的文章 |
[See L]的文章 |
[Comber A]的文章 |
[Salk CF]的文章 |
必应学术 |
必应学术中相似的文章 |
[See L]的文章 |
[Comber A]的文章 |
[Salk CF]的文章 |
相关权益政策 |
暂无数据 |
收藏/分享 |
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。