FAIR Computational Workflows

Abstract:

ional workflows describe the complex multi-step methods that are used for data collection, data preparation, analytics, predictive modelling, and simulation that lead to new data products. They can inherently contribute to the FAIR data principles: by processing data according to established metadata; by creating metadata themselves during the processing of data; and by tracking and recording data provenance. These properties aid data quality assessment and contribute to secondary data usage. Moreover, workflows are digital objects in their own right. This paper argues that FAIR principles for workflows need to address their specific nature in terms of their composition of executable software steps, their provenance, and their development.

SEEK ID: https://testing.sysmo-db.org/publications/77

DOI: 10.1162/dint_a_00033

Projects: Xiaoming Test

Publication type: Journal

Journal: Data Intelligence

Publisher: MIT Press - Journals

Citation: Data Intelligence,2(1-2):108-121

Date Published: 2020

URL:

Registered Mode: manually

Authors: Carole Goble, Sarah Cohen-Boulakia, Stian Soiland-Reyes, Daniel Garijo, Yolanda Gil, Michael R. Crusoe, Kristian Peters, Daniel Schober

help Submitter
Citation
Goble, C., Cohen-Boulakia, S., Soiland-Reyes, S., Garijo, D., Gil, Y., Crusoe, M. R., Peters, K., & Schober, D. (2020). FAIR Computational Workflows. In Data Intelligence (Vol. 2, Issues 1-2, pp. 108–121). MIT Press - Journals. https://doi.org/10.1162/dint_a_00033
Activity

Views: 568

Created: 2nd Dec 2021 at 13:44

Last updated: 11th Mar 2024 at 18:14

help Tags

This item has not yet been tagged.

help Attributions

None

Powered by
(v.1.15.0-pre)
Copyright © 2008 - 2024 The University of Manchester and HITS gGmbH