"WP No.";5;"Lead beneficiary";ARC;"Start month";9;"End month";34 "WP title";"Development of tools and practices for communities" "ObjectivesThis work package will develop, extend and/or adapt practical reproducibility-related tools (incl. software, checklists, standards, workflows, monitoring dashboards & policy creation guides) for funders, publishers, & researchers, to be used in the WP4 pilot activities. An agile, continuous, co-creation design & development approach (in constant dialogue with Tasks 4.2-4.4) will be used to ensure that the respective needs & requirements of pilot users are met.NOTE: The tools earmarked for development here are selected on the basis of the priorities identified in our scoping work for this proposal. Given our co-creative approaches, however, the exact direction of development will be oriented to user-needs discovered through the project implementation. We recognise the list of activities below is highly ambitious. However, we point out that TIER2 development partners have been selected for the tools & initiatives they already bring to the project (many already in production – see section 1.2.2.2), whose effective integration will deliver results at scale & cost." "Description of workTask 5.1 Practical tools and practices for researchers (M9-M34; ARC [lead], KNOW, GESIS, OpenAIRE, UOXF)This task implements, customises, & deploys practical tools & practices for researchers that promote & facilitate reproducibility of scientific results during planning, data collection & results analysis phases of the research lifecycle, based on the scoping & co-creation activities in WPs3/4. Planned activities (to be further refined through the WP4 co-creation design processes) are:5.1.1) Reproducibility Checklist: Building upon the scoping work from WP3 we will create an interactive Reproducibility Checklist to guide researchers in best reproducibility practices for their epistemic context. Hosted via the Reproducibility Hub (WP2), the Checklist will also include well-established, discipline specific & domain agnostic recommendations & standards to report digital objects, listed in & tracked in FAIRsharing via community contributions.5.1.2) Extending DMPs to RMPs: Effective Data Management Plans (DMPs) are already a very important element of ensuring reproducible workflows, & actionable DMP tools are increasing this usefulness, but TIER2 believes the potential is even greater. This activity will extend DMPs into a new concept (“Reproducibility Management Plans” - RMPs), that will include additional reproducibility-related metadata for the reported research. To this end, Argos machine-actionable DMP tool will be extended to enable qualified references*15 across research resources & outputs for the whole lifecycle, connecting to FAIRsharing for the community standards, assisting researchers to self-assess & ensure the reproducibility of their research.5.1.3) Reproducible Workflows: We will next leverage software containerisation technologies, workflow description languages (e.g., CWL), & experiment packaging specifications (e.g. RO-crate) to create a detailed methodology to ensure the reproducibility of computational workflows in different epistemic contexts. Our focus will be on adapting SCHeMa, an open-source tool of this type already successfully used within life sciences, to investigate its use in different epistemic contexts (e.g., extending to social sciences for survey data & computer science for Machine Learning). Task 5.2 Practical tools and practices for publishers (M9-M34; UOXF [lead], KNOW, ARC, GESIS, OpenAIRE)This task will involve all activities for the implementation, customization, validation & deployment of the set of practical tools & practices that promote, facilitate, monitor & assess reproducibility of research outputs, taking into consideration the special needs & requirements of publishers.5.2.1) Workflows to review research datasets & code: Various mechanisms have been proposed as workflows for checking of data (Wilkinson et al. 2019) & code (Nüst and Eglen 2021), while existing publishing schemes & venues (e.g., Octopus, F1000) have processes in place for advanced screening and/or review of datasets & software regarding reproducibility. To further this work, TIER2 will develop streamlined workflows for review of data & code (at the manuscript submission time), including scoping of a common system of ‘stamps’ or validity marks to indicate that work has been checked, validated and/or reused. We also plan to exploit ARC’s Reproducibility Assessment Toolkit (ROAL) in order to spot & highlight research artefacts (e.g., datasets, software, models) stated in the manuscript, indicating potential missing metadata elements that should accompany & enrich research output descriptions5.2.2) Threaded publications: Incremental publishing models (e.g., Octopus, ResearchEquals) where work is published in smaller units (e.g., protocols, methods, data, code) & then linked (or “threaded”) together at the end to form a cohesive whole are seen as potential routes to greater reproducibility, as publication bias is avoided through early, chronologic & continuous reporting of work & greater scrutiny is enabled through early-sharing of work (Hartgerink and van Zelst 2018). This concept involves basically linking published outputs arising at different stages of a research project or programme together to help ensure that insights & usable outputs are not missing from the research system, thus supporting reproducibility & minimising research waste. Greater uptake of such models amongst publishers requires development to establish shared technical standards for linking entities (i.e., research objects, as well as grants, funders, individuals), as well as to establish who should create & share this information & how, enabling the provenance of research artefacts. TIER2 will do this development work, including investigating potential protocols including RO-Crate, Nanopublications, Docmaps & others, & implement pilot activities.5.2.3) Registered reports/open publishing models for new contexts: From our scoping work (sec. 1.2.1.3), extension of models for publishing which minimise publication bias and maximise transparency are a key priority for publishers. TIER2 Associated Partner and their parent company Taylor & Francis have been key players in introducing such models. This task will further support two lines of action:extension of the F1000 transparent publishing model to new contexts – especially the recently launched Routledge Open Research (for humanities/soc sci) Open Research Europe (covering all Horizon research), especially definition of reporting standards/guidelines for qualitative research;extension of T&F’s workflow for registered reports to new disciplines (e.g., education) to investigate the efficacy (or not) of this model for new epistemic contexts. In addition, with leading conferences, we will also investigate potential for registered reports in Computer Science publishing. Task 5.3. Practical tools and practices for funders (M9-M34; ARC [lead], KNOW, ARC, VUmc, GESIS, OpenAIRE, UOXF)This task will implement, customise, & deploy practical tools & practices that enable & support funding institutions in prioritising & tracking reproducibility within their funded projects. Building upon existing production-ready tools from TIER2 partners, this task will develop:5.3.1) Reproducibility Promotion Plans: TIER2 will produce practical advice for funders on how to create a plan to boost the reproducibility of their funded-results. Modelled on the “Research Integrity Promotion Plan (RIPP)” created by VUmc/AU in SOPS4RI, we will work with funders & other stakeholders to develop a tool that assists research funding (and performing) organisations to create “Reproducibility Promotion Plans” (RePPs), including considerations for reforming incentives to foster reproducible practices. The plan will describe how a specific funder will ensure, foster & promote reproducible research practices, avoid detrimental practices, & handle misconduct assisting funders to form their own RePPs to take disciplinary, organisational & national differences into account.5.3.2) Funder extension of RMPs. This activity involves the extension of machine actionable RMP tools developed in T4.1.2 to offer evaluation & reporting functionalities for the officers in funding organisations. Such tools will automate the identification & classification of research artefacts produced during the project & indicate potential enrichment of DMPs with relevant research output descriptions, leveraging the reporting standards in FAIRsharing to also assist in assessing levels of FAIRness.5.3.3) Reproducibility monitoring dashboard: Building upon algorithms created by ARC for the DG-RTD Study “Assessment of reproducibility of research results in EU Framework Programmes”, as well as the OpenAIRE Knowledge Graph & FAIRsharing registry, this activity involves the development of tools that enable funding agencies in tracking & monitoring reusability of research artefacts (datasets, software, tools/systems, etc.) produced within projects of interest, across different programmes, topics & disciplines. An automatically generated report will be generated facilitating assessment & quantification of the impact of policies for data-sharing, code-sharing, etc. We will also scope requirements for making such a dashboard available to publishers." "Deliverables:D5.1: Reproducibility toolset (tools & practices) for researchers (ARC, M34)D5.2: Reproducibility toolset (tools & practices) for publishers (KNOW, M34)D5.3: Reproducibility toolset (tools & practices) for funders (ARC, M34)Milestones:M5.1: TIER2 researcher reproducibility toolset first release (M22)M5.2: TIER2 publisher reproducibility toolset first release (M22)M5.3: TIER2 funder reproducibility toolset first release (M22)"