Call for Artifact Evaluation
All authors of accepted papers at ISSRE 2023 are encouraged to submit artifacts for Artifact Evaluation (AE), an essential opportunity to enhance the reproducibility and quality of your research. By submitting your artifacts, you not only contribute to the progress of our field but also stand a chance to earn badges that will be displayed on your papers in the conference proceedings, showcasing the credibility and rigour of your work. Each submitted artifact will be reviewed by at least three members of the Artifact Evaluation Committee (AEC).
Before submitting your artifact, please check the information below. Should you have any questions or concerns, you can reach the AEC chairs.
What are Artifacts?
Artifacts refer to digital objects, such as code or data, that are either created by authors to support their research or generated during experiments. The AE process encourages authors to make these artifacts openly available, fostering open science practices and promoting research reproducibility. In the context of software reliability, artifacts can include Tools, Data (e.g., logs, raw data), or a combination of both. For example, an Artifact Evaluation Package might consist of a runnable tool along with sample data to demonstrate its functionality. Note that, while you are encouraged to release source code of your tools, you can also just release the executable binaries. If you are uncertain about whether your artifact qualifies for the AE process, please do not hesitate to contact the AE Chairs for guidance.
Evaluation Objectives and Badging
Following IEEE standards, the evaluation of artifacts aims to achieve three main objectives:
- Available Badge: The code and/or datasets, along with any associated data and documentation provided by the authors, should be reasonable and complete. They should have the potential to support reproducibility of the published results.
- Reviewed Badge: The code and/or datasets, along with any associated data and documentation provided by the authors, should be reasonable, complete, and capable of producing the outputs described in the paper. It should also support reproducibility of the published results.
- Reproducible Badge: This badge signifies an additional level of certification. It indicates that an independent party has successfully regenerated computational results using the research objects, methods, code, and analysis conditions created by the authors. To earn the Reviewed badge, authors must also obtain the Available badge. Similarly, to earn the Reproducible badge, authors must obtain both the Reviewed and Available badges. Further details will be provided in the subsequent information.
How to Prepare and Submit an Artifact?
Your artifact should contain a README file, preferably with a .txt, .md, or .html extension. The README file should include four sections:
- Artifact Description: Explain the content of the artifact, detailing the structure of folders, files, and any other relevant information.
- Environment Setup: Specify the requirements to run the artifact, including operating system, minimum RAM, or CPU. If your artifact requires a specific environment, we recommend providing a Docker image or virtual machine for ease of access. In such a case, you also need to provide the information on how to run the virtual machine or docker image.
- Getting Started: Describe how to set up the artifact and validate its general functionality based on a small example data. This setup and execution should ideally take no more than 30 minutes.
- Reproducibility Instructions: Describes how to reproduce the paper’s claims and results in detail.
The inclusion of all sections depends on the type of artifact you are submitting and the badges you are requesting for review.
Specifically:
- Artifact Description and Environment Setup sections are required for the Available, Reviewed, and Reproducible badges.
- The Getting Started section is required for the Reviewed and Reproducible badges.
- The Detailed Instructions section is required for the Reproducible badge.
Once you have prepared your artifact and README file, upload it to Zenodo or a similar service (e.g., figshare) to acquire a DOI. This DOI should be included in both the camera-ready version of your paper and during the artifact submission.
To submit the artifact for evaluation, provide the DOI and additional information about the artifact (e.g., the paper abstract) using the EasyChair platform. You also need to provide a copy of the accepted paper and you need to specify which of the badges you are applying to.
It follows more information about the badges.
Available Badge
To earn the Available badge, your artifact should be made available via Zenodo, a publicly-funded platform supporting open science. Ensure that the artifact is self-contained and versioned during the upload process. The platform will generate a DOI, which is necessary for artifact evaluation submission.
Please note that the artifact will be immediately accessible to the public and cannot be modified or deleted after submission. However, you have the option to upload an updated version of the artifact, which will receive a new DOI (e.g., to address reviewer comments)
To qualify for this badge, reviewers will verify the DOI's validity and assess whether the artifact and README file provide a sufficiently clear and detailed description.
Reviewed Badge
Reviewers will examine the artifact's basic functionality using the information provided in the "Getting Started" section of the README file. You will earn the badge if the reviewers were able to to set up the artifact and validate its general functionality based on a small example data, taking no more than 30 minutes.
Reproducible Badge
Earning the Reproducible badge requires an additional level of certification. The reviewers will regenerate computational results using your research objects, methods, code, and analysis conditions. As such, the Detailed Instructions section is mandatory for this badge. Reviewers will conduct a comprehensive assessment of the artifact to validate its support for the paper's key claims. Note that if reproducing the results requires long computation time and high computational resources, it is at the discretion of the reviewer to run the experiments. Obviously, if obtaining the results requires days or months of computation, we cannot expect that the reviewer can reproduce the results.
It is important to clarify that a high quality of the documentation and instructions is needed to earn the badges.
Should you have any questions or concerns, please do not hesitate to contact the AEC Chairs.
Important dates (AoE)
- Artifact Evaluation submission: August 21st, 2023
- Artifact Evaluation rebuttal: from September 10th, 2023 to September 12th, 2023
- PC discussion: from September 13th, 2023 to September 16th, 2023
- Artifact Evaluation notification: September 19th, 2023
Artifact Evaluation Chairs
Valerio Terragni
The University of Auckland
New Zealand
Bo Fang
Pacific Northwest National Laboratory
USA