Call for Artifact Evaluation

 

All authors of accepted papers at ISSRE 2023 are encouraged to submit artifacts for Artifact Evaluation (AE), an essential opportunity to enhance the reproducibility and quality of your research. By submitting your artifacts, you not only contribute to the progress of our field but also stand a chance to earn badges that will be displayed on your papers in the conference proceedings, showcasing the credibility and rigour of your work. Each submitted artifact will be reviewed by at least three members of the Artifact Evaluation Committee (AEC).

Before submitting your artifact, please check the information below. Should you have any questions or concerns, you can reach the AEC chairs.

What are Artifacts?

Artifacts refer to digital objects, such as code or data, that are either created by authors to support their research or generated during experiments. The AE process encourages authors to make these artifacts openly available, fostering open science practices and promoting research reproducibility. In the context of software reliability, artifacts can include Tools, Data (e.g., logs, raw data), or a combination of both. For example, an Artifact Evaluation Package might consist of a runnable tool along with sample data to demonstrate its functionality. Note that, while you are encouraged to release source code of your tools, you can also just release the executable binaries. If you are uncertain about whether your artifact qualifies for the AE process, please do not hesitate to contact the AE Chairs for guidance.

Evaluation Objectives and Badging

Following IEEE standards, the evaluation of artifacts aims to achieve three main objectives:

How to Prepare and Submit an Artifact?

Your artifact should contain a README file, preferably with a .txt, .md, or .html extension. The README file should include four sections:

The inclusion of all sections depends on the type of artifact you are submitting and the badges you are requesting for review.

Specifically:

Once you have prepared your artifact and README file, upload it to Zenodo or a similar service (e.g., figshare) to acquire a DOI. This DOI should be included in both the camera-ready version of your paper and during the artifact submission.

To submit the artifact for evaluation, provide the DOI and additional information about the artifact (e.g., the paper abstract) using the EasyChair platform. You also need to provide a copy of the accepted paper and you need to specify which of the badges you are applying to.

It follows more information about the badges.

Available Badge

To earn the Available badge, your artifact should be made available via Zenodo, a publicly-funded platform supporting open science. Ensure that the artifact is self-contained and versioned during the upload process. The platform will generate a DOI, which is necessary for artifact evaluation submission.

Please note that the artifact will be immediately accessible to the public and cannot be modified or deleted after submission. However, you have the option to upload an updated version of the artifact, which will receive a new DOI (e.g., to address reviewer comments)

To qualify for this badge, reviewers will verify the DOI's validity and assess whether the artifact and README file provide a sufficiently clear and detailed description.

Reviewed Badge

Reviewers will examine the artifact's basic functionality using the information provided in the "Getting Started" section of the README file. You will earn the badge if the reviewers were able to to set up the artifact and validate its general functionality based on a small example data, taking no more than 30 minutes.

Reproducible Badge

Earning the Reproducible badge requires an additional level of certification. The reviewers will regenerate computational results using your research objects, methods, code, and analysis conditions. As such, the Detailed Instructions section is mandatory for this badge. Reviewers will conduct a comprehensive assessment of the artifact to validate its support for the paper's key claims. Note that if reproducing the results requires long computation time and high computational resources, it is at the discretion of the reviewer to run the experiments. Obviously, if obtaining the results requires days or months of computation, we cannot expect that the reviewer can reproduce the results.

It is important to clarify that a high quality of the documentation and instructions is needed to earn the badges.

Should you have any questions or concerns, please do not hesitate to contact the AEC Chairs.

Important dates (AoE)

Artifact Evaluation Chairs

Valerio Terragni
 The University of Auckland
 New Zealand

Bo Fang
 Pacific Northwest National Laboratory
 USA