Call for Artifacts
We invite all authors of papers accepted at ISSRE 2025 Research Track (REG, PER, and TAR papers) and Industry Track (full papers only) to submit artifacts for Artifact Evaluation (AE), showcasing the reproducibility and quality of your work, thereby illuminating the path of advancement and excellence in our field. You may also have the chance to earn badges displayed on your papers in the conference proceedings, highlighting the credibility and rigor of your work. Additionally, artifacts participating in the evaluation will be eligible to compete for the ISSRE 2025 Best Artifact Award, acknowledging authors' endeavors in creating and disseminating exceptional research outcomes. Each submitted artifact will undergo review by at least three members of the Artifact Evaluation Committee (AEC).
Before submitting your artifact, please review the information below. If you have any questions or concerns, you can contact the AEC chairs.
Understanding Artifacts
Artifacts encompass digital entities pivotal to research, spanning code or data, whether crafted by authors to bolster their investigations or generated throughout experiments. The AE process advocates for the transparent sharing of these artifacts, fostering open science principles and bolstering research reproducibility. Within the realm of software reliability, artifacts may encompass tools, data (such as logs or raw data), or a fusion of both. For instance, an Artifact Evaluation Package might comprise an executable tool alongside sample data to showcase its functionality. Note that, while you are encouraged to release the source code of your tools, you can also just release the executable binaries. Should you have any uncertainties regarding the eligibility of your artifact for the AE process, feel free to reach out to the AE Chairs for further guidance.
Objectives of Evaluation and Badge Certification
Aligned with IEEE standards, artifacts evaluation pursues three primary objectives:
- Available Badge: The code and/or datasets, along with any associated data and documentation provided by the authors, should be reasonable and complete. They should have the potential to support reproducibility of the published results.
- Reviewed Badge: The code and/or datasets, along with any associated data and documentation provided by the authors, should be reasonable, complete, and capable of producing the outputs described in the paper. It should also support reproducibility of the published results.
- Reproducible Badge: This badge signifies an additional level of certification. It indicates that an independent party has successfully regenerated computational results using the research objects, methods, code, and analysis conditions created by the authors.
To earn the Reviewed badge, authors must also obtain the Available badge. Similarly, to earn the Reproducible badge, authors must obtain both the Reviewed and Available badges. Further details will be provided in the subsequent information.
How to Prepare and Submit an Artifact?
Your artifact should contain a README file, with a .txt extension. The README file should include seven elements:
- TARGET BADGE: Clarify the badge you are targeting (available, reviewed, reproducible).
- INFO: title of the accepted paper and its submission number, and contact information.
- EXPECTED BEHAVIOUR: what is the artifact intended to do, including a description of its scope and the output it will provide.
- ARTIFACT DESCRIPTION: Explain the content of the artifact, detailing the structure of folders, files, and any other relevant information.
- ENVIRONMENT SETUP: Specify the requirements to run the artifact. This includes minimum RAM, or CPU. If your artifact depends on a specific operating system, you are required to provide a suitable virtualization environment. In any case, we strongly recommend providing a Docker image for ease of access. In such a case, you also need to provide the information on how to run the Docker image.
- GETTING STARTED: Describe how to set up the artifact and validate its general functionality based on a small example data. This setup and execution should ideally take no more than 30 minutes.
- REPRODUCIBILITY: Describes how to reproduce the paper's claims and results in detail.
As a guideline, we recommend that you prepare your submission such that set-up, execution and analysis can be completed within 4 hours.
The inclusion of all sections depends on the type of artifact you are submitting and the badges you are requesting for review. Specifically, the Getting Started section is required for the Reviewed and Reproducible badges, and the Reproducibility Instructions section is required for the Reproducible badge.
Note that it is requested you include a LICENSE file that contains an open-source license, which clearly describes the distribution rights for your project.
Once you have prepared your artifact, including both the required README file and LICENSE file, upload it to Zenodo or a similar service (e.g., figshare) to acquire a DOI. This DOI should be included in both the camera-ready version of your paper and during the artifact submission.
To submit the artifact for evaluation, provide the DOI and additional information about the artifact (e.g., the paper abstract) using the EasyChair platform. You also need to provide a copy of the accepted paper and specify which of the badges you are applying for.
More information about the badges and the best artifact awards follows.
Available Badge
To qualify for the Available badge, your artifact must be accessible through a hosting platform. While Zenodo is commonly used and recommended for its reliability and integration with scholarly practices, authors are free to choose the platform that best suits their needs. Ensure that your artifact is self-contained and versioned during the upload process. Upon upload to your chosen platform, a DOI will be generated, which is essential for artifact evaluation submission.
Kindly note that once submitted, your artifact will be immediately accessible to the public and cannot be modified or deleted. However, you retain the option to upload an updated version, which will be assigned a new DOI (e.g., to address reviewer feedback).
Furthermore, please ensure that your artifact is accompanied by a LICENSE file, clearly outlining the distribution rights. To be eligible for badges such as 'Available' or higher, the file must incorporate an open-source license.
Reviewers will validate the authenticity of the DOI and LICENSE file, and then assess the clarity and comprehensiveness of your artifact and README file.
Reviewed Badge
Reviewers will evaluate your artifact's fundamental functionality based on the instructions provided in the "Getting Started" section of the README file. The Reviewed Badge will be awarded if reviewers can successfully set up the artifact and verify its basic functionality using a small sample dataset within 30 minutes.
Reproducible Badge
Earning the Reproducible badge entails an additional level of scrutiny. Reviewers will reproduce computational results using your research objects, methods, code, and analysis conditions. Therefore, the inclusion of a Reproducibility Instructions section is imperative for this badge. Reviewers will conduct a thorough assessment to validate your artifact's support for the paper's primary claims.
Please note that if reproducing results requires extensive computation or substantial computational resources, reviewers may exercise discretion in running experiments. If obtaining results necessitates significant computation time (e.g., days or months), reviewers may not be able to reproduce them fully.
It is crucial to emphasize that clear, detailed documentation and instructions are essential for badge certification.
Should you require further clarification or assistance, please feel free to contact the AEC Chairs.
2025 ISSRE Best Artifact Awards
In recognition of the authors' dedicated efforts in creating and disseminating exceptional research artifacts, the 2025 ISSRE Best Artifact Award will be presented. This award celebrates the outstanding contributions made by authors to advance the field through the creation and dissemination of remarkable research artifacts.
Evaluation procedure
Artifacts will be evaluated by the Artifact Evaluation Committee. Committee members may contact the authors in case of doubts, to clarify possible open issues with the execution of the artifacts. In any circumstances, no amendments to the declared procedures and the provided delivery are accepted, while clarifications are only intended to avoid factual mistakes by the Evaluation Committee.
Composition of the Evaluation Committee
The Evaluation Committee is open for candidates. If you wish to join the Evaluation Committee, please contact the AE Chairs. Consider that the Evaluation Committee will be mostly active between the end of August and in September, and that we expect each member will be assigned 2 artifacts for review.
Important dates (tentative)
- Artifact Evaluation submission: August 19th, 2025 (AoE)
- Artifact Evaluation notification: September 21th, 2025 (AoE)
All dates refer to AoE time (Anywhere on Earth)
Submission page
The link for submitting the artifacts will be provided soon.
Artifact Evaluation Chairs
- João R. Campos, University of Coimbra, Portugal
- Andrea Ceccarelli, University of Florence, Italy