NDSS Symposium 2024 Call for Artifacts - NDSS Symposium

NDSS Symposium 2024 Call for Artifacts

Call for Artifacts is now closed.

NDSS 2024 introduces an Artifact Evaluation (AE) process, allowing authors to submit an artifact alongside accepted papers.

The artifact may include source code, scripts, datasets, models, test suites, benchmarks, and/or any other material underlying the paper’s contributions. Each submitted artifact will be reviewed by the NDSS Artifact Evaluation Committee (AEC).

The AE process promotes the reproducibility of experimental results and the dissemination of artifacts to benefit our community as a whole. Publishing an artifact benefits, among others, how easily peers can build on it, use it as a comparison point, or solve questions about cases not considered by the original authors.

Authors of NDSS papers have the option of submitting their artifacts shortly after the notification of the (conditional) acceptance of their papers. Papers that pass artifact evaluation may include an additional 1-2 page appendix detailing the artifact and will have evaluation badges on their first page.

The AE process recognizes authors who devote effort to make their work reusable and reproducible by others. This includes making artifacts publicly available, documenting and packaging their work in a way that facilitates reuse, and structuring experiments such that they can be repeated and the results reproduced by other researchers. The AEC will consider outstanding artifacts for Distinguished Artifact Awards.

Call for Artifacts

Before submitting your artifact, please check the information and submission guidelines provided on the Artifact Evaluation website.

Important Dates

Summer Deadline:

  • Wed, 21 Jun 2023: Paper notification to authors
  • Wed, 28 Jun 2023: Artifact registration deadline
  • Wed, 5 Jul 2023: Artifact submission deadline
  • Thu, 6 Jul to Fri, 14 Jul 2023: Kick-the-tires stage (answering AEC reviewer questions)
  • Tue, 5 Sep 2023: Artifact decisions
  • Fri, 8 Sep 2023: Camera-ready deadline for papers

Fall Deadline:

  • Wed, 13 Sep 2023: Paper notification to authors
  • Wed, 20 Sep 2023: Artifact registration deadline
  • Wed, 27 Sep 2023: Artifact submission deadline
  • Thu, 28 Sep to Fri, 6 Oct 2023: Kick-the-tires stage (answering AEC reviewer questions)
  • Fri, 24 Nov 2023: Artifact decisions
  • Wed, 29 Nov 2023: Camera-ready deadline for papers

Evaluation Process

Authors are invited to submit artifacts soon after receiving the paper notification. At least one contact author must be reachable and respond to questions in a timely manner during the entire evaluation period to allow round trip communications between the AEC and the authors. Artifacts can be submitted only in the AE time frame associated with the paper submission round.

In addition to accepted papers, papers that receive a major or minor revision decision are eligible for AE: at artifact submission time, their authors should justify the necessary changes that they intend to carry out on the initially submitted paper and how such changes relate to the submitted artifact. 

At submission time, authors choose which badges (see below) they want to be evaluated for. Members of the AEC will evaluate each artifact using the artifact appendix and instructions as guides, as detailed later in this page. Evaluators will communicate anonymously with authors through HotCRP to resolve minor issues and ask clarifying questions.

Evaluation starts with a kick-the-tires period during which evaluators ensure they can access their assigned artifacts and perform basic operations such as building and running a minimal working example. Artifact evaluations include feedback about the artifact, giving authors the option to address any significant blocking issues for AE work using this feedback. Communication after the kick-the-tires stage end can address interpretation concerns for the produced results or minor syntactic issues in the submitted materials.

For prospective authors: The target should be to present and document your artifact in a way that AEC members can use it and complete the evaluation successfully with minimal (and ideally no) interaction. To ensure that your instructions are complete, we suggest that you run through them on a fresh setup prior to submission, following exactly the instructions you have provided.

Badges

Authors can request their artifact to be evaluated towards one, two, or all of the following badges:

Available. To earn this badge, the AEC must judge that the artifact associated with the paper has been made available for retrieval permanently and publicly. However, an artifact undergoing AE often evolves as a consequence of the AEC feedback it receives. Therefore, if authors opt for “volatile” storage for the initial submission, they must commit to uploading their materials to public services for permanent storage if the badge is awarded. We encourage authors to consider services such as Zenodo, FigShare, or Dryad that also create a citable DOI, which the final artifact appendix version for publication must mention. Other options, such as GitHub and GitLab, are also acceptable if a stable reference to the evaluated version (e.g., a commit hash or tag) is provided in the final appendix version. Furthermore, for this badge, a README file explicitly referencing the paper and a LICENSE file should be provided.

Functional. To earn this badge, the AEC must judge that the artifact conforms to the expectations set by the paper for functionality, usability, and relevance. Also, an artifact must be usable on other machines than the authors’, including when specialized hardware is required (for example, paths, addresses, and identifiers must not be hardcoded.) The AEC will particularly consider three aspects:

  • Documentation: is the artifact sufficiently documented to be exercised by readers of the paper?
  • Completeness: does the submitted artifact include all of the key components described in the paper?
  • Exercisability: does the submitted artifact include the scripts and data needed to run the experiments described in the paper, and can the software be successfully executed?

Reproduced. To earn this badge, the AEC must judge that they can use the submitted artifact to obtain the main results presented in the paper. In short, is it possible for the AEC to independently repeat the experiments and obtain results that support the main claims made by the paper? The goal of this effort is not to reproduce the results exactly, but instead to generate results independently within an allowed tolerance such that the main claims of the paper are validated. For example, in the case of lengthy experiments, scaled-down versions can be proposed if clearly and convincingly explained for their significance.

Artifact Evaluation Committee

Artifact Evaluation Chair

Daniele Cono D’Elia, Sapienza University of Rome

The call for AEC members is open, and applications can be submitted at: https://secartifacts.github.io/ndss2024/aec-call