Journal of Open Source Software (JOSS): design and first-year review

View article
PeerJ Computer Science
Potential reviewers can volunteer via http://joss.theoj.org/reviewer-signup.html.
That reviewer list has since been replaced and is now available at https://docs.google.com/spreadsheets/d/1PAPRJ63yq9aPC1COLjaQp8mHmEq3rZUzwUYxTulyu78/edit?usp=sharing.

Main article text

 

Introduction

Background and Motivation

Why publish software?

Challenges of publishing software

The Journal of Open Source Software

Goals and principles

  • Other than their short length, JOSS articles are conventional articles in every other sense: the journal has an ISSN, articles receive Crossref DOIs with high-quality submission metadata, and articles are appropriately archived.

  • Because software articles are “advertising” and simply pointers to the actual scholarship (the software), short abstract-length submissions are sufficient for these “advertisements.”

  • Software is a core product of research and therefore the software itself should be archived appropriately when submitted to and reviewed in JOSS.

  • Code review, documentation, and contributing guidelines are important for open-source software and should be part of any review. In JOSS, they are the focus of peer review. (While a range of other journals publish software, with various peer-review processes, the focus of the review is usually the submitted article and reviewers might not even look at the code.) The JOSS review process itself, described in Section ‘Peer review in JOSS’, was based on the on-boarding checklist for projects joining the rOpenSci collaboration (Boettiger et al., 2015).

  • The software must be open source by the Open Source Initiative (OSI) definition (https://opensource.org).

  • The software must have a research application.

  • The submitter should be a major contributor to the software they are submitting.

  • The software should be a significant new contribution to the available open-source software that either enables some new research challenge(s) to be addressed or makes addressing research challenges significantly better (e.g., faster, easier, simpler).

  • The software should be feature-complete, i.e., it cannot be a partial solution.

How JOSS works

The JOSS web application and submission tool

Open peer review on GitHub

Whedon and the Whedon-API

Business model and content licensing

  • Crossref membership: $275. This is a yearly fixed cost for the JOSS parent entity—Open Journals—so that article DOIs can be registered with Crossref.

  • Crossref article DOIs: $1. This is a fixed cost per article.

  • JOSS web application hosting (currently with Heroku): $19 per month

Comparison with other software journals

Peer review in JOSS

The JOSS process

  1. Leland McInnes submitted the hdbscan software and article to JOSS on 26 February 2017 using the web application and submission tool. The article is a Markdown file named paper.md, visibly located in the software repository (here, and in many cases, placed together with auxiliary files in a paper directory).

  2. Following a routine check by a JOSS administrator, a “pre-review” issue was created in the joss-reviews GitHub repository (hdbscan JOSS pre-review, 2016). In this pre-review issue, an editor (Daniel S. Katz) was assigned, who then identified and assigned a suitable reviewer (Zhao Zhang). Editors generally identify one or more reviewers from a pool of volunteers based on provided programming language and/or domain expertise1 .

    The editor then asked the automated bot Whedon to create the main submission review issue via the command @whedon start review magic-word=bananas. (“magic-word=bananas” is a safeguard against accidentally creating a review issue prematurely.)

  3. The reviewer then conducted the submission review (hdbscan JOSS review, 2016) (see Fig. 2) by working through a checklist of review items, as described in Section ‘JOSS review criteria’. The author, reviewer, and editor discussed any questions that arose during the review, and once the reviewer completed their checks, they notified the submitting author and editor. Compared with traditional journals, JOSS offers the unique feature of holding a discussion—in the open within a GitHub issue—between the reviewer(s), author(s), and editor. Like a true conversation, discussion can go back and forth in minutes or seconds, with all parties contributing at will. This contrasts with traditional journal reviews, where the process is merely an exchange between the reviewer(s) and author(s), via the editor, which can take months for each communication, and in practice is limited to one or two, perhaps three in some cases, exchanges due to that delay (Tennant et al., 2017).

    Note that JOSS reviews are subject to a code of conduct (Smith & Niemeyer, 2016), adopted from the Contributor Covenant Code of Conduct (Ehmke, 2016). Both authors and reviewers must confirm that they have read and will adhere to this Code of Conduct, during submission and with their review, respectively.

  4. After the review was complete, the editor asked the submitting author to make a permanent archive of the software (including any changes made during review) with a service such as Zenodo or Figshare, and to post a link to the archive in the review thread. This link, in the form of a DOI, was associated with the submission via the command @whedon set 10.5281/zenodo.401403 as archive.

  5. The editor-in-chief used the Whedon RubyGem library on his local machine to produce the compiled PDF, update the JOSS website, deposit Crossref metadata, and issue a DOI for the submission (https://doi.org/10.21105/joss.00205).

  6. Finally, the editor-in-chief updated the review issue with the JOSS article DOI and closed the review. The submission was then accepted into the journal.

JOSS review criteria

  • Conflict of interest

    • As the reviewer I confirm that I have read the JOSS conflict of interest policy and that there are no conflicts of interest for me to review this work.

  • Code of Conduct

  • General checks

    • Repository: Is the source code for this software available at the repository URL?

    • License: Does the repository contain a plain-text LICENSE file with the contents of an OSI-approved software license?

    • Version: Does the release version given match the GitHub release?

    • Authorship: Has the submitting author made major contributions to the software?

  • Functionality

    • Installation: Does installation proceed as outlined in the documentation?

    • Functionality: Have the functional claims of the software been confirmed?

    • Performance: Have any performance claims of the software been confirmed?

  • Documentation

    • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?

    • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.

    • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems)?

    • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?

    • Automated tests: Are there automated tests or manual steps described so that the function of the software can be verified?

    • Community guidelines: Are there clear guidelines for third parties wishing to (1) contribute to the software, (2) report issues or problems with the software, and (3) seek support?

  • Software paper

    • Authors: Does the paper.md file include a list of authors with their affiliations?

    • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?

    • References: Do all archival references that should have a DOI list one (e.g., papers, datasets, software)?

Fast track for reviewed rOpenSci contributions

A Review of the First Year

The Second Year for JOSS

Conclusions

Additional Information and Declarations

Competing Interests

Daniel S. Katz is an Academic Editor for PeerJ CS. Abigail Cabunoc Mayes is an employee of Mozilla Foundation. Tracy K. Teal is an employee of Data Carpentry.

Author Contributions

Arfon M. Smith conceived and designed the experiments, wrote the paper, prepared figures and/or tables, performed the computation work, reviewed drafts of the paper, edited submissions.

Kyle E. Niemeyer conceived and designed the experiments, analyzed the data, wrote the paper, prepared figures and/or tables, performed the computation work, reviewed drafts of the paper, edited submissions.

Daniel S. Katz and Lorena A. Barba, conceived and designed the experiments, wrote the paper, reviewed drafts of the paper, edited submissions.

George Githinji, Melissa Gymrek, Pjotr Prins, Ariel Rokem and Roman Valls Guimera reviewed drafts of the paper, edited submissions.

Kathryn D. Huff, Christopher R. Madan, Abigail Cabunoc Mayes, Kevin M. Moerman, Karthik Ram, Tracy K. Teal and Jacob T. Vanderplas conceived and designed the experiments, reviewed drafts of the paper, edited submissions.

Data Availability

The following information was supplied regarding data availability:

Niemeyer, Kyle (2017): JOSS first-year publication data and figures. figshare. https://doi.org/10.6084/m9.figshare.5147722.

Source code of JOSS, on GitHub: https://github.com/openjournals/joss.

Funding

This work was supported in part by the Alfred P. Sloan Foundation. Work by K E Niemeyer was supported in part by the National Science Foundation (No. ACI-1535065). Work by P Prins was supported by the National Institute of Health (R01 GM123489, 2017-2022). Work by K. Ram was supported in part by The Leona M and Harry B. Helmsley Charitable Trust (No. 2016PG-BRI004). Work by A Rokem was supported by the Gordon & Betty Moore Foundation and the Alfred P. Sloan Foundation, and by grants from the Bill & Melinda Gates Foundation, the National Science Foundation (No. 1550224), and the National Institute of Mental Health (No. 1R25MH112480). There was no additional external funding received for this study. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

42 Citations 16,979 Views 2,955 Downloads

Your institution may have Open Access funds available for qualifying authors. See if you qualify

Publish for free

Comment on Articles or Preprints and we'll waive your author fee
Learn more

Five new journals in Chemistry

Free to publish • Peer-reviewed • From PeerJ
Find out more