JOSS Documentation: Open Journals
JOSS Documentation: Open Journals
JOSS Documentation: Open Journals
Open Journals
i
3.3.3 Other considerations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
3.3.3.1 Authorship . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
3.3.3.2 An important note about ‘novel’ software and citations of relevant work . . . . . . 18
3.3.3.3 What happens if the software I’m reviewing doesn’t meet the JOSS criteria? . . . . 18
3.3.3.4 What about submissions that rely upon proprietary languages/development environ-
ments? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
3.4 Review checklist . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
3.4.1 Conflict of interest . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
3.4.2 Code of Conduct . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
3.4.3 General checks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
3.4.4 Functionality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
3.4.5 Documentation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
3.4.6 Software paper . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
3.5 Editorial Guide . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
3.5.1 Pre-review . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
3.5.1.1 How papers are assigned to editors . . . . . . . . . . . . . . . . . . . . . . . . . . 20
3.5.1.2 Finding reviewers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
3.5.1.3 Starting the review . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
3.5.2 Review . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
3.5.2.1 Adding a new reviewer once the review has started . . . . . . . . . . . . . . . . . . 22
3.5.3 After reviewers recommend acceptance . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
3.5.4 Handling of papers published together with AAS publishing . . . . . . . . . . . . . . . . . 23
3.5.5 Processing of rOpenSci-reviewed and accepted submissions . . . . . . . . . . . . . . . . . 24
3.5.6 Rejecting a paper . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
3.5.6.1 Voting on papers flagged as potentially out of scope . . . . . . . . . . . . . . . . . 24
3.5.7 Sample messages for authors and reviewers . . . . . . . . . . . . . . . . . . . . . . . . . . 24
3.5.7.1 Sample email to potential reviewers . . . . . . . . . . . . . . . . . . . . . . . . . . 24
3.5.7.2 Query scope of submission . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
3.5.7.3 GitHub invite to potential reviewers . . . . . . . . . . . . . . . . . . . . . . . . . . 25
3.5.7.4 Message to reviewers at the start of a review . . . . . . . . . . . . . . . . . . . . . 26
3.5.7.5 Message to authors at the end of a review . . . . . . . . . . . . . . . . . . . . . . . 26
3.5.7.6 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
3.5.8 Overview of editorial process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
3.5.9 Visualization of editorial flow . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
3.5.10 Expectations on JOSS editors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
3.5.10.1 Responding to editorial assignments . . . . . . . . . . . . . . . . . . . . . . . . . 28
3.5.10.2 Continued attention to assigned submissions . . . . . . . . . . . . . . . . . . . . . 29
3.5.11 Out of office . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
3.5.12 Editorial buddy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
3.5.13 Managing notifications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
3.5.13.1 Things you should do when joining the editorial team . . . . . . . . . . . . . . . . 30
3.6 Interacting with Whedon . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
3.6.1 Author commands . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
3.6.1.1 Compiling papers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
3.6.1.1.1 Compiling papers from a non-default branch . . . . . . . . . . . . . . . . 33
3.6.1.2 Finding reviewers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
3.6.2 Editorial commands . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
3.6.2.1 Assigning an editor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
3.6.2.2 Inviting an editor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
3.6.2.3 Adding and removing reviewers . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
3.6.2.4 Starting the review . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
3.6.2.5 Reminding reviewers and authors . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
3.6.2.6 Setting the software archive . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
3.6.2.7 Changing the software version . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
ii
3.6.3 Accepting a paper (dry run) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
3.6.3.1 Check references . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
3.6.4 Accepting a paper (for real) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
3.7 Installing the JOSS application . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
iii
iv
JOSS Documentation
The Journal of Open Source Software (JOSS) is a developer friendly journal for research software packages.
JOSS is an academic journal (ISSN 2475-9066) with a formal peer-review process that is designed to improve the
quality of the software submitted. Upon acceptance into JOSS, we mint a CrossRef DOI for your paper and we list it
on the JOSS website.
This site contains documentation for authors interested in submitting to JOSS, reviewers who have generously volun-
teered their time to review submissions, and editors who manage the JOSS editorial process.
If you’re interested in learning more about JOSS, you might want to read:
• Our announcement blog post describing some of the motivations for starting a new journal
• The paper in Computing in Science and Engineering introducing JOSS
• The paper in PeerJ CS describing the first year of JOSS
• The about page on the main JOSS site
3
JOSS Documentation
If you’d like to submit a paper to JOSS, please read the author submission guidelines in the Submitting a paper to
JOSS section.
5
JOSS Documentation
JOSS is a proud affiliate of the Open Source Initiative. As such, we are committed to public support for open source
software and the role OSI plays therein. In addition, Open Journals (the parent entity behind JOSS) is a NumFOCUS-
sponsored project.
If you’ve already developed a fully featured research code, released it under an OSI-approved license, and written
good documentation and tests, then we expect that it should take perhaps an hour or two to prepare and submit your
paper to JOSS. But please read these instructions carefully for a streamlined submission.
7
JOSS Documentation
• Your paper (paper.md and BibTeX files, plus any figures) must be hosted in a Git-based repository. Placing
these items together with your software (rather than in a separate repository) is strongly encouraged.
In addition, the software associated with your submission must:
• Be stored in a repository that can be cloned without registration.
• Be stored in a repository that is browsable online without registration.
• Have an issue tracker that is readable without registration.
• Permit individuals to create issues/file tickets against your repository.
JOSS publishes articles about research software. This definition includes software that: solves complex modeling
problems in a scientific context (physics, mathematics, biology, medicine, social science, neuroscience, engineering);
supports the functioning of research instruments or the execution of research experiments; extracts knowledge from
large data sets; offers a mathematical library, or similar.
JOSS publishes articles about software that represent substantial scholarly effort on the part of the authors. Your
software should be a significant contribution to the available open source software that either enables some new
research challenges to be addressed or makes addressing research challenges significantly better (e.g., faster, easier,
simpler).
As a rule of thumb, JOSS’ minimum allowable contribution should represent not less than three months of work for
an individual. Some factors that may be considered by editors and reviewers when judging effort include:
• Age of software (is this a well-established software project) / length of commit history.
• Number of commits.
• Number of authors.
• Total lines of code (LOC). Submissions under 1000 LOC will usually be flagged, those under 300 LOC will be
desk rejected.
• Whether the software has already been cited in academic papers.
• Whether the software is sufficiently useful that it is likely to be cited by your peer group.
In addition, JOSS requires that software should be feature-complete (i.e. no half-baked solutions) and designed for
maintainable extension (not one-off modifications of existing tools). “Minor utility” packages, including “thin” API
clients, and single-function packages are not acceptable.
Authors wishing to publish software deemed out of scope for JOSS have a few options available to them:
• Follow GitHub’s guide on how to create a permanent archive and DOI for your software. This DOI can then be
used by others to cite your work.
• Enquire whether your software might be considered by communities such as rOpenSci and pyOpenSci.
While we are happy to review submissions in standalone repositories, we also review submissions that are significant
contributions made to existing packages. It is often better to have an integrated library or package of methods than a
large number of single-method packages.
Important: Begin your paper with a summary of the high-level functionality of your software for a non-specialist
reader. Avoid jargon in this section.
JOSS welcomes submissions from broadly diverse research areas. For this reason, we require that authors include in
the paper some sentences that explain the software functionality and domain of use to a non-specialist reader. We also
require that authors explain the research applications of the software. The paper should be between 250-1000 words.
Your paper should include:
• A list of the authors of the software and their affiliations, using the correct format (see the example below).
• A summary describing the high-level functionality and purpose of the software for a diverse, non-specialist
audience.
• A clear Statement of Need that illustrates the research purpose of the software.
• A list of key references, including to other software addressing related needs.
• Mention (if applicable) a representative set of past or ongoing research projects using the software and recent
scholarly publications enabled by it.
• Acknowledgement of any financial support.
As this short list shows, JOSS papers are only expected to contain a limited set of metadata (see example below), a
Statement of Need, Summary, Acknowledgements, and References sections. You can look at an example accepted
paper. Given this format, a “full length” paper is not permitted, and software documentation such as API (Applica-
tion Programming Interface) functionality should not be in the paper and instead should be outlined in the software
documentation.
Important: Your paper will be reviewed by two or more reviewers in a public GitHub issue. Take a look at the review
checklist and review criteria to better understand how your submission will be reviewed.
This example paper.md is adapted from Gala: A Python package for galactic dynamics by Adrian M. Price-Whelan
http://doi.org/10.21105/joss.00388:
---
title: 'Gala: A Python package for galactic dynamics'
tags:
- Python
- astronomy
- dynamics
- galactic dynamics
- milky way
authors:
- name: Adrian M. Price-Whelan^[Custom footnotes for e.g. denoting who the
˓→corresspoinding author is can be included like this.]
orcid: 0000-0003-0872-7098
affiliation: "1, 2" # (Multiple affiliations must be quoted)
- name: Author Without ORCID
affiliation: 2
- name: Author with no affiliation
affiliation: 3
affiliations:
- name: Lyman Spitzer, Jr. Fellow, Princeton University
index: 1
- name: Institution Name
index: 2
- name: Independent Researcher
index: 3
date: 13 August 2017
bibliography: paper.bib
# Optional fields if submitting to a AAS journal too, see this blog post:
# https://blog.joss.theoj.org/2018/12/a-new-collaboration-with-aas-publishing
aas-doi: 10.3847/xxxxx <- update this with the DOI from AAS once you know it.
aas-journal: Astrophysical Journal <- The name of the AAS journal.
---
# Summary
The forces on stars, galaxies, and dark matter under external gravitational
fields lead to the dynamical evolution of structures in the universe. The orbits
of these bodies are therefore key to understanding the formation, history, and
future state of galaxies. The field of "galactic dynamics," which aims to model
the gravitating components of galaxies to study their structure and evolution,
is now well-established, commonly taught, and frequently used in astronomy.
Aside from toy problems and demonstrations, the majority of problems require
efficient numerical tools, many of which require the same base code (e.g., for
performing numerical orbit integration).
# Statement of need
# Mathematics
Single dollars ($) are required for inline mathematics e.g. $f(x) = e^{\pi/x}$
$$\Theta(x) = \left\{\begin{array}{l}
0\textrm{ if } x < 0\cr
1\textrm{ else}
\end{array}\right.$$
# Citations
If you want to cite a software repository URL (e.g. something on GitHub without a
˓→preferred
citation) then you can do it with the example BibTeX entry below for @fidgit.
# Figures
# Acknowledgements
# References
@book{Binney:2008,
Adsnote = {Provided by the SAO/NASA Astrophysics Data System},
Adsurl = {http://adsabs.harvard.edu/abs/2008gady.book.....B},
Author = {{Binney}, J. and {Tremaine}, S.},
Booktitle = {Galactic Dynamics: Second Edition, by James Binney and Scott
˓→Tremaine.~ISBN 978-0-691-13026-2 (HB).~Published by Princeton University Press,
@article{gaia,
author = {{Gaia Collaboration}},
title = "{The Gaia mission}",
journal = {\aap},
archivePrefix = "arXiv",
eprint = {1609.04153},
primaryClass = "astro-ph.IM",
keywords = {space vehicles: instruments, Galaxy: structure, astrometry,
˓→parallaxes, proper motions, telescopes},
year = 2016,
month = nov,
volume = 595,
doi = {10.1051/0004-6361/201629272},
adsurl = {http://adsabs.harvard.edu/abs/2016A%26A...595A...1G},
}
@article{astropy,
author = {{Astropy Collaboration}},
title = "{Astropy: A community Python package for astronomy}",
journal = {\aap},
archivePrefix = "arXiv",
eprint = {1307.6212},
(continues on next page)
year = 2013,
month = oct,
volume = 558,
doi = {10.1051/0004-6361/201322068},
adsurl = {http://adsabs.harvard.edu/abs/2013A%26A...558A..33A}
}
@misc{fidgit,
author = {A. Smith},
title = {Fidgit: An ungodly union of GitHub and Figshare},
year = {2020},
publisher = {GitHub},
journal = {GitHub repository},
url = {https://github.com/arfon/fidgit}
}
Note that the paper ends with a References heading, and the references are built automatically from the content in the
.bib file. You should enter in-text citations in the paper body following correct Markdown citation syntax.
JOSS uses Pandoc to compile papers from their Markdown form into a PDF. You can test that your paper is properly
structured using the JOSS paper preview service.
There are no fees for submitting or publishing in JOSS. You can read more about our cost and sustainability model.
Authors are welcome to submit their papers to a preprint server (arXiv, bioRxiv, SocArXiv, PsyArXiv etc.) at any
point before, during, or after the submission and review process.
Submission to a preprint server is not considered a previous publication.
3.1.9 Authorship
Purely financial (such as being named on an award) and organizational (such as general supervision of a research
group) contributions are not considered sufficient for co-authorship of JOSS submissions, but active project direction
and other forms of non-code contributions are. The authors themselves assume responsibility for deciding who should
be credited with co-authorship, and co-authors must always agree to be listed. In addition, co-authors agree to be
accountable for all aspects of the work, and to notify JOSS if any retraction or correction of mistakes are needed after
publication.
We strongly prefer software that doesn’t rely upon proprietary (paid for) development environments/programming
languages. However, provided your submission meets our requirements (including having a valid open source license)
then we will consider your submission for review. Should your submission be accepted for review, we may ask you,
the submitting author, to help us find reviewers who already have the required development environment installed.
After submission:
• An Associate Editor-in-Chief will carry out an initial check of your submission, and proceed to assign a handling
editor.
• The handling editor will assign two or more JOSS reviewers, and the review will be carried out in the JOSS
reviews repository.
• Authors will respond to reviewer-raised issues (if any are raised) on the submission repository’s issue tracker.
Reviewer and editor contributions, like any other contributions, should be acknowledged in the repository.
• Upon successful completion of the review, authors will make a tagged release of the software, and deposit a
copy of the repository with a data-archiving service such as Zenodo or figshare, get a DOI for the archive, and
update the review issue thread with the version number and DOI.
• After we assign a DOI for your accepted JOSS paper, its metadata is deposited with CrossRef and listed on the
JOSS website.
• The review issue will be closed, and an automatic tweet from @JOSS_TheOJ will announce it!
If you want to learn more details about the review process, take a look at the reviewer guidelines.
Firstly, thank you so much for agreeing to review for the Journal of Open Source Software (JOSS), we’re delighted to
have your help. This document is designed to outline our editorial guidelines and help you understand our requirements
for accepting a submission into the JOSS. Our review process is based on a tried-and-tested approach of the rOpenSci
collaboration.
We like to think of JOSS as a ‘developer friendly’ journal. That is, if the submitting authors have followed best
practices (have documentation, tests, continuous integration, and a license) then their review should be rapid.
For those submissions that don’t quite meet the bar, please try to give clear feedback on how authors could improve
their submission. A key goal of JOSS is to raise the quality of research software generally and you (the experienced
reviewer) are well placed to give this feedback.
A JOSS review involves checking submissions against a checklist of essential software features and details in the
submitted paper. This should be objective, not subjective; it should be based on the materials in the submission as
perceived without distortion by personal feelings, prejudices, or interpretations.
We encourage reviewers to file issues against the submitted repository’s issue tracker. When you have completed
your review, please leave a comment in the review issue saying so.
You can include in your review links to any new issues that you the reviewer believe to be impeding the acceptance of
the repository. (Similarly, if the submitted repository is a GitHub repository, mentioning the review issue URL in the
submitted repository’s issue tracker will create a mention in the review issue’s history.)
The definition of a conflict of Interest in peer review is a circumstance that makes you “unable to make an impartial
scientific judgment or evaluation.” (PNAS Conflict of Interest Policy). JOSS is concerned with avoiding any actual
conflicts of interest, and being sufficiently transparent that we avoid the appearance of conflicts of interest as well.
As a reviewer, COIs are your present or previous association with any authors of a submission: recent (past four years)
collaborators in funded research or work that is published; and lifetime for the family members, business partners,
and thesis student/advisor or mentor. In addition, your recent (past year) association with the same organization of a
submitter is a COI, for example, being employed at the same institution.
If you have a conflict of interest with a submission, you should disclose the specific reason to the submissions’ editor.
This may lead to you not being able to review the submission, but some conflicts may be recorded and then waived, and
if you think you are able to make an impartial assessment of the work, you should request that the conflict be waived.
For example, if you and a submitter were two of 2000 authors of a high energy physics paper but did not actually
collaborate. Or if you and a submitter worked together 6 years ago, but due to delays in the publishing industry, a
paper from that collaboration with both of you as authors was published 2 year ago. Or if you and a submitter are both
employed by the same very large organization but in different units without any knowledge of each other.
Declaring actual, perceived, and potential conflicts of interest is required under professional ethics. If in doubt: ask
the editors.
As outlined in the submission guidelines provided to authors, the JOSS paper (the compiled PDF associated with this
submission) should only include:
• A list of the authors of the software and their affiliations.
• A summary describing the high-level functionality and purpose of the software for a diverse, non-specialist
audience.
• A clear statement of need that illustrates the purpose of the software.
• A description of how this software compares to other commonly-used packages in this research area.
• Mentions (if applicable) of any ongoing research projects using the software or recent scholarly publications
enabled by it.
• A list of key references including a link to the software archive.
Important: Note the paper should not include software documentation such as API (Application Programming
Interface) functionality, as this should be outlined in the software documentation.
Important: Note, if you’ve not yet been involved in a JOSS review, you can see an example JOSS review checklist
here.
There should be an OSI approved license included in the repository. Common licenses such as those listed on chooseal-
icense.com are preferred. Note there should be an actual license file present in the repository not just a reference to
the license.
Acceptable: A plain-text LICENSE or COPYING file with the contents of an OSI approved licenseNot
acceptable: A phrase such as ‘MIT license’ in a README file
Reviewers should verify that the software represents substantial scholarly effort. As a rule of thumb, JOSS’ minimum
allowable contribution should represent not less than three months of work for an individual. Signals of effort may
include:
• Age of software (is this a well-established software project) / length of commit history.
• Number of commits.
• Number of authors.
• Lines of code (LOC): These statistics are usually reported by Whedon in the pre-review issue thread.
• Whether the software has already been cited in academic papers.
• Whether the software is sufficiently useful that it is likely to be cited by other researchers working in this domain.
These guidelines are not meant to be strictly prescriptive. Recently released software may not have been around long
enough to gather citations in academic literature. While some authors contribute openly and accrue a long and rich
commit history before submitting, others may upload their software to GitHub shortly before submitting their JOSS
paper. Reviewers should rely on their expert understanding of their domain to judge whether the software is of broad
interest (likely to be cited by other researchers) or more narrowly focused around the needs of an individual researcher
or lab.
Note: The decision on scholarly effort is ultimately one made by JOSS editors. Reviewers are asked to flag sub-
missions of questionable scope during the review process so that the editor can bring this to the attention of the JOSS
editorial team.
3.3.2.3 Documentation
There should be sufficient documentation for you, the reviewer to understand the core functionality of the software
under review. A high-level overview of this documentation should be included in a README file (or equivalent). There
should be:
The authors should clearly state what problems the software is designed to solve and who the target audience is.
There should be a clearly-stated list of dependencies. Ideally these should be handled with an automated package
management solution.
Good: A package management file such as a Gemfile or package.json or equivalent OK: A list of depen-
dencies to install Bad (not acceptable): Reliance on other software not listed by the authors
The authors should include examples of how to use the software (ideally to solve real-world analysis problems).
Reviewers should check that the software API is documented to a suitable level.
Good: All functions/methods are documented including example inputs and outputs OK: Core API func-
tionality is documented Bad (not acceptable): API is undocumented
Note: The decision on API documentation is left largely to the discretion of the reviewer and their experience of
evaluating the software.
3.3.2.4 Functionality
Reviewers are expected to install the software they are reviewing and to verify the core functionality of the software.
3.3.2.5 Tests
Authors are strongly encouraged to include an automated test suite covering the core functionality of their software.
Good: An automated test suite hooked up to an external service such as Travis-CI or similar OK: Docu-
mented manual steps that can be followed to objectively check the expected functionality of the software
(e.g., a sample input file to assert behavior) Bad (not acceptable): No way for you, the reviewer, to
objectively assess whether the software works
3.3.3.1 Authorship
As part of the review process, you are asked to check whether the submitting author has made a ‘substantial contribu-
tion’ to the submitted software (as determined by the commit history) and to check that ‘the full list of paper authors
seems appropriate and complete?’
As discussed in the submission guidelines for authors, authorship is a complex topic with different practices in different
communities. Ultimately, the authors themselves are responsible for deciding which contributions are sufficient for
co-authorship, although JOSS policy is that purely financial contributions are not considered sufficient. Your job as a
reviewer is to check that the list of authors appears reasonable, and if it’s not obviously complete/correct, to raise this
as a question during the review.
3.3.3.2 An important note about ‘novel’ software and citations of relevant work
Submissions that implement solutions already solved in other software packages are accepted into JOSS provided that
they meet the criteria listed above and cite prior similar work. Reviewers should point out relevant published work
which is not yet cited.
3.3.3.3 What happens if the software I’m reviewing doesn’t meet the JOSS criteria?
We ask that reviewers grade submissions in one of three categories: 1) Accept 2) Minor Revisions 3) Major Revisions.
Unlike some journals we do not reject outright submissions requiring major revisions - we’re more than happy to give
the author as long as they need to make these modifications/improvements.
3.3.3.4 What about submissions that rely upon proprietary languages/development environments?
As outlined in our author guidelines, submissions that rely upon a proprietary/closed source language or development
environment are acceptable provided that they meet the other submission requirements and that you, the reviewer, are
able to install the software & verify the functionality of the submission as required by our reviewer guidelines.
If an open source or free variant of the programming language exists, feel free to encourage the submitting author to
consider making their software compatible with the open source/free variant.
JOSS reviews are checklist-driven. That is, there is a checklist for each JOSS reviewer to work through when com-
pleting their review. A JOSS review is generally considered incomplete until the reviewer has checked off all of their
checkboxes.
Below is an example of the review checklist for the Yellowbrick JOSS submission.
Important: Note this section of our documentation only describes the JOSS review checklist. Authors and reviewers
should consult the review criteria to better understand how these checklist items should be interpreted.
• I confirm that I have read the JOSS conflict of interest policy and that: I have no COIs with reviewing this work
or that any perceived COIs have been waived by JOSS for the purpose of this review.
• I confirm that I read and will adhere to the JOSS code of conduct.
• Repository: Is the source code for this software available at the repository url?
• License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software
license?
• Contribution and authorship: Has the submitting author made major contributions to the software? Does the
full list of paper authors seem appropriate and complete?
3.4.4 Functionality
3.4.5 Documentation
• A statement of need: Do the authors clearly state what problems the software is designed to solve and who the
target audience is?
• Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with
an automated package management solution.
• Example usage: Do the authors include examples of how to use the software (ideally to solve real-world
analysis problems).
• Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g.,
API method documentation)?
• Automated tests: Are there automated tests or manual steps described so that the functionality of the software
can be verified?
• Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2)
Report issues or problems with the software 3) Seek support
• Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse,
non-specialist audience been provided?
• A statement of need: Do the authors clearly state what problems the software is designed to solve and who the
target audience is?
• State of the field: Do the authors describe how this software compares to other commonly-used packages?
• Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing
quality)?
• References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g.,
papers, datasets, software)? Do references in the text use the proper citation syntax?
The Journal of Open Source Software (JOSS) conducts all peer review and editorial processes in the open, on the
GitHub issue tracker.
JOSS editors manage the review workflow with the help of our bot, @whedon. The bot is summoned with commands
typed directly on the GitHub review issues. For a list of commands, type: @whedon commands.
Note: To learn more about @whedon’s functionalities, take a look at our dedicated guide.
3.5.1 Pre-review
Once a submission comes in, it will be in the queue for a quick check by the Editor-in-chief (EiC). From there, it
moves to a PRE-REVIEW issue, where the EiC will assign a handling editor, and the author can suggest reviewers.
Initial direction to the authors for improving the paper can already happen here, especially if the paper lacks some
requested sections.
Important: If the paper is out-of-scope for JOSS, editors assess this and notify the author in the PRE-REVIEW issue.
Editors can flag submissions of questionable scope using the command @whedon query scope.
The EiC assigns an editor (or a volunteering editor self-assigns) with the command @whedon assign
@username as editor in a comment.
Note: If a paper is submitted without a recommended editor, it will show up in the weekly digest email under
the category ‘Papers currently without an editor.’ Please review this weekly email and volunteer to edit papers that
look to be in your domain. If you choose to be an editor in the issue thread type the command @whedon assign
@yourhandle as editor or simply @whedon assign me as editor
By default, unless an editor volunteers, the Associated Editor-in-chief (AEiC) on duty will attempt to assign an incom-
ing paper to the most suitable handling editor. While AEiCs will make every effort to match a submission with the
most appropriate editor, there are a number of situations where an AEiC may assign a paper to an editor that doesn’t
fit entirely within the editor’s research domains:
• If there’s no obvious fit to any of the JOSS editors
• If the most suitable editor is already handling a large number of papers
• If the chosen editor has a lighter editorial load than other editors
In most cases, an AEiC will ask one or more editors to edit a submission (e.g. @editor1, @editor 2 -
would one of you be willing to edit this submission for JOSS). If the editor doesn’t re-
spond within ~3 working days, the AEiC may assign the paper to the editor regardless.
Editors may also be invited to edit over email when an AEiC runs the command @whedon invite @editor1
as editor.
At this point, the handling editor’s job is to identify reviewers who have sufficient expertise in the field of software
and in the field of the submission. JOSS papers have to have a minimum of two reviewers per submission, except for
papers that have previously been peer-reviewed via rOpenSci. In some cases, the editor also might want to formally
add themself as one of the reviewers. If the editor feels particularly unsure of the submission, a third (or fourth)
reviewer can be recruited.
To recruit reviewers, the handling editor can mention them in the PRE-REVIEW issue with their GitHub handle, ping
them on Twitter, or email them. After expressing initial interest, candidate reviewers may need a longer explanation
via email. See sample reviewer invitation email, below.
Reviewer Considerations
• It is rare that all reviewers have the expertise to cover all aspects of a submission (e.g., knows the language really
well and knows the scientific discipline well). As such, a good practice is to try and make sure that between the
two or three reviewers, all aspects of the submission are covered.
• Selection and assignment of reviewers should adhere to the JOSS COI policy.
Potential ways to find reviewers
Finding reviewers can be challenging, especially if a submission is outside of your immediate area of expertise. Some
strategies you can use to identify potential candidates:
• Search the reviewer spreadsheet of volunteer reviewers.
– When using this spreadsheet, pay attention to the number of reviews this individual is already doing to
avoid overloading them.
– It can be helpful to use the “Data > Filter Views” capability to temporarily filter the table view to include
only people with language or domain expertise matching the paper.
• Ask the author(s): You are free to ask the submitting author to suggest possible reviewers by using the reviewer
spreadsheet and also people from their professional network. In this situation, the editor still needs to verify that
their suggestions are appropriate.
• Use your professional network: You’re welcome to invite people you know of who might be able to give a good
review.
• Search Google and GitHub for related work, and write to the authors of that related work.
– You might like to try this tool from @dfm.
• Ask on social networks: Sometimes asking on Twitter for reviewers can identify good candidates.
• Check the work being referenced in the submission:
– Authors of software that is being built on might be interested in reviewing the submission.
– Users of the the software that is being submission be interested in reviewing the submission
• Avoid asking JOSS editors to review: If at all possible, avoid asking JOSS editors to review as they are generally
very busy editing their own papers.
Once a reviewer accepts, the handling editor runs the command @whedon assign @username as reviewer
in the PRE-REVIEW issue. Add more reviewers with the command @whedon add @username as reviewer.
Under the uncommon circumstance that a review must be started before all reviewers have been identified (e.g., if
finding a second reviewer is taking a long time and the first reviewer wants to get started), an editor may elect to start
the review and add the remaining reviewers later. To accomplish this, the editor will need to hand-edit the review
checklist to create space for the reviewers added after the review issues is created.
Note: The assign command clobbers all reviewer assignments. If you want to add an additional reviewer use the
add command.
Next, run the command @whedon start review. If you haven’t assigned an editor and reviewer, this command
will fail and @whedon will tell you this. This will open the REVIEW issue, with prepared review checklists for each
reviewer, and instructions. The editor should close the PRE-REVIEW issue, at this point, and move the conversation
to the separate REVIEW issue.
3.5.2 Review
The REVIEW issue contains some instructions, and reviewer checklists. The reviewer(s) should check off items of the
checklist one-by-one, until done. In the meantime, reviewers can engage the authors freely in a conversation aimed at
improving the paper.
If a reviewer recants their commitment or is unresponsive, editors can remove them with the command @whedon
remove @username as reviewer. You can also add new reviewers in the REVIEW issue, but in this case, you
need to manually add a review checklist for them by editing the issue body.
Comments in the REVIEW issue should be kept brief, as much as possible, with more lengthy suggestions or requests
posted as separate issues, directly in the submission repository. A link-back to those issues in the REVIEW is helpful.
When the reviewers are satisfied with the improvements, we ask that they confirm their recommendation to accept the
submission.
Sometimes you’ll need to add a new reviewer once the main review (i.e. post pre-review) is already underway. In this
situation you should do the following:
• In the review thread, do @whedon add @newreviewer as reviewer.
• Manually edit the first message in the thread to add a checklist for @newreviewer.
In the future, this will be more automated but for now, there’s some manual work required.
When a submission is ready to be accepted, we ask that the authors issue a new tagged release of the software (if
changed), and archive it (on Zenodo, figshare, or other). The authors then post the version number and archive DOI
in the REVIEW issue. The handling editor executes the pre-publication steps, and pings the EiCs for final processing.
Pre-publication steps:
• Get a new proof with the @whedon generate pdf command.
• Download the proof, check all references have DOIs, follow the links and check the references.
– Whedon can help check references with the command @whedon check references
• Proof-read the paper and ask authors to fix any remaining typos, badly formed citations, awkward wording, etc..
• Ask the author to make a tagged release and archive, and report the version number and archive DOI in the
review thread.
• Check the archive deposit has the correct metadata (title and author list), and request the author edit it if it
doesn’t match the paper.
• Run @whedon set <doi> as archive.
• Run @whedon set <v1.x.x> as version if the version was updated.
• Run @whedon accept to generate the final proofs, which has Whedon notify the @openjournals/
joss-eics team that the paper is ready for final processing.
At this point, the EiC/AEiC will take over to make final checks and publish the paper.
It’s also a good idea to ask the authors to check the proof. We’ve had a few papers request a post-publication change
of author list, for example—this requires a manual download/compile/deposit cycle and should be a rare event.
JOSS is collaborating with AAS publishing to offer software review for some of the papers submitted to their journals.
A detailed overview of the motivations/background is available in the announcement blog post, here we document the
additional editorial steps that are necessary for JOSS to follow:
Before/during review
• If the paper is a joint publication, make sure you apply the AAS label to both the pre-review and the review
issues.
• Before moving the JOSS paper from pre-review to review, ensure that you (the JOSS editor) make the
reviewers aware that JOSS will be receiving a small financial donation from AAS publishing for this review
(e.g. like this).
After the paper has been accepted by JOSS
• Once the JOSS review is complete, ask the author for the status of their AAS publication, specifically if they
have the AAS paper DOI yet.
• Once this is available, ask the author to add this information to their paper.md YAML header as documented
in the submission guidelines.
# Optional fields if submitting to a AAS journal too, see this blog post:
# https://blog.joss.theoj.org/2018/12/a-new-collaboration-with-aas-publishing
aas-doi: 10.3847/xxxxx <- update this with the DOI from AAS once you know it.
aas-journal: Astrophysical Journal <- The name of the AAS journal.
• Pause the review (by applying the paused label) to await notification that the AAS paper is published.
If a paper has already been reviewed and accepted by rOpenSci, the streamlined JOSS review process is:
• Assign yourself as editor and reviewer
• Add a comment in the pre-review issue pointing to the rOpenSci review
• Add the rOpenSci label to the pre-review issue
• Start the review issue
• Add a comment in the review issue pointing to the rOpenSci review
• Add the rOpenSci label to the review issue
• Compile the paper and check it looks ok
• Tick off all the review checkboxes
• Go to to the source code repo and grab the Zenodo DOI
• Accept and publish the paper
If you believe a submission should be rejected, for example, because it is out of scope for JOSS, then you should:
• Ask Whedon to flag the submission as potentially out of scope with the command @whedon query scope.
This command adds the query-scope label to the issue.
• Mention to the author your reasons for flagging the submission as possibly out of scope, and give them an
opportunity to defend their submission.
• The EiC on rotation will make a final determination of whether a submission is in scope, taking into account the
feedback of other editors.
Once per week, an email is sent to all JOSS editors with a summary of the papers that are currently flagged as
potentially out of scope. Editors are asked to review these submissions and vote on the JOSS website if they have an
opinion about a submission.
I found you following links from the page of The Super Project and/or on Twitter. This
message is to ask if you can help us out with a submission to JOSS (The Journal of
˓→Open
JOSS publishes articles about open source research software. The submission I'd like
˓→you
The review process at JOSS is unique: it takes place in a GitHub issue, is open,
and author-reviewer-editor conversations are encouraged.
JOSS reviews involve downloading and installing the software, and inspecting the
˓→repository
Editors and reviewers post comments on the Review issue, and authors respond to the
˓→comments
and improve their submission until acceptance (or withdrawal, if they feel unable to
satisfy the review).
Would you be able to review this submission for JOSS? If not, can you recommend
someone from your team to help out?
Kind regards,
JOSS Editor.
:wave: thanks for your submission to JOSS. From a quick inspection of this submission
˓→it's not entirely obvious that it meets our [submission criteria](https://joss.
˓→readthedocs.io/en/latest/submitting.html#submission-requirements). In particular,
˓→this item:
Could you confirm here that there _is_ a research application for this software (and
˓→explain what that application is)? The section [_'what should my paper contain'_
˓→](https://joss.readthedocs.io/en/latest/submitting.html#what-should-my-paper-
˓→contain) has some guidance for the sort of content we're looking to be present in
˓→the `paper.md`.
Many thanks!
:wave: @reviewer1 & @reviewer2, would any of you be willing to review this submission
˓→for JOSS? We carry out our checklist-driven reviews here in GitHub issues and
@authorname @reviewer1 @reviewer2 this is the review thread for the paper. All of
˓→our communications will happen here from now on.
Both reviewers have checklists at the top of this thread with the JOSS requirements.
˓→As you go over the submission, please check any items that you feel have been
The JOSS review is different from most other journals. Our goal is to work with the
˓→authors to help them meet our criteria instead of merely passing judgment on the
˓→submission. As such, the reviewers are encouraged to submit issues and pull
˓→requests on the software repository. When doing so, please mention `openjournals/
˓→an eye on what is happening). Please also feel free to comment and ask questions on
˓→as you come across them instead of waiting until you've reviewed the entire package.
We aim for reviews to be completed within about 2-4 weeks. Please let me know if any
˓→of you require some more time. We can also use Whedon (our bot) to set automatic
- [ ] Check the archival deposit (e.g., in Zenodo) has the correct metadata. This
˓→includes the title (should match the paper title) and author list (make sure the
˓→list is correct and people who only made a small fix are not on it). You may also
3.5.7.6
Note: The assign command clobbers all reviewer assignments. If you want to add an additional reviewer use the
add command.
Editorial
flow
As documented above, usually, papers will be assigned to you by one of the AEiCs. We ask that editors do their best
to respond in a timely fashion (~ 3 working days) to invites to edit a new submission.
As an editor, part of your role is to ensure that submissions you’re responsible for are progressing smoothly through
the editorial process. This means that once or twice per week we ask that you check your GitHub notifications and/or
your editorial dashboard (e.g. http://joss.theoj.org/dashboard/youreditorname) for updates to the
papers you are handling.
If reviews go stale
Sometimes reviews go quiet, either because a reviewer has failed to complete their review or an author has been slow
to respond to a reviewer’s feedback. As the editor, we need you to prompt the author/or reviewer(s) to revisit the
submission if there has been no response within 7-10 days unless there’s a clear statement in the review thread
that says an action is coming at a slightly later time, perhaps because a reviewer committed to a review by a
certain date, or an author is making changes and says they will be done by a certain date.
Whedon has functionality to remind an author or review to return to a review at a certain point in the future. For
example:
Sometimes we need time away from our editing duties at JOSS. The joss-reviews repository has the OoO bot installed
which means you can mark yourself as out of the office (and unable to respond to reviews) for a period of time e.g.:
Mark yourself as OoO in one of the reviews you’re editing in the joss-reviews repository like this:
Ooo bot will then respond to any mentions in the joss-reviews repository to let people know you’re away.
Note, if you’re planning on being out of the office for more than two weeks, please let the JOSS editorial team
know.
New editors are assigned an editorial ‘buddy’ from the existing editorial team. The buddy is there to help the new editor
onboard successfully and to provide a dedicated resource for any questions they might have but don’t feel comfortable
posting to the editor mailing list.
Buddy assignments don’t have a fixed term but generally require a committment for 1-2 months.
Some things you might need to do as a buddy for a new editor:
• Respond to questions via email or on GitHub review issues.
• Check in with the new editor every couple of weeks if there hasn’t been any other communication.
• (Optionally) keep an eye on the new editor’s submissions.
Being on the JOSS editorial team means that there can be a lot of notifications from GitHub if you don’t take some
proactive steps to minimize noise from the reviews repository.
watching
Please note, that by not watching the reviews repository, you will still receive notifications for issues (reviews) where
you are @mentioned.
Sometimes another editor might mention you in a review issue (for example to ask you a question). If you’ve responded
and no-longer want to receive messages for that review, you can manually unsubscribe by clicking the button in the
right-hand column on the review issue page.
Curate your GitHub notifications experience
GitHub has extensive documentation on managing notifications which explains when and why different notifications
are sent from a repository.
Set up email filters
Email filters can be very useful for managing incoming email notifications, here are some recommended resources:
• A GitHub blog post describing how to set up email filters.
If you use Gmail:
• https://gist.github.com/ldez/bd6e6401ad0855e6c0de6da19a8c50b5
• https://github.com/jessfraz/gmailfilters
• https://hackernoon.com/how-to-never-miss-a-github-mention-fdd5a0f9ab6d
Whedon or @whedon on GitHub, is our editorial bot that interacts with authors, reviewers, and editors on JOSS
reviews.
@whedon can do a bunch of different things. If you want to ask @whedon what it can do, simply type the following
in a JOSS review or pre-review issue:
# List all of Whedon's capabilities
@whedon commands
# Set the software archive DOI at the top of the issue e.g.
@whedon set 10.0000/zenodo.00000 as archive
EDITORIAL TASKS
# Ask Whedon to do a dry run of accepting the paper and depositing with Crossref
@whedon accept
EiC TASKS
# Reject a paper
@whedon reject
# Withdraw a paper
@whedon withdraw
# Ask Whedon to actually accept the paper and deposit with Crossref
# (supports custom branches too)
@whedon accept deposit=true
When a pre-review or review issue is opened, @whedon will try to compile the JOSS paper by looking for a
paper.md file in the repository specified when the paper was submitted.
If it can’t find the paper.md file it will say as much in the review issue. If it can’t compile the paper (i.e. there’s
some kind of Pandoc error), it will try and report that error back in the thread too.
Note: If you want to see what command @whedon is running when compiling the JOSS paper, take a look at the
code here.
Anyone can ask @whedon to compile the paper again (e.g. after a change has been made). To do this simply comment
on the review thread as follows:
By default, Whedon will look for papers in the default git branch. If you want to compile a paper from a non-default
branch, this can be done as follows:
Sometimes submitting authors suggest people the think might be appropriate to review their submission. If you want
the link to the current list of JOSS reviewers, type the following in the review thread:
Editors can either assign themselves or other editors as the editor of a submission as follows:
Whedon can be used by EiCs to send email invites to registered editors as follows:
This will send an automated email to the editor with a link to the GitHub pre-review issue.
Note: The assign command clobbers all reviewer assignments. If you want to add an additional reviewer use the
add command.
Once the reviewer(s) and editor have been assigned in the pre-review issue, the editor starts the review with:
Important: If a reviewer recants their commitment or is unresponsive, editors can remove them with the command
@whedon remove @username as reviewer. You can also add new reviewers in the REVIEW issue, but in
this case, you need to manually add a review checklist for them by editing the issue body.
Whedon can reminders authors and reviewers after a specified amount of time to return to the review issue. Reminders
can only be set by editors, and only for REVIEW issues. For example:
Important: For reviewers, the reminder will only be triggered if the reviewer’s review is outstanding (i.e. outstanding
checkboxes).
When a submission is accepted, we ask that the authors create an archive (on Zenodo, figshare, or other) and post the
archive DOI in the REVIEW issue. The editor should add the accepted label on the issue and ask @whedon to add
the archive to the issue as follows:
Sometimes the version of the software changes as a consequence of the review process. To update the version of the
software do the following:
Whedon can accept a paper from the review issue. This includes generating the final paper PDF, Crossref metedata,
and depositing this metadata with the Crossref API.
JOSS topic editors can ask for the final proofs to be created by Whedon with the following command:
@whedon accept
On issuing this command, Whedon will also check the references of the paper for any missing DOIs. This command
can be triggered separately:
Note: Whedon can verify that DOIs resolve, but cannot verify that the DOI associated with a paper is actually correct.
In addition, DOI suggestions from Whedon are just that - i.e. they may not be correct.
If everything looks good with the draft proofs from the @whedon accept command, JOSS editors-in-chief can take
the additional step of actually accepting the JOSS paper with the following command:
Note: This command is only available to the JOSS editor-in-chief, or associate editor-in-chiefs.
To be written. . .
For now, please take a look at the JOSS codebase.