All reviews of published articles are made public. This includes manuscript files, peer review comments, author rebuttals and revised materials. Note: This was optional for articles submitted before 13 February 2023.
Peer reviewers are encouraged (but not required) to provide their names to the authors when submitting their peer review. If they agree to provide their name, then their personal profile page will reflect a public acknowledgment that they performed a review (even if the article is rejected). If the article is accepted, then reviewers who provided their name will be associated with the article itself.
I have not heard from the reviewers, yet. However, since it's holiday time, they may be away. Moreover, only one of them voted "minor changes" and I think you have adequately addressed these concerns. Thus, I am happy to let you know that your manuscript has been accepted.
[# PeerJ Staff Note - this decision was reviewed and approved by Paula Soares, a PeerJ Section Editor covering this Section #]
Two reviewers mentioned lack of access to original survey raw data. Please provide a DOI to the data or a statement why this data is not accessible upon publication.
figure 2 could benefit of absolute numbers in the bars
I could not identify a link to the raw shared data.
no comment
I missed a statement about the availability of the raw survey data to why that cannot be provided
I like your effort and willingness to understand the question of code publication and sharing.
The manuscript is well written, properly structured and easy to understand. Context and previous findings are adequately provided by references.
It would be helpful if the authors share the raw anonymized survey results.
The research questions are clearly defined and the data collection - 188 completed responses to a survey generated for this study - offers the required data to answer them. The data analysis is solid and several figures help to community the core findings efficiently. Table 4 is rather overwhelming and could also be translated into a figure to make it easier to understand the data.
The analysis is sound and understandable as well as supported by several references. The authors contextualize the outcome well in the elaborated discussion part. Furthermore, the author stated as employees of PLOS Computational Biology that the journal has made the strategic decision based on the survey results to rather focus on policies than offering further technological solutions to facilitate the sharing of code for their community.
no comment
no comment.
The article is clear and unabigious. There are many well chosen litature references, All of the data, figures and raw data is accessible and well-structured.
188 researchers in computational biology were surveyed. The findings also include other research areas with partly a very low number of responses. The relevance for those areas from the statistics shown is somewhat questionable.
no comment
The idea to include the new code notebook initiative, Neurolibre, into the survey was very good. It gave a feedback on how those tools are accepted, may be it takes more time to further increase this acceptance.
All text and materials provided via this peer-review history page are made available under a Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.