Review History


To increase transparency, PeerJ operates a system of 'optional signed reviews and history'. This takes two forms: (1) peer reviewers are encouraged, but not required, to provide their names (if they do so, then their profile page records the articles they have reviewed), and (2) authors are given the option of reproducing their entire peer review history alongside their published article (in which case the complete peer review process is provided, including revisions, rebuttal letters and editor decision letters).

New to public reviews? Learn more about optional signed reviews and how to write a better rebuttal letter.

Summary

  • The initial submission of this article was received on January 21st, 2014 and was peer-reviewed by 2 reviewers and the Academic Editor.
  • The Academic Editor made their initial decision on February 18th, 2014.
  • The first revision was submitted on March 28th, 2014 and was reviewed by the Academic Editor.
  • The article was Accepted by the Academic Editor on April 3rd, 2014.

Version 0.2 (accepted)

· · Academic Editor

Accept

Your manuscript can be accepted by PeerJ.

Version 0.1 (original submission)

· · Academic Editor

Minor Revisions

Consider the suggestions by reviewer 1 and give arguments in case you do not find this suggestion valuable.

·

Basic reporting

The reporting of the paper is very clear and thorough both in terms of background and methods.

Experimental design

The research question is clear as are the methods used for conducting the meta-analyses and the meta-regression analyses. The methods used are of a high technical standard.

Validity of the findings

I think the conclusions are sound and based on high quality statistical analyses and well established systematic review methods.

I have one suggestion for improving the analyses but its relatively minor and up to the authors if they want to consider it:

I think it makes sense that you extracted 188 effect sizes from the 9 randomised controlled experiments - as this is a good use of the data reported in the papers. The discussion mentions the limitations of non-independence in the synthesis but I wonder whether its worth doing a sensitivity analyses using robust standard errors as a further way of justifying the validity of this approach. See for example: http://onlinelibrary.wiley.com/doi/10.1002/jrsm.5/pdf

·

Basic reporting

Clearly and concisely written throughout, this article conforms to the PeerJ structure whilst incorporating standard sections used in reporting the findings of systematic reviews and meta-analyses. The introduction and discussion are adequate to place the study aims and findings within the context of existing research and demonstrates how the study fills a knowledge gap in the subject. Figures are of sufficient quality for publication with the exception that the formatting of the final column in table 1. The column needs to be widened to properly accomodate the term 'hemicryptophyte'.

Experimental design

The investigation has been undertaken rigorously and meets the standard requirements of systematic review and meta-analysis.

Validity of the findings

As with all properly undertaken systematic reviews, the findings of this article not only attempt to summarise a general outcome relating to a given intervention but also shine a light on the needs of future research. The article is carefully argued throughout and therefore does not overstate the findings and is thorough in the treatment of the limitations of existing research. I hope the plant conservation community are alerted to the existence of this article and take its recommendations on board in order to improve future practice and research.

All text and materials provided via this peer-review history page are made available under a Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.