Review History


All reviews of published articles are made public. This includes manuscript files, peer review comments, author rebuttals and revised materials. Note: This was optional for articles submitted before 13 February 2023.

Peer reviewers are encouraged (but not required) to provide their names to the authors when submitting their peer review. If they agree to provide their name, then their personal profile page will reflect a public acknowledgment that they performed a review (even if the article is rejected). If the article is accepted, then reviewers who provided their name will be associated with the article itself.

View examples of open peer review.

Summary

  • The initial submission of this article was received on March 29th, 2015 and was peer-reviewed by 2 reviewers and the Academic Editor.
  • The Academic Editor made their initial decision on April 21st, 2015.
  • The first revision was submitted on May 5th, 2015 and was reviewed by the Academic Editor.
  • The article was Accepted by the Academic Editor on May 11th, 2015.

Version 0.2 (accepted)

· May 11, 2015 · Academic Editor

Accept

The manuscript has been greatly improved after revision. Now I suggest its acceptance.

Version 0.1 (original submission)

· Apr 21, 2015 · Academic Editor

Major Revisions

It's nice to see idea to combine bootsrap and mRMR to find robust and non-redundant biomarkers. Both reviewers think the work is interesting and give some positive comments. Meanwhile, they also have some major concerns to be addressed. I suggest the authors to provide a point by point response letter and the revision.

For the removing of redundancy among selected features, an important reference is missing. The recent method in Nucl. Acids Res. (2013) 41 (4): e53 can beat mRMR in gene expression data.

Reviewer 1 ·

Basic reporting

none

Experimental design

none

Validity of the findings

none

Additional comments

none

Annotated reviews are not available for download in order to protect the identity of reviewers who chose to remain anonymous.

Reviewer 2 ·

Basic reporting

The authors give sufficient introduction and background of OTU-based microbial community analysis, but there exists a minor mistake. At the end of 4th paragraph, it's said that "none of these methods directly identify biological features responsible for group relationships". However, tools like Mothur does calculate beta-diversity of OTU abundance and find the OTUs (species-level) that have significantly different abundances.

Experimental design

In the "Classification accuracy analysis based on synthetic datasets", the authors need to clarify a key point of the experiment design. Was the MetaBoot feature selection carried out on the training set (50 samples) or on the training+testing sets (60 samples)? If it's done on 60 samples, the leaks of testing samples' information will lead to underestimation of classification error rate. In addition, 10 testing samples are not enough to get stable estimate of the classification accuracy. A better way is to run a 6-fold cross-validation.

Validity of the findings

For the soil dataset, the selected biomarkers are on phylum-level, thus it's impossible to understand their functional roles from biological / environmental point of view. It's better to use real datasets that can be depicted on genus-level. The authors may be interested in the following papers studying human gut microbiome.
(1) A human gut microbial gene catalogue established by metagenomic sequencing.
(2) Diet rapidly and reproducibly alters the human gut microbiome.

Additional comments

In general, the article was meaningful and well organized.

All text and materials provided via this peer-review history page are made available under a Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.