Review History


All reviews of published articles are made public. This includes manuscript files, peer review comments, author rebuttals and revised materials. Note: This was optional for articles submitted before 13 February 2023.

Peer reviewers are encouraged (but not required) to provide their names to the authors when submitting their peer review. If they agree to provide their name, then their personal profile page will reflect a public acknowledgment that they performed a review (even if the article is rejected). If the article is accepted, then reviewers who provided their name will be associated with the article itself.

View examples of open peer review.

Summary

  • The initial submission of this article was received on November 10th, 2020 and was peer-reviewed by 3 reviewers and the Academic Editor.
  • The Academic Editor made their initial decision on January 2nd, 2021.
  • The first revision was submitted on April 27th, 2021 and was reviewed by 2 reviewers and the Academic Editor.
  • The article was Accepted by the Academic Editor on May 4th, 2021.

Version 0.2 (accepted)

· May 4, 2021 · Academic Editor

Accept

The authors have addressed the reviewer comments.

[# PeerJ Staff Note - this decision was reviewed and approved by Paula Soares, a PeerJ Section Editor covering this Section #]

[# PeerJ Staff Note: Although the Academic and Section Editors are happy to accept your article as being scientifically sound, a final check of the manuscript shows that it would benefit from further English editing. Therefore, if you can identify further edits, please work with our production group to address them while in proof stage #]

Reviewer 1 ·

Basic reporting

The authors have adequately addressed all the comments that raised in a previous round of review.

Experimental design

non.

Validity of the findings

non.

Additional comments

The authors have adequately addressed all the comments that raised in a previous round of review.

Reviewer 2 ·

Basic reporting

The manuscript needs English proofreading.

Experimental design

non.

Validity of the findings

non.

Additional comments

The manuscript needs English proofreading.

Version 0.1 (original submission)

· Jan 2, 2021 · Academic Editor

Major Revisions

Careful proofreading is necessary.

[# PeerJ Staff Note: Please ensure that all review comments are addressed in a rebuttal letter and any edits or clarifications mentioned in the letter are also inserted into the revised manuscript where appropriate.  It is a common mistake to address reviewer questions in the rebuttal letter but not in the revised manuscript. If a reviewer raised a question then your readers will probably have the same question so you should ensure that the manuscript can stand alone without the rebuttal letter.  Directions on how to prepare a rebuttal letter can be found at: https://peerj.com/benefits/academic-rebuttal-letters/ #]

[# PeerJ Staff Note: The review process has identified that the English language must be improved. PeerJ can provide language editing services - please contact us at [email protected] for pricing (be sure to provide your manuscript number and title) #]

Reviewer 1 ·

Basic reporting

(1) The manuscript needs English proofreading.

Experimental design

(1) several data are unbalanced data. So, it is better to use g-mean instead of classification accuracy.
(2) The authors need to specify how they tune the SVM parameters.

Validity of the findings

Non.

Additional comments

Need to add several recent papers, such as
https://doi.org/10.1007/s11634-018-0334-1

Reviewer 2 ·

Basic reporting

The whole manuscript needs English expert for editing.

Experimental design

For the SVM hyperparameters, the authors need to specify how they choose them.

Validity of the findings

Ok.

Additional comments

Non.

Reviewer 3 ·

Basic reporting

The authors of the paper titled " Robust Proportional Overlapping Analysis for Feature Selection in Binary Classification within Functional Genomic Experiments" have proposed a new feature selection method. The authors have done various analysis to establish their points. I think the authors should address my queries to make this paper accepted.

Experimental design

line 194: Why cross-validation is not considered here?

line 196: Why 500 runs are considered, any specific reason for this? What is the reason behind choosing Random forest (RF), support vector machine (SVM), and k-Nearest neighbours?

Table 1: the gene count for nki dataset is very low, not acceptable for classification. The class sizes (96/48) is not clear to me, can you please explain them.

line 303: If there is a possibility of improving the method as you have mentioned in the last para before the Reference section. Then why have not you used that in this paper?

Validity of the findings

line 222: Do provide the selected gene names and give their biological significance? Why they provide better classification results please justify that?

Table 4: Why maximum genes count are 30?

line 294: ''median absolute deviation (MAD) than the measure of the interquartile range (IQR) used in ?''. Reason for ?

Additional comments

The references are very old please update them with the latest references.

The authors should read the entire manuscript carefully and improve the quality of the language. Lots of typo errors are exist.

According to me, the paper has lacked novelty. The latest related research discussion is required to judge the performance of the proposed feature selection method.

All text and materials provided via this peer-review history page are made available under a Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.