Review History


All reviews of published articles are made public. This includes manuscript files, peer review comments, author rebuttals and revised materials. Note: This was optional for articles submitted before 13 February 2023.

Peer reviewers are encouraged (but not required) to provide their names to the authors when submitting their peer review. If they agree to provide their name, then their personal profile page will reflect a public acknowledgment that they performed a review (even if the article is rejected). If the article is accepted, then reviewers who provided their name will be associated with the article itself.

View examples of open peer review.

Summary

  • The initial submission of this article was received on October 31st, 2022 and was peer-reviewed by 3 reviewers and the Academic Editor.
  • The Academic Editor made their initial decision on December 18th, 2022.
  • The first revision was submitted on February 24th, 2023 and was reviewed by 2 reviewers and the Academic Editor.
  • A further revision was submitted on April 6th, 2023 and was reviewed by the Academic Editor.
  • A further revision was submitted on May 23rd, 2023 and was reviewed by the Academic Editor.
  • The article was Accepted by the Academic Editor on June 1st, 2023.

Version 0.4 (accepted)

· Jun 1, 2023 · Academic Editor

Accept

The authors have adequately addressed all issues raised and extensively revised the manuscript which is now acceptable for publication.

[# PeerJ Staff Note - this decision was reviewed and approved by Brenda Oppert, a PeerJ Section Editor covering this Section #]

Version 0.3

· Apr 17, 2023 · Academic Editor

Minor Revisions

The author have adequately addressed all points raised by the referees. English language needs editing.

[# PeerJ Staff Note: The Academic Editor has identified that the English language must be improved. PeerJ can provide language editing services - please contact us at [email protected] for pricing (be sure to provide your manuscript number and title). #]

Version 0.2

· Mar 8, 2023 · Academic Editor

Minor Revisions

Please address all issues raised.

Reviewer 2 ·

Basic reporting

All concerns have been responded.

Experimental design

pass

Validity of the findings

pass

Reviewer 4 ·

Basic reporting

no comment

Experimental design

If possible it could be useful to descrive in a more specific way what the authors define as DEGs inside chapter "Disease-Causing Genes", in particular the groups used to make the comparison and get the fold change (group 1 vs group 2).

Validity of the findings

no comment

Additional comments

Maybe one simple table (in the main text, inside the results and discussions or Conclusions paragraphs) with the name of drugs predicted by mutliple methods in each dataset (when applicable/useful) could help to summarize the results.

EG

Dataset Clinical use Drug Method 1 Method 2 Method3

Colorectal cancer Y drug_1 X X
N drug_2 X X

Melanoma Y drug_3 X X X
........

Version 0.1 (original submission)

· Dec 18, 2022 · Academic Editor

Major Revisions

Please exhaustively address all the points raised by the referees.

[# PeerJ Staff Note: Please ensure that all review and editorial comments are addressed in a response letter and any edits or clarifications mentioned in the letter are also inserted into the revised manuscript where appropriate. #]

[# PeerJ Staff Note: The review process has identified that the English language must be improved. PeerJ can provide language editing services - please contact us at [email protected] for pricing (be sure to provide your manuscript number and title) #]

Reviewer 1 ·

Basic reporting

1- There is no comparison with other state of the art algorithms/tools.
Some important similar methods
https://doi.org/10.1186/s12967-020-02541-3

https://doi.org/10.1002/psp4.12670

https://doi.org/10.1093/bib/bbab319

Experimental design

pass

Validity of the findings

2- AUC is a very important performance measures for evaluating the repurposed drugs. However, other perf. mea. Like ACC, Spec, SN, … is also very useful to have a clear and unbiased picture of a model's performance.

Additional comments

In the present study, Cuvitoglu et. al. used Network neighborhood for drug repositioning. They also applied the method on melanoma, colorectal and prostate cancers in which Several candidate drugs were predicted by applying 0.6 or higher AUC values. The problem if of outmost important. Below are my comments:

1- There is no comparison with other state of the art algorithms/tools.
Some important similar methods
https://doi.org/10.1186/s12967-020-02541-3

https://doi.org/10.1002/psp4.12670

https://doi.org/10.1093/bib/bbab319

2- AUC is a very important performance measures for evaluating the repurposed drugs. However, other perf. mea. Like ACC, Spec, SN, … is also very useful to have a clear and unbiased picture of a model's performance.
3- Figure 1 doesn't have clear and informative caption. For example authors can use color coded nodes instead of using arrows to pint to some nodes.
The manuscript suffers from informative visualizations.

Reviewer 2 ·

Basic reporting

English is not clear and professional and need to be revised throughly.

Experimental design

1- The authors should provide more information about their hypothesis. Why should DGN and DPAN be so similar while we know that a reasonable treatment should increase the expression of disease genes that are up-regulated and decrease the expression of disease genes that are down-regulated.

2- Authors need to clarify method section.
For example: “Our DR model is based on several network structures” it seems that they just use FIN network!

Validity of the findings

The method has been tested on melanoma, colorectal cancer, and prostate cancer. What is the correlation between IC50/EC50 and drug combined.auc values?

There is no comprehensive comparison with the previous methods.

Additional comments

In table 1, there are two different combined.auc values for gefitinib (0.63 and 0.61).

·

Basic reporting

1) The English language is properly used so that an international audience can follow the text. Some typos that need to be corrected are as following:
- In lines 228 and 229, 232, 237 space is missing before AUC.
- In line 83, Instead of “1 METHOD” it should be just “METHOD”
In line 222, instead of “2 RESULTS AND DISCUSSION” it should be just “RESULTS AND DISCUSSION”.

2) This article includes some introduction and background on the significance of computational drug repurposing studies, especially network based studies, and why it is needed to repurpose existing drugs. However, computational drug repositioning is also widely used for COVID-19. This needs to be added into the Introduction section and related papers need to be cited. There are several network-based and signature-based (transcriptome profiling, expression based) Computational Drug Repositioning studies conducted for COVID-19. Especially these need to be referred and how this study differs from those studies needs to be emphasized.

3) In terms of related work, the authors need to give more detail of the other cited studies. For example, in line 49 they state that “They reached a 0.92 AUC in their experiments.” Which dataset is used here, related to which disease, which cross validation technique is used? LOOCV, n-fold, etc? For other related work, these details should be better provided.

4) The structure of the article conforms to the standard sections. Figure 1 and the tables of the manuscript are relevant in general and their qualities are OK.

5) The manuscript is self-contained.

Experimental design

The article is scientifically and methodologically sound. The submission clearly defines the research question, which is relevant and meaningful. The proposed method was tested on melanoma, colorectal and prostate cancer datasets.

Validity of the findings

The authors state that “Several candidate drugs were predicted by applying 0.6 or higher AUC values.” AUC value threshold of 0.6 may be low. Do other competitor methods also get low AUC values for those datasets?

Some of the predictions were approved by clinical phase trials or other in-vivo studies found in literature.

Additional comments

The use of "our", "we" throughout the manuscript makes the writing quite informal and I would recommend that this be changed throughout. The manuscript is the presentation of a scientific investigation. It is not owned by, nor belongs to, any set of authors as the same study can be proposed and implemented by others. Therefore, it is more appropriate to refer to "The study", or "the results" and what explicitly occurred during the investigation, rather than "Our study" "Our dataset", "we applied" etc..

All text and materials provided via this peer-review history page are made available under a Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.