Review History


All reviews of published articles are made public. This includes manuscript files, peer review comments, author rebuttals and revised materials. Note: This was optional for articles submitted before 13 February 2023.

Peer reviewers are encouraged (but not required) to provide their names to the authors when submitting their peer review. If they agree to provide their name, then their personal profile page will reflect a public acknowledgment that they performed a review (even if the article is rejected). If the article is accepted, then reviewers who provided their name will be associated with the article itself.

View examples of open peer review.

Summary

  • The initial submission of this article was received on February 27th, 2019 and was peer-reviewed by 3 reviewers and the Academic Editor.
  • The Academic Editor made their initial decision on April 2nd, 2019.
  • The first revision was submitted on May 1st, 2019 and was reviewed by 2 reviewers and the Academic Editor.
  • The article was Accepted by the Academic Editor on May 19th, 2019.

Version 0.2 (accepted)

· May 19, 2019 · Academic Editor

Accept

=====================================================

Reviewer 1 ·

Basic reporting

The review implements the changes requested in the first-round of the review process. So, I believe that the article is suitable to be published as it is.

Experimental design

The review implements the changes requested in the first-round of the review process.

Validity of the findings

The review explains better the policy and practical implications of their findings for policy-makers.

Reviewer 3 ·

Basic reporting

The paper has been revised well. All my concerns have been solved.

Experimental design

N/A

Validity of the findings

N/A

Additional comments

N/A

Version 0.1 (original submission)

· Apr 2, 2019 · Academic Editor

Minor Revisions

=========================

[# PeerJ Staff Note: Please ensure that all review comments are addressed in an appropriate rebuttal letter, and please ensure that any edits or clarifications mentioned in the rebuttal letter are also inserted into the revised manuscript (where appropriate). Direction on how to prepare a rebuttal letter can be found at: https://peerj.com/benefits/academic-rebuttal-letters/ #]

Reviewer 1 ·

Basic reporting

The paper “Predicting the results of evaluation procedures of academics” aims to assess whether it is possible i) to predict the results of the Italian Habilitation (ASN) using only the information contained in the candidates’ CVs?; ii) to identify a small set of predictors of the ASN results?
The paper is successful in reaching its targets and I admire the capacity of the author(s) of developing a very wide and rich dataset, which is less exposed to biases, as they stress in the comparative evaluation of their model.

Experimental design

Barring from being an expert of machine learning algorithms and techniques, the paper and its findings seem convincing.

Validity of the findings

I have just few minor tips for the authors.
First, discussion and conclusions should stress not only the prediction ability of the model developed, but also discuss the policy and practical implications of their findings for policy-makers. For instance, the maximum number of years with affiliation to the same university for obtaining ASN is a finding that contradicts the literature stream that stresses mobility as a positive feature in the academic career. Some kinds of tentative explanations of this unexpected results, maybe related to the specificities of the Italian context wherein the analysis took place, would be suggested.
Also practical implications of the findings should be stressed. E.g. Should the policy makers rely more machine learning techniques in order to make habilitation choice thus saving time of academic staff from the time-consuming peer review?
Finally, a very minor point, I would suggest in the Discussion and Conclusions section to replace the name of variables as defined in Figure 4, with the correspondent description to facilitate the reading.

·

Basic reporting

While the research presented here is of very high quality, the structure of the paper can however be improved on various accounts. For once, while the literature review is both impressive and thorough, I wonder if the "prediction of bibliometric indicators" subsection is absolutely necessary here, as the works discussed in detail there are of no "direct" relevance to the research and isn't referred to or used as benchmark in the "evaluation" section. I suggest that the authors sum up this subsection in a few lines, add it to the introductory paragraph of the "Related Work" section, and remove the "Prediction of the Results of Evaluation Procedures" subsection title. Also, the "results" section contain methodological procedures that do not contain any results. We suggest that the subsections contained in the "results" section be moved to the "methods and material" section, in which they clearly belong, and that the "evaluation" section be then renamed "results". Finally, while the content of the "Italian Scientific Habilitation" section is of crucial importance to the paper, I am not sure whether devoting a distinct section to it is the best option: some of its content clearly belongs to the introduction section, while the segment on bibliometric and non-bibliometric indicators would clearly fit in the "Methods and Material" section. In light of the above comments, I think that this paper would benefit from sticking closer to the IMRaD structural standard.

Experimental design

Additional information on why the 2012 ASN data is a representative and exhaustive sample of the whole population would have been welcome.

Validity of the findings

No comment.

Reviewer 3 ·

Basic reporting

In this manuscript, the authors develop a data mining approach to predict the results of ASN and identify some important features. The data used in this paper contains 59,149 researchers and 1,910,873 papers spanning in 184 recruitment field, which is relatively larger compared to related works. The treatment of the problem is technically sound.
1 - The acronym of Recruitment Field and Random Forests are both “RF”. I think this may lead to ambiguity.
2 – The quality of Figure 1 needs to be improved.

Experimental design

no comment

Validity of the findings

no comment

Additional comments

The subject and the proposed method are valuable. The problems and solutions in the paper are clearly stated. So, I believe that the article is suitable to be published in "PeerJ Computer Science journal" after a little revision.

All text and materials provided via this peer-review history page are made available under a Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.