All reviews of published articles are made public. This includes manuscript files, peer review comments, author rebuttals and revised materials. Note: This was optional for articles submitted before 13 February 2023.
Peer reviewers are encouraged (but not required) to provide their names to the authors when submitting their peer review. If they agree to provide their name, then their personal profile page will reflect a public acknowledgment that they performed a review (even if the article is rejected). If the article is accepted, then reviewers who provided their name will be associated with the article itself.
=====================================================
The review implements the changes requested in the first-round of the review process. So, I believe that the article is suitable to be published as it is.
The review implements the changes requested in the first-round of the review process.
The review explains better the policy and practical implications of their findings for policy-makers.
The paper has been revised well. All my concerns have been solved.
N/A
N/A
N/A
=========================
[# PeerJ Staff Note: Please ensure that all review comments are addressed in an appropriate rebuttal letter, and please ensure that any edits or clarifications mentioned in the rebuttal letter are also inserted into the revised manuscript (where appropriate). Direction on how to prepare a rebuttal letter can be found at: https://peerj.com/benefits/academic-rebuttal-letters/ #]
The paper “Predicting the results of evaluation procedures of academics” aims to assess whether it is possible i) to predict the results of the Italian Habilitation (ASN) using only the information contained in the candidates’ CVs?; ii) to identify a small set of predictors of the ASN results?
The paper is successful in reaching its targets and I admire the capacity of the author(s) of developing a very wide and rich dataset, which is less exposed to biases, as they stress in the comparative evaluation of their model.
Barring from being an expert of machine learning algorithms and techniques, the paper and its findings seem convincing.
I have just few minor tips for the authors.
First, discussion and conclusions should stress not only the prediction ability of the model developed, but also discuss the policy and practical implications of their findings for policy-makers. For instance, the maximum number of years with affiliation to the same university for obtaining ASN is a finding that contradicts the literature stream that stresses mobility as a positive feature in the academic career. Some kinds of tentative explanations of this unexpected results, maybe related to the specificities of the Italian context wherein the analysis took place, would be suggested.
Also practical implications of the findings should be stressed. E.g. Should the policy makers rely more machine learning techniques in order to make habilitation choice thus saving time of academic staff from the time-consuming peer review?
Finally, a very minor point, I would suggest in the Discussion and Conclusions section to replace the name of variables as defined in Figure 4, with the correspondent description to facilitate the reading.
While the research presented here is of very high quality, the structure of the paper can however be improved on various accounts. For once, while the literature review is both impressive and thorough, I wonder if the "prediction of bibliometric indicators" subsection is absolutely necessary here, as the works discussed in detail there are of no "direct" relevance to the research and isn't referred to or used as benchmark in the "evaluation" section. I suggest that the authors sum up this subsection in a few lines, add it to the introductory paragraph of the "Related Work" section, and remove the "Prediction of the Results of Evaluation Procedures" subsection title. Also, the "results" section contain methodological procedures that do not contain any results. We suggest that the subsections contained in the "results" section be moved to the "methods and material" section, in which they clearly belong, and that the "evaluation" section be then renamed "results". Finally, while the content of the "Italian Scientific Habilitation" section is of crucial importance to the paper, I am not sure whether devoting a distinct section to it is the best option: some of its content clearly belongs to the introduction section, while the segment on bibliometric and non-bibliometric indicators would clearly fit in the "Methods and Material" section. In light of the above comments, I think that this paper would benefit from sticking closer to the IMRaD structural standard.
Additional information on why the 2012 ASN data is a representative and exhaustive sample of the whole population would have been welcome.
No comment.
In this manuscript, the authors develop a data mining approach to predict the results of ASN and identify some important features. The data used in this paper contains 59,149 researchers and 1,910,873 papers spanning in 184 recruitment field, which is relatively larger compared to related works. The treatment of the problem is technically sound.
1 - The acronym of Recruitment Field and Random Forests are both “RF”. I think this may lead to ambiguity.
2 – The quality of Figure 1 needs to be improved.
no comment
no comment
The subject and the proposed method are valuable. The problems and solutions in the paper are clearly stated. So, I believe that the article is suitable to be published in "PeerJ Computer Science journal" after a little revision.
All text and materials provided via this peer-review history page are made available under a Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.