To increase transparency, PeerJ operates a system of 'optional signed reviews and history'. This takes two forms: (1) peer reviewers are encouraged, but not required, to provide their names (if they do so, then their profile page records the articles they have reviewed), and (2) authors are given the option of reproducing their entire peer review history alongside their published article (in which case the complete peer review process is provided, including revisions, rebuttal letters and editor decision letters).
Thank you again for your attention to all the reviewers' comments. You have addressed each comment thoroughly and edited your manuscript accordingly.
We found your paper to be well written overall with a clear rationale for the study.
The main concern, which is noted by both reviewers, is that it lacks explanation of some basic terminology from the start of the manuscript. Unfortunately this contributes to confusion throughout and takes away from the general clarity as well as the message of your paper. Please see the reviews which give helpful comments in this regard.
Abstract Methods -
- what is a Q-sort exercise?
- how many people involved?
- what did these people do - as opposed to what the authors did
- where did the Milestones come from?
- what is a innovative transition of care curricula?
-should the "game boards" be explained?
- is simulation and simulation innovation the same thing?
- is discharge clinic feedback and discharge clinic the same thing?
is Tracer and milestones and tracer the same thing?
- is there any overlap in teh 5/22, 9/22 and 7/22?
I don't understand this - don't get t his feeling from the results presented in the abstract
61- IM not defined - assume it is Internal Medicine
71 APDIM not defined
I need an example of some of these 142 curricular milestones- I don't understand what was actually done
What is a Q-sort?
How many people were involved in the research process? ie how many Units of 2-3 people were involved?
did they rank the 22 or the 144?
109-116 I would have thought 100% of the top 8 came from the list of 22 milestones?
132-136 what is PC C2? What is IPC A5?
158- Is Tracer a person?
171- what is a game board
no idea if this is supported by the results
Given the relative complexity fo the project - the clarity of writing is a strength.
The case for the research, and significance of the issue is built nicely in the introduction, the methods are explained well, and the results reported in an understandable format.
The abstract could be modified for further simplicity eg using 'Q-sort' terminology without explanation in the abstract may be confusing. Maybe 'validated ranking process' or similar.
The tables add value and are clear.
The researchers have explained the retrospective limitations.
However I think the mapping approach is a good way to deal with matching existing curricular into new frameworks ( or vice versa), and provides readers with a pragmatic approach that might prevent losing useful approaches in a rush to the new framework.
The biases and limitation of 'expert consensus' are acknowledged.
The processes for deriving the data is well described and would be reproducible.
The nature of EPAs as the 'endpoints' is somewhat subjective and this affected the prioritization process. However it is now an important framework and this project illustrates a way to approach that subjectivity.
Importantly I don't think the authors 'overreach' in their conclusions
thank you for the opportunity to review.
Probably more for educators than practitioners as target audience
All text and materials provided via this peer-review history page are made available under a Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.