All reviews of published articles are made public. This includes manuscript files, peer review comments, author rebuttals and revised materials. Note: This was optional for articles submitted before 13 February 2023.
Peer reviewers are encouraged (but not required) to provide their names to the authors when submitting their peer review. If they agree to provide their name, then their personal profile page will reflect a public acknowledgment that they performed a review (even if the article is rejected). If the article is accepted, then reviewers who provided their name will be associated with the article itself.
Thank you again for your attention to all the reviewers' comments. You have addressed each comment thoroughly and edited your manuscript accordingly.
We found your paper to be well written overall with a clear rationale for the study.
The main concern, which is noted by both reviewers, is that it lacks explanation of some basic terminology from the start of the manuscript. Unfortunately this contributes to confusion throughout and takes away from the general clarity as well as the message of your paper. Please see the reviews which give helpful comments in this regard.
Question
Abstract Methods -
- what is a Q-sort exercise?
- how many people involved?
- what did these people do - as opposed to what the authors did
- where did the Milestones come from?
- what is a innovative transition of care curricula?
Abstract Results-
-should the "game boards" be explained?
- is simulation and simulation innovation the same thing?
- is discharge clinic feedback and discharge clinic the same thing?
is Tracer and milestones and tracer the same thing?
- is there any overlap in teh 5/22, 9/22 and 7/22?
Conclusion
I don't understand this - don't get t his feeling from the results presented in the abstract
FULL TEXT
61- IM not defined - assume it is Internal Medicine
71 APDIM not defined
METHODS
I need an example of some of these 142 curricular milestones- I don't understand what was actually done
What is a Q-sort?
How many people were involved in the research process? ie how many Units of 2-3 people were involved?
did they rank the 22 or the 144?
109-116 I would have thought 100% of the top 8 came from the list of 22 milestones?
132-136 what is PC C2? What is IPC A5?
158- Is Tracer a person?
171- what is a game board
Conclusion
no idea if this is supported by the results
not sure
not sure
not sure
Given the relative complexity fo the project - the clarity of writing is a strength.
The case for the research, and significance of the issue is built nicely in the introduction, the methods are explained well, and the results reported in an understandable format.
The abstract could be modified for further simplicity eg using 'Q-sort' terminology without explanation in the abstract may be confusing. Maybe 'validated ranking process' or similar.
The tables add value and are clear.
The researchers have explained the retrospective limitations.
However I think the mapping approach is a good way to deal with matching existing curricular into new frameworks ( or vice versa), and provides readers with a pragmatic approach that might prevent losing useful approaches in a rush to the new framework.
The biases and limitation of 'expert consensus' are acknowledged.
The processes for deriving the data is well described and would be reproducible.
The nature of EPAs as the 'endpoints' is somewhat subjective and this affected the prioritization process. However it is now an important framework and this project illustrates a way to approach that subjectivity.
Importantly I don't think the authors 'overreach' in their conclusions
thank you for the opportunity to review.
Important topic.
Well done
Probably more for educators than practitioners as target audience
All text and materials provided via this peer-review history page are made available under a Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.