Review History


To increase transparency, PeerJ operates a system of 'optional signed reviews and history'. This takes two forms: (1) peer reviewers are encouraged, but not required, to provide their names (if they do so, then their profile page records the articles they have reviewed), and (2) authors are given the option of reproducing their entire peer review history alongside their published article (in which case the complete peer review process is provided, including revisions, rebuttal letters and editor decision letters).

New to public reviews? Learn more about optional signed reviews and how to write a better rebuttal letter.

Summary

  • The initial submission of this article was received on December 20th, 2014 and was peer-reviewed by 2 reviewers and the Academic Editor.
  • The Academic Editor made their initial decision on January 20th, 2015.
  • The first revision was submitted on February 13th, 2015 and was reviewed by the Academic Editor.
  • The article was Accepted by the Academic Editor on February 14th, 2015.

Version 0.2 (accepted)

· · Academic Editor

Accept

Thank you again for your attention to all the reviewers' comments. You have addressed each comment thoroughly and edited your manuscript accordingly.

Version 0.1 (original submission)

· · Academic Editor

Minor Revisions

We found your paper to be well written overall with a clear rationale for the study.

The main concern, which is noted by both reviewers, is that it lacks explanation of some basic terminology from the start of the manuscript. Unfortunately this contributes to confusion throughout and takes away from the general clarity as well as the message of your paper. Please see the reviews which give helpful comments in this regard.

Reviewer 1 ·

Basic reporting

Question
Abstract Methods -
- what is a Q-sort exercise?
- how many people involved?
- what did these people do - as opposed to what the authors did
- where did the Milestones come from?
- what is a innovative transition of care curricula?
Abstract Results-
-should the "game boards" be explained?
- is simulation and simulation innovation the same thing?
- is discharge clinic feedback and discharge clinic the same thing?
is Tracer and milestones and tracer the same thing?
- is there any overlap in teh 5/22, 9/22 and 7/22?
Conclusion
I don't understand this - don't get t his feeling from the results presented in the abstract

FULL TEXT
61- IM not defined - assume it is Internal Medicine
71 APDIM not defined

METHODS
I need an example of some of these 142 curricular milestones- I don't understand what was actually done
What is a Q-sort?
How many people were involved in the research process? ie how many Units of 2-3 people were involved?
did they rank the 22 or the 144?
109-116 I would have thought 100% of the top 8 came from the list of 22 milestones?
132-136 what is PC C2? What is IPC A5?
158- Is Tracer a person?
171- what is a game board

Conclusion
no idea if this is supported by the results

Experimental design

not sure

Validity of the findings

not sure

Comments for the author

not sure

·

Basic reporting

Given the relative complexity fo the project - the clarity of writing is a strength.
The case for the research, and significance of the issue is built nicely in the introduction, the methods are explained well, and the results reported in an understandable format.

The abstract could be modified for further simplicity eg using 'Q-sort' terminology without explanation in the abstract may be confusing. Maybe 'validated ranking process' or similar.

The tables add value and are clear.

Experimental design

The researchers have explained the retrospective limitations.
However I think the mapping approach is a good way to deal with matching existing curricular into new frameworks ( or vice versa), and provides readers with a pragmatic approach that might prevent losing useful approaches in a rush to the new framework.

The biases and limitation of 'expert consensus' are acknowledged.

Validity of the findings

The processes for deriving the data is well described and would be reproducible.
The nature of EPAs as the 'endpoints' is somewhat subjective and this affected the prioritization process. However it is now an important framework and this project illustrates a way to approach that subjectivity.

Importantly I don't think the authors 'overreach' in their conclusions

Comments for the author

thank you for the opportunity to review.
Important topic.
Well done
Probably more for educators than practitioners as target audience

All text and materials provided via this peer-review history page are made available under a Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.