Review History


To increase transparency, PeerJ operates a system of 'optional signed reviews and history'. This takes two forms: (1) peer reviewers are encouraged, but not required, to provide their names (if they do so, then their profile page records the articles they have reviewed), and (2) authors are given the option of reproducing their entire peer review history alongside their published article (in which case the complete peer review process is provided, including revisions, rebuttal letters and editor decision letters).

New to public reviews? Learn more about optional signed reviews and how to write a better rebuttal letter.

Summary

  • The initial submission of this article was received on October 24th, 2016 and was peer-reviewed by 2 reviewers and the Academic Editor.
  • The Academic Editor made their initial decision on November 11th, 2016.
  • The first revision was submitted on December 30th, 2016 and was reviewed by the Academic Editor.
  • A further revision was submitted on January 4th, 2017 and was reviewed by the Academic Editor.
  • The article was Accepted by the Academic Editor on January 5th, 2017.

Version 0.3 (accepted)

· · Academic Editor

Accept

Thanks for attending to those last couple of comments so quickly. The manuscript is now ready for publication.

Version 0.2

· · Academic Editor

Minor Revisions

Thank you for your thoughtful and considered response to all the reviewer comments. I am completely in agreement with you that the interpretation of the sensitivity analysis by one of the reviewers was erroneous, and I am of the opinion that the way in which this part of the analysis was conducted was valid and adds value to the overall study. Similarly, your explanation of why static optimisation would not be useful or appropriate is very thorough. As such, I haven't sent the manuscript back out to reviewers but I do just have a couple of very minor corrections to request before the manuscript is ready for publication.

1. The Supplementary Table should be cited somewhere in the text of the article.
2. Can add some information to the methods about how many retroreflective markers you used and where they were placed - perhaps a table in the supplementary info would be most appropriate for this.
3. Raw data. It is the ethos of PeerJ that as much data from an analysis as possible be made freely available. I take your point that the raw marker co-ordinates and ground reaction forces would be a vast quantity of data. However, it would be great if the data from some of the earlier stages of processing could be made available. I leave it to you to decide what data would be the most useful. Alternatively, despite the large file sizes, it may be quicker for you to simply upload the raw data files to a repository (Dryad and Figshare are both good choices - I have used them for large FEA datasets). Whatever you decide, please make sure the location of the data is noted in the manuscript.

I look forward to seeing the final version of the manuscript.

Phil Cox

Version 0.1 (original submission)

· · Academic Editor

Major Revisions

Please find the reviewers' comments below. On the whole, reviewer 1 is very positive, reviewer 2 a little less so. Reviewer 2 has concerns regarding the sensitivity analysis - however, I have a feeling these have arisen from a misunderstanding of how and why the analysis was conducted. For that reason, I think greater detail is needed regarding the implementation of the sensitivity analysis, and also what the result of the sensitivity analysis means for your study.

Reviewer 2 also suggests that a funding statement and the raw data are missing, but I can see that both of these things are present in your submission.

Otherwise, both reviewers make a number of valid points that need addressing. I look forward to seeing your revised manuscript.

·

Basic reporting

This study compares the contact loading in the intact limb of a unilateral limb loss group versus a matched control group. The limb loss group show increased peak contact force and contact force loading rate.

Overall, this manuscript is very well presented. The scope of the study is clear and well informed, methods are appropriately described and sufficiently detailed, results are well presented but not over interpreted and the discussion provides useful and critical insight into the results. Figures are well made, informative and clear and the supplementary material is useful and supportive. I therefore believe this study is suitable for publication.

Experimental design

The low number of subjects in each group (10) was initially a worry however the authors handle this well, providing appropriate recognition and acknowledgment of this evidenced by a focussed analysis and comments in the discussion. I therefore do not see this as being preventative of publication.

Validity of the findings

The results appear scientifically sound are supported by a sensitivity analysis to increase confidence in the results.

Comments for the author

I do have a number of comments that I would like the authors to consider:

Abstract:
In the manuscript I received there were two subtly different versions of the abstract. The version that includes the quantification of the JCF data is more informative.

Introduction:
L65-67 The authors suggest that “abrupt changes in gait mechanics” could “overwhelm the adaptive response of cartilage”. Are the authors aware of any studies that might be able to support this statement? Over what time period does cartilage adapt to changes in mechanical stimulus? As this sentence essentially reflects the authors “mechanical overloading” hypothesis, can the authors also justify their JCF variables analysed (peak, loading rate and impulse) in the context of “overwhelming the adaptive response”.
L72 The hypothesis you state is not precisely the hypothesis you test. Your hypothesis implies that you statistically examine the entire gait cycle when in fact your results examine three discrete representations of the JCF; the peak, the loading rate and the impulse. I recommend that you either make your hypothesis more specific to these 0D (discrete) variables or consider a more suitable statistical approach that would evaluate the JCF over the entire gait cycle (a 1D analysis e.g. statistical parametric mapping www.spm1d.org or functional data analysis).
L75-77 The rationale for examining the medial knee is based on the prevalence of medial knee OA. As the study of Wise et al. (2012) is not based on amputees and as the subsequent studies using the external KAM do not have convincing outcomes can this choice be strengthened for example by either by referring to other studies observing other features of amputee gait (e.g. trunk kinematics or knee abduction angles) or the prevalence of medial knee OA in veterans.

Methods:
L95: If the authors have the variability data for the time walking independently, then please can this be added.
L175: The authors report standardised effect sizes, to quantitatively judge the size of the effect observed. However my preference would be to present the confidence intervals of the difference along with some indication of what magnitude of difference between the two groups the authors would consider to be meaningful. Ideally this change would be based on an understanding of what magnitude of JCF difference might “overwhelm the adaptive response” that therefore in turn support the “mechanical overloading” hypothesis.
L181: Would it be possible to add the model input parameters into the supplementary material to increase transparency and aid easy replication?
L187-189: I very much liked the idea of the sensitivity analysis of the model parameters to further evaluate the results but at the end of the paragraph describing the analysis I was not clear how the outcome of the analysis was to be interpreted. Please add a brief sentence to clarify.

Results:
L219: I do not follow the interpretation of transtibial subjects walking with less knee flexion (fig 6). It appears from figure 6 (although the flexion/extension direction is not labelled) that they have more flexion than the control group during early stance.

Conclusions:
L291: In my opinion to suggest that the increased loads observed may be a risk factor for OA would require more substantial insight into the magnitude of the load increase that you have observed (also see my second comment in the methods section). Whilst you have observed “relatively high loads” can you link this to a meaningful change/adaptation in the cartilage to qualify this as a risk factor?

Reviewer 2 ·

Basic reporting

The text is clearly written. Units of %BW are not typical and I think should be changed to just the ratio BW (not multiplied by 100). The introduction adequately describes and references the background and motivation of the work. Structure is fine except that there is no statement of funding which I think is required and is standard:
"Funding Disclosure
1 Separately from declaring Competing Interests, PeerJ also requires that authors disclose the financing which made their work possible.
2 The Funding statement is published in the final article. This disclosure provides added transparency."

Figures are good, except units of %BW instead of BW makes numbers much larger, should be changed. Submission seems to be sufficiently 'self-contained'. Is the raw data and code available? I only see the processed/final measures/outcomes (Average? medial contact force throughout stance for each subject, average? peaks, impulses, for each subject) I believe raw data (which you have not included) is required as per Peerj guidelines:

https://peerj.com/about/author-instructions/#data-and-materials
"
Data and Materials
All authors are responsible for making materials, code, raw data and associated protocols relevant to the submission available without delay.
Please ensure that all relevant datasets, code, images and information are available in one of the following possible ways and provide a link to the appropriate location: uploaded as Supplemental Files, deposited in a public repository, or hosted in a publicly accessible database. There are very few circumstances in which we can accept a manuscript without raw data (see point 4 in 'Preparing your submission').
"
Obviously the raw data would be a lot with all the markers and ground reactions forces for all used trials for all subjects, but there are repositories available for sharing. Some of the code and models may not be yours to freely distribute or you may not have access to the code, but I believe the intent behind this guideline is to make as much data freely available as possible so that others could similarly analyse the results.

"Some examples of invalid reasons for not submitting raw data or code:
The data is owned by a third-party who have not given permission to publish it within this article.
Please add a note to PeerJ staff to not publish the raw data alongside your article.
The raw data is too large.
Please upload the raw data to an online repository (e.g. Figshare, Dryad etc)."

Experimental design

Seems to be mostly original work, well defined question, relevant to the issues presented, and seemingly meaningful. Knowledge gap is presented and explored. Investigation seems to be conducted not very rigorously and is to a very basic/low, not high, technical standard regarding the estimation of muscle and joint reaction forces. Methods are sufficiently described. The research seems ethical.

Validity of the findings

I do not feel comfortable with the validity of these findings due to the uncertain nature of the perturbations analyses you have shown and poorly described. It appears that these perturbations of model parameters were required to have results with statistically significant differences for both the peak and impulse measures since at iteration 0 for both of these, you show 0% limb loss>controls. How do you know that the perturbations are realistic and physiological. Many other studies do not perform perturbations to the model and would thus not show any statistical difference for these measures. It seems suspect that the standard scaled model does not support your conclusions regarding impulse and peak. I think a detailed description of which parameters were required to be perturbed in order to obtain the significant differences between your groups for the peak and impulse measures and an analysis of whether this perturbation is physiological/realistic should be reported, especially if it is physiological and is thus something that other researchers should also replicate to ensure their studies perturb the same measures.

Comments for the author

This is a good study that requires a little more work for me to suggest it as being methodologically sound:
"PeerJ evaluates articles based only on an objective determination of scientific and methodological soundness, not on subjective determinations of 'impact,' 'novelty' or 'interest'"
I have attempted to follow the PeerJ guidelines regarding review diligently and effectively and have provided comments based on their suggested points. I have classified this paper as major revision primarily due to the uncertain appropriateness and completeness of some of the methods and minor technicalities regarding the lack of raw data and missing funding disclosure.

This is a good initial attempt to model the behaviors of limb loss subjects relative to controls in order to better understand and highlight the possible bio-mechanisms for which limb-loss subjects experience higher rates of knee OA. The authors clearly demonstrate knowledge of the issues surrounding the phenomena. However I believe the modelling efforts are not solid enough to say with any certainty that the results are accurate, primarily due to the inherent uncertainty of muscle co-contraction and activation during gait and the sensitivity of medial/lateral contact force forces and ratios to the muscle force estimations. Obviously you point this out and perhaps that is all you can do. I think at the very least however that a static optimization method should be employed to estimate muscle forces( not mentioned in methods). Also, I would assume these are soldiers of relatively large muscle mass and I would think that some level of muscle scaling may be warranted based on EMG/dynamometer testing or other means if possible, e.g. the standard scaled model muscle maximums may not be appropriate for these subjects.
Perhaps it would be much easier and less modelling-focused and parameter-sensitive if you could simply show the inverse dynamic flexion and adduction moments/angles and compare the peaks/impulse/rate of these. Increased flexion moment seems to indicate increased muscle activation based on yours and other peoples findings and perhaps this is enough to show with a statement regarding increase muscle activation about the knee is likely to compress the joint more and thus increase the joint reaction force. As of now you are assuming the muscle activation to be dependent solely on flexion angle and flexion moment, and then calculating medial force using those muscle activation estimates, the assumed relative proportioning of muscle maximums, and the adduction moment. Are you confident that the muscles do not depend on the adduction moment as well? It seems possible given the broad range of muscles around the knee that could contribute in varying degrees to either adduction or abduction. Also, are you confident that the distribution of PCSA given in the standard model based on a limited cadaveric study is reflective of your special population group? It seems unlikely given the increased musculature demand and training required for basic training relative to the general population.

All text and materials provided via this peer-review history page are made available under a Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.