Review History


All reviews of published articles are made public. This includes manuscript files, peer review comments, author rebuttals and revised materials. Note: This was optional for articles submitted before 13 February 2023.

Peer reviewers are encouraged (but not required) to provide their names to the authors when submitting their peer review. If they agree to provide their name, then their personal profile page will reflect a public acknowledgment that they performed a review (even if the article is rejected). If the article is accepted, then reviewers who provided their name will be associated with the article itself.

View examples of open peer review.

Summary

  • The initial submission of this article was received on March 14th, 2025 and was peer-reviewed by 4 reviewers and the Academic Editor.
  • The Academic Editor made their initial decision on May 12th, 2025.
  • The first revision was submitted on June 22nd, 2025 and was reviewed by 2 reviewers and the Academic Editor.
  • A further revision was submitted on July 30th, 2025 and was reviewed by 1 reviewer and the Academic Editor.
  • The article was Accepted by the Academic Editor on September 19th, 2025.

Version 0.3 (accepted)

· · Academic Editor

Accept

As all of the reviewer's comments have been addressed in this newest revision, the manuscript can be published.

[# PeerJ Staff Note - this decision was reviewed and approved by Massimiliano Fasi, a PeerJ Section Editor covering this Section #]

Reviewer 2 ·

Basic reporting

-

Experimental design

-

Validity of the findings

-

Version 0.2

· · Academic Editor

Minor Revisions

**PeerJ Staff Note:** Please ensure that all review, editorial, and staff comments are addressed in a response letter and that any edits or clarifications mentioned in the letter are also inserted into the revised manuscript where appropriate.

**Language Note:** When you prepare your next revision, please either (i) have a colleague who is proficient in English and familiar with the subject matter review your manuscript, or (ii) contact a professional editing service to review your manuscript. PeerJ can provide language editing services - you can contact us at [email protected] for pricing (be sure to provide your manuscript number and title). – PeerJ Staff

Reviewer 1 ·

Basic reporting

-

Experimental design

-

Validity of the findings

-

Additional comments

The latest revision has improved.

Reviewer 2 ·

Basic reporting

-

Experimental design

-

Validity of the findings

-

Additional comments

I have minor suggestions that will improve the quality.
The manuscript is generally clear, but a light copyediting is recommended to improve fluency, grammar, and professional tone.
Figure 6: The x and y axis labels are missing. Although the title suggests these are values of time, it is still not clear what the exact values are: milliseconds, microseconds, or seconds?
Figure 7: Provide the exact labels of the x and y axes.
Figure 8: Provide the exact labels of the x and y axes.
Figure 9: Provide the exact labels of the x and y axes.
Figure 10: Provide the exact labels of the signalling cost and handover failures.

Version 0.1 (original submission)

· · Academic Editor

Major Revisions

**PeerJ Staff Note:** Please ensure that all review and editorial comments are addressed in a response letter and that any edits or clarifications mentioned in the letter are also inserted into the revised manuscript where appropriate.

**Language Note:** The review process has identified that the English language must be improved. PeerJ can provide language editing services - please contact us at [email protected] for pricing (be sure to provide your manuscript number and title). Alternatively, you should make your own arrangements to improve the language quality and provide details in your response letter. – PeerJ Staff

Reviewer 1 ·

Basic reporting

-

Experimental design

-

Validity of the findings

-

Additional comments

The paper does not present any significant novelty compared to existing literature. The methodology lacks scientific rigor, and the results section lacks in-depth discussion. Additionally, there is a lack of statistical validation or performance metrics to support the claims.

Reviewer 2 ·

Basic reporting

The introduction is repetitive; condense the content and highlight the contribution earlier.

The literature review lacks synthesis; organize thematically and clearly state how MSCHF is different.

Justify the selection of [67] as the only comparison baseline.

Define all acronyms (e.g., MIH, RSS, RAT) on first use.

Figures 1, 4, 5, and 7 are low resolution; regenerate them in high quality.

Figures 6–9 lack axis labels, units, and legends; add these for clarity.

Captions should describe the takeaway from each figure.

Experimental design

Simulation parameters like user speed, signal thresholds, and mobility models are missing; include a complete parameter table.

The algorithm's pseudo-code is too abstract; expand it with real control flow and decision logic.

Threshold values are fixed and not justified; clarify if they were empirically tuned.

Buffer monitoring in MSCHF is simplistic; it lacks time-based trends or multi-factor decisions.

±10% buffer/data rate adjustment lacks justification; consider parameterizing it.

The use of random delays is not tied to performance impact; make it meaningful.

Packet-level buffer behavior (e.g., overflow, underrun) is not tracked; this limits insight.

calculateSignalingCost(), calculateHandoverFailure(), and calculateBufferConsumption() return random values; base them on actual simulation state.

Non-MSCHF uses a 50% random handover decision; this is unrealistic—revisit logic.

Scalability testing across user/node densities is not demonstrated.

Validity of the findings

The mathematical model uses assumed constants (k, γ, δ) without evidence; clarify how they are derived.

Only one baseline algorithm is used; include more for stronger validation.

No statistical analysis (e.g., confidence intervals, variance) is presented; add these to support claims.

Performance claims are not backed by formal hypothesis testing; include statistical tests where appropriate.

Reviewer 3 ·

Basic reporting

The manuscript proposes an innovative MSCHF algorithm designed to minimize signaling costs and improve vertical handover efficiency in wireless communications networks. The algorithm operates by optimizing the signaling procedure during the handover process between heterogeneous networks (e.g., cellular to Wi-Fi networks) and ensures minimal latency and resource usage. The paper describes the application of Markov decision processes (MDPs) to model the handover decision-making process and employs an approximate dynamic programming approach to find the optimal solution. The results of the algorithm are tested and compared with traditional methods in terms of signaling cost reduction and handover efficiency.

Content and Structure Review

Abstract
The abstract provides a concise summary of the problem, methodology, and key findings. It introduces the MSCHF algorithm, its optimization of handover efficiency, and its comparison with traditional methods. However, the abstract could benefit from a more explicit mention of the quantitative improvements achieved by the algorithm.

Suggestions:
Simplify the mention of technical terms (e.g., Markov decision processes) for broader accessibility, or provide a brief definition.

Include a clearer mention of specific performance metrics or improvement percentages (e.g., cost reduction, latency decrease) to make the impact more tangible.

Introduction
The introduction is clear and sets the context well by identifying the challenges in vertical handover within heterogeneous networks. It effectively justifies the need for cost optimization and efficiency improvement in handovers and introduces the MSCHF algorithm as a potential solution. However, the introduction could be expanded to provide more detail on the current state of vertical handover methods and their limitations.

Suggestions:
More discussion on current approaches and their limitations would better emphasize the novelty and advantages of the MSCHF algorithm.

A brief mention of real-world applications or industries that could benefit from the algorithm would strengthen the introduction's relevance.

Literature Review
The literature review provides a good overview of existing techniques for handover management and signaling cost minimization in wireless networks. It mentions relevant methods, such as Markov decision processes (MDPs), and compares them to the proposed algorithm. However, it could benefit from deeper comparisons of the MSCHF algorithm with recent works specifically focused on 5G networks and heterogeneous networks.

A more detailed comparison with other optimization techniques (e.g., genetic algorithms, Q-learning) would provide further justification for the choice of MDPs in the MSCHF algorithm.

Experimental design

Methodology

The methodology section clearly outlines the components of the MSCHF algorithm, including the use of MDPs and approximate dynamic programming for optimization. The paper describes how the algorithm is applied to vertical handover decisions and evaluates its performance. However, the section could benefit from a more detailed explanation of the optimization process, including how parameters are chosen and how the handover decision-making process is modeled.

Suggestions:
Provide more detail on how the MDPs are constructed, specifically the state and action spaces, and how the rewards are defined.

Explain the approximate dynamic programming approach in more depth, especially how it contributes to improving efficiency without excessive computational cost.

Technical Review

Methodology and Algorithms
The choice of MDPs for modeling vertical handover decision-making is appropriate, as it allows for systematic optimization of the handover process. The use of approximate dynamic programming is also a promising approach for reducing computational complexity, but the methodology lacks some detail regarding how the approximation is performed and its impact on performance.

Suggestions:
Provide a more explicit explanation of the approximation in dynamic programming—how does it reduce computation while maintaining accuracy?

Discuss the potential trade-offs in computational complexity and solution accuracy due to the approximate dynamic programming approach.

Hyperparameter Tuning and Validation Techniques
The paper does not provide detailed information on how the hyperparameters of the MDP model are tuned or validated. This lack of detail makes it challenging to evaluate the robustness and generalizability of the proposed solution.

Suggestions:
Describe the hyperparameter tuning process for the MSCHF algorithm, including the search methods used for selecting parameters.

Explain the validation techniques used to assess model performance. Was k-fold cross-validation or any form of cross-validation applied?

Performance Evaluation

Result Presentation
The results are presented with clear metrics on signaling cost reduction and handover efficiency. The comparison with traditional methods demonstrates the algorithm’s effectiveness. However, the results could be presented more comprehensively with additional visualizations.

Suggestions:
Provide graphs comparing the performance of the MSCHF algorithm against traditional methods (e.g., bar charts, line graphs).

Include a confusion matrix or more detailed performance breakdowns to show the model’s strengths and weaknesses in different scenarios.

Validity of the findings

Statistical Tests
The paper presents promising results, but it lacks statistical validation of the reported improvements. The performance differences between the MSCHF algorithm and traditional methods are not statistically tested.

Suggestions:
Conduct statistical tests (e.g., t-tests or ANOVA) to validate the significance of the reported improvements.

Report confidence intervals or standard deviations for key performance metrics to provide a better understanding of the model's stability.

Visualization and Analysis

Figures and Tables
The figures and tables are generally well-organized but could benefit from more context. For example, some of the graphs are not clearly labeled, making them difficult to interpret without additional explanation.

Suggestions:
Include more detailed captions for each figure, explaining the significance of the data and the key takeaways.

Add more visualizations that illustrate the optimization process, such as flowcharts or diagrams showing the decision-making flow within the MSCHF algorithm.

Reviewer 4 ·

Basic reporting

In general, the work is sufficient

Experimental design

The methods has novelty

Validity of the findings

the validity of findings are fine and acceptable

Additional comments

no more comments

All text and materials provided via this peer-review history page are made available under a Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.