Review History


All reviews of published articles are made public. This includes manuscript files, peer review comments, author rebuttals and revised materials. Note: This was optional for articles submitted before 13 February 2023.

Peer reviewers are encouraged (but not required) to provide their names to the authors when submitting their peer review. If they agree to provide their name, then their personal profile page will reflect a public acknowledgment that they performed a review (even if the article is rejected). If the article is accepted, then reviewers who provided their name will be associated with the article itself.

View examples of open peer review.

Summary

  • The initial submission of this article was received on June 30th, 2025 and was peer-reviewed by 3 reviewers and the Academic Editor.
  • The Academic Editor made their initial decision on September 18th, 2025.
  • The first revision was submitted on October 29th, 2025 and was reviewed by 2 reviewers and the Academic Editor.
  • The article was Accepted by the Academic Editor on November 24th, 2025.

Version 0.2 (accepted)

· · Academic Editor

Accept

The authors have addressed all the reviewers' comments and have done a good
job with their revisions. This manuscript is now ready for publication.

[# PeerJ Staff Note - this decision was reviewed and approved by Xiangjie Kong, a PeerJ Section Editor covering this Section #]

Reviewer 1 ·

Basic reporting

In my opinion, the authors did a great job in improving the manuscript. I have no further comments and recommend acceptance.

Experimental design

In my opinion, the authors did a great job in improving the manuscript. I have no further comments and recommend acceptance.

Validity of the findings

In my opinion, the authors did a great job in improving the manuscript. I have no further comments and recommend acceptance.

Additional comments

In my opinion, the authors did a great job in improving the manuscript. I have no further comments and recommend acceptance.

Reviewer 2 ·

Basic reporting

All suggestions are incorporated correctly.

Experimental design

The query related to experimental design section is incorporated correctly.

Validity of the findings

The results are presented well.

Version 0.1 (original submission)

· · Academic Editor

Major Revisions

**PeerJ Staff Note:** Please ensure that all review, editorial, and staff comments are addressed in a response letter and that any edits or clarifications mentioned in the letter are also inserted into the revised manuscript where appropriate.

**Language Note:** When preparing your next revision, please ensure that your manuscript is reviewed either by a colleague who is proficient in English and familiar with the subject matter, or by a professional editing service. PeerJ offers language editing services; if you are interested, you may contact us at [email protected] for pricing details. Kindly include your manuscript number and title in your inquiry. – PeerJ Staff

Reviewer 1 ·

Basic reporting

The manuscript addresses a significant gap in computer-aided hematology by presenting reliable methods for detecting schistocytes, which are tiny, morphologically diverse red blood cell fragments critical for diagnosing thrombotic microangiopathies. The authors introduce two innovations: (i) the first public dataset to explicitly label schistocytes (SBSD) and (ii) MCS-Net, a YOLOv10-based detector augmented with four bespoke modules (MSCA, CSD, SHA, and NWD). Reported results demonstrate impressive gains over strong baselines, with favorable trade-offs between speed and accuracy.

While the paper is well organized and largely clear, several methodological details are missing or under-explained. Additionally, the experimental validation, though extensive, could be strengthened to convince a high-impact computer vision audience.

In Section 3, the architectural components are described individually. However, a comprehensive diagram illustrating the full data flow through the YOLOv10 backbone and into the four proposed modules (MSCA, CSD, SHA, NWD) is missing. This makes it difficult to visualize precisely how they integrate. Which feature maps do they operate on? Do they function in parallel or sequentially?

The NWD loss function in Section 3 is described at a high level. For full reproducibility, the formal description should be more detailed. For example, how are the 2D Gaussian distributions specifically parameterized from the bounding box coordinates? What were the exact values or ranges for the hyperparameters used in this loss function during training?

The term "lightweight" is used to describe the CSD+NWD configuration, but this is a relative term. A more rigorous definition should be provided, perhaps by defining a specific threshold for parameters or GFLOPs below which a model is considered "lightweight" in the context of clinical deployment hardware.

The SBSD dataset is introduced as a major contribution. However, per the current manuscript content, the precise details regarding public access or raw data sharing (such as providing a DOI or data repository) are not clearly stated in the data-sharing statement. This must be addressed explicitly to meet data-sharing policy requirements—clarification is needed on how researchers may access SBSD for reproducibility.

Experimental design

The work presents two key novel contributions:

1) The SBSD Dataset: This is the first publicly described, expert-annotated blood smear dataset that specifically includes the challenging and clinically significant schistocyte cell type. This is a novel data contribution to the field, but no public link is provided.

2) The MCS-Net Model: This is a new neural network architecture. While it builds on an existing backbone (YOLOv10), it introduces four purpose-built modules (MSCA, CSD, SHA, NWD) designed to solve specific, well-documented problems in medical object detection (multi-scale features, tiny object localization, etc.). This represents novel methodological research.

In the experimental section, why not include more relevant comparisons against other state-of-the-art methods specifically designed for medical imaging or small object detection? The chosen baselines are all general-purpose detectors. A comparison against specialized architectures would make the claims of MCS-Net's superiority more convincing. The complete absence of transformer-based detectors (e.g., DETR variants) is a notable omission.

Figures 15 and 16 are not clear because of the bounding box overlap and the resolution of the images. I suggest trying to emphasize with some zoom-ins.

In the figures, the meaning of the bounding box colors should be explicitly stated in the caption (e.g., red for schistocytes, etc.). The figures would also be stronger if annotations (e.g., arrows) pointed out the defining morphological features of each cell type. If possible, provide high-resolution figures.

The manuscript must include an ethics statement regarding the collection and use of patient data for the SBSD dataset, including information on patient consent and institutional review board (IRB) approval. This is a critical omission.

Validity of the findings

The discussion of the results lacks statistical validation. All performance improvements are presented as point estimates (e.g., a 3.6% gain in mAP). To be rigorous, these comparisons require confidence intervals or statistical significance tests (e.g., paired t-tests) over multiple training runs to demonstrate that the observed improvements are not due to random initialization or training variance. In addition, it should be discussed and justified why the results provided in Tables 2 and 3 (above all) are quite similar to each other, and why the use of MCS-Net is motivated.

The rationale for not subdividing the granulocyte category is mentioned in the Discussion, but this limitation should be acknowledged earlier in the Methods section when describing the SBSD dataset, since it clearly affects this WBC category.

Reviewer 2 ·

Basic reporting

The article titled "MCS-Net: A hierarchical multi-scale neural framework for schistocyte detection in blood smears" is well-written and described. The quality of the manuscript has been further improved after the inclusion of the following comments:

1. The article should follow manuscript preparation guidelines.

2. The author has to proofread for typo errors and spelling mistakes.

3. The literature section is not properly presented in the manuscript. The author has skipped a few studies related to the domain. Therefore, the author has to mention the study's inclusion or exclusion criteria.

Experimental design

The experimental design of the study is well written.

Validity of the findings

Well presented.

Additional comments

The author has to include the latest studies.

Reviewer 3 ·

Basic reporting

-

Experimental design

-

Validity of the findings

-

Additional comments

1. The Introduction section should be enriched with more up-to-date studies from the literature.

2. The motivation and goal of the study should be presented in more detail. How the proposed method fills a gap in the literature.

3. The proposed method is presented in Figure 1 and should be discussed and explained in more detail.

4. All datasets are divided in an 8:1:1 ratio. To obtain more reliable results, it would be good to perform k-fold cross-validation.

All text and materials provided via this peer-review history page are made available under a Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.