Review History


All reviews of published articles are made public. This includes manuscript files, peer review comments, author rebuttals and revised materials. Note: This was optional for articles submitted before 13 February 2023.

Peer reviewers are encouraged (but not required) to provide their names to the authors when submitting their peer review. If they agree to provide their name, then their personal profile page will reflect a public acknowledgment that they performed a review (even if the article is rejected). If the article is accepted, then reviewers who provided their name will be associated with the article itself.

View examples of open peer review.

Summary

  • The initial submission of this article was received on February 26th, 2025 and was peer-reviewed by 3 reviewers and the Academic Editor.
  • The Academic Editor made their initial decision on May 8th, 2025.
  • The first revision was submitted on September 3rd, 2025 and was reviewed by 2 reviewers and the Academic Editor.
  • The article was Accepted by the Academic Editor on October 17th, 2025.

Version 0.2 (accepted)

· · Academic Editor

Accept

The authors have addressed all the reviewers' comments.

[# PeerJ Staff Note - this decision was reviewed and approved by Vladimir Uversky, a PeerJ Section Editor covering this Section #]

Reviewer 1 ·

Basic reporting

This is a revision; they claimed they had addressed the reviewers' comments. In fact, they did not, and they show no respect for the comments.

Experimental design

-

Validity of the findings

-

·

Basic reporting

-

Experimental design

-

Validity of the findings

-

Additional comments

All previously sent comments are well addressed, really appreciated.

Version 0.1 (original submission)

· · Academic Editor

Major Revisions

**PeerJ Staff Note:** Please ensure that all review and editorial comments are addressed in a response letter and that any edits or clarifications mentioned in the letter are also inserted into the revised manuscript where appropriate.

**Language Note:** The review process has identified that the English language must be improved. PeerJ can provide language editing services - please contact us at [email protected] for pricing (be sure to provide your manuscript number and title). Alternatively, you should make your own arrangements to improve the language quality and provide details in your response letter. – PeerJ Staff

Reviewer 1 ·

Basic reporting

-

Experimental design

-

Validity of the findings

-

Additional comments

1. The decision to reject this manuscript without further review is based on several major concerns. The authors list results without a full explanation of the motivation for the experiment, methods, and analysis procedures, or observations. In other words, the authors list results without telling a scientific story. Data presentation and description need to be more precise and rigorous. These issues make it difficult to accurately and thoroughly evaluate the science. BTW, some important references are missing, such as PMID: 37601374; PMID: 36068190; PMID: 35252042; PMID: 35222296; PMID: 39854809; PMID: 36591809; PMID: 20889547;

**PeerJ Staff Note:** It is PeerJ policy that additional references suggested during the peer-review process should only be included if the authors are in agreement that they are relevant and useful.

2. Moreover, this paper is poorly written, and the logic is also poor, especially for the Abstract and Introduction parts. There are so many grammar errors, wrong wording, wrong expressions, etc., in the paper that it takes me a long time to understand the content. I suggest rejection, and the manuscript should be rewritten and modified by an English editing expert before publication. BTW, upon initial checks, it showed numerous passages with excessive similarity (more than 40%) to previously published works.

·

Basic reporting

1. The manuscript generally uses clear language; however, several grammar and syntax issues reduce readability. Examples include:
- Line 38: "which can save the economy and time cost" should be rephrased to "which can reduce economic and time costs".
- Line 45: Keywords listed as "blood; nucleic acid; virus" are too generic. Suggest replacing with more targeted terms like "nucleic acid testing", "blood screening", or "PCR-based diagnostics".
- Line 69: "passed the enzyme-linked immunosorbent assay" is ambiguous, clarify whether these samples were deemed suitable for NAT (Nucleic Acid Testing).
- Line 94: Sentence lacks clarity; suggest rewording to clearly define each component of the Roche instrumentation.
- Line 248: “number of pools that responded to the mixed test was the largest” revise to “had the highest number of reactive pools”.
- Line 285: “in most cases in the process of testing” is redundant; simplify to “during testing”.
- Line 294: “saved economic and time costs”, revise to “reduced both time and economic costs”.
Similarly, some other typographical and language errors are present in the manuscript that reduce the readability, it is strongly recommended to improve these throughout the manuscript.

2. Consistency and terminology need improvement:
- Use "reactive samples" consistently instead of alternating with "reaction samples".
- Define "effective resolution rate" clearly when first introduced (Line 18 or 113).
- Consider creating a short glossary or explaining uncommon terms like “invalid single pools” and “split reactive pools” etc. and also the terms at Line 60-62.

3. Abbreviations:
- RS (Reactive Specimens) is used without clear definition in the abstract. Please define all abbreviations upon first use and maintain consistency.

4. Figures and tables:
- Tables are informative but captions are minimal. Provide brief interpretations or indications of statistical significance in each caption.

5. Literature and citations:
- Reference list is a bit short and most of the references are pre-2021. Inclusion of more recent studies (2022–2024) on blood screening or NAT platform comparisons and addition of more references about key terms would improve the quality.

Experimental design

The study presents a clearly defined objective to compare Kehua and Roche nucleic acid detection systems. It includes a comprehensive dataset covering multiple years (2016–2024), which adds robustness. Following changes are recommended to be incorporated to enhance the quality of manuscript.
1. Methods are described with reasonable detail, although information on sample inclusion/exclusion criteria, ethical considerations, and data handling could be elaborated.
2. It is recommended to clarify the procedural differences in the Kehua and Roche systems, how they differ in automation and how this automation can effect.

Validity of the findings

Results are presented with statistical analysis (chi-square tests) and appropriate P-values. Findings support the conclusion that Kehua is comparable to Roche in most performance indicators, with some differences (e.g., HCV detection rates). Following aspects should be addressed before the acceptance of manuscript.

1. Could you elaborate on the criteria for designating a pool as invalid in both systems?
2. Was any verification performed on the discordant samples between the two systems?
3. How did the transition in pooling strategy for the Kehua system in 2018 affect comparative analysis?
4. Can you provide additional insights into why Kehua's HCV detection rate was significantly lower?
5. Were there any significant trends in false negatives or positives across the two systems?
6. CT value distributions are informative, but conclusions would benefit from further discussion of the clinical relevance of CT thresholds.

Additional comments

The study is timely and addresses an important issue in transfusion safety. Strengths include the longitudinal data and the side-by-side analysis of invalid results and CT values.
1. While comprehensive, the manuscript would benefit from professional language editing.
2. Consider adding a flowchart to clarify the comparison process between the two systems.

·

Basic reporting

The authors provide an in-depth comparison of the Kehua detection system and Roche nucleic acid detection system. The data is backed by pertinent statistical evaluation.

Experimental design

A substantial sample size combined with multi-year data offers robust statistical strength.

Validity of the findings

The findings are presented clearly and backed by statistical analysis. The conclusion that both systems are effective, albeit with different advantages, is well-founded.

Additional comments

Table 1 header formatting issue

All text and materials provided via this peer-review history page are made available under a Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.