Review History


All reviews of published articles are made public. This includes manuscript files, peer review comments, author rebuttals and revised materials. Note: This was optional for articles submitted before 13 February 2023.

Peer reviewers are encouraged (but not required) to provide their names to the authors when submitting their peer review. If they agree to provide their name, then their personal profile page will reflect a public acknowledgment that they performed a review (even if the article is rejected). If the article is accepted, then reviewers who provided their name will be associated with the article itself.

View examples of open peer review.

Summary

  • The initial submission of this article was received on November 29th, 2024 and was peer-reviewed by 2 reviewers and the Academic Editor.
  • The Academic Editor made their initial decision on March 14th, 2025.
  • The first revision was submitted on May 7th, 2025 and was reviewed by 2 reviewers and the Academic Editor.
  • A further revision was submitted on September 3rd, 2025 and was reviewed by the Academic Editor.
  • The article was Accepted by the Academic Editor on October 7th, 2025.

Version 0.3 (accepted)

· · Academic Editor

Accept

The authors positively addressed the minor concerns of the reviewers from the previous round of review. The manuscript is ready for publication now.

[# PeerJ Staff Note - this decision was reviewed and approved by Vicente Alarcon-Aquino, a PeerJ Section Editor covering this Section #]

Version 0.2

· · Academic Editor

Minor Revisions

The reviewers appreciated the revised version of the paper, all previously highlighted concerns have been positively addressed. They thus recommend acceptance after a minor revision. In particular, reviewers advocate for adding a little discussion on how the current work compares with related approaches based on artificial intelligence. They point out two papers that should be compared to.

**PeerJ Staff Note:** It is PeerJ policy that additional references suggested during the peer-review process should only be included if the authors agree that they are relevant and useful.

**PeerJ Staff Note:** Please ensure that all review, editorial, and staff comments are addressed in a response letter and that any edits or clarifications mentioned in the letter are also inserted into the revised manuscript where appropriate.

Reviewer 5 ·

Basic reporting

Cyber Security issues, Lack of proper awareness, Lack of proper framework, Skillful staff absence, Lack
of Quick response in emergency, High expenses, Lack of privacy, Management issues, Malicious
activities, Resource absences, Confusion on System measurement, Human error during software
development and Non reliability.

Experimental design

They prioritized and evaluated the severity of each cyber security
problem using the Fuzzy-TOPSIS technique to highlight the importance of their work

Validity of the findings

this work incorporates the cutting-edge Fuzzy-TOPSIS
method, a computational technique that has proven effective in handling fuzziness and
ambiguity in many areas when applied to decision-making problems

Additional comments

Fascinating and timely article. It deserves publication, and I am recommending acceptance with corrections. Some issues require your attention. I list these corrections below as feedback/comments, and I look forward to reading this article's updated version.

• Given the complexity involved, the author has produced many positive and welcome outcomes. The literature review offers a useful overview of current research and policy, and the resulting bibliography provides a very useful resource for current practitioners.

• This research reveals an interesting finding about 'Cyber Security Challenges for Software Vendors
2 through A Fuzzy-TOPSIS Approach'. However, I would like to see more discussion of exactly what this finding means and its implications on the related topic of cybersecurity threats, exploits, and vulnerabilities in new software bills of materials with artificial intelligence - see: https://journals.sagepub.com/doi/pdf/10.1177/15485129241267919 and on the related topic of ‘AI security and cyber risk in IoT systems’ - see: https://doi.org/10.3389/fdata.2024.1402745 It would be interesting to see a few sentences reviewing and comparing your work in relations to these recent studies in related topics.

**PeerJ Staff Note:** It is PeerJ policy that additional references suggested during the peer-review process should only be included if the authors agree that they are relevant and useful.


• This is a well-written article that identifies an important gap.

• While this study is largely confirmatory, it is still a useful and welcome contribution.

I hope the comments and feedback are helpful, and well done for writing such an interesting article. I am looking forward to reading the updated version.

Cite this review as

Reviewer 6 ·

Basic reporting

The keywords must be sorted in alphabetical order.

I suggest reviewing the research questions, as the same ones have been created twice (Introduction and SLR).

I suggest improving the quality of Figures.

Experimental design

In section “III. Research Design,” it should be noted that the Barbara Kitchenham's guidelines details the Planning SLR (Identifying the need for a SLR, Development of a review protocol (a) Research questions, b) Search strategy, c) Inclusion and exclusion criteria, d) Quality assessment)), Conducting SLR (Identification of research (Scientific databases), Selection of studies, Quality assessment) and Reporting SLR (answer research questions). However, in “3) REPORTING THE REVIEW,” “3.1 QUALITY EVALUATION” and “3.2 PUBLICATION YEAR” are carried out. This causes confusion, as the research questions defined in the SLR have not been answered. In the subsection “A. SLR Findings,” it is stated that “For RQ1, the proposed method utilized the SLR technique to identify 13 critical cybersecurity issues,” and it is not known how those 13 critical cybersecurity issues were extracted.

Validity of the findings

The discussion should detail the findings and compare them with other similar results. Therefore, I suggest improving the discussion section by considering that a summary of the findings is presented.

The conclusions of the study are very general and therefore I suggest that they be improved.

Cite this review as

Version 0.1 (original submission)

· · Academic Editor

Major Revisions

The reviewers agreed on the importance of the topic and found the work solid. Still, some improvements to the current submission should be made before publication. The presentation of the paper should be improved, some aspects of the study cleared, and some discussions added. In addition to addressing all comments from the reviewers, the revision work should consider the following points.

[1] Expand the related work discussion, by comparing the current submission with similar approaches. Also, more recent publications should be added.

[2] Better explain the SLR process (e.g., inclusion and exclusion criteria, people involved, statistical significance of the expert survey experiment) as indicated by the reviewers.

[3] Strengthen the discussion about the findings of the study and elaborate on the implications of prioritized cybersecurity challenges on software development practice.

Please also proofread the whole paper and fix the typesetting errors highlighted by the reviewers.

**PeerJ Staff Note:** Please ensure that all review and editorial comments are addressed in a response letter and that any edits or clarifications mentioned in the letter are also inserted into the revised manuscript where appropriate.

**Language Note:** The review process has identified that the English language must be improved. PeerJ can provide language editing services - please contact us at [email protected] for pricing (be sure to provide your manuscript number and title). Alternatively, you should make your own arrangements to improve the language quality and provide details in your response letter. – PeerJ Staff

Reviewer 3 ·

Basic reporting

The paper performs an SLR to detect the most important challenges for software vendors in the cyber security context. The security problems found have been validated through consulting experts. They have been prioritized using the fuzzy-topSIS technique to rank and assess the risks associated with security aspects.

The introduction is too general in the context of cybersecurity. Cybercrimes also include personal contact, for example using social networks, then not affected by the cybersecurity issues during software development.

Regarding the classification and prioritization of cyber security challenges, several risks and vulnerabilities can be used as mechanisms for assessing the prioritization, but they have not been included in the paper.

There is a more recent ‘Cybercrime’ analysis published by Cisco in 2024 that is updated in comparison with ‘Norton Cybercrime Report 2011’.

The related work section is too general and could be used for the introduction of any paper on cybersecurity. This section is presented with the structure of an introduction, more than a related work section. I recommend including three parts, the previous papers that have analysed the cyber security challenges, another paragraph focused on software vendors and a last one about where the prioritization techniques have been applied in the cybersecurity context.

There are several prioritization techniques, such as AHP or VIKOR. In the paper, it is not explained why TOPIS has been selected in this case.

Figures and tables are very far from the text where they are referred.

I would use a bar diagram in Figure 3.

The definition included on page 8, line 293 is not clear, the definition of what? Additionally, the linguistic scale is not presented.

‘The fuzzy topsis’ subsection includes several references where the topsis technique has been included, but they are not related to cybersecurity. Obviously, there are several examples where fuzzy-Topsis can be applied, but some related to the case study would be references.

Why “Lack of Quality, Liability, and Reliability” has been understood as a single challenge instead of 3 different ones?

Matrix on page 9 has some typos, such as ‘…….’, R(nm) without ‘()’, c11, c12…….Cm, the small squares. Additionally, the left part where appears D=A2, is not explained. Additionally, the possible alternatives and criteria should be included clearly.

In equations (2), (3) and (6) appear the variable ‘U’ with one, two and three subindexes.
What is ‘min lij’?

There are several mistakes in the equations, then it is very difficult to follow the steps. Please, review carefully both equations and steps, including explanation and checking the mistakes.
In SLR findings section, awareness, management, security and eminence are listed, but they are not related with the text and the proposal.

Section 3.2, is referred to, but the sections are not called with numbers, letters are used.

Additionally, there are several grammar typos, some of them are:
- challenges. Like
- In the proposed paper a -> In the proposed paper, a
- million Indians have fallen victim to 88 cybercrime in India -> million Indians have fallen victim to 88 cybercrimes.
- In Stage 1, With the
- In Stage 3, We u
- technique. [32-35]
- 1981. [39-41] The
- criteria. [42-44].
- challenges. [45-46]
- 1,2,3,...
- upto ”n”
- upto “m”
- Cieuro???(0,1)
- for example, In an extended
- criterion is "eminence," and
- cyber-attacks," CCSC2

Experimental design

The form used with the experts should be available.

The period of publication is from 2001 to 2021, no explanation about why there are no publications from 2022 to 2024 have not been included. If there are no relevant papers in the last 3 years, is it possible to claim that this is a ‘hot topic’? Additionally, the persons and the roles of each of them in the SLR have not been included. Who has performed each phase?

The snowballing process in SLR has not been included in the paper. This process can be very relevant for extracting some extra relevant papers.

I appreciate in incorporation of software engineering experts, and I recognize that this is not easy, however, only 5 experts have been studied, and there are no details about the profile of the experts, such as area, experience responsibilities in the company.

Validity of the findings

In general, the paper analyses a relevant topic, combining an SLR study and the prioritization of the challenge found with experts. However, several mistakes make the paper difficult to follow and include doubts about the correctness of the steps followed.

SLR papers used to include a link with the specific papers that have been selected and, in this case, the challengers extracted from each of them.

Cite this review as

·

Basic reporting

The manuscript contains numerous grammatical errors and awkward phrasings, which hinder readability. Below are a few examples:
- Line 23: "like cyber security"
- Line 166: "issues prioritization as shown in Figure 1."
Equations, Figures and Tables should be carefully editted.

Experimental design

- The manuscript lacks details on how industry experts were selected. Were they from diverse geographical regions or industries?
- More explanation is needed on how weights were assigned to cybersecurity challenges.

Validity of the findings

- Were any reliability tests conducted to ensure consistency in expert opinions?
- A sensitivity analysis on the ranking of cybersecurity challenges would strengthen the findings.

All text and materials provided via this peer-review history page are made available under a Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.