All reviews of published articles are made public. This includes manuscript files, peer review comments, author rebuttals and revised materials. Note: This was optional for articles submitted before 13 February 2023.
Peer reviewers are encouraged (but not required) to provide their names to the authors when submitting their peer review. If they agree to provide their name, then their personal profile page will reflect a public acknowledgment that they performed a review (even if the article is rejected). If the article is accepted, then reviewers who provided their name will be associated with the article itself.
I am pleased to inform you that your work has now been accepted for publication in PeerJ Computer Science.
Please be advised that you are not permitted to add or remove authors or references post-acceptance, regardless of the reviewers' request(s).
Thank you for submitting your work to this journal. On behalf of the Editors of PeerJ Computer Science, we look forward to your continued contributions to the Journal.
With kind regards,
[# PeerJ Staff Note - this decision was reviewed and approved by Claudio Ardagna, a PeerJ Section Editor covering this Section #]
All set as per my previous review.
All set as per my previous review.
All set as per my previous review.
All of my earlier comments have been addressed. Recommended to accept.
I have received reviews of your manuscript from scholars who are experts on the cited topic. They find the topic very interesting; however, several concerns must be addressed regarding experimental results (more datasets), the research gap in education, and comparisons with current approaches. These issues require a major revision. Please refer to the reviewers’ comments listed at the end of this letter, and you will see that they are advising that you revise your manuscript. If you are prepared to undertake the work required, I would be pleased to reconsider my decision. Please submit a list of changes or a rebuttal against each point that is being raised when you submit your revised manuscript.
Thank you for considering PeerJ Computer Science for the publication of your research.
With kind regards,
**PeerJ Staff Note:** Please ensure that all review, editorial, and staff comments are addressed in a response letter and any edits or clarifications mentioned in the letter are also inserted into the revised manuscript where appropriate.
The article seems to be written in professional English.
The article references various studies, such as works by "Smith et al.", "Asif et al.", and "Latif et al.". This indicates a foundation in existing literature and provides context.
The article has sections like "Related Work", "Dataset Collection and Preprocessing", and "Methodology". It also references tables and figures. The raw data collection process is described, especially in the context of online student behavior during the pandemic.
The article mentions various methodologies and results.
The article provides definitions for various terms, especially in the context of the dataset.
The study aims to evaluate the influence of students' online learning data on their learning performance, especially during the COVID-19 pandemic.
The study proposes three research questions:
RQ 1: Can the chosen model effectively predict using online learning data?
RQ 2: Which features have the most predictive power, and how do they differ from those in other studies?
RQ 3: Is the model behavior perceived as reasonable by course teachers in practice?
The study mentions obtaining written informed consent from all participants before enrollment. The research seems to be conducted rigorously, using various machine learning algorithms and feature selection techniques.
The study details the machine learning techniques used, including Naive Bayes, LibSVM, Multi-Layer Perception (MLP), and others. It also describes feature selection techniques, such as filter-based feature ranking and subset selection. The use of the WEKA tool for applying EDM techniques is also mentioned.
The study focuses on student learning behaviors during the COVID-19 pandemic, evaluating the performance of various techniques. The research questions posed aim to understand the effectiveness of the chosen model, the significance of features affecting academic performances, and the model's perceived reasonability by course teachers.
The study collects datasets from online courses during the COVID-19 Pandemic and evaluates the performance of each technique. The dataset construction and preprocessing are detailed, and various machine learning algorithms and feature selection techniques are employed.
The paper concludes by summarizing its main contributions, such as conducting a study on student learning behaviors during the pandemic, collecting datasets, and evaluating the performance of each technique. The paper also highlights the change in significant features affecting academic performances during the pandemic.
Strengths:
1. The study is timely, focusing on student learning behaviors during the COVID-19 pandemic, a topic of significant interest given the shift to online learning.
2. The article employs various machine learning algorithms and feature selection techniques, showcasing a thorough approach.
3. The study mentions obtaining written informed consent, indicating ethical considerations in the research process.
4. The dataset construction and preprocessing are detailed, providing transparency in the data gathering process.
5. The study poses clear and relevant research questions, guiding the research's direction.
Weaknesses:
1. While the study mentions various methodologies and results, a more in-depth analysis of each method's performance might enhance the article's value.
2. A comparison with pre-pandemic student behaviors might provide more insights into the changes brought about by the pandemic.
The researchers/ author have made significant steps towards the future education system, which is much appreciated. But the following points were suggested and listed to improve the standard of the article.
1. Require improvement in the standard of writing and presenting the article.
2. A few more recent articles (2023,2022) related to the objective of the work shall be reviewed and included. To validate the requirement of this work in 2023 as the entire world has rejoined to normal life from new-normal.
3. The dataset taken 228 for Course A and 129 for Course B looks very small where the machine learning model itself might not required.
4. As the data set is small, the model prediction rate looks high. Need to prove the maintenance of same score even with higher data set.
5. Insufficient results with very less courses begin considered (just two courses - Course A and Course B).
The knowledge gap under investigation should be improved and highlighted
The author needs to validate how the study helps to bridge the gap in the physical education system with respect to online education.
The author needs to investigate the goals in an exhaustive way with a higher technical caliber. The research must have been carried out in accordance with the current requirements in a normal environment without Panedmic situation.
Much more system validation and verification with respect to increased courses, and data set review comments to evidence the performance, efficiency, and accuracy has to be included.
The findings are stated clearly and they are presented in a good manner representing the summary of the article.
It is really appreciated that the authors/researchers have taken important measures towards the betterment of online education in the future. However, the mentioned suggestions may raise the article's quality.
All text and materials provided via this peer-review history page are made available under a Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.