All reviews of published articles are made public. This includes manuscript files, peer review comments, author rebuttals and revised materials. Note: This was optional for articles submitted before 13 February 2023.
Peer reviewers are encouraged (but not required) to provide their names to the authors when submitting their peer review. If they agree to provide their name, then their personal profile page will reflect a public acknowledgment that they performed a review (even if the article is rejected). If the article is accepted, then reviewers who provided their name will be associated with the article itself.
All reviewers have confirmed that the authors have addressed all of their comments.
[# PeerJ Staff Note - this decision was reviewed and approved by Xiangjie Kong, a PeerJ Section Editor covering this Section #]
The authors Carefully addressed the comments in the revised draft. So I highly endorse the manuscript in it s current form for publication.
The authors Carefully addressed the comments in the revised draft. So I highly endorse the manuscript in it s current form for publication.
The authors Carefully addressed the comments in the revised draft. So I highly endorse the manuscript in it s current form for publication.
Having thoroughly reviewed the manuscript and the authors' responses to my previous comments, I am pleased to report that all of my concerns and suggestions have been adequately addressed.
The authors have diligently incorporated my suggestions, clarifications, and revisions throughout the manuscript, enhancing its clarity, coherence, and scholarly rigor. I am particularly impressed by their attention to detail and their willingness to engage constructively with the feedback provided.
In light of the revisions made, I believe that the manuscript is now in an acceptable state for publication. Therefore, I recommend that it be accepted for publication.
Thank you for the opportunity to review this manuscript.
See section 1 for comments
See section 1 for comments
See section 1 for comments
Please see both reviewers' detailed comments. Reviewers suggest improvements such as providing a more thorough explanation of the model's methodology, offering additional details on the dataset, incorporating visual aids for comparison, addressing ethical considerations, clearly articulating the paper's contribution to existing research, and ensuring clarity and coherence in language throughout the manuscript.
**PeerJ Staff Note:** Please ensure that all review, editorial, and staff comments are addressed in a response letter and that any edits or clarifications mentioned in the letter are also inserted into the revised manuscript where appropriate.
**Language Note:** PeerJ staff have identified that the English language needs to be improved. When you prepare your next revision, please either (i) have a colleague who is proficient in English and familiar with the subject matter review your manuscript, or (ii) contact a professional editing service to review your manuscript. PeerJ can provide language editing services - you can contact us at [email protected] for pricing (be sure to provide your manuscript number and title). – PeerJ Staff
This study highlights the rise in users offering unstructured feedback on their actual learning experiences as it examines the expanding phenomena of online self-learning in the context of lifelong learning. A further practical dimension to the investigation of sentiment analysis in educational contexts is provided by the Bilibili platform's concentration on high school mathematics courses.
It is a noteworthy attempt to integrate an AlBERT-BiGRU+LDA hybrid model for sentiment analysis on a dataset of actual text comments from high school students. The model selection is also notable because it demonstrates the author's dedication to using cutting-edge methods for better analysis.
However, the following improvements could make the paper better.
Methodology and data sources: Give a more thorough explanation of each stage in the AlBERT-BiGRU+LDA hybrid model. A better explanation of how each element fits into the larger sentiment analysis work would be beneficial to readers.
Provide further details regarding the type of high school student unstructured commentary data. It would improve the paper's context and readers' comprehension to provide a brief summary of the different sorts of comments, their length, and any unique difficulties encountered when processing this dataset.
Consider adding other typical evaluation measures, such recall, F1 score, or area under the ROC curve, that are frequently used in sentiment analysis tasks, while accuracy and loss rates are being addressed. This will offer a more thorough evaluation of the model's functionality.
Recognize the study's limitations, including any potential biases in the dataset, problems with generalizability, or presumptions made when developing the model. Resolving these issues improves the findings' transparency and dependability.
Describe the study's practical ramifications for platform administrators and instructors. In what ways may the actual application of sentiment analysis findings enhance teaching strategies, course mater
Please see the attached file.
Please see the attached file.
Please see the attached file.
Please see the attached file.
All text and materials provided via this peer-review history page are made available under a Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.