Review History


All reviews of published articles are made public. This includes manuscript files, peer review comments, author rebuttals and revised materials. Note: This was optional for articles submitted before 13 February 2023.

Peer reviewers are encouraged (but not required) to provide their names to the authors when submitting their peer review. If they agree to provide their name, then their personal profile page will reflect a public acknowledgment that they performed a review (even if the article is rejected). If the article is accepted, then reviewers who provided their name will be associated with the article itself.

View examples of open peer review.

Summary

  • The initial submission of this article was received on March 23rd, 2023 and was peer-reviewed by 2 reviewers and the Academic Editor.
  • The Academic Editor made their initial decision on April 26th, 2023.
  • The first revision was submitted on May 29th, 2023 and was reviewed by 2 reviewers and the Academic Editor.
  • The article was Accepted by the Academic Editor on June 8th, 2023.

Version 0.2 (accepted)

· Jun 8, 2023 · Academic Editor

Accept

The author has successfully issued the reviewers' concerns. The current version may be accepted. Congrats!

[# PeerJ Staff Note - this decision was reviewed and approved by Claudio Ardagna, a PeerJ Computer Science Section Editor covering this Section #]

Reviewer 1 ·

Basic reporting

This paper proposes the BERT-PAGG relation extraction model that combines BERT, PCNN, self-attention mechanism, gating mechanism, and GCNN. The experimental results demonstrate the effectiveness of the proposed BERT-PAGG model, which outperforms several baseline models on the macro-F1 score.

Experimental design

This article fully utilizes the relative positions of entities and combines the local and global features extracted by the PAG module with the relevant features extracted by GCN. Furthermore, the authors provide a detailed analysis of the hyperparameters used in the model, as well as an ablation study to identify the contributions of each component.

Validity of the findings

The experimental results of this article come from two publicly available datasets, which demonstrate that the PAGG proposed in this paper can effectively improve the performance of the Chinese relationship extraction model.

Additional comments

The paper is written in a clear and concise style, with well-structured sentences and paragraphs that are easy to understand. Some grammatical issues and word spelling errors have also been corrected.

Reviewer 2 ·

Basic reporting

In the revised version of this paper, the authors provide a detailed introduction to the current state of research and challenges faced in Chinese relationship extraction. Based on existing research, the authors propose the BERT-PAGG model and provide a clear introduction . The experimental results are also impressive, showing that the BERT-PAGG model outperforms several baseline models on the benchmark dataset.

Experimental design

The experimental design of this article is relatively complete, with the author providing detailed parameters and conducting sufficient experiments to verify the effectiveness of the BERT-PAGG model.

Validity of the findings

The findings presented in the paper are well-supported by the experimental results, and demonstrate the effectiveness of the proposed BERT-PAGG model for Chinese relation extraction. The authors provide a thorough evaluation of the model using a standard benchmark dataset, and show a clear improvement over several baseline models.

Additional comments

The authors give a more detailed and vivid account of the peculiarities of Chinese relationship extraction, and in the related work section, the authors cite more advanced works and compare them with the BERT-PAGG model in the experimental session.

Version 0.1 (original submission)

· Apr 26, 2023 · Academic Editor

Major Revisions

The work is interesting and solid. However, there are some issues. Please revise the paper accordingly.

[# PeerJ Staff Note: Please ensure that all review comments are addressed in a rebuttal letter and any edits or clarifications mentioned in the letter are also inserted into the revised manuscript where appropriate. It is a common mistake to address reviewer questions in the rebuttal letter but not in the revised manuscript. If a reviewer raised a question then your readers will probably have the same question so you should ensure that the manuscript can stand alone without the rebuttal letter. Directions on how to prepare a rebuttal letter can be found at: https://peerj.com/benefits/academic-rebuttal-letters/ #]

[# PeerJ Staff Note: The review process has identified that the English language must be improved. PeerJ can provide language editing services - please contact us at copyediting@peerj.com for pricing (be sure to provide your manuscript number and title) #]

Reviewer 1 ·

Basic reporting

This paper studied the task of Chinese relation extraction. To better exploit external information, such as entity location information and syntactic structure information, this work proposed a BERT-based relation extraction model. Specifically, the model used a piecewise convolutional neural network (PCNN) and a self-attention mechanism to respectively extract local and global features, then a gating mechanism is used to fuse global and local features. Furthermore, the work deployed entity-specific graph convolutional neural networks to capture both semantic and structural information with entity-specific masks.

Experimental design

This work used many effective techniques to better extract and fuse both sentence information and entity information, which achieved a good performance.

Validity of the findings

1. Various experiments were conducted to evaluate the performance of the proposed approach.
2. And the proposed method has achieved good performance in the benchmark model.

Additional comments

1. This paper is well written, although there are some little writing errors in the paper, such as wrong use of capitalization and words.

Reviewer 2 ·

Basic reporting

1. This paper proposes the BERT-PAGG relation extraction model considering the influence of relative position between entity pairs and sentence-level information on the performance of relationship extraction models. The model introduces entity location information and combines local and global features extracted by the PAG module with dependent features extracted by the GCN. The experimental results on two publicly available datasets show that the proposed method achieves optimal results compared to other models.
2. The main contribution and significance of this paper needs to be further outlined. For example, authors can further explain research challenges in existing works.

Experimental design

1. This paper introduces the relative positions of the entities and extract the local features of the sentences via the piecewise convolutional neural network PCNN.
2. A self-attention mechanism is employed to capture global dependent features, and a gating mechanism is used to adaptively fuse global and local features.

Validity of the findings

1. The experimental results on two publicly available datasets show that the proposed method achieves optimal results compared to other models.

Additional comments

1. The particularity of Chinese relation extraction needs to be further explained.
2. It would be better to list more state-of-the-art works in the related work.

All text and materials provided via this peer-review history page are made available under a Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.