Review History


All reviews of published articles are made public. This includes manuscript files, peer review comments, author rebuttals and revised materials. Note: This was optional for articles submitted before 13 February 2023.

Peer reviewers are encouraged (but not required) to provide their names to the authors when submitting their peer review. If they agree to provide their name, then their personal profile page will reflect a public acknowledgment that they performed a review (even if the article is rejected). If the article is accepted, then reviewers who provided their name will be associated with the article itself.

View examples of open peer review.

Summary

  • The initial submission of this article was received on January 25th, 2021 and was peer-reviewed by 3 reviewers and the Academic Editor.
  • The Academic Editor made their initial decision on March 24th, 2021.
  • The first revision was submitted on May 18th, 2021 and was reviewed by 1 reviewer and the Academic Editor.
  • The article was Accepted by the Academic Editor on June 6th, 2021.

Version 0.2 (accepted)

· Jun 6, 2021 · Academic Editor

Accept

The reviewer recommended accepting the manuscript.

Reviewer 3 ·

Basic reporting

no comment

Experimental design

no comment

Validity of the findings

no comment

Additional comments

Authors have made suitable modifications to the suggestions

Version 0.1 (original submission)

· Mar 24, 2021 · Academic Editor

Major Revisions

The authors should address all review comments to revise the manuscript. The revised manuscript could be resubmitted for further review.

[# PeerJ Staff Note: Please ensure that all review comments are addressed in a rebuttal letter and any edits or clarifications mentioned in the letter are also inserted into the revised manuscript where appropriate.  It is a common mistake to address reviewer questions in the rebuttal letter but not in the revised manuscript. If a reviewer raised a question then your readers will probably have the same question so you should ensure that the manuscript can stand alone without the rebuttal letter.  Directions on how to prepare a rebuttal letter can be found at: https://peerj.com/benefits/academic-rebuttal-letters/ #]

Reviewer 1 ·

Basic reporting

This paper proposes a new approach of multi-scale meta-relational network as a classification approach based on optimized initialization representation for image with few samples. The idea of META-SGD is adopted and model-independent meta-learning algorithm is introduced to find the optimal parameters of the model. The contents and target of the paper take interests of readers.

Experimental design

Experiments suggests the performance according to the theoretical aspect discussed in the paper. Experimental parts satisfies the necessary comparison between models and tasks.

Validity of the findings

The proposed approach of meta-learning is very important topic in the recent deep learning research since the principle is based on the idea “how to learn by using previous experience” with the small number of sample images.

Additional comments

Meta-learning of multi-scale meta-relational network proposed in this paper has advantage to have the highest accuracy as the meta-validation set is recorded. Paper suggests that MAML and Meta-SGD need fine-tuning on new tasks while multi-scale relational network based on metric learning can achieve good generalization performance on new tasks without fine-tuning.
Experiments suggests the performance according to the theoretical aspect discussed in the paper. Detailed descriptions are provided and the paper is well written in both of theoretical aspect and experimental part.

Reviewer 2 ·

Basic reporting

The title does not reflect the contribution of this paper.
New learning and classification methods are introduced and not research on mage classification method
The paper proposes a new classification approach addressing small dataset problems. The derivations of the proposed method is provided. The approach is evaluated with standard datasets and measures.

Suggestions for improvements on the paper for the presentation issues are given below.

1. Chronology is important. However, it is redundant to keep mentioning the years as citations include publication year.-- Example. In 2016, .... (citation, 2016)

2. Quite often though capital letters and small letters are used inconsistently.
ARTIFICIAL intelligence
deep Learning
This method USES the idea of…..

3. Consistencies in citations convention is needed in the presentation

4. The last reference on the reference list is incomplete ….

Experimental design

Yes, a clear comparative evaluation was provided

Validity of the findings

All datasets and measures were clearly provided.

Additional comments

All my comments largely on the presentation was given above.

The following sections of the paper needs further clarification

1. Sentence 219 n the multi-scale meta-relational network, we hope to find a set of characterization (thetha) that can make fine adjustments efficiently according to a small number of samples. Where, is composed of feature extractor parameter and metric learner parameter . These two parameters are mentioned but not explained. Elaboration on these are needed

2. Sentence 517 at the end --- a better way of yuan learning
- Is this referring to the ----MAML learning Yu Zaiyuan task distribution of …
- Elaboration on yuan learning is needed

Reviewer 3 ·

Basic reporting

the processing of iterative is clear, but the convergence and convergence speed are not involved.

Experimental design

no comment

Validity of the findings

no comment

Additional comments

to classify image based on small sample, a novel parameters iteration method to improve multi-scale meta-relational network is presented in this paper. the classification results illustrate the superiority of the method. but there are still some problem existed.
1. there are some mistakes in serial number of formula. For example: line 237: Equation (3-6) and (3-7) should be (6) and (7);
2. the processing of iterative is clear, but the convergence and convergence speed are not involved.
3. the results is better than existing methods, are the differences significant enough? will the gap depend on the sample?

All text and materials provided via this peer-review history page are made available under a Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.