Review History


All reviews of published articles are made public. This includes manuscript files, peer review comments, author rebuttals and revised materials. Note: This was optional for articles submitted before 13 February 2023.

Peer reviewers are encouraged (but not required) to provide their names to the authors when submitting their peer review. If they agree to provide their name, then their personal profile page will reflect a public acknowledgment that they performed a review (even if the article is rejected). If the article is accepted, then reviewers who provided their name will be associated with the article itself.

View examples of open peer review.

Summary

  • The initial submission of this article was received on April 13th, 2023 and was peer-reviewed by 2 reviewers and the Academic Editor.
  • The Academic Editor made their initial decision on June 27th, 2023.
  • The first revision was submitted on July 25th, 2023 and was reviewed by the Academic Editor.
  • The article was Accepted by the Academic Editor on July 26th, 2023.

Version 0.2 (accepted)

· Jul 26, 2023 · Academic Editor

Accept

Thanks for providing detailed answers to all reviewers' comments and revising the manuscript accordingly. This is an app that I hope will be used by e.g. the increasing number of ecologists using acoustic methods. The manuscript is now ready for publication.

[# PeerJ Staff Note - this decision was reviewed and approved by Paula Soares, a PeerJ Section Editor covering this Section #]

Version 0.1 (original submission)

· Jun 27, 2023 · Academic Editor

Minor Revisions

Both reviewers appreciated your work,and they make constructive comments that should improve the final version. This is nice contribution to the software section of the journal.

Reviewer 1 ·

Basic reporting

Good English, good references.

Some problems with the lay-out:
(1) the first author's name appears twice in the title;
(2) missing opening parentheses in all references in the text, e.g. "ecological monitoring Ross and Allen (2014)" instead of e.g. "ecological monitoring (Ross and Allen 2014)"; this occurs with every reference throughout the text;
(3) perhaps related to the previous: p.3 "R packages, such as Attali (2021); Bailey (2022);..." Those are authors' names, not names of packages.

Experimental design

I think that description of software falls within the scope of this journal ("BioInformatics Software tools").

It is mainly explained clearly, although only at the very end ("upload") I realized that the standard workflow involves uploading data to a server, so it seems to be a web app, although it is claimed that the app can work locally. This could be made clearer.

p.2 "We present the NEAL app" sounds strange, after already having introduced the exact same thing as "the Shiny app". Probably a copy-paste thingy.

To replicate, I would like to know what "istft" is doing. Presumably this is some sort of vocoder; is the Griffin-Lim algorithm used, or something with better quality? Perhaps you are keeping the complex-valued spectral slices?

Date-time format is ambiguous w.r.t. time zone and daylight saving time. Such details are relevant in bioacoustics. Why not ISO 8601?

Validity of the findings

The authors make a big deal out of the refreshing-strategies of the app. But other state-of-the-art software is *very* fast at rendering full-screen spectrograms (much faster than running the spectral analysis), so the authors might consider computing the visual spectrogram pixel by pixel instead of as 720,000 tiles.

Additional comments

Not so clear: "the phase of the sounds outside the selection are collapsed to zero". Can you elaborate why this is so?

Cite this review as

·

Basic reporting

The authors have developed an open-source application named NEAL to aid in the efficient and consistent examination and annotation of audio data. The tool enables detailed annotations of vocalisations in terms of time and frequency, vocalisation type, and an open text field for notes, thereby capturing multifaceted information. This makes the application rich in terms of its utility for simultaneous multiple labelling tasks, and each of these labels could serve as a target for machine learning algorithms. The tool leverages the Shiny package for R, providing an interactive front-end with reactive functionality, without requiring knowledge of R programming from the users. Despite its primary design for bird audio, it can be extended to other bioacoustics taxonomies. The authors also envision potential extensions of the app to include more contextual data and novel visualisations, and its open-source nature enables other researchers to contribute to its development. I am very happy to read this paper and authors have done a great job at describing the features and usability of the application. Here are some minor comments I would like to bring to the author's notice:

1. While the authors accurately point out that high-quality annotations produced by experts are essential for training supervised learning algorithms, it is important to clarify that not all machine learning algorithms fall under the supervised learning category. Indeed, there are self-supervised and unsupervised learning algorithms as well. Therefore, a more accurate phrasing could be, "These models, when operating as supervised learning algorithms, require training on high-quality annotations produced by experts." This maintains the authors' original intent while accommodating the broader scope of machine learning methodologies.

2. On line 328 and again in the caption of the Table 3, the authors employ the phrase, "species identified by NEAL," to actually refer to annotations made using NEAL. 'Species recognition' has a very specific meaning in audio-ML and should not be conflated with in this manner.

3. The authors have made elaborate comparisons between NEAL and several other labelling software, but one notable omission is 'Sonic Visualiser,' a longstanding tool widely employed for similar tasks. I am wondering if there is a specific reason why 'Sonic Visualiser' wasn't included in the authors' comparative analysis.

Experimental design

no comment

Validity of the findings

no comment

All text and materials provided via this peer-review history page are made available under a Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.