Review History


All reviews of published articles are made public. This includes manuscript files, peer review comments, author rebuttals and revised materials. Note: This was optional for articles submitted before 13 February 2023.

Peer reviewers are encouraged (but not required) to provide their names to the authors when submitting their peer review. If they agree to provide their name, then their personal profile page will reflect a public acknowledgment that they performed a review (even if the article is rejected). If the article is accepted, then reviewers who provided their name will be associated with the article itself.

View examples of open peer review.

Summary

  • The initial submission of this article was received on April 29th, 2016 and was peer-reviewed by 3 reviewers and the Academic Editor.
  • The Academic Editor made their initial decision on May 31st, 2016.
  • The first revision was submitted on July 28th, 2016 and was reviewed by 3 reviewers and the Academic Editor.
  • The article was Accepted by the Academic Editor on August 18th, 2016.

Version 0.2 (accepted)

· Aug 18, 2016 · Academic Editor

Accept

Dear Authors,

Thank you very much for your careful work addressing all of the reviewers' comments and suggestions. It is a pleasure to see such a thorough and conscientious job of this. I look forward to reading your article in its final form!

Best wishes,

Mary Baker
HP Inc Labs, Palo Alto

Reviewer 1 ·

Basic reporting

Pass. The article reads well, is clear and connects new knowledge with the broader field of knowledge.

Experimental design

Pass. Experimental design is reasonable and presented well.

Validity of the findings

Pass. Findings are reasonable and interesting.

Additional comments

No comments.

·

Basic reporting

The authors addressed my comments.

Experimental design

N/A

Validity of the findings

N/A

Additional comments

Thank you for your response letter with details on specific changes performed to address my comments.

Reviewer 3 ·

Basic reporting

No comments.

Experimental design

No comments.

Validity of the findings

No comments.

Additional comments

After reading through the revisions to the original manuscript, I feel that all of my original comments have been adequately addressed. I recommend accepting this version of the paper.

Version 0.1 (original submission)

· May 31, 2016 · Academic Editor

Minor Revisions

Dear Authors,

Thank you for submitting this interesting manuscript on electronic laboratory notebooks. The reviewers are generally positive and (except for the first reviewer) suggest only minor revisions. I believe that the suggestions of the first reviewer are too substantial to consider for this submission. The suggestions of the other two reviewers are perhaps minor, but there are quite a few of them, and I think they are very useful. The submission will benefit greatly, for instance, from more clarity around the user study and its participants.

I would also like to see some mention, even if there are no solutions to suggest, of the digital preservation problem. The writing in a paper notebook tends not to become obsolete very quickly, while electronic formats come and go, and even cloud providers come and go. Could this be a problem for the review of some kinds of experimental data and procedure long after the fact?

In addition, I believe the manuscript would benefit from a good proofreading. Fixing the many grammatical errors would help polish the manuscript. For instance, in the first sentence of the abstract "This report shares the experience during selection, implementation and maintenance phase" should probably be "This report shares the experience during the selection, implementation and maintenance phases". In Figure 4, please either spell out "Number" or change the "No" to "No." so it's clearly an abbreviation of the word number. Also, in 4a, please say "No. of" or "Number of" and not just "No" since otherwise it looks as if you're saying there were no experiments :-) I know these are little things, but this will help present your work as smoothly as possible.

I'm looking forward to your revision that addresses these and the reviewers' recommended changes. If you have any questions, please don't hesitate to contact me directly or the reviewer who kindly provided an open review.

Thanks very much,
Mary Baker
HP Labs, Palo Alto

Reviewer 1 ·

Basic reporting

Language is clear and the article is generally well written.

Experimental design

The experimental design was clear. The authors wrangle with a very difficult problem of how to construct a user friendly electronic notebook and how to test the users satisfaction with the design.

Validity of the findings

The conclusions that the authors draw based on their study are valid. However, the study could be improved by having a more sophisticated ENL system to test. Particular features of the ENL interface could be isolated and the user experience tested. For example, drawing pictures or version control (e.g. per git model). However, the reviewer understands that this is a lot of work and may be outside of the resources available to the authors. This should be considered in the future.

Additional comments

Plots in Fig. 4 should be overlayed and on the same axis

·

Basic reporting

The article is well written and contains sufficient related work.

Experimental design

The methodology is well documented.

Validity of the findings

The findings seem valid and conclusions are appropriate.

Additional comments

The paper presents an experience of implementing an ELN in a large collaborative project. The implementation methodology and user experiences and feedback are adequately presented. I think such experiences would be useful for others trying to develop or adopt an ELN. My comments are mostly regarding clarification and some are suggestions.


Suggestions/clarification points:

- (Table 1) What does 'Chemical (sub)structure search' mean? Expand on that little.
- (Line 122) 'The most important items of the collected user requirement are listed in Table 2'. How was the importance of an item determined?
- (Table 2) What does GLP stand for? Good Laboratory Practice?
- (Line 128) What does 'chemical and biological search' mean? This might be obvious to people in biological field, but not to me (computer science background).
- (Table 2) How were requirements classified as core vs. non-core?
- (Table 2) Explain 'chemical and Biological notebook'. Do you just mean a notebook for chemical and biological experiments? If so, rephrase.
- (Line 139) 'two vendors were selected'. Based on what?
- (Line 144) 'The final decision was based on the higher number of positive features in the chosen system'. Just the higher number of features does not seem like a good criteria. Were all features equally important? Perhaps explain little more on the choice here. It would be useful to know the key features that led to choosing one vendor over the other.
- (Table 3) The fifth point ('Users should take responsibility ...') is not clear. It does not read like a feature either.
- (Table 3) 'The potential to continually use the ELN after five year ...' How was this evaluated? Was it just based on cost?
- (Line 174) 'The initially predefined templates were rarely adopted.' I can see that happening. Was there any common format/pattern among users that could be used to revise the default templates? If so, maybe the templates could be revised based on ELN usage for first few months?
- (Line 195) What was defined as acceptable performance?
- (Line 196) Expand on the 'denial of access' case. Was it due to just heavy unexpected usage of ELN cloud or was it due to an external factor?
- (Line 213) Here you say 100 users overall but in Supplement Article S2 you mention the number of overall users was less than 80. In introduction you say 90 bench scientist. Having a para on study participants explaining study participant demographics (age, gender, title, and previous familiarity with ELN) would help. In that para you can explain the overall number of users and number of parallel users, number of users who took the survey, and so forth.
- (Table 4) Are survey results different for users who were familiar with ELN vs. who weren't? If the results are different, it is worth pointing out.
- (Table 4) Mention number of users in parenthesis when saying 'Most user' and 'Many users'. E.g., 'Most user (n=?)'.
- (Table 4) Just curious, did the frequent users who realize an increase in quality of documentation said they would recommend ELN?
- (Table 4) Among users who were not satisfied (seventh point), was there any common theme/reason for their dissatisfaction?
- (Line 262) 'For many users'. How many exactly? Give number, e.g., 'For many users (n=?)'
- (Line 321) '... not to the quality of results.' Not just results. Isn't it also about quality of documentation/reproducibility? It was good to see this paragraph.

- Suggestion for a discussion point: In this experiment it seems, to me, the users were forced to i) change the way they document their experiments, and ii) document their experiments on a computer instead of a paper notebook. Would it have been an easier transition if a paper notebook was printed with a template (from the ELN) and the users were first asked to use the paper notebook (with template from ELN), and then later transition to ELN. This of course would be a speculation (permitted by PeerJ). It might be interesting to add a small para on this from the author's perspective based on their experience.

- (Line 458) Very appropriate quote from one of the users. Incorporate more of such quotes from users throughout the paper wherever applicable. You do have many quotes from users (Survey supplement).


Minor things:

- Some table captions are below the table, some are on top. Convention is to keep caption for tables on top and for figures on bottom. You can choose to keep table captions on top or bottom, but be consistent.
- Show number of users (n) in table/figure captions. E.g., Table 4 (how many users responded to the survey?), Figure 4 (number of users might have varied over time, but you can show median or minimum number of users).
- Bad break for Table 2 and Table 3. It's easier to read if the entire table is on one page.
- (Line 18) typo. feedback
- (Line 46) 'Ioannidis et al., 2014' not 'Ioannidis, et al., 2014'. No comma between first author and et al. Fix this for other similar citations as well (Line 54, Line 57, ..).
- (Table 3) typo. First point. '... selected solution should *be* intuitive'
- (Line 101) Figure 3 is referenced before Figure 2.
- (Line 240) typo. '... or biweekly *mails* about the ELN ...'
- (Line 208) typo. weekends
- (Line 234) in Aug/Sep 2015 *were* related ...
- (Line 242) subsequent decline in Nov 2014 (no and) is correlated ...
- (Table 4) typo in fifth point. skeptical
- (Table 6) change 'about searching' to 'about the quality of search results' (if thats what you mean, if not clarify what do you mean by searching).

Reviewer 3 ·

Basic reporting

This is an experience paper that describes the rollout of an electronic laboratory notebook (ELN) system used in a collaborative research program that identifies new mechanisms for antibiotic delivery. The introduction provides good motivation as to why an ELN is important; however, there is no related work section that compares and contrasts the design choices made in the implementation of this ELN compared to others. If there are other examples of similar systems, it would extremely helpful to compare and contrast your design choices to the choices of prior work.

Experimental design

Since this an experience paper, it doesn’t seem that this work answers any particular research question in a traditional sense. This paper provides many details as to how the ELN was implemented, but there was no formal experimental design.

Validity of the findings

The results presented in this paper come in two forms: quantitative (number of experiments performed, number of support tickets) and qualitative (user survey that contrasts experience with expectations). The high level results provided from the survey are quite helpful (Table 4), but I feel some details are missing. It would be helpful to have another table that describes some statistics about the user studies – how many users, distribution of PC vs Mac vs Linux users. Some of this is provide in lines 213 – 222 on page 13, but the numbers aren’t precise – it would be better to pull these out into a table when possible. Finally, some analytics on the survey results could provide some more meaningful insights into precisely why the users felt the system was frustrating to use. This would help to better justify the findings in tables 5 and 6.

Additional comments

Overall, I found this work pretty interesting. There is currently a large push in the medical and scientific communities to digitize health and experiment data. This paper provides a nice overview of the challenges and pitfalls one can face when implementing a large scale collaborative system for sharing experiment findings. The information presented here should be quite helpful to others attempting to implement a similar system.
The supplemental files included with this submission are quite helpful, but I was left with a couple of questions regarding the actual implementation of the ELN. Was data actually entered into spreadsheets such as the ones presented in tables S3 and S4? It might be really helpful to include a couple of screenshots from the actual ELN being used in a browser to give the reader a better understanding as to what the users were dealing with. Currently, I’m not sure how the spreadsheets fit within the web-based GUI used for experiment reporting.
As I mentioned in my comments above, I think you could do a better job quantifying the results from the user surveys. Where quantified results are available, you should present them. When you present the high level results in tables 4 – 6, it would be good to have numbers to bring out how important the findings are. For example, “Windows users were unhappy about the performance of the system” – if there were a table that showed how many users used which platform, we could better understand how big of an impact this really is. Ideally, you should present the data in tables 4 – 6 in histogram format rather than in the currently presented qualitative manner.
Overall, I think the community should find this paper valuable but some improvements can be made to the presentation of results.

All text and materials provided via this peer-review history page are made available under a Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.