Melissa Haendel is one of the authors on the “On the reproducibility of science: unique identification of research resources in the biomedical literature" article which we published yesterday. This important article received quite a bit of attention when it published.
Dr Haendel is an assistant professor in the Oregon Health & Science Library and Dept. of Medical Informatics and Epidemiology. She has a Ph.D. and post-doctoral work in neuroscience, where she studied early development in mouse, chick, and zebrafish.
She currently leads several semantic data integration projects, such as CTSAconnect (aimed at integrating research profiling data and research resources for the purposes of expertise representation), and the Monarch Initiative (focused on integrating model systems phenotype and genotype data with other biological data for the purposes of disease discovery). She also leads the ontology and curation team for the eagle-i project, which aims to make information about research resources accessible.
Her primary interest is in using ontologies as a nexus for linking many types of data - genomic, biomedical, evolutionary, bibliometrics, etc, to support quality end-user application development.
PJ: Dr Haendel, can you tell us a bit about yourself and your own experience with the science’s reproducibility problem?
MH: Science is in a crisis of (non) reproducibility. I am interested in identifying mechanisms that can help publishers and editors address this critical problem. My PhD is in developmental neuroscience. During my training, I often found it difficult to replicate previous scientific results. Later, when working at the Zebrafish model organism database (ZFIN), I was frustrated at my inability to identify the precise organisms, probes, antibodies and other scientific materials that underpinned genotype-phenotype assertions in the literature. My curator colleagues and I spent enormous amounts of time tracking down specifics that should have been in the publications, even going so far as to call the authors. The lack of specificity in the literature was initially shocking to me, and is now the subject of my efforts to develop tools and policies that will help address this problem in the scientific literature.
PJ: Can you tell us about the research you published with us?
MH: As a scientist, one uses the literature as your recipe book to guide you in your experimental plans. When some of the ingredients are unidentifiable and/or unobtainable, it makes it very difficult to be successful in making experiments work. I wanted to publish with PeerJ, in part because the very open quality of PeerJ can inspire researchers to publish the materials and data that will make their work accessible and reproducible. Science is about more than the stories we tell, it means sharing the materials and methods necessary for anyone to reproduce their results.
PJ: Your results brought you to developing reporting guidelines for life science resources. How could they contribute to the reproducibility problem-solving efforts?
MH: My goal is to inspire journal editors, reviewers, and authors to implement guidelines to support reproducibility. Ideally, there should be tools implemented to help support this goal. Reviewers should be informed to try to identify key aspects of reproducibility before publication. Further, my hope is that authors will recognize that the incorporation of their results into the public biomedical knowledge-space depends upon the specificity of the resources used in their research.
PJ: What are the advantages for you to publish with us?
MH: I wanted a journal that would have a broad audience, and a readership with a modern approach to scholarly communication.
PJ: So, what is the audience that you wish to reach publishing with us?
MH: All scientists, authors, reviewers, and editors
PJ: What do you think of our “Pay once, Publish for life” Membership?
MH: I love it. I like that it makes me feel like a partner in the publishing process, rather than a one-time event.
PJ: You have published in several journals in the past.From your prior experiences, what opinion have you formed about the publication process in general, and how would you compare it to your experience with us?
MH: Submission to PeerJ was the easiest of any journal to which I have submitted my work. As someone who spends a fair amount of time thinking about user-centered design, I could tell that PeerJ had a quality team of developers dedicated to making the system simple and efficient.
PJ: Thanks! And were our author instructions and policies clear to you as well?
MH: Yes, very straightforward.
PJ: What was your experience of our production process?
MH: Seemed straightforward. I really liked the focus on visibility and reaching out to the community regarding newsworthy content.
PJ: What did you think of the appearance of the published article?
MH: I didn’t like how there was a blank left sidebar on most pages, except for where the figures were, as I prefer not to see whitespace. Otherwise the formatting looked great, and I like the PeerJ color scheme.
PJ: Was there anything that surprised you with your overall PeerJ experience?
MH: I was surprised that I had a reviewer that did not disclose their name. I had thought that given the open nature of PeerJ, that everyone would want credit for his or her review. However, I am grateful that it is optional as I think having a choice means that people can be honest in their reviews.
PJ: Why did you choose to reproduce the complete peer-review history of your article?
MH: I like the idea that one can see the evolution of the paper. The reviewers are critical to making science the best it can be, and I wanted them to be attributed for their effort. I also think that having open reviews can be instructive for younger investigators.
PJ: In your opinion, are your funding bodies supportive of Open Access and PeerJ?
MH: I would say that given current public access policies that funding agencies are more supportive than ever of open access journals. I would also say that traditional impact metrics are not long for this world, and journals like PeerJ and their partnerships with altmetrics measures will help pave the way for the science itself to be impactful, not where it is published.
PJ: Did any of your colleagues express anything to you about your publication with us?
MH: They all said, “cool!”
PJ: So… would YOU submit again?
MH: Of course! I am now a lifetime member.
PJ: Anything else you would like to talk about?
MH: I would like PeerJ to consider adopting our research resource reporting guidelines to support reproducibility. Further, I think it would be great if PeerJ included biocurators on their review panels. This would be similar to having a statistician participate in the review, and could greatly aid reproducibility and decrease inefficiencies in data capture.
PJ: Thanks for your feedback! In conclusion, could you describe PeerJ in three words?
Open. Innovative. Scholarly.
PJ: Many thanks for your time!
PeerJ is currently getting first decisions back to authors with a median time of 24 days, and we have hundreds of highly satisfied authors. If you would like to experience the PeerJ process for yourself, then submit your next article to us!