Just enter your email
A peer-reviewed article of this Preprint also exists.
You write: "Systematic reviews have not however been as frequent, i.e. there are approximately 400 meta-analyses in ecology and only 26-30 systematic reviews to date (Web of Knowledge searches with appropriate search terms)."
I believe it would improve the rigour of your paper if you actually included this data as a supplementary data file in an appropriate format e.g. bibtex (http://en.wikipedia.org/wiki/BibTeX), hosted by a third-party research data repository e.g. figshare. If you use the 'Web of Science' tab within Web of Knowledge, you'll find you can export your search query results (see screenshot here: http://www.flickr.com/photos/79472036@N07/9205701378/ ). Furthermore, to aid re-use (& further analysis) of this bibliographical data in future you should include the exact details of your search terms and the date you did the search in another supplementary data file e.g. "Search performed 2013-07-03. Topic=(meta-analyses) Timespan=All years. Databases=SCI-EXPANDED". I'd be particularly curious to know how you refined your search to just 'ecology'. Did you perhaps use Web of Knowledge's "ENVIRONMENTAL SCIENCES ECOLOGY" research area to define that? All this would help make your estimate more reproducible, transparent & re-usable by providing machine-readable (bibtex) evidence. Also important because earlier in this manuscript you state "primary research articles need to more effectively report evidence" - the same is true of editorials and opinion pieces ;)
Another comment: "Oikos is currently ranked fifth in ecological journals publishing meta-analyses"
Who's ranking, for what year, and how big is the subset of ecological journals that publish meta-analyses? [Citation and more context needed] Google Scholar Metrics currently (2013-07-04) gives Oikos a h5-index of 43 and a h5-median of 58. By these metrics & ranking it is NOT the fifth-by-rank of all ecological journals that publish meta-analyses. I presume you are perhaps using the Thomson Reuters Journal Citation Reports® Impact Factor journal-ranking (shudders) - a "statistically-illiterate" scheme so ingrained in some corners of academia that in these places you can speak exactly of a journal's 'rank' without even thinking twice that anyone might not know about or disagree with the ranking scheme you're referring to?
I implore you to read Brembs et al 2013 "Deep impact: unintended consequences of journal rank" http://dx.doi.org/10.3389/fnhum.2013.00291 and plead you to reconsider touting the Journal Impact Factor based ranking of journals in this or any other manuscript. It's not contested that articles vary hugely in quality and impact within journals, not just between journals. Thus if any ranking was desired (is ranking really necessary?) I'd look at article-level ranking & article level metrics as a more objective measure of worth rather than journal-based metrics.
Stephen Curry (2012) http://occamstypewriter.org/scurry/2012/08/13/sick-of-impact-factors/
Thanks for both of the above comments. Working backwards, I am going to cut the ranking comment, it is not necessary. As to the first suggestion, wow, I have no idea you could export search data. I did restrict searches to titles only for very specific terms and for ecology as a sub-disc. only. Cadotte et al do an excellent job of effectively reporting a far more rigorous search. I just wanted to do a quick cut to see if their findings still stand, which of course they do since their paper is only 1 year old-ish. Thanks!! Oh ya, and yes, I do love editorials with data - even just hints of trends in them make thinking about the evidence more critical and more concrete as well.
Usage since published - updated daily