The majority (70% or so) of submissions are desk-rejected without even being sent for review, and the ability to do that well is something that's learned over time with extensive detailed knowledge of the particular field served by the journal. Note that there are more kinds of editors than just academic editors, too, even at places like PLOS & eLife.
Hey Gwern, big fan of your GPT2 work. I notice I'm surprised to hear you say you struggle daily to fix broken links to the Elsevier catalog at ScienceDirect, because the links are used by libraries all over the world & they don't have the same feedback. Would you have a few examples available for me to send to the folks responsible?
Nature does it all the time. Here's one I fixed just this morning when I noticed it by accident: http://www.nature.com/mp/journal/vaop/ncurrent/full/mp201522... (Note, by the way, how very helpfully Nature redirects it to the homepage without an error. That's what the reader wants, right? To go to the homepage and for Nature to deliberately conceal the error from the website maintainer? This is definitely what every 'archival quality' journal should do, IMO, just to show off their top-notch quality and helpful ways and why we pay them so much taxpayer money.) Oh, SpringerLink broke a whole bunch which I am still fixing, here's two from yesterday: http://www.springerlink.com/content/5mmg0gmtg69g6978/http://www.springerlink.com/content/p26143p057591031/ And here's an amusing ScienceDirect example: https://www.sciencedirect.com/science/article/pii/S000632071... (I would have loads more specifically ScienceDirect examples except I learned many years ago to never link ScienceDirect PDFs because the links expire or otherwise break.)
Isn't this exactly the intended use-case for the DOI?
Your first article has the DOI 10.1038/mp.2015.225, and the resulting link (https://doi.org/10.1038/mp.2015.225) properly directs to the article's present location.
DOIs link to paywalls or temporarily-unembargoed papers, have to be hunted down (many places hide the DOIs in tabs or, like JSTOR, actually bury it in the HTML source itself!), and break things like section links as well. Adding yet another level of indirection is not my idea of a solution and hardly speaks well of 'archive-quality publishers' that we have to resort to third parties to work around their hideously broken websites which, like Nature, go out of their way to make links not just break but actively misleading.
To solve your immediate problem, just grab the DOI here: https://apps.crossref.org/SimpleTextQuery
They also have an API from which you can fetch DOIs in various ways.
DOIs are a solution to the issue of having persistent, publisher-independent links that will always resolve, even if a journal changes publisher or goes out of business. Academia uses them because link rot is unavoidable across the web, but there must always be a link to the publication that resolves so that when someone in 2070 wants to follow a citation in the references of a work published today, they can do that. It's the same thinking that underlies people pointing to the internet archive in Wikipedia citations. It's a layer of redirection, but in a way that preserves accessibility for the long term. It's also the same thinking that underlies DNS. There shouldn't be one company that controls how to resolve an IP address to a domain name, and likewise you shouldn't have to go through one publisher to resolve a reference to a research article.
As a side note, Crossref is staffed with exactly the sort of web geeks that you would see at an Internet Archive get-together (#).
So I hear your frustrations, but I think you're giving DOIs short shrift.
Do Nature's spinoffs have any prestige any more? Anything in a Nature spinoff related to batteries comes across as PR Newswire level material. If that.
Journals do transfer among publishers, go out of business, etc so you shouldn't expect a direct link like that to be stable. The recommended practice is to use the DOI. Would using a DOI meet your needs?
As a researcher, I understand the frustrations with the publishing process. I spent years complaining about it, then I decided to do something. A few years later, my company was acquired by Elsevier & everyone was calling me a sellout. What changed? The same thing that changes every time you get your hands dirty trying to fix something - you see all the hidden complexity that wasn't apparent before.
Are there legacy components to academic publishing? Sure there are. Is research assessment & funding messed up? Yep. Will posting preprints or research blogging fix everything? Nope.
If you take a step back and look at research as an enterprise, the scale is absolutely staggering. Tens of billions of public & private money needs to be allocated to researchers every year & it needs to be done in a way that is insulated from political & social tides, so that big problems like cancer, aging, antibiotic resistance & pandemics can be worked on consistently over the decades it takes to make real progress. You don't want a system like this to change quickly. That said, it is changing.
Information and analytical services that support researchers and clinicians is the biggest growing part of Elsevier's business for many years now, and these businesses only get even more valuable as more and more content is available openly.
At the same time, Elsevier continues to provide all the back-end services that scientific societies, funders, researchers, and their institutions need to keep the system running so they can focus on their research.
What are these systems?
Starting with societies, many of them get the funding they use to support the mission of the society - advocating on policy issues important to their research community - through the society journal. Elsevier makes running the journal financially sustainable by hosting it, recruiting peer reviewers, attracting and maintaining a good editorial board, handling ethics complaints, and providing a cheap platform.
Elsevier helps funders understand how to allocate their funds in alignment with the funders mission, not just by conferring status, but with more advanced ways of understand the broader impact of a work. Elsevier (including me personally) has worked to undo the negative effects of over-reliance on the impact factor: https://www.elsevier.com/authors-update/story/impact-metrics...
Researchers and their institutions use all this stuff to showcase their work, recruit faculty, attract funding, make their case for tenure & decide who should get it.
After spending years working on projects with these different groups, I developed a much more nuanced understanding of how everything works & what the levers of change actually are. Happy to discuss with anyone!
> If you take a step back and look at research as an enterprise, the scale is absolutely staggering. Tens of billions of public & private money needs to be allocated to researchers every year & it needs to be done in a way that is insulated from political & social tides, so that big problems like cancer, aging, antibiotic resistance & pandemics can be worked on consistently over the decades it takes to make real progress. You don't want a system like this to change quickly. That said, it is changing.
First off, in reply to this part: "You don't want a system like this to change quickly." ... I don't accept this as a first principle.
It is useful to think about how research and funding interrelates with publishing and peer-review mechanisms. However, I would not advocate a "go slow" approach with regards to modernizing publishing, e.g. out of some concern for the ability of research and funding aspects to "keep up".
Generally speaking, I advocate for finding leverage points in systems to drive change. Right now, there is considerable leverage to apply to the big academic publishers. So, now, we should push. The big publishers will respond; there will be friction and academic and political fighting. If we're successful, there will be change.
I don't worry much about how such changes will hurt the research and funding system. The system will adapt.
I am mindful that people have jobs in these industries, and that change may threaten them. But it would be a fallacy to only blame promoters of change for risking the status-quo jobs. I think a big responsibility falls on the companies, too. They are (presumably) intelligent actors. So what is stopping the companies from reforming themselves internally? Doing so could provide continuity to their employees, preserving tacit knowledge.
When a company can fight change with PR and lobbying more affordably than adapting, I am rarely surprised at what happens.
Thanks for the thoughtful reply. I agree publishers could do more to change & I especially think we should do more to make all the changes that are happening under the hood more visible. I mean, that's literally my job. You gave me a sincere response & deserve one in return, but we need to work towards a shared understanding of what the current situation is if we want to have a conversation that's not just talking past one another.
My understanding of the situation includes the following:
Elsevier has a new CEO.
Elsevier has been reporting for several years now that revenue from services has been one of the fastest growing parts of the business, so much so that the company now calls itself an information and analytics company, not a publisher (1).
Elsevier, though slow initially, is now fully behind open access. 9/10 of the journals launched last year were open access (2).
Elsevier is pursuing a number of what the industry calls "transformative agreements" with libraries, consortia, and whole countries which involve full access to all Elsevier content and built in open access publishing for everyone covered under the arrangement (2).
This specific issue was about one way of structuring such an agreement to reduce the financial burden on MIT while still ensuring all their content was published open access and was even designed to make it easier for librarians to keep a collection of the intellectual output of their institution by automatically pushing manuscripts into the institutional repository, which is something librarians have been asking for for a long time (3).
So given all this, the only way I can answer your question about what's stopping change is to say that nothing is stopping it. It's happening & has been happening for years. I am tempted to ask, looking at some of the comments in the parent thread, what's stopping change in people's perceptions of Elsevier? I don't just mean that rhetorically. I really would be interested in understanding why people have the views they do and how they're different.
What's your current understanding of the situation and does it differ in ways from mine that you'd like to highlight?
I appreciate your comment. I don't have much time at the moment to reply, but I'll say this:
First, I would encourage you to seek out this kind of feedback broadly and systematically (as you probably already are).
Second, perceptions change slowly.
Third, with regards to viewing established players with skepticism, savvy people follow the money. Can you break down the financials of Elsevier and its parent company, the RELX Group? How much of these profits come from closed-access journals versus some of the newer initiatives?
Fourth, though it is less common, some organizations do put effort into long-term initiatives that may cannibalize their cash cows. Let's talk about what history has to tell us about those companies and those transitions.
Preprints are great & everyone should post a preprint as soon as they're ready to share what they've been working on. That said, there's a big difference between journals and preprints, not just in the production quality or the improvements made after review, but in the whole hidden infrastructure that supports the discovery of artcles, the indexing, preservation, linking, etc.
Again, Dmitri, there's plenty of reasons to criticize Elsevier without making things up. RELX doesn't break out profit by division, but the operating margin for Elsevier is around 22%, which anyone can read for themselves in their annual report.
The figure of 40% appears in quite a number of sources. Do they all make things up?
Does the "operating margin" somehow refute it? Does "operating" here mean 22% after subtracting all additional (possibly luxurious or generously allocated) expenses?
No one really wants to criticise anyone and every work should be fairly rewarded. It is the unfair part that people are objecting to.
This is something I hear all the time about publishers, and it used to resonate with me, too, until I started to work for a publisher and realized how much goes into the system we have beyond just putting manuscripts online. The real eye-opening thing for me was talking to editors and seeing all the behind-the-scenes stuff that they do. They have to know enough about their field to know what's worth sending out for review in the first place, manage the review process so that you don't have nasty, unhelpful reviews or personal vendettas getting exercised, manage ethical concerns, deal with authorship disputes, etc, and that's just the review piece of things. There's a whole information infrastructure behind the scenes making sure that once something is published that it can be found, indexed, searched for, aggregated by author, connected to the data and code and protocols and other entities that it mentions... I mean, I've been at this for 8 years and there's still so much I don't know.
All that just to make the point that the value proposition is still very much there, though I'll agree publishers could do more to make this apparent.
That's true, but I'm not so sure all of that is still really necessary with the way science dissemination is chaging. In computing science for example the de-facto standard is to self-publish papers on arXiv where there is no peer review prior to publication (beyond arXiv's moderators who check papers are properly categorised, formatted, etc.). The "peer review" comes in the form of the community reading and citing or not citing papers in later publications.
You could argue that publishers only ever needed reviewers - and all the administration baggage that you mention that comes with it - because they had to choose what to compile into each paper issue that would be mailed to subscribers. If we remove the concept of "issues" and just have everyone self-publish on arXiv, a lot of the value you mention regarding journals is no longer needed.
Of course, everyone publishing on arXiv has downsides. It's no longer easy to just read Nature/Science/Physical Review Letters/etc. to find the best research in the field - some other mechanism will be needed to show scientists the best papers without them spending huge portions of their time reading - but I am sure we will find solutions to these problems in time. In fact, with some of my astronomer colleagues it is also pretty normal for them to spend an hour each morning skimming through 10 or so new papers posted to the arXiv.
Aren't most editors unpaid volunteers? There are paid type setters and web people, etc., but the editors who are knowledgeable in the field are not paid except for maybe top 10 journals like Science and Nature.
Yes, the common belief (and it's mine too) is that the gatekeepers like Elsevier use free or low paid experts to pick the good papers and edit them and keep that expensive fee you pay them. Is this only in CS? In my experience all the work was done by free-to-the-publisher editors and what the editors got was listing they were on that journal.
Yea, I brought this up because I have published in a few Earth science journals and that was how it worked in that field.
The editors look at the papers and are the first level of rejections. The ones they think are decent they send to people they think would be good at reviewing the content. The reviews come back (or not, and they send the paper to someone else to review), the editor reads the reviews and, if the reviewer think the paper should be published, the editor sends the reviews to the author for the author to make changes to the paper as needed. The author returns the paper with the edits and explanations for why some suggestions from the reviewer were not followed. The editor now usually accepts the paper for publication and hands off the author to a typesetter that will help in getting the Word doc or TeX formatted paper into the style that the journal wants. This last person is paid by the journal, but in no way needs a PhD in the field or much knowledge of the material he/she is reading.
Agreed. I worked (a long time ago) at BioMedCentral - an open access journal company, and was surprised at the amount of work that went into validating and editing a single paper. Things like paying statisticians to do a proper unbiased statistical review of the methodology, or to co-ordinate peer reviewers and their relationships.
There are perfectly good reasons to criticize Elsevier - you may feel like they should have a way for patients and caregivers to access research about their condition, for example - so there's really no need to be disingenuous.
Giving precise quote with full source link for everyone interested is deceptive? How so?
> RELX, the parent company of Elsevier, hasn't been involved in this for over a decade
Yes, the article is dated 2005. It was clearly a major issue for the journal's reputation at the time, that went through the trouble to write it. The guardian article is from 2008, so 3 years after. Does 2008 being 10 years ago erase that part of the history?
> If anyone doubts the value provided by academic publishers, they should read this story of the inner workings of @TheLancet, one of @ElsevierConnect 's leading journals http://bit.ly/relx-lancet, the link celebrating 200 years of history of the journal!
Either Elsevier takes credit for 200 years of The Lancet, including the last 20 years, or we need to erase all the history up until 10 years ago? :) Wasn't that tweet perhaps deceptive?
> There are perfectly good reasons to criticize Elsevier - you may feel like they should have a way for patients and caregivers to access research about their condition, for example - so there's really no need to be disingenuous.