I've always found the information (or at least knowing that it exists) to be helpful. By having SI, articles can be kept shorter and have good flow - the details are elsewhere, much like a footnote. Articles often have page restrictions, but SI, being electronic, is without such restrictions.
The editors of the noted journal sees all of this as problematic for both the authors and reviewers.
For the authors,
"With few restrictions on space, reviewers may place additional demands on authors, requiring them to perform and add new analyses and experiments to the supplemental data. Often these additions are “invariably subordinate or tangential,” Maunsell maintains, but represent significant work from the author and thus delay the publication process. Supplemental data thus changes the expectations of both author and reviewer, leading to what he describes as an “arms race:”And for the reviewers,
"Reviewer demands in turn have encouraged authors to respond in a supplemental material arms race. Many authors feel that reviewers have become so demanding they cannot afford to pass up the opportunity to insert any supplemental material that might help immunize them against reviewers’ concerns."But there is also the truly laughable perspective of the journal:
"Validating supplementary data adds to the already overburdened job of the reviewer, Maunsell writes. Consequently, these materials do not receive the same degree of rigorous review, if any at all. At the same time, the journal certifies that they have been peer-reviewed."(emphasis added)I say laughable as while I greatly value peer-review and am an active participant and strongly believe it should continue, I never look to a journal to certify anything. A peer-reviewed article has simply been sent out to reviewer and they have sent reviews back. I don't know what the reviews were, if they were done by competent people, if they loved the article, hated it or whatever, and I also don't know if the reviews influenced the editor in the least. I certainly know that everyone reviews articles in their own manner and that that manner may not even be consistent from paper to paper. (I'm not.)
Add to this the knowledge that any paper can later be retracted and you have reduced the supposed certification to vapor. Peer review: three people looked at it and liked it. That's all, nothing more. It is not a proof of correctness, it's just proof that three people looked at it and (probably) liked it.
Tip of the hat to Matteo Cavalleri for the lead.