Monday, February 11, 2008

Problems with peer-review, Part One

In Chapter 4 (Free Software and the Scientific
Practice of Computer Science) of Decoding Liberation, while writing of the ideal of objectivity in science and the role of free software in ensuring it in computer science, we spent some time examining the phenomenon of peer-review. In this post (there'll be two actually), I want to revisit that discussion by way of amplifying one of the points made in there. I'll post a couple of small parts in this first post, and then follow-up tomorrow with my own comments

In Chapter 4, We started by noting that
Free software and current scientific practice share a reliance on peer review to help ensure that results are of the highest possible objective quality. Peer review’s role in science was formalized in the eighteenth century, when the Royal Society of London’s “Committee on Papers” was granted the power to “solicit expert opinions.” Peer review became an indispensable part of scientific practice due to the sharp increase in scientific work after the Second World War (Drummond 2003). Just as the increased complexity of science, due to its increasingly mathematical nature, required scientists to conduct peer review in the era of patronfunded science during the Renaissance, the increase in both variety of disciplines and volume of submissions drove formerly self-reliant journal editorial boards to seek larger pools of reviewers.

From this point onwards, though, we note a problem, which will ultimately be the subject of these posts:
But peer review, especially its anonymous variant, might not improve the rigor of the review process and thus not adequately facilitate objectivity (van Rooyen et al. 1999). Instead, anonymous peer review might act as a damper on innovation, by placing guardians at the gates to science: paradigms remain unchallenged as the authority of powerful scientists remains unquestioned (Horrobin 1990). The discipline of computer science is not immune to these problems; anecdotal evidence seems to suggest that practitioners are disgruntled about this process. Anonymous critique of papers, they point out, results in a familiar attendance list at premier academic conferences. But a more serious charge can be leveled against anonymous peer review: it provides no guarantee of the quality of published work (Horrobin 1990, 1996, 1981). An examination (Rothwell and Martyn 2000) of the correlation among reviewers’ evaluations of submissions to neuroscience journals and conferences revealed that
For one journal, the relationships among the reviewers’ opinions were no better than that obtained by chance. For the other journal, the relationship was only fractionally better. For the meeting abstracts, the content of the abstract accounted for only about 10 to 20 percent of the variance in opinion of referees, and other factors accounted for 80 to 90 percent of the variance. (Horrobin 2001)
It is difficult to value this form of peer review when little distinguishes it from arbitrary selection.

Thats the problem; we go on to talk about open, non-anonymous peer review as a particular solution, and about free software's methods of peer review and its value as an ideal for the practice of computer science at large. In the second post, I want to talk a bit about how badly, it seems to me, peer review is busted in the sciences. This will be anecdotal, insofar as I will be reliant upon my own experiences and observations. Still, considered as a report from the trenches, it might have some value for the reader. I should also qualify my comments by saying that while peer review seems to work reasonably well in journal article review, it is undeniably broke in conference article and grant proposal review, two fairly large and important parts of the practice of science today. We can then return to the solutions mentioned above.

8 Comments:

Blogger alina said...

Barrister Global Services Scam

8:05 AM  
Blogger alina said...

Barrister Global Services Scam

8:06 AM  
Blogger Lindasy Rosenwald said...

Nice blogging, My review is very good example.
Lindsay Rosenwald http://www.lindsay-rosenwald.net/ Dr. Lindsay Rosenwald is one of the re-known venture capitalists and the hedge fund managers in the world

11:18 AM  
Blogger Lindasy Rosenwald said...

Nice blogging, My review is very good example.
Lindsay Rosenwald http://www.lindsay-rosenwald.net/ Dr. Lindsay Rosenwald is one of the re-known venture capitalists and the hedge fund managers in the world

11:18 AM  
Blogger John said...

Nice Blogging,
UTAH : Utah Web Design http://www.adaptivitypro.com/utah-web-design/

8:45 AM  
Blogger John said...

Very good blogging,
Utah SEO http://www.adaptivitypro.com/utah-web-design/

8:48 AM  
Blogger John said...

Nice Blogging,
UTAH : Utah Web Design http://www.adaptivitypro.com/utah-web-design/

8:49 AM  
Blogger John said...

Very good blogging,
Utah SEO http://www.adaptivitypro.com/utah-web-design/

8:51 AM  

Post a Comment

<< Home