Reviewing papers
The purpose of this page is to support the members of the GOC by providing guidelines on how to review a paper. Overall, does the research question posed make sense and does the methodology outlined address the question and do the data and results support the conclusions
Assessing the scientific merit
- There should be a minimal number of simplifying assumptions
- They show an awaresness of the rigor curators apply when inferring annotations from sequence similarity
- They do take into account the relative completeness or incompleteness of annotation
- Blast scores should be used intelligently, not simply arbitrary cut-offs
- They should validate their methods or results against examples of incorrect annotations.
Materials, that is, the data sets used as input
- Are datasets used appropriate to answer the research question
- Is the publication explicit about which data sources it has used and how it has partitioned these resources (i.e. version of ontology/ date of annotation files)
Source Data Interpretation
- They take account of the DAG structure in their analysis, and show a consideration for the consequences of misuse of the DAG (direct vs. indirect annotations)
- Understanding of current curation best practices
- Understanding of procedures used by contributing MODs for the evaluation of similarity/orthology for functional transfer
- Evidence code (meaning and use) particularly with respect to inferring functions based on sequence similarity (i.e only from experimental sources)
- Awareness of the possible incomplete nature of the annotations due to the
curation backlog
- Qualifier awareness (especially NOT)
Methodology
- Is the reviewer familiar with the specific type of analysis used in the manuscript
- Is the test relevant to the research question
- Are algorithms developed and reasoning used to evaluate the data robust for the purposes
- Are any statistical tests used appropriate to evaluate significance
- Use of third party software -are version, parameters, cut-offs specified
- Reproducibility Are the methods for makign the conclusions fully described and reproducble
Are all appropriate citations made to
- Support scientific statements
- Recognise previous work
- Describe input data sources
General
- Is this work a significant advance over the previous literature
- Is there anything in the paper that seems unsure
- Provide specific citations of claims of review
Different types of papers
We should also probably have different guidelines for different types of papers:
- Technical papers which use GO
- annotation accuracy assessment
- functional prediction using GO
- GO Tools
- Papers that evaluate biological results using GO enrichment
- etc...
Other notes
Perhaps we should also look at journals guidelines and criteria for assessing informatics papers, and especially multidisciplinary papers (as GO papers often increasingly are). A reviewer might be competent to review the biological or technical content, but not vice versa.