LogoLogo
  • The Unjournal
  • An Introduction to The Unjournal
    • Content overview
    • How to get involved
      • Brief version of call
      • Impactful Research Prize (pilot)
      • Jobs and paid projects with The Unjournal
        • Advisory/team roles (research, management)
        • Administration, operations and management roles
        • Research & operations-linked roles & projects
        • Standalone project: Impactful Research Scoping (temp. pause)
      • Independent evaluations (trial)
        • Reviewers from previous journal submissions
    • Organizational roles and responsibilities
      • Unjournal Field Specialists: Incentives and norms (trial)
    • Our team
      • Reinstein's story in brief
    • Plan of action
    • Explanations & outreach
      • Press releases
      • Outreach texts
      • Related articles and work
    • Updates (earlier)
      • Impactful Research Prize Winners
      • Previous updates
  • Why Unjournal?
    • Reshaping academic evaluation: Beyond accept/reject
    • Promoting open and robust science
    • Global priorities: Theory of Change (Logic Model)
      • Balancing information accessibility and hazard concerns
    • Promoting 'Dynamic Documents' and 'Living Research Projects'
      • Benefits of Dynamic Documents
      • Benefits of Living Research Projects
    • The File Drawer Effect (Article)
    • Open, reliable, and useful evaluation
      • Multiple dimensions of feedback
  • Frequently Asked Questions (FAQ)
    • For research authors
    • Evaluation ('refereeing')
    • Suggesting and prioritizing research
  • Our policies: evaluation & workflow
    • Project submission, selection and prioritization
      • What research to target?
      • What specific areas do we cover?
      • Process: prioritizing research
        • Prioritization ratings: discussion
      • Suggesting research (forms, guidance)
      • "Direct evaluation" track
      • "Applied and Policy" Track
      • 'Conditional embargos' & exceptions
      • Formats, research stage, publication status
    • Evaluation
      • For prospective evaluators
      • Guidelines for evaluators
        • Why these guidelines/metrics?
        • Proposed curating robustness replication
        • Conventional guidelines for referee reports
      • Why pay evaluators (reviewers)?
      • Protecting anonymity
    • Mapping evaluation workflow
      • Evaluation workflow – Simplified
    • Communicating results
    • Recap: submissions
  • What is global-priorities-relevant research?
  • "Pivotal questions"
    • ‘Operationalizable’ questions
    • Why "operationalizable questions"?
  • Action and progress
    • Pilot steps
      • Pilot: Building a founding committee
      • Pilot: Identifying key research
      • Pilot: Setting up platforms
      • Setting up evaluation guidelines for pilot papers
      • 'Evaluators': Identifying and engaging
    • Plan of action (cross-link)
  • Grants and proposals
    • Survival and Flourishing Fund (successful)
    • ACX/LTFF grant proposal (as submitted, successful)
      • Notes: post-grant plan and revisions
      • (Linked proposals and comments - moved for now)
    • Unsuccessful applications
      • Clearer Thinking FTX regranting (unsuccessful)
      • FTX Future Fund (for further funding; unsuccessful)
      • Sloan
  • Parallel/partner initiatives and resources
    • eLife
    • Peer Communities In
    • Sciety
    • Asterisk
    • Related: EA/global priorities seminar series
    • EA and EA Forum initiatives
      • EA forum peer reviewing (related)
      • Links to EA Forum/"EA journal"
    • Other non-journal evaluation
    • Economics survey (Charness et al.)
  • Management details [mostly moved to Coda]
    • Governance of The Unjournal
    • Status, expenses, and payments
    • Evaluation manager process
      • Choosing evaluators (considerations)
        • Avoiding COI
        • Tips and text for contacting evaluators (private)
    • UJ Team: resources, onboarding
    • Policies/issues discussion
    • Research scoping discussion spaces
    • Communication and style
  • Tech, tools and resources
    • Tech scoping
    • Hosting & platforms
      • PubPub
      • Kotahi/Sciety (phased out)
        • Kotahi: submit/eval/mgmt (may be phasing out?)
        • Sciety (host & curate evals)
    • This GitBook; editing it, etc
    • Other tech and tools
      • Cryptpad (for evaluator or other anonymity)
      • hypothes.is for collab. annotation
Powered by GitBook
On this page
  • How to write a good review (general conventional guidelines)
  • Writing referee reports: resources and benchmarks

Was this helpful?

Export as PDF
  1. Our policies: evaluation & workflow
  2. Evaluation
  3. Guidelines for evaluators

Conventional guidelines for referee reports

PreviousProposed curating robustness replicationNextWhy pay evaluators (reviewers)?

Last updated 9 months ago

Was this helpful?

How to write a good review (general conventional guidelines)

Some key points
  • Cite evidence and reference specific parts of the research when giving feedback.

  • Justify your critiques and claims in a reasoning-transparent way, rather than merely ‘"passing judgment." Avoid comments like "this does not pass the smell test".

  • Provide specific, actionable feedback to the author where possible.

  • Try to restate the authors’ arguments, clearly presenting the most reasonable interpretation of what they have written. See .

  • Be collegial and encouraging, but also rigorous. Criticize and question specific parts of the research without suggesting criticism of the researchers themselves.

We're happy for you to use whichever process and structure you feel comfortable with when writing your evaluation content.

One possible structure

Core

  1. Briefly summarize the work in context

  2. Highlight positive aspects of the paper and its strengths and contributions, considered in the context of existing research.

  3. Most importantly: Identify and assess the paper's most important and impactful claim(s). Are these supported by the evidence provided? Are the assumptions reasonable? Are the authors using appropriate methods?

  4. Note major limitations and potential ways the work could be improved; where possible, reference methodological literature and discussion and work that models what you are suggesting.

Optional/desirable

  • Offer suggestions for increasing the impact of the work, for incorporating the work into global priorities research and impact evaluations, and for supporting and enhancing future work.

  • Discuss minor flaws and their potential revisions.

  • Desirable: formal

Please don't spend time copyediting the work. If you like, you can give a few specific suggestions and then suggest that the author look to make other changes along these lines.

Remember: The Unjournal doesn’t “publish” and doesn’t “accept or reject.” So don’t give an Accept, Revise-and-Resubmit', or Reject-type recommendation. We ask for quantitative metrics, written feedback, and expert discussion of the validity of the paper's main claims, methods, and assumptions.

Writing referee reports: resources and benchmarks

Economics

Semi-relevant:

Report:

Open Science

General, other fields

Other templates and tools

(Conventional but open access; simple and brief)

(Open-science-aligned; perhaps less detail-oriented than we are aiming for)

(Journal-independent “PREreview”; detailed; targets ECRs)

(Conventional; general)

(extensive resources; only some of this is applicable to economics and social science)

‘the 4 validities’ and

steelmanning
'claim identification and assessment'
How to Write an Effective Referee Report and Improve the Scientific Review Process (Berk et al, 2017)
Econometric Society: Guidelines for referees
Improving Peer Review in Economics: Stocktaking and Proposal (Charness et al 2022)
PLOS
Peer Community In... Questionnaire
Open Reviewers Reviewer Guide
The Wiley Online Library
"Peer review in the life sciences (Fraser)"
Collaborative template: RRR assessment peer review
Introducing Structured PREreviews on PREreview.org
seaboat