LogoLogo
  • The Unjournal
  • An Introduction to The Unjournal
    • Content overview
    • How to get involved
      • Brief version of call
      • Impactful Research Prize (pilot)
      • Jobs and paid projects with The Unjournal
        • Advisory/team roles (research, management)
        • Administration, operations and management roles
        • Research & operations-linked roles & projects
        • Standalone project: Impactful Research Scoping (temp. pause)
      • Independent evaluations (trial)
        • Reviewers from previous journal submissions
    • Organizational roles and responsibilities
      • Unjournal Field Specialists: Incentives and norms (trial)
    • Our team
      • Reinstein's story in brief
    • Plan of action
    • Explanations & outreach
      • Press releases
      • Outreach texts
      • Related articles and work
    • Updates (earlier)
      • Impactful Research Prize Winners
      • Previous updates
  • Why Unjournal?
    • Reshaping academic evaluation: Beyond accept/reject
    • Promoting open and robust science
    • Global priorities: Theory of Change (Logic Model)
      • Balancing information accessibility and hazard concerns
    • Promoting 'Dynamic Documents' and 'Living Research Projects'
      • Benefits of Dynamic Documents
      • Benefits of Living Research Projects
    • The File Drawer Effect (Article)
    • Open, reliable, and useful evaluation
      • Multiple dimensions of feedback
  • Frequently Asked Questions (FAQ)
    • For research authors
    • Evaluation ('refereeing')
    • Suggesting and prioritizing research
  • Our policies: evaluation & workflow
    • Project submission, selection and prioritization
      • What research to target?
      • What specific areas do we cover?
      • Process: prioritizing research
        • Prioritization ratings: discussion
      • Suggesting research (forms, guidance)
      • "Direct evaluation" track
      • "Applied and Policy" Track
      • 'Conditional embargos' & exceptions
      • Formats, research stage, publication status
    • Evaluation
      • For prospective evaluators
      • Guidelines for evaluators
        • Why these guidelines/metrics?
        • Proposed curating robustness replication
        • Conventional guidelines for referee reports
      • Why pay evaluators (reviewers)?
      • Protecting anonymity
    • Mapping evaluation workflow
      • Evaluation workflow – Simplified
    • Communicating results
    • Recap: submissions
  • What is global-priorities-relevant research?
  • "Pivotal questions"
    • ‘Operationalizable’ questions
    • Why "operationalizable questions"?
  • Action and progress
    • Pilot steps
      • Pilot: Building a founding committee
      • Pilot: Identifying key research
      • Pilot: Setting up platforms
      • Setting up evaluation guidelines for pilot papers
      • 'Evaluators': Identifying and engaging
    • Plan of action (cross-link)
  • Grants and proposals
    • Survival and Flourishing Fund (successful)
    • ACX/LTFF grant proposal (as submitted, successful)
      • Notes: post-grant plan and revisions
      • (Linked proposals and comments - moved for now)
    • Unsuccessful applications
      • Clearer Thinking FTX regranting (unsuccessful)
      • FTX Future Fund (for further funding; unsuccessful)
      • Sloan
  • Parallel/partner initiatives and resources
    • eLife
    • Peer Communities In
    • Sciety
    • Asterisk
    • Related: EA/global priorities seminar series
    • EA and EA Forum initiatives
      • EA forum peer reviewing (related)
      • Links to EA Forum/"EA journal"
    • Other non-journal evaluation
    • Economics survey (Charness et al.)
  • Management details [mostly moved to Coda]
    • Governance of The Unjournal
    • Status, expenses, and payments
    • Evaluation manager process
      • Choosing evaluators (considerations)
        • Avoiding COI
        • Tips and text for contacting evaluators (private)
    • UJ Team: resources, onboarding
    • Policies/issues discussion
    • Research scoping discussion spaces
    • Communication and style
  • Tech, tools and resources
    • Tech scoping
    • Hosting & platforms
      • PubPub
      • Kotahi/Sciety (phased out)
        • Kotahi: submit/eval/mgmt (may be phasing out?)
        • Sciety (host & curate evals)
    • This GitBook; editing it, etc
    • Other tech and tools
      • Cryptpad (for evaluator or other anonymity)
      • hypothes.is for collab. annotation
Powered by GitBook
On this page
  • Why have an “Applied & Policy Stream”?
  • What should be included in the Policy stream?
  • How should our (evaluation etc.) policies differ here?
  • Authors' permission in the Applied Stream

Was this helpful?

Export as PDF
  1. Our policies: evaluation & workflow
  2. Project submission, selection and prioritization

"Applied and Policy" Track

David Reinstein, Nov 2024: Over the last six months we have considered and evaluated a small amount of work under this “Applied & Policy Stream”. We are planning to continue this stream for the forseeable future.

Why have an “”?

Much of the most impactful research is not aimed at academic audiences and may never be submitted to academic journals. It is written in formats that are very different from traditional academic outputs, and cannot be easily judged by academics using the same standards. Nonetheless, this work may use technical approaches developed in academia, making it important to gain expert feedback and evaluation.

The Unjournal can help here. However, to avoid confusion, we want to make this clearly distinct from our main agenda, which focuses on impactful academically-aimed research.

Our “Applied & Policy Stream” will be clearly labeled as separate from our main stream. This may constitute roughly 10 or 15% of the work that we cover. Below, we refer to this as the “applied stream” for brevity.

What should be included in the Policy stream?

Our considerations for prioritizing this work are generally the same as for our academic stream – is it in the fields that we are focused on, using approaches that enable meaningful evaluation and rating? Is it already having impact (e.g., influencing grant funding in globally-important areas)? Does it have the potential for impact, and if so, is it high-quality enough that we should consider boosting its signal?

We will particularly prioritize policy and applied work that uses technical methods that need evaluation by research experts, often academics.

a range of applied research from EA/GP/LT linked organizations such as GPI, Rethink Priorities, Open Philanthropy, FLI, HLI, Faunalytics, etc., as well as EA-adjacent organizations and relevant government white papers.

How should our (evaluation etc.) policies differ here?

Ratings/metrics: As in the academic stream, this work will be evaluated for its credibility, usefulness, communication/logic, etc. However, we are not seeking to have this work assessed by the standards of academia in a way that yields a comparison to traditional journal tiers. Evaluators: Please ignore these parts of our interface; if you are unsure if it is relevant feel free to ask.

Evaluator selection, number, pay: Generally we want to continue to select academic research experts or non-academic researchers with strong academic and methodological background to do these evaluations. , particularly from academia, to work that is not normally scrutinized by such experts.

The compensation may be flexible as well; in some cases the work may be more involved than for the academic stream and in some cases less involved. As a starting point we will begin by offering the same compensation as for the academic stream.

Careful flagging and signposting: To preserve the reputation of our academic-stream evaluations we need to make it clear, wherever people might see this work, that it is not being evaluated by the same standards as the academic stream and doesn't “count” towards those metrics.

Authors' permission in the Applied Stream

Previous"Direct evaluation" trackNext'Conditional embargos' & exceptions

Last updated 5 months ago

Was this helpful?

This research is more likely to fall into the category of , "already influencing a substantial amount of funding in impact-relevant areas, or substantially influencing policy considerations".

If the research itself is being funded by a global-impact focused foundation or donor, this will also constitute a strong prima facie reason to commission an evaluation (without requiring the authors' consent). See .

this post on the EA Forum
Direct evaluation of "impactful work"