LogoLogo
  • The Unjournal
  • An Introduction to The Unjournal
    • Content overview
    • How to get involved
      • Brief version of call
      • Impactful Research Prize (pilot)
      • Jobs and paid projects with The Unjournal
        • Advisory/team roles (research, management)
        • Administration, operations and management roles
        • Research & operations-linked roles & projects
        • Standalone project: Impactful Research Scoping (temp. pause)
      • Independent evaluations (trial)
        • Reviewers from previous journal submissions
    • Organizational roles and responsibilities
      • Unjournal Field Specialists: Incentives and norms (trial)
    • Our team
      • Reinstein's story in brief
    • Plan of action
    • Explanations & outreach
      • Press releases
      • Outreach texts
      • Related articles and work
    • Updates (earlier)
      • Impactful Research Prize Winners
      • Previous updates
  • Why Unjournal?
    • Reshaping academic evaluation: Beyond accept/reject
    • Promoting open and robust science
    • Global priorities: Theory of Change (Logic Model)
      • Balancing information accessibility and hazard concerns
    • Promoting 'Dynamic Documents' and 'Living Research Projects'
      • Benefits of Dynamic Documents
      • Benefits of Living Research Projects
    • The File Drawer Effect (Article)
    • Open, reliable, and useful evaluation
      • Multiple dimensions of feedback
  • Frequently Asked Questions (FAQ)
    • For research authors
    • Evaluation ('refereeing')
    • Suggesting and prioritizing research
  • Our policies: evaluation & workflow
    • Project submission, selection and prioritization
      • What research to target?
      • What specific areas do we cover?
      • Process: prioritizing research
        • Prioritization ratings: discussion
      • Suggesting research (forms, guidance)
      • "Direct evaluation" track
      • "Applied and Policy" Track
      • 'Conditional embargos' & exceptions
      • Formats, research stage, publication status
    • Evaluation
      • For prospective evaluators
      • Guidelines for evaluators
        • Why these guidelines/metrics?
        • Proposed curating robustness replication
        • Conventional guidelines for referee reports
      • Why pay evaluators (reviewers)?
      • Protecting anonymity
    • Mapping evaluation workflow
      • Evaluation workflow – Simplified
    • Communicating results
    • Recap: submissions
  • What is global-priorities-relevant research?
  • "Pivotal questions"
    • ‘Operationalizable’ questions
    • Why "operationalizable questions"?
  • Action and progress
    • Pilot steps
      • Pilot: Building a founding committee
      • Pilot: Identifying key research
      • Pilot: Setting up platforms
      • Setting up evaluation guidelines for pilot papers
      • 'Evaluators': Identifying and engaging
    • Plan of action (cross-link)
  • Grants and proposals
    • Survival and Flourishing Fund (successful)
    • ACX/LTFF grant proposal (as submitted, successful)
      • Notes: post-grant plan and revisions
      • (Linked proposals and comments - moved for now)
    • Unsuccessful applications
      • Clearer Thinking FTX regranting (unsuccessful)
      • FTX Future Fund (for further funding; unsuccessful)
      • Sloan
  • Parallel/partner initiatives and resources
    • eLife
    • Peer Communities In
    • Sciety
    • Asterisk
    • Related: EA/global priorities seminar series
    • EA and EA Forum initiatives
      • EA forum peer reviewing (related)
      • Links to EA Forum/"EA journal"
    • Other non-journal evaluation
    • Economics survey (Charness et al.)
  • Management details [mostly moved to Coda]
    • Governance of The Unjournal
    • Status, expenses, and payments
    • Evaluation manager process
      • Choosing evaluators (considerations)
        • Avoiding COI
        • Tips and text for contacting evaluators (private)
    • UJ Team: resources, onboarding
    • Policies/issues discussion
    • Research scoping discussion spaces
    • Communication and style
  • Tech, tools and resources
    • Tech scoping
    • Hosting & platforms
      • PubPub
      • Kotahi/Sciety (phased out)
        • Kotahi: submit/eval/mgmt (may be phasing out?)
        • Sciety (host & curate evals)
    • This GitBook; editing it, etc
    • Other tech and tools
      • Cryptpad (for evaluator or other anonymity)
      • hypothes.is for collab. annotation
Powered by GitBook
On this page
  • Submission, evaluation and management platform
  • Sciety group (curated evaluations and research)

Was this helpful?

Export as PDF
  1. Action and progress
  2. Pilot steps

Pilot: Setting up platforms

PreviousPilot: Identifying key researchNextSetting up evaluation guidelines for pilot papers

Last updated 1 year ago

Was this helpful?

Set up the basic platforms for posting and administering reviews and evaluations and offering curated links and categorizations of papers and projects.

Progress reports

Update 7 Sep 2022, partial update 22 Dec 2022

  • We are setting up processes and forms in

    • is pretty useable (but imperfect, e.g., we need to ask people to (click 'submit a URL instead' on page one)

  • Evaluations form: using a Gdoc for now, trying out Airtable, Qualtrics and other solutions, aiming to integrate it into Kotahi

  • See Mapping evaluation workflow for how projects will enter, be evaluated, and 'output'

  • We will outline specific for developers\

  • Sciety group set up with 'Hypothes.is feed'; working on processing first evaluations\

Submission, evaluation and management platform Kotahi: submit/eval/mgmt (may be phasing out?)

7 Feb 2023

  • Set up

  • Configured it for submission and management

  • Mainly configured for evaluation but it needs bespoke configuration to be efficient and easy for evaluators, particular for the quantitative ratings and predictions. Thus we are using Google Docs (or cryptpads) for the pilot. Will configure Kotahi with further funds.

Sciety group (curated evaluations and research)

Evaluations are curated , which integrates these with the publicly-hosted research.

7 Feb 2023: We are working on

  • The best ways to get evaluations from "submissions on Kotahi" into Sciety,

  • ... with the curated link to the publicly-hosted papers (or projects) on a range of platforms, including NBER

  • Ways to get DOIs for each evaluation and author response

  • Ways to integrate evaluation details as 'collaborative annotations' (with hypothes.is) into the hosted papers

(We currently use a hypothes.is workaround to have this feed into so these show up as ‘evaluated pre-prints’ in their public database, gaining a DOI.

Kotahi
Submissions form
requests
Kotahi page HERE
in our Sciety.org group
Sciety
Notes, exploring the platform.