LogoLogo
  • The Unjournal
  • An Introduction to The Unjournal
    • Content overview
    • How to get involved
      • Brief version of call
      • Impactful Research Prize (pilot)
      • Jobs and paid projects with The Unjournal
        • Advisory/team roles (research, management)
        • Administration, operations and management roles
        • Research & operations-linked roles & projects
        • Standalone project: Impactful Research Scoping (temp. pause)
      • Independent evaluations (trial)
        • Reviewers from previous journal submissions
    • Organizational roles and responsibilities
      • Unjournal Field Specialists: Incentives and norms (trial)
    • Our team
      • Reinstein's story in brief
    • Plan of action
    • Explanations & outreach
      • Press releases
      • Outreach texts
      • Related articles and work
    • Updates (earlier)
      • Impactful Research Prize Winners
      • Previous updates
  • Why Unjournal?
    • Reshaping academic evaluation: Beyond accept/reject
    • Promoting open and robust science
    • Global priorities: Theory of Change (Logic Model)
      • Balancing information accessibility and hazard concerns
    • Promoting 'Dynamic Documents' and 'Living Research Projects'
      • Benefits of Dynamic Documents
      • Benefits of Living Research Projects
    • The File Drawer Effect (Article)
    • Open, reliable, and useful evaluation
      • Multiple dimensions of feedback
  • Frequently Asked Questions (FAQ)
    • For research authors
    • Evaluation ('refereeing')
    • Suggesting and prioritizing research
  • Our policies: evaluation & workflow
    • Project submission, selection and prioritization
      • What research to target?
      • What specific areas do we cover?
      • Process: prioritizing research
        • Prioritization ratings: discussion
      • Suggesting research (forms, guidance)
      • "Direct evaluation" track
      • "Applied and Policy" Track
      • 'Conditional embargos' & exceptions
      • Formats, research stage, publication status
    • Evaluation
      • For prospective evaluators
      • Guidelines for evaluators
        • Why these guidelines/metrics?
        • Proposed curating robustness replication
        • Conventional guidelines for referee reports
      • Why pay evaluators (reviewers)?
      • Protecting anonymity
    • Mapping evaluation workflow
      • Evaluation workflow – Simplified
    • Communicating results
    • Recap: submissions
  • What is global-priorities-relevant research?
  • "Pivotal questions"
    • ‘Operationalizable’ questions
    • Why "operationalizable questions"?
  • Action and progress
    • Pilot steps
      • Pilot: Building a founding committee
      • Pilot: Identifying key research
      • Pilot: Setting up platforms
      • Setting up evaluation guidelines for pilot papers
      • 'Evaluators': Identifying and engaging
    • Plan of action (cross-link)
  • Grants and proposals
    • Survival and Flourishing Fund (successful)
    • ACX/LTFF grant proposal (as submitted, successful)
      • Notes: post-grant plan and revisions
      • (Linked proposals and comments - moved for now)
    • Unsuccessful applications
      • Clearer Thinking FTX regranting (unsuccessful)
      • FTX Future Fund (for further funding; unsuccessful)
      • Sloan
  • Parallel/partner initiatives and resources
    • eLife
    • Peer Communities In
    • Sciety
    • Asterisk
    • Related: EA/global priorities seminar series
    • EA and EA Forum initiatives
      • EA forum peer reviewing (related)
      • Links to EA Forum/"EA journal"
    • Other non-journal evaluation
    • Economics survey (Charness et al.)
  • Management details [mostly moved to Coda]
    • Governance of The Unjournal
    • Status, expenses, and payments
    • Evaluation manager process
      • Choosing evaluators (considerations)
        • Avoiding COI
        • Tips and text for contacting evaluators (private)
    • UJ Team: resources, onboarding
    • Policies/issues discussion
    • Research scoping discussion spaces
    • Communication and style
  • Tech, tools and resources
    • Tech scoping
    • Hosting & platforms
      • PubPub
      • Kotahi/Sciety (phased out)
        • Kotahi: submit/eval/mgmt (may be phasing out?)
        • Sciety (host & curate evals)
    • This GitBook; editing it, etc
    • Other tech and tools
      • Cryptpad (for evaluator or other anonymity)
      • hypothes.is for collab. annotation
Powered by GitBook
On this page
  • Progress: the team (continual update)
  • Key elements of plan
  • How was this founding committee recruited?
  • First: direct outreach to a list of pivotal, prominent people
  • Second: public call for interest
  • See also public Gdoc discussion of "the committee and consensus"

Was this helpful?

Export as PDF
  1. Action and progress
  2. Pilot steps

Pilot: Building a founding committee

PreviousPilot stepsNextPilot: Identifying key research

Last updated 1 year ago

Was this helpful?

7 Feb 2023: We have an organized founding/management committee, as well as an advisory board (see Our team). We are focusing on pushing research through the evaluation pipeline, communicating this output, and making it useful. We have a working division of labor, e.g., among "managing editors," for specific papers. We are likely to expand our team after our pilot, conditional on further funding.

Progress: the team (continual update)

Our team

Key elements of plan

Put together founding committee, meetings, public posts, and feedback (done)
  1. Build a "founding committee" of 5–8 experienced and enthusiastic EA-aligned or adjacent researchers at EA orgs, research academics, and practitioners (e.g., draw from speakers at recent EA Global meetings).

    1. Create private Airtable with lists of names and organizations

    2. Added element: List of supporter names for credibility, with little or no commitment

  2. Host a meeting (and shared collaboration space/document), to come to a consensus on a set of practical principles. [26 May 2022: First meeting held, writing up shared notes.]

  3. Post and present our consensus (coming out of this meeting) on key fora. After a brief followup period (~1 week), consider adjusting the above consensus plan in light of the feedback, repost, and move forward.

... Excerpts from successful ACX grant, , reiterated in followup FTX Future Fund (for further funding; unsuccessful).

How was this founding committee recruited?

  • The creation of an action plan can be seen in the Gdoc discussion

Three key relevant areas from which to draw candidates

DR: I think I need to draw people from a few relevant areas: 1. Academia, in relevant subject fields for The Unjournal: economics, quantitative social science, maybe more

2. Effective altruism, to assess the value and scope of the journal and the research

3. Open Science and academic reform, and applied metascience—people with practical ideas and knowledge

+ People with strong knowledge of the journal and bibliometric processes and systems

First: direct outreach to a list of pivotal, prominent people

  1. Assemble a list of the most relevant and respected people, using more or less objective criteria and justification.

    1. Ask to join founding committee.

    2. Ask to join list of supporters.

  2. Add people who have made past contributions.

28 May 2022: The above has mostly been done, at least in terms of people attending the first meeting. We probably need a more systematic approach to getting the list of supporters.

Second: public call for interest

Further posts on social media, academic websites and message boards, etc.

See also public Gdoc

discussion of "the committee and consensus"
"Procedure for choosing committee"
EA Forum question post: Soliciting lists and names
Unjournal: Call for participants and research - EA Forum
Logo
The twelve-month plan