Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
(see our #in-a-nutshell) wants your involvement, help, and feedback. We offer rewards and strive to compensate people for their time and effort.
Join our team: Complete this form to apply for our...
Evaluator pool: to be eligible to be commissioned and paid to evaluate and rate research, mainly in quantitative social science and policy
Field specialist teams: help identify, prioritize, and manage research evaluation in a particular field or cause area
Management team or advisory board, to be part of our decision-making
Do an Independent Evaluation to build your portfolio, receive guidance, and be eligible for promotion and prizes. See details Independent evaluations (trial)
Suggest "Pivotal questions" for us to focus on
Give us feedback: Is anything unclear? What could be improved? Email contact@unjournal.org. We will offer rewards for the most useful suggestions.
David Reinstein is the founder and co- of The Unjournal. The organization is currently looking for field specialists and evaluators, as well as suggestions for relevant work for The Unjournal to evaluate.
The Unjournal team is building a system for credible, public, journal-independent feedback and evaluation of research.
We maintain an open call for participants for four different roles:
Management Committee members (involving honorariums for time spent)
Advisory Board members (no time commitment)
Field Specialists (who will often also be on the Advisory Board)
A pool of Evaluators (who will be paid for their time and their work; we also draw evaluators from outside this pool)
The roles are explained in more detail here. You can express your interest (and enter our database) here.
We will reach out to evaluators (a.k.a. "reviewers") on a case-by-case basis, appropriate for each paper or project being assessed. This is dependent on expertise, the researcher's interest, and a lack of conflict of interest.
Time commitment: Case-by-case basis. For each evaluation, here are some guidelines for the amount of time to spend.
Compensation: We pay a minimum of $200 (updated Aug. 2024) for a prompt and complete evaluation, $400 for experienced evaluators. We offer additional prizes and incentives, and are committed to an average compensation of at least $450 per evaluator. See here for more details.
Who we are looking for: We are putting together a list of people interested in being an evaluator and doing paid referee work for The Unjournal. We generally prioritize the pool of evaluators who signed up for our database before reaching out more widely.
Interested? Please fill out this form (about 3–5 min, same form for all roles or involvement).
Ready to get started doing evaluations and building a track record? See our new Independent evaluations (trial) initiative, offering prizes and recognition for the best work. You can evaluate work in our public database, or suggest and evaluate work.
We are looking for high-quality, globally pivotal research projects to evaluate, particularly those embodying open science practices and innovative formats. We are putting out a call for relevant research. Please suggest research here. (We offer bounties and prizes for useful suggestions – .) For details of what we are looking for, and some potential examples, see this post and accompanying links.
You can also put forward your own work.
Note: This is under continual refinement; see our policies for more details.
As of December 2023, the prizes below have been chosen and will be soon announced. We are also scheduling an event linked to this prize. However, we are preparing for even larger author and evaluator prizes for our next phase. Submit your research to The Unjournal or serve as an evaluator to be eligible for future prizes (details to be announced).
Submit your work to be eligible for our “Unjournal: Impactful Research Prize” and a range of other benefits including the opportunity for credible public evaluation and feedback.
First-prize winners will be awarded $, and the runner-ups will receive $1000.
Note: these are the minimum amounts; we will increase these if funding permits.
Prize winners will have the opportunity (but not the obligation) to present their work at an online seminar and prize ceremony co-hosted by The Unjournal, Rethink Priorities, and EAecon.
To be eligible for the prize, submit a link to your work for public evaluation here.
Please choose “new submission” and “Submit a URL instead.”
The latter link requires an ORCID ID; if you prefer, you can email your submission to
The Unjournal, with funding from the Long Term Future Fund and the Survival and Flourishing Fund, organizes and funds public-journal-independent feedback and evaluation. We focus on research that is highly relevant to global priorities, especially in economics, social science, and impact evaluation, and aim to expand this widely. We encourage better research by making it easier for researchers to get feedback and credible ratings on their work.
We aim to publicly evaluate 15 papers (or projects) within our pilot year. This award will honor researchers doing robust, credible, transparent work with a global impact. We especially encourage the submission of research in "open" formats such as hosted dynamic documents (Quarto, R-markdown, Jupyter notebooks, etc.).
The research will be chosen by our management team for public evaluation by 2–3 carefully selected, paid reviewers based on an initial assessment of a paper's methodological strength, openness, clarity, relevance to global priorities, and the usefulness of further evaluation and public discussion. We sketch out these criteria here.
All evaluations, including quantitative ratings, will be made public by default; however, we will consider "embargos" on this for researchers with sensitive career concerns (the linked form asks about this). Note that submitting your work to The Unjournal does not imply "publishing" it: you can submit it to any journal before, during, or after this process.
If we choose not to send your work out to reviewers, we will try to at least offer some brief private feedback (please on this).
All work evaluated by The Unjournal will be eligible for the prize. Engagement with The Unjournal, including responding to evaluator comments, will be a factor in determining the prize winners. We also have a slight preference for giving at least one of the awards to an early-career researcher, but this need not be determinative.
Our management team and advisory board will vote on the prize winners in light of the evaluations, with possible consultation of further external expertise.
Deadline: Extended until 5 December (to ensure eligibility).
Note: In a subsection below, Recap: submissions, we outline the basic requirements for submissions to The Unjournal.
The prize winners for The Unjournal's Impactful Research Prize were selected through a multi-step, collaborative process involving both the management team and the advisory board. The selection was guided by several criteria, including the quality and credibility of the research, its potential for real-world impact, and the authors' engagement with The Unjournal's evaluation process.
Initial Evaluation: All papers that were evaluated by The Unjournal were eligible for the prize. The discussion, evaluations, and ratings provided by external evaluators played a significant role in the initial shortlisting.
Management and Advisory Board Input: Members of the management committee and advisory board were encouraged to write brief statements about papers they found particularly prize-worthy.
Meeting and Consensus: A "prize committee" meeting was held with four volunteers from the management committee to discuss the shortlisted papers and reach a consensus. The committee considered both the papers and the content of the evaluations Members of the committee allocated a total of 100 points among the 10 paper candidates. We used this to narrow down a shortlist of five papers.
Point Voting: The above shortlist and the notes from the accompanying discussion were shared with all management committee and advisory board members. Everyone in this larger group was invited to allocate up to 100 points among the shortlisted papers (and asked to allocate fewer points if they were less familiar with the papers and evaluations).
Special Considerations: We decided that at least one of the winners had to be a paper submitted by the authors or one where the authors substantially engaged with The Unjournal's processes. However, this constraint did not prove binding. Early-career researchers were given a slight advantage in our consideration.
Final Selection: The first and second prizes were given to the papers with the first- and second-most points, respectively.
This comprehensive approach aimed to ensure that the prize winners were selected in a manner that was rigorous, fair, and transparent, reflecting the values and goals of The Unjournal.
I (David Reinstein) am an economist who left UK academia after 15 years to pursue a range of projects (see my web page). One of these is The Unjournal:
The Unjournal (with funding from the Long Term Future Fund and the Survival and Flourishing Fund) organizes and funds public-journal-independent feedback and evaluation, paying reviewers for their work. We focus on research that is highly relevant to global priorities, especially in economics, social science, and impact evaluation. We encourage better research by making it easier for researchers to get feedback and credible ratings on their work.
We are looking for your involvement...
We want researchers who are interested in doing evaluation work for The Unjournal. We pay an average of evaluation, and we award monetary prizes for the strongest work. Right now we are particularly looking for economists and people with quantitative and policy-evaluation skills. We describe what we are asking evaluators to do here: essentially a regular peer review with some different emphases, plus providing a set of quantitative ratings and predictions. Your evaluation content would be made public (and receive a DOI, etc.), but you can choose if you want to remain anonymous or not.
To sign up to be part of the pool of evaluators or to get involved in The Unjournal project in other ways, please fill out this brief form or email contact@unjournal.org.
We welcome suggestions for particularly impactful research that would benefit from (further) public evaluation. We choose research for public evaluation based on an initial assessment of methodological strength, openness, clarity, relevance to global priorities, and the usefulness of further evaluation and public discussion. We sketch these criteria here, and discuss some potential examples here (see research we have chosen and evaluated at unjournal.pubpub.org, and a larger list of research we're considering here).
If you have research—your own or others—that you would like us to assess, please fill out this form. You can submit your own work here (or by contacting ). Authors of evaluated papers will be eligible for our Impactful Research Prizes ().
We are looking for both feedback on and involvement in The Unjournal project. Feel free to reach out at .
View our data protection statement
19 Feb 2024. We are not currently hiring, but expect to do so in the future
To indicate your potential interest in roles at The Unjournal, such as those described below, please fill out and link (or upload) your CV or webpage.
If you already filled out this form for a role that has changed titles, don’t worry. You will still be considered for relevant and related roles in the future.
If you add your name to this form, we may contact you to offer you the opportunity to do paid project work and paid work tasks.
Furthermore, if you are interested in conducting paid research evaluation for The Unjournal, or in joining our advisory board, please complete the form linked .
Feel free to contact contact@unjournal.org with any questions.
The Unjournal, a not-for-profit collective under the umbrella and fiscal sponsorship of the , is an equal-opportunity employer and contractor. We are committed to creating an inclusive environment for all employees, volunteers, and contractors. We do not discriminate on the basis of race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetic information, disability, age, or veteran status.
See our data protection statement .
In addition to the jobs and paid projects listed here, we are expanding our management team, advisory board, field specialist team pool, and evaluator pool. Most of these roles involve compensation/honorariums. See
If you are interested in discussing any of the above in person, please email us () to arrange a conversation.
Did you just write a brilliant peer review for an economics (or social science, policy, etc.) journal? Your work should not be wasted, there should be a way to share your insights and get credit!
Consider transforming these insights into a public "independent evaluation" for . This will benefit the community and help make research better and more impactful. And we can share your work and provide you feedback. This will help you build a portfolio with The Unjournal, making it more likely we'll hire you for paid work and compensate you at the higher rate. And we offer prizes for the best work.
You can do this either anonymously or sign your name.
To say this in :
Journal peer review is critical for assessing and improving research, but too often these valuable discussions remain hidden behind closed doors. By publishing a version of your review, you can: (1) Amplify the impact of your reviewing efforts by contextualizing the research for a broader audience, (2) Facilitate more transparent academic discussions around the strengths and limitations of the work, (3) Get public recognition for your peer review contributions, which are often unseen and unrewarded (4) Reduce overall reviewing burdens by allowing your assessment to be reused, (5) Support a culture of open scholarship by modeling constructive feedback on public research
According to a COPE Discussion document: Who “owns” peer reviews (emphasis added)
While the depth of commentary may vary greatly among reviews, given the minimal thresholds set by copyright law, it can be presumed that most reviews meet the requirements for protection as an “original work of authorship”. As such, in the absence of an express transfer of copyright or a written agreement between the reviewer and publisher establishing the review as a “work for hire”, it may be assumed that, by law, the reviewer holds copyright to their reviewer comments and thus is entitled to share the review however the reviewer deems fit...
The COPE council notes precisely the benefits we are aiming to unlock. They mention an 'expectation of confidentiality' that seems incompletely specified.
For example, reviewers may wish to publish their reviews in order to demonstrate their expertise in a subject matter and to contribute to their careers as a researcher. Or they may see publication of their reviews as advancing discourse on the subject and thus acting for the benefit of science as a whole. Nevertheless, a peer reviewer’s comments are significantly different from many other works of authorship in that they are expressly solicited as a work product by a journal and—whatever the peer review model—are subject to an expectation of confidentiality. However, without an express agreement between the journal and the reviewer, it is questionable whether such obligation of confidentiality should be considered to apply only until a final decision is reached on the manuscript, or to extend indefinitely.
Several journals explicitly agree that reviewers are welcome to publish the content of their reviews, with some important caveats. The Publish Your Reviews initiative gathered public statements from several journals and publishers confirming that they support reviewers posting their comments externally. However, they generally ask reviewers to remove any confidential information before sharing their reviews. This includes: the name of the journal, the publication recommendation (e.g., accept, revise, or reject), and any other details the journal or authors considered confidential, such as unpublished data.
For these journals, we are happy to accept and share/link the verbatim content as part of an independent Unjournal evaluation.
But even for journals that have not signed onto this, as the COPE mentioned Your peer review is your intellectual property, it is not owned by the journal!
There may be some terms and conditions you agreed to as part of submitting a referee report. Please consult these carefully.
However, you are still entitled to share your own expert opinions on publicly-shared research. You may want to rewrite the review somewhat. You should make it clear that it refers to the publicly-shared (working paper/preprint) version of the research, not the one the journal shared with you in confidence. As above, you should probably not mention the journal name, the decision, or any other sensitive information. You don't even need to mention that you did review the paper for a journal.
Even if a journal considers the specific review confidential, this doesn't prevent the reviewer from expressing their independent assessment elsewhere.
As an expert reviewer, you have unique insights that can improve the quality and impact of research. Making your assessment available through The Unjournal amplifies the reach and value of your efforts. You can publish evaluations under your name or remain anonymous.
Ready to make your peer reviews work harder for science? Consider submitting an independent evaluation, for recognition, rewards, and to improve research. Contact us anytime at contact@unjournal.org for guidance... We look forward to unlocking your valuable insights!
Nov. 2023 update: We have paused this process focus to emphasize our positions. We hope to come back to hiring researchers to implement these projects soon.
We are planning to hire 3–7 researchers for a one-off paid project.
There are two opportunities: Contracted Research (CR) and Independent Projects (IP).
Project Outline
What specific research themes in economics, policy, and social science are most important for global priorities?
What projects and papers are most in need of further in-depth public evaluation, attention, and scrutiny?
Where does "Unjournal-style evaluation" have the potential to be one of the most impactful uses of time and money? By impactful, we mean in terms of some global conception of value (e.g., the well-being of living things, the survival of human values, etc.).
This is an initiative that aims to identify, summarize, and conduct an in-depth evaluation of the most impactful themes in economics, policy, and social science to answer the above questions. Through a systematic review of selected papers and potential follow-up with authors and evaluators, this project will enhance the visibility, understanding, and scrutiny of high-value research, fostering both rigorous and impactful scholarship.
Contracted Research (CR) This is the main opportunity, a unique chance to contribute to the identification and in-depth evaluation of impactful research themes in economics, policy, and social science. We’re looking for researchers and research users who can commit a (once-off) 15–20 hours. CR candidates will:
Summarize a research area or theme, its status, and why it may be relevant to global priorities (~4 hours).
We are looking for fairly narrow themes. Examples might include:
The impact of mental health therapy on well-being in low-income countries.
The impact of cage-free egg regulation on animal welfare.
Public attitudes towards AI safety regulation.
Identify a selection of papers in this area that might be high-value for UJ evaluation (~3 hours).
Choose at least four of these from among NBER/"top-10 working paper" series (or from work submitted to the UJ – we can share – or from work where the author has expressed interest to you).
For a single paper, or a small set of these papers (or projects) (~6 hours)
Read the paper fairly carefully and summarize it, explaining why it is particularly relevant.
Discuss one or more aspects of the paper that need further scrutiny or evaluation.
Identify 3 possible evaluators, and explain why they might be particularly relevant to evaluate this work. (Give a few sentences we could use in an email to these evaluators).
Possible follow-up task: email and correspond with the authors and evaluators (~3 hours).
We are likely to follow up on your evaluation suggestions. We also may incorporate your writing into our web page and public posts; you can choose whether you want to be publicly acknowledged or remain anonymous.
Independent Projects (IP)
We are also inviting applications to do similar work as an “Independent Project” (IP), a parallel opportunity designed for those eager to engage but not interested in working under a contract, or not meeting some of the specific criteria for the Contracted Research role. This involves similar work to above.
If you are accepted to do an IP, we will offer some mentoring and feedback. We will also offer prize rewards/bounties for particularly strong IP work. We will also consider working with professors and academic supervisors on these IP projects, as part of university assignments and dissertations.
You can apply to the CR and IP positions together; we will automatically consider you for each.
Get Involved!
Nov. 2023: We are currently prioritizing bringing in more to build our teams in a few areas, particularly in:
Catastrophic risks, AI governance and safety
Animal welfare: markets, attitudes
As well as:
Quantitative political science (voting, lobbying, attitudes)
Social impact of AI/emerging technologies
Macro/growth, finance, public finance
Long-term trends and demographics
In addition to the "work roles," we are looking to engage researchers, research users, meta-scientists, and people with experience in open science, open access, and management of initiatives similar to The Unjournal.
We are continually looking to enrich our general team and board, including our , These roles come with some compensation and incentives.
(Please see links and consider submitting an expression of interest).
Kickstarter incentive: After the first 8 quality submissions (or by Jan. 1, 2025 - whichever comes later) we will award a prize of $500 to the strongest evaluation.
Note on .
is seeking academics, researchers, and students to submit structured evaluations of the most impactful research . Strong evaluations will be posted or linked on our , offering readers a perspective on the implications, strengths, and limitations of the research. These evaluations can be submitted using for academic-targeted research or for ; evaluators can publish their name or maintain anonymity; we also welcome collaborative evaluation work. We will facilitate, promote, and encourage these evaluations in several ways, described below.
We are particularly looking for people with research training, experience, and expertise in quantitative social science and statistics including cost-benefit modeling and impact evaluation. This could include professors, other academic faculty, postdocs, researchers outside of academia, quantitative consultants and modelers, PhD students, and students aiming towards PhD-level work (pre-docs, research MSc students etc.) But anyone is welcome to give this a try — when in doubt, piease go for it.
We are also happy to support collaborations and group evaluations. There is a good track record for this — see: “”, ASAPBio’s, and for examples in this vein. We may also host live events and/or facilitate asynchronous collaboration on evaluations
Instructors/PhD, MRes, Predoc programs: We are also keen to work with students and professors to integrate ‘independent evaluation assignments’ (aka ‘learn to do peer reviews’) into research training.
Your work will support The Unjournal’s core mission — improving impactful research through journal-independent public evaluation. In addition, you’ll help research users (policymakers, funders, NGOs, fellow researchers) by providing high quality detailed evaluations that rate and discuss the strengths, limitations, and implications of research.
Doing an independent evaluation can also help you. We aim to provide feedback to help you become a better researcher and reviewer. We’ll also give prizes for the strongest evaluations. Lastly, writing evaluations will help you build a portfolio with The Unjournal, making it more likely we will commission you for paid evaluation work in the future.
We focus on rigorous, globally-impactful research in quantitative social science and policy-relevant research. (See for details.) We’re especially eager to receive independent evaluations of:
Research we publicly prioritize: see our we've prioritized or evaluated. ()
Research we previously evaluated (see , as well as )
Work that other people and organizations suggest as having high potential for impact/value of information (also see)
The Unjournal’s structured evaluation forms: We encourage evaluators to do these using either:
Bounties: We will offer prizes for the ‘most valuable independent evaluations’.
All evaluation submissions will be eligible for these prizes and “grandfathered in” to any prizes announced later. We will announce and promote the prize winners (unless they opt for anonymity).
Evaluator pool: People who submit evaluations can elect to join our evaluator pool. We will consider and (time-permitting) internally rate these evaluations. People who do the strongest evaluations in our focal areas are likely to be commissioned as paid evaluators for The Unjournal.
We are reaching out to PhD programs and pre-PhD research-focused programs. Some curricula already involve “mock referee report” assignments. We hope professors will encourage their students to do these through our platform. In return, we’ll offer the incentives and promotion mentioned above, as well as resources, guidance, and some further feedback
5. Fostering a positive environment for anonymous and signed evaluations
We want to preserve a positive and productive environment. This is particularly important because we will be accepting anonymous content. We will take steps to ensure that the system is not abused. If the evaluations have an excessively negative tone, have content that could be perceived as personal attacks, or have clearly spurious criticism, we will ask the evaluators to revise this, or we may decide not to post or link it.
Crowdsourced feedback can add value in itself; encouraging this can enable some public evaluation and discussion of work that The Unjournal doesn’t have the bandwidth to cover
Improving our evaluator pool and evaluation standards in general.
Students and ECRs can practice and (if possible) get feedback on independent evaluations
They can demonstrate their ability this publicly, enabling us to recruit and commission the strongest evaluators
Examples will help us build guidelines, resources, and insights into ‘what makes an evaluation useful’.
This provides us opportunities to engage with academia, especially in Ph.D programs and research-focused instruction.
We will compensate you for your time at a rate reflecting your experience and skills ($25–$65/hour). This work also has the potential to serve as a “work sample” for future roles at The Unjournal, as it is highly representative of what our and are commissioned to do.
If you are interested in involvement in either the CR or IP side of this project, please let us know .
You can also suggest research yourself and then do an independent evaluation of it.
We’re looking for careful methodological/technical evaluations that focus on research credibility, impact, and usefulness. We want evaluators to dig into the weeds, particularly in areas where they have aptitude and expertise. See our.
Our : If you are evaluating research aimed at an academic journal or
Our : If you are evaluating research that is probably not aimed at an academic journal. This may include somewhat less technical work, such as reports from policy organizations and think tanks, or impact assessments and cost-benefit analyses
Other public evaluation platforms: We are also open to engaging with evaluations done on existing public evaluation platforms such as. Evaluators: If you prefer to use another platform, please let us know about your evaluation using one of the forms above. If you like, you can leave most of our fields blank, and provide a link to your evaluation on the other public platform.
Academic (~PhD) assignments and projects: We are also looking to build ties with research-intensive university programs; we can help you structure academic assignments and provide external reinforcement and feedback. Professors, instructors, and PhD students: please contact us ().
We will encourage all these independent evaluations to be publicly hosted, and will share links to these. We will further promote the strongest independent evaluations, potentially (such as )
However, when we host or link these, we will keep them clearly separated and signposted as distinct from the commissioned evaluations; independent evaluations will not be considered official, and their ratings won’t be included in our (see dashboard; see ).
As a start, after the first eight (or by Jan. 1 2025, whichever comes later), we will award a prize of $500 to the most valuable evaluation.
Further details tbd.
We’re also moving towards a two-tiered base We will offer a higher rate to people who can demonstrate previous strong review/evaluation work. These independent evaluations will count towards this ‘portfolio’.
Our provides examples of strong work, including the.
We will curate guidelines and learning materials from relevant fields and from applied work and impact-evaluation. For a start, see
commissions public evaluations of impactful research in quantitative social sciences fields. We are an alternative and a supplement to traditional academic peer-reviewed journals – separating evaluation from journals unlocks a . We ask expert evaluators to write detailed, constructive, critical reports. We also solicit a set of structured ratings focused on research credibility, methodology, careful and calibrated presentation of evidence, reasoning transparency, replicability, relevance to global priorities, and usefulness for practitioners (including funders, project directors, and policymakers who rely on this research). While we have mainly targeted impactful research from academia, our covers impactful work that uses formal quantitative methods but is not aimed at academic journals. So far, we’ve commissioned about 50 evaluations of 24 papers, and published these evaluation packages , linked to academic search engines and bibliometrics.
These are principally not research roles, but familiarity with research and research environments will be helpful, and there is room for research involvement depending on the candidate’s interest, background, and skills/aptitudes.
There are currently one such role:
#h.htvmrldrqnhn (As of November 2023, still seeking freelancers)
Further note: We previously considered a “Management support and administrative professional” role. We are not planning to hire for this role currently. Those who indicated interest will be considered for other roles.
As of November 2023, we are soliciting applications for freelancers with skills in particular areas
The Unjournal is looking to work with a proficient writer who is adept at communicating with academics and researchers (particularly in economics, social science, and policy), journalists, policymakers, and philanthropists. As we are in our early stages, this is a generalist role. We need someone to help us explain what The Unjournal does and why, make our processes easy to understand, and ensure our outputs (evaluations and research synthesis) are accessible and useful to non-specialists. We seek someone who values honesty and accuracy in communication; someone who has a talent for simplifying complex ideas and presenting them in a clear and engaging way.
The work is likely to include:
Promotion and general explanation
Spread the word about The Unjournal, our approach, our processes, and our progress in press releases and short pieces, as well as high-value emails and explanations for a range of audiences
Make the case for The Unjournal to potentially skeptical audiences in academia/research, policy, philanthropy, effective altruism, and beyond
Keeping track of our progress and keeping everyone in the loop
Help produce and manage our external (and some internal) long-form communications
Help produce and refine explanations, arguments, and responses
Help provide reports to relevant stakeholders and communities
Making our rules and processes clear to the people we work with
Explain our procedures and policies for research submission, evaluation, and synthesis; make our systems easy to understand
Help us build flexible communications templates for working with research evaluators, authors, and others
Other communications, presentations, and dissemination
Write and organize content for grants applications, partnership requests, advertising, hiring, and more
Potentially: compose non-technical write-ups of Unjournal evaluation synthesis content (in line with interest and ability)
Most relevant skills, aptitudes, interests, experience, and background knowledge:
Understanding of The Unjournal project
Strong written communications skills across a relevant range of contexts, styles, tones, and platforms (journalistic, technical, academic, informal, etc.)
Familiarity with academia and research processes and institutions
Familiarity with current conversations and research on global priorities within government and policy circles, effective altruism, and relevant academic fields
Willingness to learn and use IT, project management, data management, web design, and text-parsing tools (such as those mentioned below), with the aid of GPT/AI chat
Further desirable skills and experience:
Academic/research background in areas related to The Unjournal’s work
Operations, administrative, and project management experience
Experience working in a small nonprofit institution
Experience with promotion and PR campaigns and working with journalists and bloggers
Proposed terms:
Project-based contract "freelance" work
$30–$55/hour USD (TBD, depending on experience and capabilities). Hours for each project include some onboarding and upskilling time.
Our current budget can cover roughly 200 hours of this project work. We hope to increase and extend this (depending on our future funding and expenses).
This role is contract-based and supports remote and international applicants. We can contract people living in most countries, but we cannot serve as an immigration sponsor.
We are again considering application for the 'evaluation metrics/meta-science' role. We will also consider all applicants for our positions, and for roles that may come up in the future.
The potential roles discussed below combine research-linked work with operations and administrative responsibilities. Overall, this may include some combination of:
Assisting and guiding the process of identifying strong and potentially impactful work in key areas, explaining its relevance, its strengths, and areas warranting particular evaluation and scrutiny
Interacting with authors, recruiting, and overseeing evaluators
Synthesizing and disseminating the results of evaluations and ratings
Aggregating and benchmarking these results
Helping build and improve our tools, incentives, and processes
Curating outputs relevant to other researchers and policymakers
Doing "meta-science" work
See also our field specialist team pool and evaluator pool. Most of these roles involve compensation/honorariums. See
. (Nov. 2023: Note, we can not guarantee that we will be hiring for this role, because of changes in our approach.)