Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Text to accompany the Impactful Research Prize discussion
Note: This section largely repeats content in our guide for researchers/authors, especially our FAQ on "why engage."
Jan. 2024: We have lightly updated this page to reflect our current systems.
We describe the nature of the work we are looking to evaluate, along with examples, in . Update 2024: This is now better characterized under and .
If you are interested in submitting your work for public evaluation, we are looking for research which is relevant to global priorities—especially quantitative social sciences—and impact evaluations. Work that would benefit from further feedback and evaluation is also of interest.
Your work will be evaluated using our evaluation guidelines and metrics. You can read these before submitting.
Important Note: We are not a journal. By having your work evaluated, you will not be giving up the opportunity to have your work published in a journal. We simply operate a system that allows you to have your work independently evaluated.
If you think your work fits our criteria and would like it to be publicly evaluated, please submit your work through .
If you would like to submit more than one of your papers, you will need to complete a new form for each paper you submit.
By default, we would like Unjournal evaluations to be made public. We think public evaluations are generally good for authors, as explained . However, in special circumstances and particularly for very early-career researchers, we may make exceptions.
If there is an early-career researcher on the author team, we will allow authors to "embargo" the publication of the evaluation until a later date. This date is contingent, but not indefinite. The embargo lasts until after a PhD/postdoc’s upcoming job search or until it has been published in a mainstream journal, unless:
the author(s) give(s) earlier permission for release; or
until a fixed upper limit of 14 months is reached.
If you would like to request an exception to a public evaluation, you will have the opportunity to explain your reasoning in the submission form.
The Unjournal presents an additional opportunity for evaluation of your work with an emphasis on impact.
Substantive feedback will help you improve your work—especially useful for young scholars.
Ratings can be seen as markers of credibility for your work that could help your career advancement at least at the margin, and hopefully help a great deal in the future. You also gain the opportunity to
If we consider your work for public evaluation, we may ask for some of the items below, although most are optional. We will aim to make this a very light touch for authors.
A link to a non-paywalled, hosted version of your work (in any format—PDFs are not necessary) that can be given a Digital Object Identifier (DOI). Again, we will not be "publishing" this work, just evaluating it.
A link to data and code, if possible. We will work to help you to make it accessible.
Assignment of two evaluators who will be paid to assess your work. We will likely keep their identities confidential, although this is flexible depending on the reviewer. Where it seems particularly helpful, we will facilitate a confidential channel to enable a dialogue with the authors. One person on our managing team will handle this process.
By completing the submission form, you are providing your permission for us to post the evaluations publicly unless you request an embargo.
You will have a two-week window to respond through our platform before anything is posted publicly. Your responses can also be posted publicly.
For more information on why authors may want to engage and what we may ask authors to do, please see .
Nov. 2023: We are currently prioritizing bringing in more to build our teams in a few areas, particularly in:
Catastrophic risks, AI governance and safety
I (David Reinstein) am an economist who left UK academia after 15 years to pursue a range of projects (). One of these is :
The Unjournal (with funding from the and the Survival and Flourishing Fund) organizes and funds public-journal-independent feedback and evaluation, paying reviewers for their work. We focus on research that is highly relevant to global priorities, especially in economics, social science, and impact evaluation. We encourage better research by making it easier for researchers to get feedback and credible ratings on their work.
You will gain visibility and a connection to the EA/Global Priorities communities and the Open Science movement.
You can take advantage of this opportunity to gain a reputation as an ‘early adopter and innovator’ in open science.
You can win prizes: You may win a “best project prize,” which could be financial as well as reputational.
Entering into our process will make you more likely to be hired as a paid reviewer or editorial manager.
We will encourage media coverage.
Have evaluators publicly post their evaluations (i.e., 'reviews') of your work on our platform. As noted above, we will ask them to provide feedback, thoughts, suggestions, and some quantitative ratings for the paper.
Animal welfare: markets, attitudes
As well as:
Quantitative political science (voting, lobbying, attitudes)
Social impact of AI/emerging technologies
Macro/growth, finance, public finance
Long-term trends and demographics
In addition to the "work roles," we are looking to engage researchers, research users, meta-scientists, and people with experience in open science, open access, and management of initiatives similar to The Unjournal.
We are continually looking to enrich our general team and board, including our , These roles come with some compensation and incentives.
(Please see links and consider submitting an expression of interest).
We want researchers who are interested in doing evaluation work for The Unjournal. We pay an average of $400-$500 per complete and on-time evaluation, and we award monetary prizes for the strongest work. Right now we are particularly looking for economists and people with quantitative and policy-evaluation skills. We describe what we are asking evaluators to do here: essentially a regular peer review with some different emphases, plus providing a set of quantitative ratings and predictions. Your evaluation content would be made public (and receive a DOI, etc.), but you can choose if you want to remain anonymous or not.
To sign up to be part of the pool of evaluators or to get involved in The Unjournal project in other ways, please fill out this brief form or contact theunjournal@gmail.com.
We welcome suggestions for particularly impactful research that would benefit from (further) public evaluation. We choose research for public evaluation based on an initial assessment of methodological strength, openness, clarity, relevance to global priorities, and the usefulness of further evaluation and public discussion. We sketch out these criteria here, and give some potential examples here.
If you have research—your own or others—that you would like us to assess, please fill out this form. You can submit your own work here (or by contacting ). Authors of evaluated papers will be eligible for our Impactful Research Prizes (details).
We are looking for both feedback on and involvement in The Unjournal project. Feel free to reach out at .
View our data protection statement
19 Feb 2024. We are not currently hiring, but expect to do so in the future
To indicate your potential interest in roles at The Unjournal, such as those described below, please fill out this quick survey form and link (or upload) your CV or webpage.
If you already filled out this form for a role that has changed titles, don’t worry. You will still be considered for relevant and related roles in the future.
If you add your name to this form, we may contact you to offer you the opportunity to do paid project work and paid work tasks.
Furthermore, if you are interested in conducting paid research evaluation for The Unjournal, or in joining our advisory board, please complete the form linked .
Feel free to contact contact@unjournal.org with any questions.
The Unjournal, a not-for-profit collective under the umbrella and fiscal sponsorship of the , is an equal-opportunity employer and contractor. We are committed to creating an inclusive environment for all employees, volunteers, and contractors. We do not discriminate on the basis of race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetic information, disability, age, or veteran status.
See our data protection statement .
In addition to the jobs and paid projects listed here, we are expanding our management team, advisory board, field specialist team pool, and evaluator pool. Most of these roles involve compensation/honorariums. See
The Unjournal call for participants and research
See for an overview of The Unjournal.
In brief (TLDR): If you are interested in being on The Unjournal's management committee, advisory board, or evaluator pool, please fill out this form (about 3–5 min).
If you have research you would like us to assess, please fill out this form. You can also submit your own work here, or by contacting .
Please note that while data submitted through the above forms may be shared internally within our Management Team, it will not be publicly disclosed. Data protection statement linked .
I am , founder and co-director of The Unjournal. We have an open call for committee members, board members, reviewers, and suggestions for relevant work for The Unjournal to evaluate.
The Unjournal is building a system for credible, public, journal-independent feedback and evaluation of research.
Identify, invite, or select contributions of relevant research that is publicly hosted on any open platform or archive in any format.
We maintain an open call for participants for four different roles:
(involving honorariums for time spent)
members (no time commitment)
(who will often also be on the Advisory Board)
You can express your interest (and enter our database) .
We're interested in researchers and research-users who want to help us prioritize work for evaluation, and manage evaluations, considering
... research in any social science/economics/policy/impact-assessment area, and
... research with the potential to be among the most globally-impactful.
We will reach out to evaluators (a.k.a. "reviewers") on a case-by-case basis, appropriate for each paper or project being assessed. This is dependent on expertise, the researcher's interest, and a lack of conflict of interest.
Time commitment: Case-by-case basis. For each evaluation, for the amount of time to spend.
Compensation: We pay a minimum of $200 (updated Aug. 2024) for a prompt and complete evaluation, $400 for experienced evaluators. We offer additional prizes and incentives, and are committed to an average compensation of at least $450 per evaluator. .
Who we are looking for: We are putting together a list of people interested in being an evaluator and doing paid referee work for The Unjournal. We generally prioritize the pool of evaluators who signed up for our database before reaching out more widely.
Interested? Please fill out (about 3–5 min, same form for all roles or involvement).
We are looking for high-quality, globally pivotal research projects to evaluate, particularly those embodying open science practices and innovative formats. We are putting out a call for relevant research. Please suggest research . (We offer bounties and prizes for useful suggestions – see note.) For details of what we are looking for, and some potential examples, and accompanying links.
You can also .
We provide a separate form for research suggestions. We may follow up with you individually.
If you are interested in discussing any of the above in person, please email us () to arrange a conversation.
We invite you to fill in to leave your contact information, as well as outlining which parts of the project you may be interested in.
Note: This is under continual refinement; see our for more details.
These are principally not research roles, but familiarity with research and research environments will be helpful, and there is room for research involvement depending on the candidate’s interest, background, and skills/aptitudes.
There are currently one such role:
Communications, Writing, and Public Relations Specialist (As of November 2023, still seeking freelancers)
Further note: We previously considered a “Management support and administrative professional” role. We are not planning to hire for this role currently. Those who indicated interest will be considered for other roles.
.
As of November 2023, we are soliciting applications for freelancers with skills in particular areas
The Unjournal is looking to work with a proficient writer who is adept at communicating with academics and researchers (particularly in economics, social science, and policy), journalists, policymakers, and philanthropists. As we are in our early stages, this is a generalist role. We need someone to help us explain what The Unjournal does and why, make our processes easy to understand, and ensure our outputs (evaluations and research synthesis) are accessible and useful to non-specialists. We seek someone who values honesty and accuracy in communication; someone who has a talent for simplifying complex ideas and presenting them in a clear and engaging way.
The work is likely to include:
Promotion and general explanation
Spread the word about The Unjournal, our approach, our processes, and our progress in press releases and short pieces, as well as high-value emails and explanations for a range of audiences
Make the case for The Unjournal to potentially skeptical audiences in academia/research, policy, philanthropy, effective altruism, and beyond
Most relevant skills, aptitudes, interests, experience, and background knowledge:
Understanding of The Unjournal project
Strong written communications skills across a relevant range of contexts, styles, tones, and platforms (journalistic, technical, academic, informal, etc.)
Familiarity with academia and research processes and institutions
Further desirable skills and experience:
Academic/research background in areas related to The Unjournal’s work
Operations, administrative, and project management experience
Experience working in a small nonprofit institution
Proposed terms:
Project-based contract "freelance" work
$30–$55/hour USD (TBD, depending on experience and capabilities). Hours for each project include some onboarding and upskilling time.
Our current budget can cover roughly 200 hours of this project work. We hope to increase and extend this (depending on our future funding and expenses).
.
Nov. 2023 update: We have paused this process focus to emphasize our field specialist positions. We hope to come back to hiring researchers to implement these projects soon.
We are planning to hire 3–7 researchers for a one-off paid project.
There are two opportunities: Contracted Research (CR) and Independent Projects (IP).
Project Outline
What specific research themes in economics, policy, and social science are most important for global priorities?
What projects and papers are most in need of further in-depth public evaluation, attention, and scrutiny?
Where does "Unjournal-style evaluation" have the potential to be one of the most impactful uses of time and money? By impactful, we mean in terms of some global conception of value (e.g., the well-being of living things, the survival of human values, etc.).
This is an initiative that aims to identify, summarize, and conduct an in-depth evaluation of the most impactful themes in economics, policy, and social science to answer the above questions. Through a systematic review of selected papers and potential follow-up with authors and evaluators, this project will enhance the visibility, understanding, and scrutiny of high-value research, fostering both rigorous and impactful scholarship.
Contracted Research (CR) This is the main opportunity, a unique chance to contribute to the identification and in-depth evaluation of impactful research themes in economics, policy, and social science. We’re looking for researchers and research users who can commit a (once-off) 15–20 hours. CR candidates will:
Summarize a research area or theme, its status, and why it may be relevant to global priorities (~4 hours).
We are looking for fairly narrow themes. Examples might include:
The impact of mental health therapy on well-being in low-income countries.
We will compensate you for your time at a rate reflecting your experience and skills ($25–$65/hour). This work also has the potential to serve as a “work sample” for future roles at The Unjournal, as it is highly representative of what our and are commissioned to do.
We are likely to follow up on your evaluation suggestions. We also may incorporate your writing into our web page and public posts; you can choose whether you want to be publicly acknowledged or remain anonymous.
Independent Projects (IP)
We are also inviting applications to do similar work as an “Independent Project” (IP), a parallel opportunity designed for those eager to engage but not interested in working under a contract, or not meeting some of the specific criteria for the Contracted Research role. This involves similar work to above.
If you are accepted to do an IP, we will offer some mentoring and feedback. We will also offer prize rewards/bounties for particularly strong IP work. We will also consider working with professors and academic supervisors on these IP projects, as part of university assignments and dissertations.
You can apply to the CR and IP positions together; we will automatically consider you for each.
Get Involved!
If you are interested in involvement in either the CR or IP side of this project, please let us know .
We are again considering application for the 'evaluation metrics/meta-science' role. We will also consider all applicants for our positions, and for roles that may come up in the future.
The potential roles discussed below combine research-linked work with operations and administrative responsibilities. Overall, this may include some combination of:
Disambiguation: The Unjournal focuses on commissioning expert evaluations, guided by an ‘evaluation manager’ and compensating people for their work. (See the outline of our main process). We plan to continue to focus on that mode. Below we sketch an additional parallel but separate approach.
Note on other versions of this content.
Keeping track of our progress and keeping everyone in the loop
Help produce and manage our external (and some internal) long-form communications
Help produce and refine explanations, arguments, and responses
Help provide reports to relevant stakeholders and communities
Making our rules and processes clear to the people we work with
Explain our procedures and policies for research submission, evaluation, and synthesis; make our systems easy to understand
Help us build flexible communications templates for working with research evaluators, authors, and others
Other communications, presentations, and dissemination
Write and organize content for grants applications, partnership requests, advertising, hiring, and more
Potentially: compose non-technical write-ups of Unjournal evaluation synthesis content (in line with interest and ability)
Familiarity with current conversations and research on global priorities within government and policy circles, effective altruism, and relevant academic fields
Willingness to learn and use IT, project management, data management, web design, and text-parsing tools (such as those mentioned below), with the aid of GPT/AI chat
Experience with promotion and PR campaigns and working with journalists and bloggers
This role is contract-based and supports remote and international applicants. We can contract people living in most countries, but we cannot serve as an immigration sponsor.
The impact of cage-free egg regulation on animal welfare.
Public attitudes towards AI safety regulation.
Identify a selection of papers in this area that might be high-value for UJ evaluation (~3 hours).
Choose at least four of these from among NBER/"top-10 working paper" series (or from work submitted to the UJ – we can share – or from work where the author has expressed interest to you).
For a single paper, or a small set of these papers (or projects) (~6 hours)
Read the paper fairly carefully and summarize it, explaining why it is particularly relevant.
Discuss one or more aspects of the paper that need further scrutiny or evaluation.
Identify 3 possible evaluators, and explain why they might be particularly relevant to evaluate this work. (Give a few sentences we could use in an email to these evaluators).
Possible follow-up task: email and correspond with the authors and evaluators (~3 hours).
Pay evaluators to give careful feedback on this work, with prizes and incentives for strong evaluation work.
Elicit quantifiable and comparable metrics of research quality as credible measures of value (see: evaluator guidelines). Synthesize the results of these evaluations in useful ways.
Publicly post and link all reviews of the work. Award financial prizes for the work judged strongest.
Allow evaluators to choose if they wish to remain anonymous or to "sign" their reviews.
Aim to be as transparent as possible in these processes.
A pool of Evaluators (who will be paid for their time and their work; we also draw evaluators from outside this pool)
- Biological & pandemic risk
- AI governance, AI safety
- Animal welfare, markets for animal products
- Long-term trends, demography
- Macroeconomics/growth/(public) finance
- Quantitative political science (voting, lobbying, etc.)
- Social impact of new technology (including AI)
Interacting with authors, recruiting, and overseeing evaluators
Synthesizing and disseminating the results of evaluations and ratings
Aggregating and benchmarking these results
Helping build and improve our tools, incentives, and processes
Curating outputs relevant to other researchers and policymakers
Doing "meta-science" work
See also our field specialist team pool and evaluator pool. Most of these roles involve compensation/honorariums. See Advisory/team roles (research, management)
Potential focus areas include global health; development economics; markets for products with large externalities (particularly animal agriculture); attitudes and behaviors (altruism, moral circles, animal consumption, effectiveness, political attitudes, etc.); economic and quantitative analysis of catastrophic risks; the economics of AI safety and governance; aggregation of expert forecasts and opinion; international conflict, cooperation, and governance; etc.
Work (likely to include a combination of):
Identify and characterize research (in the area of focus) that is most relevant for The Unjournal to evaluate
Summarize the importance of this work, its relevance to global priorities and connections to other research, and its potential limitations (needing evaluation)
Help build and organize the pool of evaluators in this area
Assist evaluation managers or serve as evaluation manager (with additional compensation) for relevant papers and projects
Synthesize and communicate the progress of research in this area and insights coming from Unjournal evaluations and author responses; for technical, academic, policy, and intelligent lay audiences
Participate in Unjournal meetings and help inform strategic direction
Liaise and communicate with relevant researchers and policymakers
Help identify and evaluate prize winners
Meta-research and direct quantitative meta-analysis (see "Project" below)
Desirable skills and experience:
Note: No single skill or experience is necessary independently. If in doubt, we encourage you to express your interest or apply.
Understanding of the relevant literature and methodology (to an upper-postgraduate level) in this field or a related field and technical areas, i.e., knowledge of the literature, methodology, and policy implications
Research and policy background and experience
Strong communication skills
Proposed terms:
300 hours (flexible, extendable) at $25–$55/hour USD (TBD, depending on experience and skills)
This is a contract role, open to remote and international applicants. However, the ability to attend approximately weekly meetings and check-ins at times compatible with the New York timezone is essential.
Length and timing:
Flexible; to be specified and agreed with the contractor.
We are likely to hire one role starting in Summer 2023, and another starting in Autumn 2023.
Extensions, growth, and promotions are possible, depending on performance, fit, and our future funding.
Express your interest here. (Nov. 2023: Note, we can not guarantee that we will be hiring for this role, because of changes in our approach.)
The Unjournal is seeking academics, researchers, and students to submit structured evaluations of the most impactful research emerging in the social sciences. Strong evaluations will be posted or linked on our PubPub community, offering readers a perspective on the implications, strengths, and limitations of the research. These evaluations can be submitted using this form for academic-targeted research or this form for more applied work; evaluators can publish their name or maintain anonymity; we also welcome collaborative evaluation work. We will facilitate, promote, and encourage these evaluations in several ways, described below.
We are particularly looking for people with research training, experience, and expertise in quantitative social science and statistics including cost-benefit modeling and impact evaluation. This could include professors, other academic faculty, postdocs, researchers outside of academia, quantitative consultants and modelers, PhD students, and students aiming towards PhD-level work (pre-docs, research MSc students etc.) But anyone is welcome to give this a try — when in doubt, piease go for it.
We are also happy to support collaborations and group evaluations. There is a good track record for this — see: “What is a PREreview Live Review?”, ASAPBio’s Crowd preprint review, I4replication.org and repliCATS for examples in this vein. We may also host live events and/or facilitate asynchronous collaboration on evaluations
Instructors/PhD, MRes, Predoc programs: We are also keen to work with students and professors to integrate ‘independent evaluation assignments’ (aka ‘learn to do peer reviews’) into research training.
Your work will support The Unjournal’s core mission — improving impactful research through journal-independent public evaluation. In addition, you’ll help research users (policymakers, funders, NGOs, fellow researchers) by providing high quality detailed evaluations that rate and discuss the strengths, limitations, and implications of research.
Doing an independent evaluation can also help you. We aim to provide feedback to help you become a better researcher and reviewer. We’ll also give prizes for the strongest evaluations. Lastly, writing evaluations will help you build a portfolio with The Unjournal, making it more likely we will commission you for paid evaluation work in the future.
We focus on rigorous, globally-impactful research in quantitative social science and policy-relevant research. (See “What specific areas do we cover?” for details.) We’re especially eager to receive independent evaluations of:
Research we publicly prioritize: see our public list of research we've prioritized or evaluated. (Also...)
Research we previously evaluated (see public list, as well as https://unjournal.pubpub.org/ )
Work that other people and organizations suggest as having high potential for impact/value of information (also see)
You can also suggest research yourself here and then do an independent evaluation of it.
We’re looking for careful methodological/technical evaluations that focus on research credibility, impact, and usefulness. We want evaluators to dig into the weeds, particularly in areas where they have aptitude and expertise. See our guidelines.
The Unjournal’s structured evaluation forms: We encourage evaluators to do these using either:
Our Academic (main) stream form: If you are evaluating research aimed at an academic journal or
Our ‘Applied stream’ form: If you are evaluating research that is probably not aimed at an academic journal. This may include somewhat less technical work, such as reports from policy organizations and think tanks, or impact assessments and cost-benefit analyses
Other public evaluation platforms: We are also open to engaging with evaluations done on existing public evaluation platforms such as PREreview.org. Evaluators: If you prefer to use another platform, please let us know about your evaluation using one of the forms above. If you like, you can leave most of our fields blank, and provide a link to your evaluation on the other public platform.
Academic (~PhD) assignments and projects: We are also looking to build ties with research-intensive university programs; we can help you structure academic assignments and provide external reinforcement and feedback. Professors, instructors, and PhD students: please contact us (contact@unjournal.org).
We will encourage all these independent evaluations to be publicly hosted, and will share links to these. We will further promote the strongest independent evaluations, potentially re-hosting them on our platforms (such as unjournal.pubpub.org)
However, when we host or link these, we will keep them clearly separated and signposted as distinct from the commissioned evaluations; independent evaluations will not be considered official, and their ratings won’t be included in our ‘main data’ (see dashboard here; see discussion).
Bounties: We will offer prizes for the ‘most valuable independent evaluations’.
As a start, after the first eight quality submissions (or by Jan. 1 2025, whichever comes later), we will award a prize of $500 to the most valuable evaluation.
Further details tbd. As a reference...
All evaluation submissions will be eligible for these prizes and “grandfathered in” to any prizes announced later. We will announce and promote the prize winners (unless they opt for anonymity).
Evaluator pool: People who submit evaluations can elect to join our evaluator pool. We will consider and (time-permitting) internally rate these evaluations. People who do the strongest evaluations in our focal areas are likely to be commissioned as paid evaluators for The Unjournal.
We’re also moving towards a two-tiered base compensation for evaluations. We will offer a higher rate to people who can demonstrate previous strong review/evaluation work. These independent evaluations will count towards this ‘portfolio’.
Our PubPub page provides examples of strong work, including the prize-winning evaluations.
We will curate guidelines and learning materials from relevant fields and from applied work and impact-evaluation. For a start, see "Conventional guidelines for referee reports" in our knowledge base. We plan to build and curate more of this...
We are reaching out to PhD programs and pre-PhD research-focused programs. Some curricula already involve “mock referee report” assignments. We hope professors will encourage their students to do these through our platform. In return, we’ll offer the incentives and promotion mentioned above, as well as resources, guidance, and some further feedback.
Crowdsourced feedback can add value in itself; encouraging this can enable some public evaluation and discussion of work that The Unjournal doesn’t have the bandwidth to cover
Improving our evaluator pool and evaluation standards in general.
Students and ECRs can practice and (if possible) get feedback on independent evaluations
They can demonstrate their ability this publicly, enabling us to recruit and commission the strongest evaluators
Examples will help us build guidelines, resources, and insights into ‘what makes an evaluation useful’.
This provides us opportunities to engage with academia, especially in Ph.D programs and research-focused instruction.
The Unjournal commissions public evaluations of impactful research in quantitative social sciences fields. We are an alternative and a supplement to traditional academic peer-reviewed journals – separating evaluation from journals unlocks a range of benefits. We ask expert evaluators to write detailed, constructive, critical reports. We also solicit a set of structured ratings focused on research credibility, methodology, careful and calibrated presentation of evidence, reasoning transparency, replicability, relevance to global priorities, and usefulness for practitioners (including funders, project directors, and policymakers who rely on this research).[2] While we have mainly targeted impactful research from academia, our ‘applied stream’ covers impactful work that uses formal quantitative methods but is not aimed at academic journals. So far, we’ve commissioned about 50 evaluations of 24 papers, and published these evaluation packages on our PubPub community, linked to academic search engines and bibliometrics.
Ability to work independently, as well as to build coalitions and cooperation
Statistics, data science and "aggregation of expert beliefs"
As of December 2023, the prizes below have been chosen and will be soon announced. We are also scheduling an event linked to this prize. However, we are preparing for even larger author and evaluator prizes for our next phase. Submit your research to The Unjournal or serve as an evaluator to be eligible for future prizes (details to be announced).
Submit your work to be eligible for our “Unjournal: Impactful Research Prize” and a range of other benefits including the opportunity for credible public evaluation and feedback.
First-prize winners will be awarded $2500, and the runner-ups will receive $1000.
Note: these are the minimum amounts; we will increase these if funding permits.
Prize winners will have the opportunity (but not the obligation) to present their work at an online seminar and prize ceremony co-hosted by The Unjournal, , and
To be eligible for the prize, submit a link to your work for public evaluation .
Please choose “new submission” and “Submit a URL instead.”
The Unjournal, with funding from the and the , organizes and funds public-journal-independent feedback and evaluation. We focus on research that is highly relevant to global priorities, especially in economics, social science, and impact evaluation, and aim to expand this widely. We encourage better research by making it easier for researchers to get feedback and credible ratings on their work.
We aim to publicly evaluate 15 papers (or projects) within our pilot year. This award will honor researchers doing robust, credible, transparent work with a global impact. We especially encourage the submission of research in "open" formats such as hosted dynamic documents (Quarto, R-markdown, Jupyter notebooks, etc.).
The research will be chosen by our management team for public evaluation by 2–3 carefully selected, paid reviewers based on an initial assessment of a paper's methodological strength, openness, clarity, relevance to , and the usefulness of further evaluation and public discussion. We sketch out .
All evaluations, including quantitative ratings, will be made public by default; however, we will consider "embargos" on this for researchers with sensitive career concerns (the linked form asks about this). Note that submitting your work to The Unjournal does not imply "publishing" it: you can submit it to any journal before, during, or after this process.
If we choose not to send your work out to reviewers, we will try to at least offer some brief private feedback (please on this).
All work evaluated by The Unjournal will be eligible for the prize. Engagement with The Unjournal, including responding to evaluator comments, will be a factor in determining the prize winners. We also have a slight preference for giving at least one of the awards to an early-career researcher, but this need not be determinative.
Our management team and advisory board will vote on the prize winners in light of the evaluations, with possible consultation of further external expertise.
Deadline: Extended until 5 December (to ensure eligibility).
Note: In a subsection below, , we outline the basic requirements for submissions to The Unjournal.
The prize winners for The Unjournal's Impactful Research Prize were selected through a multi-step, collaborative process involving both the management team and the advisory board. The selection was guided by several criteria, including the quality and credibility of the research, its potential for real-world impact, and the authors' engagement with The Unjournal's evaluation process.
Initial Evaluation: All papers that were evaluated by The Unjournal were eligible for the prize. The discussion, evaluations, and ratings provided by external evaluators played a significant role in the initial shortlisting.
Management and Advisory Board Input: Members of the management committee and advisory board were encouraged to write brief statements about papers they found particularly prize-worthy.
Meeting and Consensus: A "prize committee" meeting was held with four volunteers from the management committee to discuss the shortlisted papers and reach a consensus. The committee considered both the papers and the content of the evaluations
This comprehensive approach aimed to ensure that the prize winners were selected in a manner that was rigorous, fair, and transparent, reflecting the values and goals of The Unjournal.
Point Voting: The above shortlist and the notes from the accompanying discussion were shared with all management committee and advisory board members. Everyone in this larger group was invited to allocate up to 100 points among the shortlisted papers (and asked to allocate fewer points if they were less familiar with the papers and evaluations).
Special Considerations: We decided that at least one of the winners had to be a paper submitted by the authors or one where the authors substantially engaged with The Unjournal's processes. However, this constraint did not prove binding. Early-career researchers were given a slight advantage in our consideration.
Final Selection: The first and second prizes were given to the papers with the first- and second-most points, respectively.


