LLPW Request for Proposals: Supplementary Information

Completeness Check & Internal Review Criteria

A completeness check of each submission will be conducted by TWCF staff/advisors. All entries that pass the completeness check will move forward to the next round. Completeness check criteria will be:

  • All questions are answered adequately.
  • Project Description section completed with appendices attached (if applicable).
  • Budget section completed with correct Budget Template attached.
  • At least one entry completed for the Output section.
  • At least one entry completed for the Outcomes section.
  • At least one entry completed for the Team Members section with CV attached as required.
  • The entire application contains all related entries and supporting materials.
  • Proposal has been submitted and confirmed.
  • Applicant is a part of a research institution or university.
  • Ensure that the project description is anonymized.

Internal Review

Entries passing the completeness check will go through a round of internal review conducted by TWCF staff/advisors. All entries that pass the internal review will be sent to external reviewers. Internal review criteria will be:

  • The project leader and the project team have the subject and practical knowledge and expertise required to carry out the project effectively. 
  • The host institution is appropriate and able to provide the necessary support.
  • The budget is feasible for the work proposed in the project description.
  • The budget is compliant with our policies.
  • The project aligns well with Sir John Templeton’s Donor Intent.
  • Willingness to engage in open and collaborative research practices.
  • Entries include Outputs with plans on how they will be shared aligned with Appendix 2 - Open Research Requirements, Recommendations for this RFP.
  • Entries respond to the Challenge Statement and Criteria appropriately.

External Review Criteria

Reviewers will only score and comment on the Project Description, Outputs and Outcomes section using the noted criteria below and will not review or consider other typical criteria like personnel, environment/facilities, or budget.

Peer Review: All applications will be evaluated according to the below criteria.

Potential for Impact

  • What impact can reasonably be expected from the project, and how effective and beneficial is this impact likely to be? 
  • What potential is there for increasing scholarly or public understanding of the subject area of the project, and for influencing opinion leaders?


  • How does the proposed project relate to the current state of the field, including other work that has been conducted or planned? 
  • Is it asking new questions as well as seeking new answers? 
  • What capacity does it have to break new ground?

Study Design and Objectives 

  • Are the project’s assumptions justified by evidence (including citations of scholarly work if applicable), and is there a persuasive rationale for why the project is needed?
  • How well are the hypotheses or objectives, aims, experimental design, methods and analyses developed and integrated into the project?
  • Does the proposal describe designs, methods, procedures, and/or activities that are carefully planned and appropriate for the project’s objectives?
  • Whether plans for data collection, analysis, storage, and dissemination are adequately addressed.
  • Does the proposal fulfill the Open Research requirements for all valuable research outputs including preregistration documents, peer-reviewed publications and preprints, datasets, code, software, protocols, and research materials? Are these plans realistic and effective?

Overall Quality of the Proposal
Taking the proposal as a whole, what is your judgment of its quality and potential? This could include the following considerations: How enthusiastic are you about this project? Is there anything about the project that you would like to highlight that may not be captured in the criteria above? 

Scoring Guidelines
Each reviewer will assign a separate score for each of these review criteria using the scale below:

Numerical Score



Low quality, even if substantially revised.


Low quality, but with substantial revisions could be high quality.


Moderate quality, and with minor revisions could be high quality.


High quality, with or without the suggestions mentioned.

External Review Pool Information

External reviews will be managed by the American Institute of Biological Sciences (AIBS). We have instructed AIBS to source external reviewers with relevant expertise. We aim to source reviewers globally with a geographic distribution that reflects that of the applications being reviewed.

Open Research Requirements and Recommendations for this RFP

This funding call requires project teams to:

  • Preregister your experiment (if you have one). Preregister all empirical elements, including both quantitative and qualitative/mixed methods. We encourage grantees to preregister their studies by using the Open Science Framework platform (OSF), or similar public registry (e.g.,, Preregistrations should be shared under a Creative Commons Attribution (CC BY) license to enable sharing and reuse.
  • Publish Open Access. All journal publications must be published in Open Access Journals, on Open Access Platforms, or made immediately available through Open Access Repositories without embargo. They must be published under a Creative Commons attribution license (CC-BY).
  • Share Datasets. Submit datasets to a FAIR (Findable, Accessible, Interoperable, and Reusable) data repository that 1) Assigns a globally unique and persistent identifier (such as a DOI or Accession number); 2) indexes the dataset, metadata, and persistent identifier in a reliable, searchable resource. All relevant dataset files must be described by adequate metadata. Examples of data repositories include Zenodo, Dryad, or subject-specific data repositories. Assign a CC-BY or CC0 license. Note that sharing protected health information (PHI) or electronic protected health information (ePHI) is not encouraged unless the data can be fully anonymized. 
  • Share Software and Applications. Maintain original code and software developed for use in TWCF-funded research in a reliable code repository, such as GitHub. Reliable code repositories are services that are supported by the coding community and allow for easy export of code. TCWF encourages sharing code used in data analysis reported in the preprint or research article.
  • Share Protocols. Share protocols used in original research in an appropriate repository, such as Zenodo, with a persistent identifier and CC-BY or CC0 license

We encourage you to:

  • Post Preprints. Post a preprint with a CC-BY license before or at the time of submission of an article to a journal in a suitable preprint server that 1) does not put up a paywall or other barrier to access; 2) Assigns a digital object identifier (DOI) to each preprint; 3) Allows the assignment of a CC-BY license.
  • Share your Lab and Field Resources. Register lab and field resources used in original research work, such as cell lines, antibodies, novel instruments, and other tangible items, with an appropriate research resource repository. Reliable repositories for resources include Addgene, animal model databases, and other services found through SciCrunch that assign a Research Resource Identifier (RRID) and provide others with information on how to find or use the resource if possible.

Process Overview

Application Phase

  • Applications open.
  • Workshop for applicants (to be recorded).
  • Workshop for last-minute Q&A (to be recorded).

Selection Phase

  • Internal Review
    • Completeness check.
    • Initial quality screen. 
    • Down-selection of proposals.
  • External Review
    • AIBS will source reviewers.
    • Blinded external review of project description, outputs and outcomes.
    • Conferral process between reviewers.
    • Weighted scores and comments will be returned to TWCF.
  • Partial Lottery
    • The lowest scoring proposals as dictated by the distribution of scores will be declined.
    • The rest of the proposals will be randomly selected to be considered for approval.
    • The remaining proposals will be screened for any outstanding opportunities to also be considered for approval. The rest of the proposals will be declined.

Approval Phase

  • This phase is not competitive, and only for proposals to be considered for approval. All proposals that make it to this stage will have a budget allocated and will be funded subject to approval by the foundation’s trustees, satisfactory compliance review, and successful negotiation of a grant agreement.
  • Applicants will receive an opportunity to respond to reviewer feedback.
  • Proposals will be subject to compliance review.
  • The final decisions for approval will be made by a committee composed of trustees of TWCF.
  • After approval, projects will have to be pre-registered in the Open Science Framework prior to grant agreement being signed.


Please download our Application Guidance Document (PDF) for full instructions.