This notice has expired. Check the NIH Guide for active opportunities and notices.

EXPIRED

Department of Health and Human Services
Part 1. Overview Information
Participating Organization(s)

National Institutes of Health (NIH)

Components of Participating Organizations

National Institute of Mental Health (NIMH)

Funding Opportunity Title

Pragmatic Strategies for Assessing Psychotherapy Quality in Practice (R01)

Activity Code

R01 Research Project Grant

Announcement Type

New

Related Notices

  • NOT-OD-16-004 - NIH & AHRQ Announce Upcoming Changes to Policies, Instructions and Forms for 2016 Grant Applications (November 18, 2015)

Funding Opportunity Announcement (FOA) Number

RFA-MH-17-500

Companion Funding Opportunity

None

Catalog of Federal Domestic Assistance (CFDA) Number(s)

93.242

Funding Opportunity Purpose

This funding opportunity announcement (FOA) supports the development and testing of pragmatic strategies for assessing the quality of the delivery of psychosocial interventions (defined here as provider-delivered behavioral, cognitive, interpersonal or other psychosocial/psychotherapeutic approaches) for the treatment or prevention of mental health disorders. Specifically, the goal is develop assessment tools and strategies that are both psychometrically rigorous (i.e., reliable, valid and strongly predictive of therapy outcomes and associated with other gold standard metrics of quality) and pragmatic (i.e., feasible for use in community practice settings and useful for advancing efforts at training, supervision, quality monitoring, and/or quality improvement).

Key Dates
Posted Date

February 26, 2016

Open Date (Earliest Submission Date)

May 8, 2016

Letter of Intent Due Date(s)

May 8, 2016

Application Due Date(s)

June 8, 2016, by 5:00 PM local time of applicant organization. All types of non-AIDS applications allowed for this funding opportunity announcement are due on this date.

Applicants are encouraged to apply early to allow adequate time to make any corrections to errors found in the application during the submission process by the due date.

AIDS Application Due Date(s)

Not Applicable

Scientific Merit Review

October 2016

Advisory Council Review

January 2017

Earliest Start Date

April 2017

Expiration Date

June 9, 2016

Due Dates for E.O. 12372

Not Applicable

Required Application Instructions

It is critical that applicants follow the instructions in the SF424 (R&R) Application Guide, except where instructed to do otherwise (in this FOA or in a Notice from the NIH Guide for Grants and Contracts). Conformance to all requirements (both in the Application Guide and the FOA) is required and strictly enforced. Applicants must read and follow all application instructions in the Application Guide as well as any program-specific instructions noted in Section IV. When the program-specific instructions deviate from those in the Application Guide, follow the program-specific instructions. Applications that do not comply with these instructions may be delayed or not accepted for review.

Table of Contents

Part 1. Overview Information
Part 2. Full Text of the Announcement

Section I. Funding Opportunity Description
Section II. Award Information
Section III. Eligibility Information
Section IV. Application and Submission Information
Section V. Application Review Information
Section VI. Award Administration Information
Section VII. Agency Contacts
Section VIII. Other Information

Part 2. Full Text of Announcement
Section I. Funding Opportunity Description

This funding opportunity announcement (FOA) supports the development and testing of pragmatic strategies for assessing the quality of the delivery of psychosocial interventions (defined here as provider-delivered behavioral, cognitive, interpersonal or other psychosocial/psychotherapeutic approaches) for the treatment or prevention of mental health disorders. Specifically, this FOA supports (1) the initial development of pragmatic tools and strategies to assess the quality of delivery of psychosocial interventions and (2) psychometric testing of the assessment strategy to examine the feasibility, reliability, validity, and utility of the approach for prospectively assessing the quality of psychosocial intervention delivery in a practice setting. The goal is to support the development and testing of assessment tools and strategies that are both psychometrically rigorous (i.e., reliable, valid and strongly predictive of therapy outcomes and associated with other gold standard metrics of quality) and pragmatic (i.e., feasible for use in community practice settings and useful for advancing efforts at training, supervision, quality monitoring, and/or quality improvement).

Research Objectives

Much progress has been made at developing and testing efficacious psychosocial interventions for disorders among youth and adults. But as noted in the NIMH Director’s Blog (Aug 6, 2012), there is no easy way to assure that a behavioral intervention provided in practice is the same as the intervention tested in a research study. Likewise, a number of recent priority statements and initiatives highlight the need for more attention to the assessment of the quality of psychosocial interventions delivered in routine practice. The fourth Objective of the NIMH Strategic Plan, which focuses on strengthening the public health impact of NIMH-supported research specifically emphasizes Developing valid and reliable measures of treatment quality and outcomes that can be applied at the person, clinic, system, and population levels as a research priority (http://www.nimh.nih.gov/about/strategic-planning-reports/strategic-research-priorities/srp-objective-4/priorities-for-strategy-41.shtml ). Moreover, a recently issued report from the Institute of Medicine, Psychosocial Interventions for Mental and Substance Use Disorders: A Framework for Establishing Evidence-Based Standards, (http://iom.nationalacademies.org/Reports/2015/Psychosocial-Interventions-Mental-Substance-Abuse-Disorders.aspx) emphasized a fundamental, rate-limiting factor in ensuring the availability of high-quality psychosocial interventions: the lack of efficient/portable approaches for assessing psychotherapy quality and its concordance with evidence-based practices (EBPs). The Mental Health Parity Act and the Patient Protection and Affordable Care Act also compel attention to quality metrics and accountability.

Information about psychotherapy quality could be used by consumers, for selecting therapists; by therapists/supervisors, for training therapists to initial competence in the delivery of an EBP or for monitoring or improving quality; by regulators, who credential training programs or license providers; and/or by administrators and policy makers, including private and public purchasers, for reimbursement decisions. Efficient measures could also facilitate research, including effectiveness trials that examine how research-based approaches perform when they are transported to typical practice settings, implementation studies that test strategies for promoting the sustained use of evidence-based practices in community practice settings, and quality-improvement studies that explore strategies for introducing research-based strategies into routine care to improve service delivery and outcomes. Overall, there is a pressing need for reliable and valid assessment approaches that are feasible for use in practice.

Fidelity measures used in efficacy studies have limited utility for community practice, as they narrowly assess adherence to specific manualized therapies and typically involve labor-intensive coding of session content using rigorously trained independent observers. Strategies for use in community practice settings must balance assessment rigor (i.e., reliability and validity) with feasibility and efficiency, taking into account financial and workforce resources available in routine practice and the implications of disruptions to clinical and administrative routines. Accordingly, the prospect of assessing the delivery of therapy in routine practice necessitates negotiating various challenges and decision points, often with limited research evidence to inform these decisions. Relevant considerations include the following:

  • Focus of the assessment (What should be measured?). Depending on the intended use of the quality metric, tools to assess therapy delivery might focus on therapy quality for specific conditions (e.g., depression, ADHD) or delivery of psychotherapy, in general. Assessments might focus on broad principles/practices (e.g., systematic monitoring of outcome, assignment of homework with review of between-session progress) or on the delivery of specific evidence-based practices (e.g., Cognitive Behavioral Therapy (CBT) or Interpersonal Psychotherapy (IPT) for depression) or specific strategies (e.g., exposure for anxiety); assessment might focus on adherence (i.e., whether or not prescribed elements are present and proscribed elements are avoided) or on competence.
  • Information source (Who or what is the best information source?). Various informants or information sources might be used to collect data on session content. The choice of information source involves a number of important considerations. For example, under what circumstances can supervisors, therapists themselves, or even clients or patients provide objective, valid reports regarding the content of therapy? Alternatively, can routinely collected information (electronic medical records) be used to index the quality of therapy?
  • Timing and frequency of sampling (When and how often should session content be measured?). Another consideration involves the timing and frequency of sampling that is necessary to obtain a valid and stable estimate of quality. A related issue concerns the timeframe for reporting (e.g., whether the assessment should focus on therapy provided in a specific session versus overall quality for the course of therapy). Sampling strategies depend on a number of factors, including the nature of the intervention and the unit of analysis or focus of the assessments. For example, in the delivery of many EBPs, the agenda and content of therapy sessions vary across the course of therapy, with specific elements introduced in a prescribed sequence.
  • Data collection methods (How should the data be obtained?). Depending on the focus of the assessment and the data source, assessment might involve reports from respondents (e.g., supervisors, therapists, clients/patients) or from medical records. Rapidly evolving technologies offer the prospect of more real-time data collection from respondents (e.g., experience sampling); real-time, automated sampling of session content and quality (e.g., video- or audio- recording with natural language processing to describe content and estimate quality); or automated extraction from medical records or other text materials (e.g., text mining with natural language processing analytics).
  • Operationalizing and scoring quality (What constitutes high quality care?). For purposes of this FOA, a critical index of quality is the degree to which the intervention is associated with good outcomes; quality might also be operationalized in terms of its concordance with research-supported strategies. Depending on the approach and the intended use of the measure, a related challenge involves metrics for gauging quality and cutoffs for identifying good versus substandard therapy quality.

Just as the assessment of quality might involve different approaches, the development and testing of quality metrics might take a variety of research approaches. For example, identification of an initial item pool for an assessment tool might start with existing research-based fidelity tools and archival data (e.g., fidelity items and fidelity rating data from efficacy/effectiveness trials) to identify crucial elements of therapy delivery and corresponding fidelity rating items that are most strongly associated with high fidelity and positive outcomes. Alternatively, machine-learning or other analytic approaches might be applied to extant trial data or medical records data in order to empirically (agnostically) identify practices associated with clinical benefit.

Refinement of the assessment tool might involve pilot testing and iterative feedback from experts. For example, feedback from therapy developers and intervention experts might be used to help ensure that the measure content meaningfully taps therapy quality. Input from other stakeholders, including therapists, supervisors, and administrators, might be used to determine whether the assessment strategy is acceptable and feasible and to refine the item content or wording (e.g., using cognitive interviewing with respondents to ensure the scale items are comprehensible and face valid).

Beyond the initial development of an item pool and pilot testing, a variety of research approaches might be used to test the psychometric properties and utility of the assessment strategy. This would include examining the reliability, validity, and utility of the approach, including prospectively testing whether the candidate measure of quality is associated with patient outcomes (and other established metrics of quality, e.g., gold standard fidelity ratings based on fidelity rating systems from efficacy research, as appropriate) in a community practice setting.

This FOA is intended to support the development and testing of measures that: explicitly incorporate end-user perspectives, to ensure tools are feasible in community practice; are strongly associated with patient outcomes; and are useful for informing practice/policy (e.g., ratings would be useful for informing quality improvements or compelling to administrators or policy makers). As outlined above, development and testing of pragmatic measurement strategies involve specifying and justifying both the assessment approach (e.g., the rationale and empirical basis for the focus of the assessments, information source, session sampling, data collection strategy) and the research strategy that will be used to develop and refine the quality measure and prospectively test it in a community practice setting. The scope of work should also include plans to quantify the resources necessary to implement the assessment strategy (e.g., therapist time and burden). While a principal goal is to support the development and testing of specific pragmatic quality measures, this FOA encourages applications that are poised to contribute to a body of generalizable knowledge regarding best-practices for the assessment of quality of psychosocial intervention delivery (e.g., Under what conditions might therapists serve as reliable and valid informants for gauging therapy quality? In general, when in the course of therapy and how frequently should sessions be sampled to index quality?).

Given the focus on practice-relevant questions in community and practice settings, collaborations between academic researchers and clinical or community practice partners or networks are expected. When possible, studies should capitalize on existing infrastructure (e.g., existing research networks. electronic medical records, administrative data bases, institutions with Clinical and Translational Science Awards) to increase the efficiency of participant recruitment (i.e., more rapid identification and enrollment) and to facilitate data collection.

Examples of studies that would be considered responsive to the goals of this FOA include but are not limited to:

  • Approaches that use existing fidelity/competence measures and extant data from efficacy/effectiveness trials to identify core items that can be feasibly administered in routine practice to capture critical therapist behaviors that are associated with therapy quality and positive outcomes.
  • Designs that parametrically vary the timing/frequency of assessments to inform understanding of how sessions should be sampled in order to obtain valid and stable estimates of therapy quality.
  • Studies that compare informant sources (e.g., supervisors, therapists, clients) to help identify conditions under which various respondents can provide unbiased, valid reports of session content and quality.
  • Approaches that employ the use of technology to facilitate real-time data collection and/or quantify session content and quality in an automated fashion (e.g., natural language processing).
  • Studies that develop algorithms to quantify therapy quality using readily available data from routine records (e.g., electronic medical records that capture patient outcomes/progress and provider behaviors).

Examples of studies that are not responsive to this FOA and will not be reviewed include the following:

  • Applications whose scope of work does not involve a focus on psychosocial interventions specifically for mental disorders.
  • Applications whose scope of work is limited to scale development without plans to prospectively assess the quality of psychotherapy in a relevant practice setting to examine reliability, validity, and utility of the approach.
  • Applications that propose to develop and test quality metrics in academic research settings as opposed to in community practice clinics/settings (e.g., studies in research clinics that involve research therapists or other features that are not representative of typical practice settings and substantially impact generalizability).
  • Applications whose scope of work does not include plans to systematically quantify the resources required to implement the assessment strategy or assess the feasibility and acceptability of the proposed quality assessment strategies.

Potential applicants are strongly encouraged to contact Scientific/Research contacts as far in advance as possible to discuss the match between potential research applications and current NIMH priorities.

See Section VIII. Other Information for award authorities and regulations.
Section II. Award Information
Funding Instrument

Grant: A support mechanism providing money, property, or both to an eligible entity to carry out an approved project or activity.

Application Types Allowed

New

The OER Glossary and the SF424 (R&R) Application Guide provide details on these application types.

Funds Available and Anticipated Number of Awards

NIMH intends to commit $2,500,000 in FY 2017 to fund 5-7 awards.

Award Budget

Application budgets are not limited but need to reflect the actual needs of the proposed project.

Award Project Period

The maximum project period is 5 years; however, applicants are strongly encouraged to limit the proposed project period to 3-4 years.

NIH grants policies as described in the NIH Grants Policy Statement will apply to the applications submitted and awards made in response to this FOA.

Section III. Eligibility Information
1. Eligible Applicants
Eligible Organizations

Higher Education Institutions

  • Public/State Controlled Institutions of Higher Education
  • Private Institutions of Higher Education

The following types of Higher Education Institutions are always encouraged to apply for NIH support as Public or Private Institutions of Higher Education:

    • Hispanic-serving Institutions
    • Historically Black Colleges and Universities (HBCUs)
    • Tribally Controlled Colleges and Universities (TCCUs)
    • Alaska Native and Native Hawaiian Serving Institutions
    • Asian American Native American Pacific Islander Serving Institutions (AANAPISIs)

Nonprofits Other Than Institutions of Higher Education

  • Nonprofits with 501(c)(3) IRS Status (Other than Institutions of Higher Education)
  • Nonprofits without 501(c)(3) IRS Status (Other than Institutions of Higher Education)

For-Profit Organizations

  • Small Businesses
  • For-Profit Organizations (Other than Small Businesses)

Governments

  • State Governments
  • County Governments
  • City or Township Governments
  • Special District Governments
  • Indian/Native American Tribal Governments (Federally Recognized)
  • Indian/Native American Tribal Governments (Other than Federally Recognized)
  • Eligible Agencies of the Federal Government
  • U.S. Territory or Possession

Other

  • Independent School Districts
  • Public Housing Authorities/Indian Housing Authorities
  • Native American Tribal Organizations (other than Federally recognized tribal governments)
  • Faith-based or Community-based Organizations
  • Regional Organizations
  • Non-domestic (non-U.S.) Entities (Foreign Institutions)
Foreign Institutions

Non-domestic (non-U.S.) Entities (Foreign Institutions) are eligible to apply.
Non-domestic (non-U.S.) components of U.S. Organizations are eligible to apply.
Foreign components, as defined in the NIH Grants Policy Statement, are allowed.

Required Registrations

Applicant Organizations

Applicant organizations must complete and maintain the following registrations as described in the SF 424 (R&R) Application Guide to be eligible to apply for or receive an award. All registrations must be completed prior to the application being submitted. Registration can take 6 weeks or more, so applicants should begin the registration process as soon as possible. The NIH Policy on Late Submission of Grant Applications states that failure to complete registrations in advance of a due date is not a valid reason for a late submission.

  • Dun and Bradstreet Universal Numbering System (DUNS) - All registrations require that applicants be issued a DUNS number. After obtaining a DUNS number, applicants can begin both SAM and eRA Commons registrations. The same DUNS number must be used for all registrations, as well as on the grant application.
  • System for Award Management (SAM) (formerly CCR) Applicants must complete and maintain an active registration, which requires renewal at least annually. The renewal process may require as much time as the initial registration. SAM registration includes the assignment of a Commercial and Government Entity (CAGE) Code for domestic organizations which have not already been assigned a CAGE Code.
  • eRA Commons - Applicants must have an active DUNS number and SAM registration in order to complete the eRA Commons registration. Organizations can register with the eRA Commons as they are working through their SAM or Grants.gov registration. eRA Commons requires organizations to identify at least one Signing Official (SO) and at least one Program Director/Principal Investigator (PD/PI) account in order to submit an application.
  • Grants.gov Applicants must have an active DUNS number and SAM registration in order to complete the Grants.gov registration.

Program Directors/Principal Investigators (PD(s)/PI(s))

All PD(s)/PI(s) must have an eRA Commons account.  PD(s)/PI(s) should work with their organizational officials to either create a new account or to affiliate their existing account with the applicant organization in eRA Commons. If the PD/PI is also the organizational Signing Official, they must have two distinct eRA Commons accounts, one for each role. Obtaining an eRA Commons account can take up to 2 weeks.

Eligible Individuals (Program Director/Principal Investigator)

Any individual(s) with the skills, knowledge, and resources necessary to carry out the proposed research as the Program Director(s)/Principal Investigator(s) (PD(s)/PI(s)) is invited to work with his/her organization to develop an application for support. Individuals from underrepresented racial and ethnic groups as well as individuals with disabilities are always encouraged to apply for NIH support.

For institutions/organizations proposing multiple PDs/PIs, visit the Multiple Program Director/Principal Investigator Policy and submission details in the Senior/Key Person Profile (Expanded) Component of the SF424 (R&R) Application Guide.

2. Cost Sharing

This FOA does not require cost sharing as defined in the NIH Grants Policy Statement.

3. Additional Information on Eligibility
Number of Applications

Applicant organizations may submit more than one application, provided that each application is scientifically distinct.

The NIH will not accept duplicate or highly overlapping applications under review at the same time.  This means that the NIH will not accept:

  • A new (A0) application that is submitted before issuance of the summary statement from the review of an overlapping new (A0) or resubmission (A1) application.
  • A resubmission (A1) application that is submitted before issuance of the summary statement from the review of the previous new (A0) application.
  • An application that has substantial overlap with another application pending appeal of initial peer review (see NOT-OD-11-101).
Section IV. Application and Submission Information
1. Requesting an Application Package

Applicants must obtain the SF424 (R&R) application package associated with this funding opportunity using the Apply for Grant Electronically button in this FOA or following the directions provided at Grants.gov.

2. Content and Form of Application Submission

It is critical that applicants follow the instructions in the SF424 (R&R) Application Guide, including Supplemental Grant Application Instructions except where instructed in this funding opportunity announcement to do otherwise. Conformance to the requirements in the Application Guide is required and strictly enforced. Applications that are out of compliance with these instructions may be delayed or not accepted for review.

For information on Application Submission and Receipt, visit Frequently Asked Questions Application Guide, Electronic Submission of Grant Applications.

Letter of Intent

Although a letter of intent is not required, is not binding, and does not enter into the review of a subsequent application, the information that it contains allows IC staff to estimate the potential review workload and plan the review.

By the date listed in Part 1. Overview Information, prospective applicants are asked to submit a letter of intent that includes the following information:

  • Descriptive title of proposed activity
  • Name(s), address(es), and telephone number(s) of the PD(s)/PI(s)
  • Names of other key personnel
  • Participating institution(s)
  • Number and title of this funding opportunity

The letter of intent should be sent to:

Email: NIMHreferral@mail.nih.gov

Page Limitations

All page limitations described in the SF424 Application Guide and the Table of Page Limits must be followed.

Instructions for Application Submission

The following section supplements the instructions found in the SF424 (R&R) Application Guide and should be used for preparing an application to this FOA.

SF424(R&R) Cover

All instructions in the SF424 (R&R) Application Guide must be followed.

SF424(R&R) Project/Performance Site Locations

All instructions in the SF424 (R&R) Application Guide must be followed.

SF424(R&R) Other Project Information

All instructions in the SF424 (R&R) Application Guide must be followed.

Other Attachments: Applicants should upload a single attachment that includes the following information relevant to the proposed project. Applicants should use the headers below in their description. This attachment must be no more than 4 pages. Applications that exceed this limit will not be reviewed.

I. Participant Recruitment and Retention Procedures: Applications must provide a clear description of the following. Applications lacking the following information will not be reviewed:

  • Recruitment and Referral sources, including detailed descriptions of the census/rate of new participants and anticipated yield of eligible participants (including provider-/therapist-/supervisor- and client-/patient- participants) from each recruitment/referral source;
  • Procedures that will be used to monitor enrollment and track/retain participants;
  • Strategies that will be used to ensure a diverse, representative sample;
  • Potential recruitment/enrollment challenges and strategies that can be implemented in the event of enrollment shortfalls (e.g., additional outreach procedures, alternate/back-up referral sources);
  • Evidence to support the feasibility of enrollment, including descriptions of prior experiences and yield from research efforts employing similar referral sources and/or strategies.

II. Study Milestones and Timeline: Applications must provide a clear description of:

  • Objective, quantifiable, and scientifically justified study milestones; and
  • A proposed timeline for reaching important study milestones such as: (a) finalizing the initial quality assessment tool and data collection strategy; (b) pilot testing the assessment tool and data collection strategies, as appropriate; (c) initiating prospective psychometric testing in the community practice setting; (d) enrolling 25%, 50%, 75% and 100% of the sample; (e) completing all subject assessments and data collection activities, including data quality checks; (f) analyzing and interpreting results; and (g) preparing de-identified data and relevant documentation to facilitate data sharing. The proposed milestones should describe projected specific, measureable and achievable progress throughout the project period, which can be used as an indicator of success.
SF424(R&R) Senior/Key Person Profile

All instructions in the SF424 (R&R) Application Guide must be followed.

R&R or Modular Budget

All instructions in the SF424 (R&R) Application Guide must be followed.

R&R Subaward Budget

All instructions in the SF424 (R&R) Application Guide must be followed.

PHS 398 Cover Page Supplement

All instructions in the SF424 (R&R) Application Guide must be followed.

PHS 398 Research Plan

All instructions in the SF424 (R&R) Application Guide must be followed, with the following additional instructions:

Applications should not duplicate information provided in the attachment described in Section IV.2, "SF424 (R&R) Other Project Information," unless needed to provide context.

Research Strategy:

Significance:

  • Describe the potential public health impact of the proposed strategy, in terms of the implications of improved metrics for monitoring quality (e.g., given the number of individuals affected/ receiving the intervention, the severity or burden associated with the target condition (e.g., in the case of prevention of suicide)).
  • Describe how the proposed research would contribute to a body of generalizable knowledge regarding best-practices for the assessment of quality of psychosocial intervention delivery (e.g., Under what conditions might therapists serve as reliable and valid informants for gauging therapy quality? In general, when in the course of therapy and how frequently should sessions be sampled to index quality?)

Innovation:

  • Describe the use of any innovative data collection methods (e.g., use of technology to sample or quantify session content) or analytic strategies (e.g., use of existing data and innovative analytic approaches to identify metrics of quality) to develop and test quality assessment strategy, as appropriate.

Approach:

  • Describe the proposed method for assessing therapy quality and the rationale for using the proposed focus or unit of analysis (e.g., broad principles/practices, specific research-supported interventions or strategies), the data source (e.g., respondent report, medical records data), sampling frame (i.e., timing and number of assessment points), and data collection method (e.g., informant self-report, technology-assisted data collection).
  • Specify the research strategy that will be used for the initial development and operationalization of the quality metric (e.g., identification of an item pool or other metrics that can be used to quantify quality).
  • Describe plans to incorporate stakeholder perspectives and take into account financial/workforce resources and clinical/administrative routines in order to ensure a feasible, practice-ready quality assessment approach.
  • Describe plans for examining the psychometric properties of the assessment strategy and plans for prospectively testing whether the candidate measure of quality is associated with patient outcomes (and other established metrics of quality, e.g., gold standard fidelity ratings based on fidelity rating systems from efficacy research, as appropriate) in a relevant community practice setting.
  • Detail procedures that will be used to systematically assess the acceptability/feasibility of the assessment strategy and quantify required resources (e.g., respondent time/burden) for the assessment approach.
  • Document anticipated power for analyses that will be used to examine validity of the assessment approach (i.e., describe power to determine whether the quality measure is associated with/predictive of patient outcomes and other established measures of quality, as appropriate).

Protections for Human Subjects:

  • Applications with data collection plans that involve multiple respondent groups (e.g., clients/patients, therapists/providers, supervisors, administrators) should address provisions for human subject protections and consenting procedures for all participant groups, accordingly.
  • The NIMH has published updated policies and guidance for investigators regarding human research protection and clinical research data and safety monitoring (NOT-MH-15-025).  The application’s Protection of Human Subjects section and data and safety monitoring plans should reflect the policies and guidance in this notice.  Plans for the protection of research subjects and data and safety monitoring will be reviewed by the NIMH for consistency with NIMH and NIH policies and federal regulations.

Resource Sharing Plan: Individuals are required to comply with the instructions for the Resource Sharing Plans as provided in the SF424 (R&R) Application Guide, with the following modification:

  • All applications, regardless of the amount of direct costs requested for any one year, should address a Data Sharing Plan.

Appendix: Do not use the Appendix to circumvent page limits. Follow all instructions for the Appendix as described in the SF424 (R&R) Application Guide.

PHS Inclusion Enrollment Report

When conducting clinical research, follow all instructions for completing PHS Inclusion Enrollment Report as described in the SF424 (R&R) Application Guide.

PHS Assignment Request Form

All instructions in the SF424 (R&R) Application Guide must be followed.

Foreign Institutions

Foreign (non-U.S.) institutions must follow policies described in the NIH Grants Policy Statement, and procedures for foreign institutions described throughout the SF424 (R&R) Application Guide.

3. Unique Entity Identifier and System for Award Management (SAM)

See Part 1. Section III.1 for information regarding the requirement for obtaining a unique entity identifier and for completing and maintaining active registrations in System for Award Management (SAM), NATO Commercial and Government Entity (NCAGE) Code (if applicable), eRA Commons, and Grants.gov

4. Submission Dates and Times

Part I. Overview Information contains information about Key Dates and times. Applicants are encouraged to submit applications before the due date to ensure they have time to make any application corrections that might be necessary for successful submission. When a submission date falls on a weekend or Federal holiday, the application deadline is automatically extended to the next business day.

Organizations must submit applications to Grants.gov (the online portal to find and apply for grants across all Federal agencies). Applicants must then complete the submission process by tracking the status of the application in the eRA Commons, NIH’s electronic system for grants administration. NIH and Grants.gov systems check the application against many of the application instructions upon submission. Errors must be corrected and a changed/corrected application must be submitted to Grants.gov on or before the application due date and time. If a Changed/Corrected application is submitted after the deadline, the application will be considered late. Applications that miss the due date and time are subjected to the NIH Policy on Late Application Submission.

Applicants are responsible for viewing their application before the due date in the eRA Commons to ensure accurate and successful submission.

Information on the submission process and a definition of on-time submission are provided in the SF424 (R&R) Application Guide.

5. Intergovernmental Review (E.O. 12372)

This initiative is not subject to intergovernmental review.

6. Funding Restrictions

All NIH awards are subject to the terms and conditions, cost principles, and other considerations described in the NIH Grants Policy Statement.

Pre-award costs are allowable only as described in the NIH Grants Policy Statement.

7. Other Submission Requirements and Information

Applications must be submitted electronically following the instructions described in the SF424 (R&R) Application Guide.  Paper applications will not be accepted.

Applicants must complete all required registrations before the application due date. Section III. Eligibility Information contains information about registration.

For assistance with your electronic application or for more information on the electronic submission process, visit Applying Electronically. If you encounter a system issue beyond your control that threatens your ability to complete the submission process on-time, you must follow the Guidelines for Applicants Experiencing System Issues. For assistance with application submission, contact the Application Submission Contacts in Section VII.

Important reminders:

All PD(s)/PI(s) must include their eRA Commons ID in the Credential field of the Senior/Key Person Profile Component of the SF424(R&R) Application Package. Failure to register in the Commons and to include a valid PD/PI Commons ID in the credential field will prevent the successful submission of an electronic application to NIH. See Section III of this FOA for information on registration requirements.

The applicant organization must ensure that the DUNS number it provides on the application is the same number used in the organization’s profile in the eRA Commons and for the System for Award Management. Additional information may be found in the SF424 (R&R) Application Guide.

See more tips for avoiding common errors.

Upon receipt, applications will be evaluated for completeness and compliance with application instructions by the Center for Scientific Review and responsiveness by components of participating organizations, NIH. Applications that are incomplete, non-compliant and/or nonresponsive will not be reviewed.

In order to expedite review, applicants are requested to notify the NIMH Referral Office by email at NIMHreferral@mail.nih.gov when the application has been submitted. Please include the FOA number and title, PD/PI name, and title of the application.

Use of Common Data Elements in NIH-funded Research

NIMH encourages the use of common data elements (CDEs) in basic, clinical, and applied research, patient registries, and other human subject research to facilitate broader and more effective use of data and advance research across studies.  CDEs are data elements that have been identified and defined for use in multiple data sets across different studies. Use of CDEs can facilitate data sharing and standardization to improve data quality and enable data integration from multiple studies and sources, including electronic health records.  NIH ICs have identified CDEs for many clinical domains (e.g., neurological disease), types of studies (e.g., genome-wide association studies (GWAS)), types of outcomes (e.g., patient-reported outcomes), and patient registries (e.g., the Global Rare Diseases Patient Registry and Data Repository).  NIH has established a Common Data Element (CDE) Resource Portal" (http://cde.nih.gov/) to assist investigators in identifying NIH-supported CDEs when developing protocols, case report forms, and other instruments for data collection. The Portal provides guidance about and access to NIH-supported CDE initiatives and other tools and resources for the appropriate use of CDEs and data standards in NIH-funded research.  Investigators are encouraged to consult the Portal and describe in their applications any use they will make of NIH-supported CDEs in their projects.

Post Submission Materials

Applicants are required to follow the instructions for post-submission materials, as described in NOT-OD-13-030.

Section V. Application Review Information
1. Criteria

Only the review criteria described below will be considered in the review process. As part of the NIH mission, all applications submitted to the NIH in support of biomedical and behavioral research are evaluated for scientific and technical merit through the NIH peer review system.

Overall Impact

Reviewers will provide an overall impact score to reflect their assessment of the likelihood for the project to exert a sustained, powerful influence on the research field(s) involved, in consideration of the following review criteria and additional review criteria (as applicable for the project proposed).

Scored Review Criteria

Reviewers will consider each of the review criteria below in the determination of scientific merit, and give a separate score for each. An application does not need to be strong in all categories to be judged likely to have major scientific impact. For example, a project that by its nature is not innovative may be essential to advance a field.

Significance

Does the project address an important problem or a critical barrier to progress in the field? Is there a strong scientific premise for the project? If the aims of the project are achieved, how will scientific knowledge, technical capability, and/or clinical practice be improved? How will successful completion of the aims change the concepts, methods, technologies, treatments, services, or preventative interventions that drive this field?

Does the application provide a compelling argument for the public health benefit of the proposed strategy, in terms of the implications of improved metrics for monitoring quality (e.g., given the number of individuals affected/ receiving the intervention, the severity or burden associated with the target condition (e.g., prevention of suicide)?

Do the study results have potential to contribute to a body of generalizable knowledge regarding best-practices for the assessment of quality of psychosocial intervention delivery (e.g., Under what conditions might therapists serve as reliable and valid informants for gauging therapy quality? In general, when in the course of therapy and how frequently should sessions be sampled to index quality)?

Investigator(s)

Are the PD(s)/PI(s), collaborators, and other researchers well suited to the project? If Early Stage Investigators or New Investigators, or in the early stages of independent careers, do they have appropriate experience and training? If established, have they demonstrated an ongoing record of accomplishments that have advanced their field(s)? If the project is collaborative or multi-PD/PI, do the investigators have complementary and integrated expertise; are their leadership approach, governance and organizational structure appropriate for the project?

Does the project involve collaborations and/or input from community practice partners/providers, consumers, and relevant policy makers in a manner that informs the research and helps to ensure the results will have utility?

Innovation

Does the application challenge and seek to shift current research or clinical practice paradigms by utilizing novel theoretical concepts, approaches or methodologies, instrumentation, or interventions? Are the concepts, approaches or methodologies, instrumentation, or interventions novel to one field of research or novel in a broad sense? Is a refinement, improvement, or new application of theoretical concepts, approaches or methodologies, instrumentation, or interventions proposed?

Are innovative data collection methods (e.g., use of technology to sample or quantify session content), analytic strategies (e.g., use of existing data and innovative analytic approaches to identify metrics of quality), or other innovative approaches used to facilitate the development/ testing of measurement approach or in the proposed quality assessment strategy, as appropriate?

Approach

Are the overall strategy, methodology, and analyses well-reasoned and appropriate to accomplish the specific aims of the project? Have the investigators presented strategies to ensure a robust and unbiased approach, as appropriate for the work proposed? Are potential problems, alternative strategies, and benchmarks for success presented? If the project is in the early stages of development, will the strategy establish feasibility and will particularly risky aspects be managed? Have the investigators presented adequate plans to address relevant biological variables, such as sex, for studies in vertebrate animals or human subjects? 

Does the application describe the empirical basis and provide a compelling rationale for the proposed method for assessing therapy quality, including the rationale for the proposed focus or unit of analysis (e.g., broad strategies, specific techniques), the data source (e.g., respondent report, medical records data), sampling frame (i.e., timing and number of assessment points), and data collection method (e.g., informant self-report, technology-assisted data collection)?

Does the application specify a sound research strategy that will be used for the initial development and operationalization of the quality metric (e.g., identification of an item pool or other metrics that can be used to quantify quality)?

Are there appropriate plans to incorporate stakeholder perspectives and take into account financial/workforce resources and clinical/administrative routines in order to ensure a feasible, practice-ready quality assessment approach?

Does the application detail appropriate procedures for systematically assessing the acceptability/feasibility of the assessment strategy and quantifying required resources (e.g., respondent time/burden) for the assessment approach?

Does the application describe well-reasoned plans for examining the psychometric properties of the assessment strategy and for prospectively testing whether the candidate measure of quality is associated with patient outcomes (and other established metrics of quality, e.g., gold standard fidelity ratings based on fidelity rating systems from efficacy research, as appropriate)?

Is a compelling power analysis provided for analyses that will be used to examine validity of the assessment approach (i.e., power to determine whether the quality measure is associated with/predictive of patient outcomes and other established measures of quality, as appropriate)?

If the project involves human subjects and/or NIH-defined clinical research, are the plans to address 1) the protection of human subjects from research risks, and 2) inclusion (or exclusion) of individuals on the basis of sex/gender, race, and ethnicity, as well as the inclusion or exclusion of children, justified in terms of the scientific goals and research strategy proposed?

Environment

Will the scientific environment in which the work will be done contribute to the probability of success? Are the institutional support, equipment and other physical resources available to the investigators adequate for the project proposed? Will the project benefit from unique features of the scientific environment, subject populations, or collaborative arrangements?

Are the research and clinical resources appropriate for supporting the proposed research? Does the application document appropriate collaborations/partnerships with and resources within the clinical practice setting where the quality assessment approach will be validated? Does the application describe existing infrastructure (e.g., CTSAs, clinical practice networks) that will be used to maximize efficiency, as appropriate?

Additional Review Criteria

As applicable for the project proposed, reviewers will evaluate the following additional items while determining scientific and technical merit, and in providing an overall impact score, but will not give separate scores for these items.

Protections for Human Subjects

For research that involves human subjects but does not involve one of the six categories of research that are exempt under 45 CFR Part 46, the committee will evaluate the justification for involvement of human subjects and the proposed protections from research risk relating to their participation according to the following five review criteria: 1) risk to subjects, 2) adequacy of protection against risks, 3) potential benefits to the subjects and others, 4) importance of the knowledge to be gained, and 5) data and safety monitoring for clinical trials.

For research that involves human subjects and meets the criteria for one or more of the six categories of research that are exempt under 45 CFR Part 46, the committee will evaluate: 1) the justification for the exemption, 2) human subjects involvement and characteristics, and 3) sources of materials. For additional information on review of the Human Subjects section, please refer to the Guidelines for the Review of Human Subjects.

Inclusion of Women, Minorities, and Children 

When the proposed project involves human subjects and/or NIH-defined clinical research, the committee will evaluate the proposed plans for the inclusion (or exclusion) of individuals on the basis of sex/gender, race, and ethnicity, as well as the inclusion (or exclusion) of children to determine if it is justified in terms of the scientific goals and research strategy proposed. For additional information on review of the Inclusion section, please refer to the Guidelines for the Review of Inclusion in Clinical Research.

Vertebrate Animals

The committee will evaluate the involvement of live vertebrate animals as part of the scientific assessment according to the following criteria: (1) description of proposed procedures involving animals, including species, strains, ages, sex, and total number to be used; (2) justifications for the use of animals versus alternative models and for the appropriateness of the species proposed; (3) interventions to minimize discomfort, distress, pain and injury; and (4) justification for euthanasia method if NOT consistent with the AVMA Guidelines for the Euthanasia of Animals. Reviewers will assess the use of chimpanzees as they would any other application proposing the use of vertebrate animals. For additional information on review of the Vertebrate Animals section, please refer to the Worksheet for Review of the Vertebrate Animal Section.

Biohazards

Reviewers will assess whether materials or procedures proposed are potentially hazardous to research personnel and/or the environment, and if needed, determine whether adequate protection is proposed.

Resubmissions

Not Applicable

Renewals

Not Applicable

Revisions

Not Applicable

Additional Review Considerations

As applicable for the project proposed, reviewers will consider each of the following items, but will not give scores for these items, and should not consider them in providing an overall impact score.

Applications from Foreign Organizations

Reviewers will assess whether the project presents special opportunities for furthering research programs through the use of unusual talent, resources, populations, or environmental conditions that exist in other countries and either are not readily available in the United States or augment existing U.S. resources.

Select Agent Research

Reviewers will assess the information provided in this section of the application, including 1) the Select Agent(s) to be used in the proposed research, 2) the registration status of all entities where Select Agent(s) will be used, 3) the procedures that will be used to monitor possession use and transfer of Select Agent(s), and 4) plans for appropriate biosafety, biocontainment, and security of the Select Agent(s).

Resource Sharing Plans

Reviewers will comment on whether the following Resource Sharing Plans, or the rationale for not sharing the following types of resources, are reasonable: (1) Data Sharing Plan; (2) Sharing Model Organisms; and (3) Genomic Data Sharing Plan (GDS).

Authentication of Key Biological and/or Chemical Resources:

For projects involving key biological and/or chemical resources, reviewers will comment on the brief plans proposed for identifying and ensuring the validity of those resources.

Budget and Period of Support

Reviewers will consider whether the budget and the requested period of support are fully justified and reasonable in relation to the proposed research.

2. Review and Selection Process

Applications will be evaluated for scientific and technical merit by (an) appropriate Scientific Review Group(s) convened by NIMH, in accordance with NIH peer review policy and procedures, using the stated review criteria. Assignment to a Scientific Review Group will be shown in the eRA Commons.

As part of the scientific peer review, all applications:

  • May undergo a selection process in which only those applications deemed to have the highest scientific and technical merit (generally the top half of applications under review) will be discussed and assigned an overall impact score.
  • Will receive a written critique.

Appeals of initial peer review will not be accepted for applications submitted in response to this FOA.

Applications will be assigned to the appropriate NIH Institute or Center. Applications will compete for available funds with all other recommended applications submitted in response to this FOA. Following initial peer review, recommended applications will receive a second level of review by the National Advisory Mental Health Council. The following will be considered in making funding decisions:

  • Scientific and technical merit of the proposed project as determined by scientific peer review.
  • Availability of funds.
  • Relevance of the proposed project to program priorities.
3. Anticipated Announcement and Award Dates

After the peer review of the application is completed, the PD/PI will be able to access his or her Summary Statement (written critique) via the eRA Commons. Refer to Part 1 for dates for peer review, advisory council review, and earliest start date.

Information regarding the disposition of applications is available in the NIH Grants Policy Statement.

Section VI. Award Administration Information
1. Award Notices

If the application is under consideration for funding, NIH will request "just-in-time" information from the applicant as described in the NIH Grants Policy Statement.

A formal notification in the form of a Notice of Award (NoA) will be provided to the applicant organization for successful applications. The NoA signed by the grants management officer is the authorizing document and will be sent via email to the grantee’s business official.

Awardees must comply with any funding restrictions described in Section IV.5. Funding Restrictions. Selection of an application for award is not an authorization to begin performance. Any costs incurred before receipt of the NoA are at the recipient's risk. These costs may be reimbursed only to the extent considered allowable pre-award costs.

Any application awarded in response to this FOA will be subject to terms and conditions found on the Award Conditions and Information for NIH Grants website.  This includes any recent legislation and policy applicable to awards that is highlighted on this website.

2. Administrative and National Policy Requirements

All NIH grant and cooperative agreement awards include the NIH Grants Policy Statement as part of the NoA. For these terms of award, see the NIH Grants Policy Statement Part II: Terms and Conditions of NIH Grant Awards, Subpart A: General  and Part II: Terms and Conditions of NIH Grant Awards, Subpart B: Terms and Conditions for Specific Types of Grants, Grantees, and Activities. More information is provided at Award Conditions and Information for NIH Grants.

Recipients of federal financial assistance (FFA) from HHS must administer their programs in compliance with federal civil rights law. This means that recipients of HHS funds must ensure equal access to their programs without regard to a person’s race, color, national origin, disability, age and, in some circumstances, sex and religion. This includes ensuring your programs are accessible to persons with limited English proficiency. HHS recognizes that research projects are often limited in scope for many reasons that are nondiscriminatory, such as the principal investigator’s scientific interest, funding limitations, recruitment requirements, and other considerations. Thus, criteria in research protocols that target or exclude certain populations are warranted where nondiscriminatory justifications establish that such criteria are appropriate with respect to the health or safety of the subjects, the scientific study design, or the purpose of the research.

For additional guidance regarding how the provisions apply to NIH grant programs, please contact the Scientific/Research Contact that is identified in Section VII under Agency Contacts of this FOA. HHS provides general guidance to recipients of FFA on meeting their legal obligation to take reasonable steps to provide meaningful access to their programs by persons with limited English proficiency. Please see http://www.hhs.gov/ocr/civilrights/resources/laws/revisedlep.html. The HHS Office for Civil Rights also provides guidance on complying with civil rights laws enforced by HHS. Please see http://www.hhs.gov/ocr/civilrights/understanding/section1557/index.html; and http://www.hhs.gov/ocr/civilrights/understanding/index.html. Recipients of FFA also have specific legal obligations for serving qualified individuals with disabilities. Please see http://www.hhs.gov/ocr/civilrights/understanding/disability/index.html. Please contact the HHS Office for Civil Rights for more information about obligations and prohibitions under federal civil rights laws at http://www.hhs.gov/ocr/office/about/rgn-hqaddresses.html or call 1-800-368-1019 or TDD 1-800-537-7697. Also note it is an HHS Departmental goal to ensure access to quality, culturally competent care, including long-term services and supports, for vulnerable populations. For further guidance on providing culturally and linguistically appropriate services, recipients should review the National Standards for Culturally and Linguistically Appropriate Services in Health and Health Care at http://minorityhealth.hhs.gov/omh/browse.aspx?lvl=2&lvlid=53.

Cooperative Agreement Terms and Conditions of Award

Not Applicable

3. Reporting

When multiple years are involved, awardees will be required to submit the Research Performance Progress Report (RPPR) annually and financial statements as required in the NIH Grants Policy Statement.

A final progress report, invention statement, and the expenditure data portion of the Federal Financial Report are required for closeout of an award, as described in the NIH Grants Policy Statement.

The Federal Funding Accountability and Transparency Act of 2006 (Transparency Act), includes a requirement for awardees of Federal grants to report information about first-tier subawards and executive compensation under Federal assistance awards issued in FY2011 or later. All awardees of applicable NIH grants and cooperative agreements are required to report to the Federal Subaward Reporting System (FSRS) available at www.fsrs.gov on all subawards over $25,000. See the NIH Grants Policy Statement for additional information on this reporting requirement.

Section VII. Agency Contacts

We encourage inquiries concerning this funding opportunity and welcome the opportunity to answer questions from potential applicants.

Application Submission Contacts

eRA Service Desk (Questions regarding ASSIST, eRA Commons registration, submitting and tracking an application, documenting system problems that threaten submission by the due date, post submission issues)
Finding Help Online: http://grants.nih.gov/support/ (preferred method of contact)
Telephone: 301-402-7469 or 866-504-9552 (Toll Free)

Grants.gov Customer Support (Questions regarding Grants.gov registration and submission, downloading forms and application packages)
Contact CenterTelephone: 800-518-4726
Web ticketing system: https://grants-portal.psc.gov/ContactUs.aspx
Email: support@grants.gov

GrantsInfo (Questions regarding application instructions and process, finding NIH grant resources)
Email: GrantsInfo@nih.gov (preferred method of contact)
Telephone: 301-710-0267

Scientific/Research Contact(s)

Joel Sherrill, Ph.D.
National Institute of Mental Health (NIMH)
Telephone: 301-443-2477
Email: jsherril@mail.nih.gov

Peer Review Contact(s)

David Armstrong, Ph.D.
National Institute of Mental Health (NIMH)
Telephone: 301-443-3534
Email: armstrda@mail.nih.gov

Financial/Grants Management Contact(s)

Tamara Kees
National Institute of Mental Health (NIMH)
Telephone: 301-443-8811
Email: tkees@mail.nih.gov

Section VIII. Other Information

Recently issued trans-NIH policy notices may affect your application submission. A full list of policy notices published by NIH is provided in the NIH Guide for Grants and Contracts. All awards are subject to the terms and conditions, cost principles, and other considerations described in the NIH Grants Policy Statement.

Authority and Regulations

Awards are made under the authorization of Sections 301 and 405 of the Public Health Service Act as amended (42 USC 241 and 284) and under Federal Regulations 42 CFR Part 52 and 45 CFR Part 75.

NIH Office of Extramural Research Logo
Department of Health and Human Services (HHS) - Home Page
Department of Health
and Human Services (HHS)
USA.gov - Government Made Easy
NIH... Turning Discovery Into Health®