Data Collection

Race, Ethnicity, and Gender Data (REG)

NIFA requires Extension professionals to collect REG data on the participants of their program activities. A program activity provides an educational benefit to a closed or set group of people, which has the following characteristics:
  • A fixed instructor
  • Fixed participants
  • Meets for 5+ classes
  • Utilizes an established Extension curriculum

Collecting REG Data

All REG data should be self-reported by participants with individual survey forms, ensuring participant confidentiality. The survey forms should have a separate item for each construct. NIFA requires collecting data and reporting on specific categories for each of the REG constructs.

 

Race

An assessment for participant race is required to include the following categories, with the option for participants to select more than one:

  • American Indian or Alaska Native
  • Asian
  • Black or African American
  • Native Hawaiian or Other Pacific Islander
  • White

Example Race Item

 

Ethnicity

An assessment for participant ethnicity is required to include the following categories:

  • Hispanic or Latino
  • Not Hispanic or Latino

Example Ethnicity Item

 

Gender

An assessment for participant gender is required to include the following categories:

  • Female
  • Male

Example Gender Item

Reporting REG Data

NIFA requires reporting the prevalence of specific categories for each of the REG constructs.

 

Race

  • American Indian or Alaska Native
  • Asian
  • Black or African American
  • Native Hawaiian or Other Pacific Islander
  • White
  • Other/Unspecified for those that did not respond to the item or selected an 'Other' category
  • and for participants that selected more than one category, either:
    • a single Multiple-Race category
    • all applicable combinations of groups selected by participants (e.g. Asian/White)

 

Ethnicity

  • Hispanic or Latino
  • Not Hispanic or Latino
  • Other/Unspecified for those that did not respond to the item or selected an 'Other' category

 

Gender

  • Female
  • Male
  • Other/Unspecified for those that did not respond to the item or selected an 'Other' category

Assessing Parity

NIFA parity assessments expect that program participant demographic rates should be equal to or greater than 80% of state demographic rates. Based on state and county demographic rates from the 2020 Decennial Census, parity benchmarks should be as follows:

 

Race

State/County American Indian or Alaska Native Asian Black or African American Native Hawaiian & Other Pacific Islander White Two or More Races Other Race
Connecticut 0.36% 3.83% 8.62% 0.04% 53.14% 7.39% 6.64%
Fairfield 0.37% 4.29% 8.92% 0.03% 48.81% 8.59% 8.99%
Hartford 0.28% 4.79% 11.31% 0.03% 49.29% 7.15% 7.15%
Litchfield 0.22% 1.51% 1.45% 0.03% 68.59% 5.62% 2.59%
Middlesex 0.19% 2.43% 4.17% 0.02% 65.62% 5.53% 2.04%
New Haven 0.38% 3.47% 11.00% 0.04% 50.31% 7.62% 7.18%
New London 0.74% 3.22% 4.77% 0.08% 60.04% 7.21% 3.94%
Tolland 0.15% 4.52% 2.91% 0.03% 65.62% 4.92% 1.84%
Windham 0.56% 1.37% 1.65% 0.02% 65.30% 6.56% 4.54%

Ethnicity

State/County Hispanic or Latino Not Hispanic or Latino
Connecticut 16.72% 66.17%
Fairfield 21.84% 62.84%
Hartford 18.14% 65.21%
Litchfield 6.84% 73.70%
Middlesex 6.26% 74.19%
New Haven 19.58% 64.27%
New London 10.41% 70.79%
Tolland 5.54% 74.82%
Windham 11.34% 70.06%

Gender

State/County Female Male
Connecticut 41.18% 38.82%
Fairfield 41.21% 38.79%
Hartford 41.37% 38.63%
Litchfield 40.53% 39.47%
Middlesex 41.05% 38.95%
New Haven 41.58% 38.42%
New London 40.49% 39.51%
Tolland 40.22% 39.78%
Windham 40.48% 39.52%

 

FAQ

This is an FAQ

Statewide Indicators

UConn Extension developed a set of indicators used to measure and report on Extension Professionals' impact and efforts towards the programs outlined in the NIFA Plan of Work:
  • Adaptation and resilience to a changing climate
  • Enhancing health and well-being
  • Sustainable agriculture and food supply
  • Sustainable landscapes across urban-rural interfaces
Each year, Extension professionals are required to report on the indicators, which are then summarized and submitted to NIFA to meet requirements associated with federal funding for Extension work.

Development, Implementation, and History

Our indicators were initially based off a set of indicators being used by UF/IFAS Extension. This initial list then went through a multi-stage feedback and revision process, which is outlined in the image below:

In summer/fall 2024, Extension's Data Analyst and Evaluation Specialist will meet with Extension professionals to receive feedback on the indicators, which may lead to changes in the indicators for the 2025 reporting cycle.

Reporting and Revisions Timeline

This timeline shows all the steps involved in the reporting and revisions of the statewide indicators. Extension professionals are directly involved in the steps in green, whereas the steps in red are carried out by only Extension's Data Analyst and Evaluation Specialist.

Data Collection and Management

UConn Extension does not have requirements for how Extension professionals store and manage their indicator data during the year- only that reported indicator data meet specific requirements when submitting the data to the indicator survey. Here are some recommendations that are intended to make the process more straightforward and easier:

 

Data Archive

It is recommended that you keep an archive of your past Extension work. This is helpful because it provides an organized way of storing one's records which can increase data quality (e.g., accuracy, contextual information) and ease of access in the future (e.g., summarizing past work, seeing past work with same client, etc.). Such an archive should be hosted on a cloud-service to ensure there is a back-up and the ability to access the archive through multiple devices. UConn provides access to OneDrive and SharePoint as options. On a cloud-service, one should create a dedicated folder to serve as their data archive, which includes a folder for storing direct evidence of their work, a dataset that lists their work with relevant information, and a dataset that summarizes their work in relation to the statewide indicators. Here is an example of such a folder:

 

Extension Work Records

This folder should serve to store all direct evidence of one's Extension work. This is important in the event that someone ever needs to provide evidence for auditing purposes or needs access to that information for future work. Direct evidence of work can include a large variety of files, like emails, receipts, meeting recordings, post-meeting notes, participant surveys, sign-in sheets, and more. And depending on one's work, they may accumulate many files that can lead to a disorganized folder. So it is recommended that this folder has a series of sub-folders that are used to organize all the files, like according to month of the year.

And in each of the sub-folders, all relevant files should be listed with detailed titles for easy identification. This way, one could easily determine the contents of the file and quickly identify which file they may need at a given time.

 

Extension Work Records Dataset

In the the data archives folder should also be an excel spreadsheet which summarizes all the work evidence one has stored in the sub-folders of the Extension Work Records folder. This spreadsheet can include the date of the work (for contextual information & sorting), a description of the work, additional relevant information (activity, client), and a list of the files associated with the work. Therefore, if a funder requested evidence of a funded work carried out several months prior, using this spreadsheet one could easily identify the day the activity was carried out, whom received the work/benefit, and which files (of many) in the archive contain additional information about the work (if needed).

 

Statewide Indicators Dataset

Since the above Extension Work Records Dataset contains information on work applicable to the statewide indicators, one could read through the whole dataset to aggregate their data, across the whole previous year, when reporting on the statewide indicators survey. However, it may be easier and more manageable if one were to aggregate this information occasionally throughout the year. To help with this, we created a Statewide Indicators Tracker that allows one to enter the relevant data for each indicator across each month of the year, which then aggregates the data for the whole year in a separate tab. Additionally, one can change whether an indicator is applicable to them or not (by changing the 'Applicable' value to No and activitate the filter button on the header cell), which will then have the spreadsheet list only the indicators applicable to their work. Then, when it is time to report on the indicators survey, one can easily just copy the aggregated values for the relevant indicators from the aggregate tab and enter the values into the survey.

 

This serves as the current recommendation for collecting and managing data for the statewide indicators. Of course, this recommendation still requires work and time from the Extension professional. But we will continue to pursue additional methods and platforms in the future to reduce Extension professionals' time towards record-keeping while ensuring data quality and ease of access.

NIFA Report

The statewide indicators data are analyzed and summarized for NIFA's Annual Report each year. This data was first submitted in the NIFA Annual Report for Fiscal Year 2023. Specifically, the indicator results were summed across all Extension professionals, showcasing Extension's total efforts towards work described by each indicator. Several indicators were then omitted based on non-response or very low impact/relevance. The indicators were then grouped and presented in the report based on underlying themes identified through the results, which are listed under each critical issue in the report.

Changes and Updates to the Indicators

Each year, Extension revises the indicators based on 1) results from the most recent analysis of the indicators, 2) feedback from Extension professionals, and 3) any changes to the Extension goals and program areas. In summer/fall, Extension's Data Analyst and Evaluation Specialist hold meetings with professionals in the following groups and Extension work areas:

  • 4-H
  • CLEAR
  • Climate Adaptation & Resilience
  • Health & Well-being
  • Food Systems & Agriculture
  • IPM
  • Sustainable Landscapes

In these meetings, the Evaluation Specialist describes the current state of the indicators, related principles of evaluation, and ideas for updates to the indicators. This is an opportunity for Extension professionals to discuss their perspectives and ideas. Following the meeting, the Data Analyst sends survey invitations to the Extension professionals for final feedback after they have had a chance to reflect on what was discussed in the meeting. The feedback is then aggregated and considered when making any changes to the indicators for the upcoming reporting cycle.

FAQ

Am I required to complete the indicators survey and if so, why?

UConn Extension is required to report to NIFA on work funded by Smith-Lever funds. Many Extension professionals' salaries and work are funded through Smith-Lever funds. We send out the indicators survey to those that receive Smith-Lever funds so that we can meet this requirement. Additionally, the indicators help showcase the work of individual Extension professionals which can be used towards promoting one's professional identity and scholarship.

When I try to access the indicators survey, it directs to a blank page with the text saying I already submitted it.

This is an issue with Qualtrics that can occur for several reasons. One common reason is that you may have accessed the survey at a previous time and accidentally submitted it while looking through all the indicators. Another common reason is that survey invitation links are personalized to each invitee, so sharing a link with someone may lead to them completing your survey instead of theirs. If someone runs into this issue, they should reach out to the Data Analyst so they can receive a new survey link.

I work with a colleague(s) on the same projects. How do we avoid duplicate data when reporting on the indicators?

When completing the indicators survey, one should enter data for all their direct work. There is currently not an ideal solution for managing situations in which Extension professionals are reporting on their shared work. We disclose this limitation when reporting the indicator results to NIFA. Additionally, while we aggregate the data to showcase the overall impact of Extension, we want each individual's data to showcase their impact even if a colleague is also reporting that same data.

What qualifies as documentation for evidence used in reporting for the indicators?

There are many potential forms of evidence depending on the type of work being carried out, including emails, post-meeting notes, recordings, receipts, surveys, etc. But generally, there should be an effort to have some reliable evidence for past work.

What is the difference between "reported" and "demonstrated" in the indicators?

For each indicator describing an impact, "reported" refers to an individual's self-report of the impact described by the indicator, whereas "demonstrated" refers to an individual showing the impact or providing observable evidence of the impact.

How should I respond to an indicator that is not applicable to my work?

One can simply report a zero or leave their response to the indicator blank.

I have other work that I carry out that does not fit with the current list of indicators. How do I add them to the indicators?

The purpose of the indicators is to describe the impacts of work shared by Extension professionals. In some cases, someone may be the only one carrying out that specific type of work. Therefore, it is not something that would typically be included as an indicator for NIFA reporting purposes. However, everyone is encouraged to create their own indicators for such work, which can help showcase their efforts and scholarship. In such cases, they can reach out to 1) the Data Analyst for help with recording and reporting that data and 2) the Evaluation Specialist for ideas of how to create indicators for that specific work. Additionally, even though it is not currently in the indicator list, it may be worthwhile adding in the future as additional professionals join Extension and/or other Extension professionals adopt similar work.

Extension Data Info Requests

Name
This field is for validation purposes and should be left unchanged.