Blog & News
A New Brief Examines the Collection of Sexual Orientation and Gender Identity (SOGI) Data at the Federal Level and in Medicaid
October 2021:Thirteen million people identify as part of a sexual or gender minority (SGM) in the United States, with an estimated 1.17 million who have Medicaid as their primary source of health insurance coverage.[1],[2] While the COVID crisis shed a unique light on the need for more equitable health data collection across all populations, one area in which efforts are still falling short is sexual orientation and gender identity (SOGI) data – especially for those covered by Medicaid. However, change is happening, and states are leading the way.
A new State Health and Value Strategies brief, authored by SHADAC researchers, documents a few examples of SOGI data collection efforts occurring at the federal level and in Medicaid, and highlights the efforts of an Oregon community stakeholder process that identified several key areas for SOGI data improvement.
SOGI Data at the Federal Level
Unfortunately, there is no current federal data standard for sexual orientation and gender identity. The data standard from the Department of Health and Human Services (HHS), released in 2011, defines the category of “sex” only as biological sex and makes no mention of gender or gender identity. Additionally, when SOGI data collection efforts have been made over the past decade, certain aspects have been more routinely measured than others. For instance, most of the research to-date has focused on how to collect sexual orientation, and very little continues to be known about the best way to collect gender identity. A review in 2016, for example, found that of the 12 federal surveys that collect various aspects of SOGI information, only half of those collect information on gender identity.
SOGI Data in Medicaid
The collection of SOGI data in Medicaid is even less common than in federal surveys. SHADAC’s review of state Medicaid applications identified only two states that provided applicants an opportunity to select something other than “male” or “female” when asked to indicate their sex/gender (these terms were used interchangeably to refer to biological sex). Connecticut’s paper application has an open-text write-in option for “gender.” Oregon asks applicants to indicate their “sex assigned at birth” as well as their “gender identity.”
Improving the Collection of SOGI Data – The Oregon Model
There are still many unresolved methodological and conceptual issues when it comes to the collection of SOGI information, and very limited research assessing the validity of SOGI data specifically collected via the Medicaid application process. But one state leading the way in this effort is Oregon.
In 2018, the Oregon Health Authority (OHA) Office of Equity and Inclusion convened a SOGI Data Collection Workgroup, composed of stakeholders who interact with the Lesbian, Gay, Bisexual, Transgender, Queer, and other (LGBTQ+) community and health systems, many of whom also identify as LGBTQ+ themselves, to develop a set of SOGI data standards. The group recommended a key set of five demographic questions and response options, some open-ended for the respondent to be able to write an answer and some with the ability to check more than one categorical box, if appropriate. Options were also given for the respondent to say that they either did not want to answer or did not know what the question was asking. The workgroup also identified that a set of additional questions were needed in order to ensure respectful communication.
Soon these standards will be put into practice. A bill requiring OHA and the Oregon Department of Human Services to expand their existing race, ethnicity, language and disability data collection standards to include SOGI was recently passed by the state legislature. The state plans to convene a rule-making advisory committee in the winter of 2022 to finalize SOGI standards using the work group’s draft standards as a starting point.
Looking Forward
There continues to be very limited research assessing the validity of SOGI data specifically collected via the Medicaid application process. Absent any type of federal standard, states looking to explore different options to address their SOGI data collection gaps may need to modify SOGI questions depending on a variety of factors including the age, cultural background, and language preferences of their target population; as well as be mindful of the evolving terminology used to identify gender identity. Oregon serves as an excellent model for how to undertake a thoughtful community stakeholder process that can inform efforts to establish new data collection on a topic that is fluid and rapidly changing, yet vitally important for the health of Medicaid populations.
[1] Conron, K.J. & Goldberg, S.K. (April 2020). LGBT people in the US not protected by state non-discrimination statutes. UCLA School of Law, The Williams Institute. https://williamsinstitute.law.ucla.edu/wp-content/uploads/LGBT-ND-Protections-Update-Apr-2020.pdf
[2] 3. Conron, K.J. & Goldberg, S.K. (January 2018). Over half a million LGBT adults face uncertainty about health insurance coverage due to HHS guidance on Medicaid requirements. UCLA School of Law, The Williams Institute. https://williamsinstitute.law.ucla.edu/wp-content/uploads/LGBT-Medicaid-Coverage-US-Jan-2018.pdf
Blog & News
Current Population Survey Shows 2020 National Uninsured Rate Stable, Rising in Three States
September 22, 2021:On September 14, the U.S. Census Bureau released 2020 national health insurance estimates from the Current Population Survey along with public use microdata. These data will serve as one of very few sources of information on 2020 state-level health insurance as the U.S. Census Bureau will not release its normal set of 1-year estimates from the 2020 American Community Survey (ACS), due to impacts of the pandemic that resulted in nonresponse bias and substantially lower response rates.
Given its large sample size, SHADAC typically relies on the ACS to study state and sub-state (e.g., county-level or state-level coverage by race) health insurance trends and posts detailed state estimates on State Health Compare. However, because the ACS data are not being released as usual this year, we recommend that analysts use the CPS, and have posted 2020 estimates from this survey on State Health Compare for analysts and policymakers that need 2020 state-level information on coverage. Differences between the ACS and CPS and considerations for their use are summarized here.
This post presents highlights from the 2020 state-level coverage estimates on State Health Compare and compares 2020 estimates to 2018, a pre-COVID baseline unaffected by pandemic-related data collection challenges.
Uninsurance was stable nationally and in most states
In 2020, 8.6 percent of Americans (nearly 28 million people) were uninsured all year, statistically unchanged from a pre-pandemic baseline of 8.5 percent in 2018. Rates of uninsurance were unchanged in most states, though three states (Arizona, Missouri, and Tennessee) saw increases and five states (Florida, Maryland, Oregon, Vermont, and Virginia) experienced decreases. Tennessee had the largest increase at 4.1 percentage points (PP) (11.4 percent vs. 7.3 percent), and Virginia had the largest decrease at 3.3 PP (5.5 percent vs. 8.8 percent).
More Americans had public coverage and fewer had private coverage
The percent of Americans with public coverage at some point during 2020 increased to 32.8 percent from 32.3 percent in 2018. This equates to 2.4 million more people with public coverage at some point in 2020 as compared to 2018. At the state level, seven states had increases in rates of private coverage (Maryland, Massachusetts, Michigan, New Hampshire, Ohio, Oklahoma, and Wyoming), and only Virginia had a decrease in rates of private coverage. Of these states, Maryland had the largest increase in public coverage at 6.2 PP (31.2 percent vs. 25.0 percent) and Virginia had the largest and only decrease in public coverage at 3.4 PP (25.1 percent vs. 28.5 percent).
The percent of Americans with private coverage at some point during 2020 fell to 58.6 percent from 59.2 percent in 2018, which represents 934,000 fewer people with private coverage. In most states, however, the percent with private coverage remained stable. Just two states (Virginia and West Virginia) saw increases in private coverage, and five states (Colorado, Delaware, Massachusetts, Ohio and Tennessee) saw decreases. Of these states, Virginia had the largest increase at 6.7 PP (69.5 percent vs. 62.7 percent), and Tennessee had the largest decrease at 7.1 PP (54.7 percent vs. 61.7 percent).
2020 coverage estimates available on State Health Compare
In addition to state-level estimates by both broad coverage types (Insured, Private, Public, and Uninsured) and more detailed (Employer, Individual, Medicaid/CHIP, Medicare, and Uninsured), state-level coverage estimates are available by age group (0-18, 0-64, 19-64, and 65+) and health status (Good/Very Good/Excellent and Fair/Poor). Estimates by poverty level will likely be forthcoming. However, coverage estimates by race/ethnicity for most states do not meet our standards for statistical reliability and precision due to the relatively small sample size of the CPS.
Data users should also note that the 2020 State Health Compare coverage estimates from the CPS are not comparable to estimates from the ACS, since the two surveys use different concepts of health insurance coverage and uninsurance. The CPS asks respondents if they had a particular type of coverage at any point during the previous year or if they were uninsured for the entire year. The ACS asks respondents about their health insurance coverage at the time of the interview. More information on this topic will be available in a forthcoming related brief.
Blog & News
New York State of Health Pilot Yields Increased Race and Ethnicity Question Response Rates
September 9, 2021:The following content is cross-posted from State Health and Value Strategies. It was first published on September 9, 2021.
Author: Colin Planalp, SHADAC
Race response rate grew 20 percentage points, ethnicity grew 8 percentage points
Summary
- New York set out to improve race and ethnicity response rates by piloting changes to the question on the Marketplace application.
- The state enhanced its explanation on the importance of the question for applicants and assistors, and it provided new training for assistors and navigators.
- Applicants did not have to share their race or ethnicity, but they could not leave the question blank; instead, they could respond with “don’t know” or “choose not to answer”.
- Among participants in the test, race response rates increased 20 percentage points and ethnicity response rates increased 8 percentage points, while response rates for a comparison group saw minimal change.
- Based on the pilot findings, New York is expanding changes to the race and ethnicity questions system-wide for the next open enrollment period, and the state is considering additional revisions in hopes of further enhancing the quality and completeness of its data.
Introduction
Even during the COVID-19 pandemic, states are striving to enhance health equity. In addition to racial justice movements that arose during 2020, the disproportionate impact of the pandemic itself on people of color highlighted existing health inequities in the United States. These factors have influenced many states to redouble their focus on closing health gaps. However, in order to identify priorities and evaluate improvement efforts, states need high quality and more complete data—a challenge when state health agencies’ data on race and ethnicity commonly contains large gaps. For instance, a recent report by the Centers for Medicare and Medicaid Services (CMS) found that 19 states’ race and ethnicity data was more than 20 percent incomplete, with some more than 50 percent incomplete.
Many states are looking to fill those gaps in race and ethnicity data for Medicaid and related agencies. This expert perspective highlights an effort by New York’s official state-based marketplace, NY State of Health, to improve the completeness of race and ethnicity data that applicants share when applying for Medicaid; Child Health Plus, the state’s Children’s Health Insurance Program (CHIP); the Essential Plan, New York’s Basic Health Program (BHP); or Qualified Health Plan (QHP) coverage through its Marketplace.
Setting Out to Improve Race and Ethnicity Response Rates
In the fall of 2020, staff in NY State of Health began a project to improve response rates in the collection of race and ethnicity data that people are asked to share during the application process. Historically, the agency had seen substantial gaps in those data, with roughly 40 percent of respondents skipping the question on race and 15 percent skipping the question on ethnicity. Because the NY State of Health serves as a single application point for all of the health insurance programs administered by the state, any issues with missing data affect all of those programs.
The state started with a pilot project to test strategies for improving race and ethnicity question response rates. By employing a smaller-scale pilot, the approach would allow them to test a “proof of concept” to determine whether their changes resulted in the intended improvements in question response rates before embarking on a larger effort that would apply to all new and existing program applicants. Demonstrating proof of concept would also provide the state with results to bolster stakeholder engagement, which was particularly important in the New York context because the state has many health plans and assister organizations, which provide assistance to 80 percent of NY State of Health enrollees.
Test Strategies to Improve Question Response Rates
Working with the State Health and Value Strategies (SHVS) technical assistance team, New York tested multiple strategies aimed at encouraging applicants to answer the optional race and ethnicity questions.
Making the case to applicants
Under the Affordable Care Act (ACA), states are prohibited from requiring applicants to share information that is not necessary for determining eligibility for coverage, including race and ethnicity. However, best practices demonstrate that people may be more willing to voluntarily share that information if told how the data are intended to be used (e.g., responses to these questions will be used to inform health equity efforts by identifying health care gaps and enhancing outreach efforts), and how they will not be used (e.g., responses to these questions will not affect program eligibility, plan choices, or access to programs).i Based on that understanding, New York deployed a revised script for the assistors and navigators participating in the pilot:
“Here at [agency name], we are testing a new approach for the next two questions which are on race and ethnicity. Obtaining this information can help us reach and possibly bridge healthcare gaps in traditionally underserved communities.
Please answer the following questions on race and ethnicity. We use this data to improve services to the community and to enhance outreach efforts. You do not have to answer these questions and giving us this information will not affect your eligibility, plan choices, or access to programs.”
While the message remained similar, and crucially still notified applicants that providing race and ethnicity information was optional, the revised introductory language made two key changes. First, it provided additional detail on how the information may be used, “to improve services to the community and to enhance outreach efforts,” rather than the original version that more generically states that “answering them can help us serve your community better.”
Second, the revised language flipped the order of the paragraph, starting with the request to share the information and a brief explanation of how the information may be used, followed by an acknowledgment that providing the information is optional. Conversely, the original introductory paragraph began by stating that “you do not have to answer any questions about race or ethnicity,” which could have discouraged applicants before they even learned why they might consider it.
Requiring a response
Although states are prohibited from requiring applicants from sharing race and ethnicity information, New York wanted to discourage people from rolling past the question with a quick skip. To achieve that, the state asked the organizations participating in the pilot to treat the question as if it required a response. However, applicants were given the option to respond with choices of “Don’t Know” and “Choose Not to Answer” to ensure they had the ability to opt-out of sharing the information (Figure 1). That way, applicants could still decline to share the requested information, but the structure of requiring a response made answering the question just as easy as declining to answer it.
Figure 1. New York’s race and ethnicity pilot question
Training for assistors and navigators
In addition to piloting changes to its race and ethnicity question, New York also emphasized the importance of the data to the health plan and assistor organization that participated in the test pilot. The state provided training for those entities and their navigators and assistors. That training—including a computer-based presentation and written materials—described in detail how assistors and navigators should ask the race and ethnicity questions, including presenting a standardized script, and it explained the importance and purpose of those data: allowing the state to better understand who they are reaching with coverage and who is still being missed.
The state felt that approach was important because most people who obtain health insurance through NY State of Health do so with the help of an assistor (80 percent in 2020), rather than filling out the application entirely on their own. For that reason, navigators and assisters can serve as champions to help collect those data—or alternatively, they could discourage applicants from sharing that data if they do not understand the potential value it carries.
Evaluating the pilot
In addition to testing changes to how the state collects race and ethnicity data, New York also wanted to evaluate whether the pilot appeared to improve response rates. While it is difficult to definitively attribute any changes to the pilot—especially because it coincided with complicating factors related to the pandemic—the state found encouraging results.
In the health plan and assister organization that participated in the pilot, the response rates for race increased 20 percentage points, to 64.0 percent in the test period during 2021, compared to 44.0 percent during a similar period in 2020 (see Figure 2). The comparison group of other assistor organizations in the state, meanwhile, saw an increase of only 0.9 percentage points. The same health plan and assister group also saw the response rate for ethnicity increase 8.0 percentage points (from 80.3 percent to 88.3 percent), compared to a decline of 1.4 percentage points in the comparison group.
Figure 2. Race and ethnicity response rates, pilot test and comparison groups
Race Question | Ethnicity Question | ||||||
2020 | 2021 | Change | 2020 | 2021 | Change | ||
Comparison group | 48.8% | 49.7% | +0.9 pp | 83.0% | 81.6% | -1.4 pp | |
Test group | 44.0% | 64.0% | +20.0 pp | 80.3% | 88.3% | +8.0 pp |
Future Considerations
Based on the successful results of its test, New York plans to implement the piloted changes on its application used by all people to enroll in and renew their coverage. Additionally, the state has continued to consider other revisions to the way in which it collects race and ethnicity data from applicants. The state has not yet implemented changes related to the following strategies, but it is considering making them in time for the next Marketplace open enrollment period.
Single race and ethnicity question
Research from the U.S. Census Bureau has found that a single, combined race and ethnicity question yields a greater response rate and improved accuracy as compared to two separate questions for race (e.g., American Indian/Alaska Native, Asian, White, Black/African American, etc.) and ethnicity (i.e., Hispanic/Latino) (see Figure 3). Historically, New York has combined the two questions into a single page of its enrollment system, but it is considering combining them into a single question that lists all response options (race and ethnicity) together with instructions to select all that apply.
Figure 3. U.S. Census Bureau-tested combined race and ethnicity question
Tailoring race and ethnicity response options
New York also is considering making revisions to the race and ethnicity options it lists as part of the state’s question, with two potential benefits. First, offering options that reflect how individuals identify may make them more likely to respond. Second, offering more granular and intuitive response options could allow the state to analyze the data with more specificity than the typical response categories allow. For instance, the state has a relatively large Middle Eastern and North African population, which typically is considered part of the White response category. However, that may not be intuitive to applicants and limits the state’s ability to analyze the data with concern to health inequities for that population. To facilitate its consideration of that option, the state worked with SHVS technical assistance providers to pull Census Bureau data on the size of various racial and ethnic populations within the state.
Conclusion
With a growing focus on health equity, high quality and complete data on beneficiaries’ race and ethnicity can help Medicaid and other state health programs to identify and mitigate gaps. This example from New York shows how states can test and evaluate strategies to improve the collection of race and ethnicity data to determine and build on approaches that work. And the state’s consideration of new strategies illustrates how improving these data is likely to require persistent and iterative efforts over time rather than any single, easy fix.
i Baker, D.W., et al. (2005). Patients’ attitudes toward health care providers collecting information about their race and ethnicity. J Gen Intern Med, 20(10), 895-900.
Blog & News
New Brief Examines Collection and Availability of Data on Race, Ethnicity, and Immigrant Groups in Federal Surveys
September 09, 2021:Measurement of race, ethnicity, and immigration (REI) status is a critical component of research used to inform policy. However, the standards for measuring these concepts are often confusing for analysts. A new SHADAC brief aims to assist state and federal analysts with survey development and/or analysis of existing survey data to generate estimates of health insurance coverage and access to care across racial and ethnic groups and according to nativity and/or immigrant status.
Survey Background
The new brief presents the collection and classification of survey data for populations defined by race, ethnicity, and nativity/immigrant (REI) status as well as the availability of these data in public use files. We focus in particular on surveys that are conducted by federal agencies and that collect information about health insurance coverage and access to care on an annual or periodic basis for the general population of the United States. Because policy decisions and funding priorities are typically made at the state and local level, the brief emphasizes federal sources that afford state-level estimates as well.
The table below lists the surveys included in the brief, the federal agency responsible for each survey, and whether national and state estimates are available.
Survey | Federal Agency | National Estimates |
State Estimates |
American Community Survey (ACS) | Census Bureau | Yes | Yes |
Behavioral Risk Factor Surveillance System (BRFSS) | Centers for Disease Control and Prevention (CDC) | Yes | Yes |
Current Population Survey (CPS) | Census Bureau | Yes | Yes |
Medical Expenditure Panel Survey – Household Component (MEPS) | Agency for Healthcare Research and Quality (AHRQ) | Yes | No |
National Health Interview Survey (NHIS) | National Center for Health Statistics (NCHS) | Yes | No |
National Survey of Children’s Health (NSCH) | Census Bureau | Yes | Yes |
Survey of Income and Program Participation (SIPP) | Census Bureau | Yes | Yes |
While all these surveys must adhere to Office of Management and Budget (OMB) standards for the collection and classification of race and ethnicity in federal surveys, the organizations conducting each survey implement these guidelines differently, resulting in race and ethnicity data that are not always comparable across surveys. In addition, differing design components in these surveys such as target population, sample design and size, and year and frequency of data collection, also account for comparison difficulties.
An Illustration of Measurement Variation: Ethnicity Measures
Ethnicity Group Categories Collected in Selected Federal Surveys | ||||||||
Ethnicity Group | ACS | BRFSS | CPS | MEPS | NHIS | NSCH | SIPP | |
2019 | 2019 | 2019 | 2018 | 2019 | 2019 | 2018 | ||
Not Hispanic or Latino | X | X | X | X | X | X | X | |
Hispanic or Latino | X | X | X | X | X | X | X | |
Mexican, Mexican-Am., Chicano | X | X | X | X | X | |||
Mexican | X | X | X | |||||
Mexican-American | X | X | ||||||
Chicano | X | X | ||||||
Puerto Rican | X | X | X | X | X | X | X | |
Cuban/Cuban-American | X | |||||||
Cuban | X | X | X | X | X | X | ||
Cuban-American | X | X | ||||||
Dominican (Republic) | X* | X* | X | X | ||||
Central or South American | X | |||||||
Central American | X | |||||||
Central American, exc. Salv. | X* | |||||||
Costa Rican | X* | |||||||
Guatemalan | X* | |||||||
Honduran | X* | |||||||
Nicaraguan | X* | |||||||
Panamanian | X* | |||||||
Salvadoran | X* | X* | ||||||
Other Central American | X* | |||||||
South American | X* | X | ||||||
Argentinean | X* | |||||||
Bolivian | X* | |||||||
Chilean | X* | |||||||
Colombian | X* | |||||||
Ecuadorian | X* | |||||||
Paraguayan | X* | |||||||
Peruvian | X* | |||||||
Uruguayan | X* | |||||||
Venezuelan | X* | |||||||
Other South American | X* | |||||||
Spaniard | X* | |||||||
Other Latin American | X | |||||||
Other Hispanic/Latino/Spanish | X | X | X | X | X | X | X | |
Multiple Categories Allowed | X | X | X | X | ||||
Other Category Specified | X | X | X | X | ||||
*Asterisks denote categories that are not specified in survey questions but are reported as open-ended “other” responses. |
The differential implementation of OMB standards can be seen in the measurement of ethnicity across the seven surveys included in this brief. In federal surveys, “ethnicity” refers specifically to Hispanic ethnicity. The minimum OMB standard for collecting data on ethnicity includes the options: 1) Hispanic/Latino or 2) Not Hispanic/Latino. All seven surveys ask about Hispanic or Latino origin—and in some cases, Spanish as well—using three main formulations for the ethnicity question: whether the respondent is of Hispanic, Latino, or Spanish origin; what Hispanic/Latino group represents the respondent’s ethnic background; or whether the person considers themselves to be Hispanic or Latino. All seven surveys also include further identifying categories of Hispanic origin as a follow-up question to the primary ethnicity question.
State and Local Level Data
Also included in this new brief are sample sizes for selected REI groups by state for the five surveys that contain state identifiers in their public use files (ACS, BRFSS, CPS, NSCH, and SIPP). However, it is important to note that sample sizes for racial, ethnic, or immigrant groups vary depending on the state or the REI group of interest. Even with publicly available geographic identifiers, the sample sizes for specific REI groups at lower levels of geography may be too small to produce reliable estimates. For example, the BRFSS has a relatively large national 2019 sample of American Indian and Alaskan Native (AIAN) (n=6,639) and Asian (n=8,531) subpopulations. However, there are only 170 AIAN and 287 Asian respondents in the 2019 sample for Minnesota, precluding comprehensive research on health equity for these groups in this state that has a large Asian population and one of the larger American Indian populations.
Download a PDF of the brief for further information.
Related Reading
Collection of Race, Ethnicity, Language (REL) Data in Medicaid Applications
State Health and Value Strategies Brief by SHADAC, 2021
Blog & News
New SHADAC Brief Looks at Changes in Federal Surveys During COVID Pandemic
August 27, 2021:The impacts of the coronavirus pandemic have been far ranging—on education, work, healthcare, and almost every other aspect of daily life. However, another effect of COVID that is just now coming to light is the interruption of data collection processes and falling response rates for yearly federal surveys that, among other measures, provide estimates of health insurance coverage for the United States population. Such surveys would, under normal circumstances, be ideally suited to measure changes to insurance coverage during such a tumultuous time.
In a new brief, SHADAC summarizes COVID-era changes and challenges for four major federal surveys—the American Community Survey (ACS), the Current Population Survey (CPS), the Medical Expenditure Panel Survey (MEPS), and the National Health Interview Survey (NHIS)—such as shifting in-person collection methods to phone and email, adding COVID-related questions to questionnaires, extending survey fielding in order to address falling response rates, and delayed data release dates, among others.
All four surveys made significant changes to their Survey Operations:
In-person: The ACS, CPS, and NHIS all suspended in-person operations from March to June 2020, with limited in-person interviews resuming in either July or September 2020. The MEPS Insurance Component (MEPS-IC) suspended in-person interviews entirely.
Mailing: Mailing centers for the ACS and the CPS were shut down in March 2020, but re-opened with limited staffing in July 2020, allowing previously completed surveys to be collected and counted. The MEPS Household Component (MEPS-HC) shifted to dual web-and-mailing collection methods in the fall of 2020 while the MEPS-IC shifted to dual web-and-phone data collection at that time.
Telephone: All surveys (ACS, CPS, MEPS-HC and MEPS-IC, and NHIS) shifted to phone-only surveys while in-person fielding was shut down, or while mailing operations were suspended, in the case of the MEPS-IC.
Web: While both the MEPS-HC and MEPS-IC took a different direction and shifted to telephone-only survey collection in March 2020, the former proposed a shift to dual web-and-mailing data collection modes while the latter opted to shift to dual web-and-phone methods in the fall of 2020.
Three surveys made changes to their Questionnaires:
The ACS and the MEPS-IC opted not make any changes to their questionnaires in response to COVID-19.
The CPS added five new questions regarding COVID’s impact on employment to the questionnaire. The NHIS similarly added new questions to its survey form, but instead asked respondents about COVID testing, contraction and symptoms, and prevention measures.
The MEPS-HC was the only survey to create an entirely new questionnaire: the “Social and Health Experiences” survey, which asked respondents about changes or delays in care due to COVID.
Each survey experienced falling Response Rates:
Where they were able to be measured, survey response rates dropped significantly. Neither the MEPS-HC nor the NHIS has yet released response rates for 2020, but previous years show a drop in response rates for the latter survey of 20 percentage points (80 percent in 2018 to 60 percent in 2019). Both the ACS and CPS saw falling response rates between 2019 and 2020. The ACS dropped 15 percentage points (86 percent in 2019 to 71 percent in 2020), while the CPS fell by roughly 6 percentage points (67.6 percent in 2019 to 61.1 percent in 2020). The MEPS-IC saw a smaller decrease between the years, from 59.6 percent in 2019 to 56.1 percent in 2020.
SHADAC plans to continue monitoring reported changes to these and other federal surveys in 2020, and continuing into 2021 where applicable. As data from 2020 becomes available, our researchers will continue to produce resources to explain what effects these adaptations may have for understanding the data, and if alternative sources may provide ways to fill in gaps where 2020 data may be incomplete or unable to be reliably used.