A New Framework for Assessing Safety Net Delivery
The existing system for evaluating state safety net programs is not adequately assessing program delivery because it does not capture the human experience of accessing services.
This new National Safety Net Scorecard is a more meaningful set of metrics that could be used as “vital signs” to effectively assess the true state of the current program delivery landscape and measure progress over time. These metrics would reframe what quality service looks like for benefits administration, driving us closer to a human-centered safety net.
Safety Net delivery and progress should be measured against these three key indicators of program success:
Equitable Access
Accessibility means removing barriers so that all people can easily access services across languages, racial and ethnic identities, ages, and levels of ability. It also means providing a dignified, respectful, and welcoming experience that makes it as easy as possible to access benefits. An accessible program has many open doors that successfully and equitably invite in all eligible people.
Online Accessibility
Is the online experience of the service simple and easy to use?
Measure the following:
- % of applications submitted via paper, desktop, mobile, landline phone, in person
- % of applications with at least one verification document submitted electronically among applications where verification was required
- Of website visitors who begin the application process, % who are unable to successfully create an account, if applicable.
- Of website visitors who begin the application process, % who are unable to successfully complete remote identity proofing, if applicable
Account creation and remote ID proofing can be significant barriers to entry for many online applicants.
Mobile Accessibility
Does the service work easily on a mobile phone?
Measure the following:
- Of applications submitted online, % of applications submitted from mobile, tablet, and desktop
- % of recertifications and periodic reports completed via mobile, tablet, and desktop
To be truly accessible, websites need to be built with a mobile first design.
“My phone is my computer.”
Call Center Accessibility
Is the call center easily accessible to all users?
Measure the following:
- Average # of calls per day
- Average call wait times (by language)
- Call abandonment rate: % of calls dropped before the customer speaks to an agent (by language)
- First call resolution rate: % of issues resolved with one call (by language)
“Honestly, I have like a million accounts with CAFÉ because I done forgot my CAFÉ information and I just don't have the time to … be waiting on the phone for an hour or two trying to get them on the phone.”
“Yes I know how to call. I have been calling every day for several weeks. The system tells me there are too many callers and then ends the call.”
Local Office Accessibility
Are in-person locations easily accessible to all visitors?
Measure the following:
- # of visitors to each local office relative to staffing ratios
- Average wait times (by language)
Application Burden
How difficult is the application process for the service?
Measure the following:
- Average # of minutes to complete an application and renewal, via desktop and mobile
- Application completion rate: # of complete, submitted applications compared with applications started via desktop or mobile (by language)
Our 50 state report found that the time it takes to complete online applications varies dramatically, from 15 minutes in some states to 120 minutes in others. For our GetCalFresh service in California, the average application completion time is 15 minutes with less than 1% of applicants submitting minimal applications.
Customer Satisfaction
How satisfied are clients with the application experience?
Measure customer satisfaction rates via user responses to these recommended questions:
- How easy or difficult was it to complete the application?
- How confident were you about which programs to apply for?
- How confident are you that you answered the questions correctly?
- How confident are you that you know what the next steps in the process are?
Customer service questions can be thoughtfully integrated at multiple stages of the process. For our benefits application pilot in Minnesota, we provide a space for quick feedback at the end of the application, and we also send an automated post-application survey to applicants within one hour of applying. We track responses over time to know how clients are feeling about the application experience, then adjust the design accordingly.

Effective Delivery
Safety net programs should operate smoothly to serve all people who are eligible. Programs need to get things right the first time—like determining someone’s eligibility and calculating the correct benefit amount—so the experience can be fast, easy, and positive for clients.
Application Outcomes
Who is approved and who is denied for benefits?
Measure the following:
- Total application volume: # of cases per week/month
- % approvals and denials
- By application type (online, mobile, in person, paper, landline phone)
- By race/ethnicity, language preferences, and other key demographics
- By annual household income (e.g. $0, 100% FPL, over 100% FPL)
- By common denial reasons
Procedural Denials
How many applicants are denied for reasons outside of financial eligibility?
Measure: % of applicants denied for procedural or administrative reasons
- By denial reason (e.g. missed interview, missing documents)
- By race/ethnicity, language preferences, and other key demographics
Exits from safety net programs spike when paperwork is due. In California, an estimated 500,000 income-eligible households leave SNAP each year, and Code for America found that in Los Angeles county, at least 1 in 3 clients were denied for SNAP because of a missed interview.
“[My interview] was at 9:00 AM but I didn't even receive a phone call…and I had to work so I sat by the phone and I waited and waited and I never got a call.”
Timeliness
How long does it take for people to receive benefits?
Measure the following:
- Average # of days from application date to determination, for both approvals and denials
- By expedited and regular service
- Average # of days from case approval to EBT card activation
- % of Medicaid determinations completed within 24 hours
- % of SNAP applications determined within the mandated time frame (30 days for regular, 7 days for expedited service)
- % of Medicaid applications determined within the mandated time frame (45 days for regular, 60 for disabled)
It can also be helpful to break out application processing times as a distribution. For example, the share of applicants that are approved/denied within one day, two days, …, up to and beyond the mandated time frames. This can help to tease out nuance that is lost when averages obscure clumping of approval or denials at the shortest and longest lengths of time. It also gives administrators flexibility to identify more local goals or targets.
Expedited Service (SNAP)
Are clients who qualify receiving reliable expedited service?
Measure the following:
- % of applications processed as expedited
- % of expedited applications processed within 7 days
- % of expedited participants whose benefits continue
Interview Completion
Are clients able to complete the required interview?
Measure the following:
- % of interviews completed over the phone
- # of cases denied for missed interview (by language and other key demographics)
“I never know when my interviews are until after the date and I get mail about that set date appointment. Who can I call to give them information to get my application approved?”
Notifications
Are notifications effective at reaching clients?
Measure the following:
- % of clients opted in for text messages
- % of clients opted in for email
- % of email notifications opened
- % of notices returned as undeliverable (by paper and email)
Code for America’s research has shown that difficult-to-understand or untimely notices are one of the most significant pain points clients face while trying to access benefits. Caseworkers across the country confirm that bad notices make the process harder:
“They get too many notices. They get notices that contradict a notice. In one weekend they can get 5 different notices. The system is sending them out, the workers are sending them out. That's where we get a lot of our in-person traffic. They come in [because] they don’t understand what’s going on.”
Verifications
Are clients able to submit accurate verification documentation?
Measure the following:
- % of cases that request verification
- % of cases denied for missing verification (by race/ethnicity, language preferences, and other key demographics)
If missing verification is a consistent reason for benefit denial, agencies may want to consider whether every verification document is absolutely necessary. Over-verification can put excess strain on applicants and lead to them dropping out of the application or renewal process:
“...too many documents are required. ID and proof of income are needed, which is understandable, but also asking for proof of ownership for my home, my mortgage bills and insurance documents, made it feel like there is no privacy.”
Renewals
Are clients able to successfully renew their benefits?
Measure the following, and break out by key demographics when possible:
- % of SNAP periodic reports approved and % denied
- % denied for form not returned
- % denied for missing verifications
- % of SNAP recertifications approved and % denied
- % denied for form not returned
- % denied for missed interview
- % denied for missing verification
- % of SNAP periodic reports and recertifications submitted online
- % Medicaid renewals approved and % denied
- % renewals completed ex parte
- By MAGI / Non MAGI
- % denied for form not returned
- % denied for failure to submit verification
- % renewals completed ex parte
- % of cases that lose benefits mid-certification
- By reason
“I never received the sar7 renewal, now I received a notice of action about my CalFresh benefits being discontinued! I tried contacting my worker number but it does not allow me to leave messages. I’m 78 yrs old and on disability I cannot have my benefits expire as it’s my only source for food. Can someone please help me with this issue so I don’t lose my benefits? My income or any information hasn’t changed since my original application. Please help.”
Churn
How often are clients churning off of programs they are eligible for?
Measure the following, and break out by key demographics when possible:
- % of new applicants that received benefits in the past 60 days
- % of cases with a renewal or report due that then reapply for benefits within the following 30, 60, 90 days (by periodic report vs. recertification)
Churn measures how often eligible participants lose benefits for procedural reasons and then reapply to regain benefits within a short period of time. High churn rates indicate unnecessary rework on the part of clients and state employees. The first of the measures above captures the impact of churn on state agency workloads, but may be clouded by other explanations of changes in application trends. The second captures how common churn is for households that have lost benefits, but does not capture how frequently case closures occur. These measures are therefore most informative when used together.
Compassionate Integrity
People should receive the benefits to which they are entitled. Benefits should be delivered correctly according to policy, and people should benefit to the fullest extent from available policy flexibilities.
Participation Rate
Is the program reaching everyone who’s eligible, with a focus on identifying and prioritizing populations that are often underserved by government?
Measure the following:
- # of SNAP participants compared with the total # of estimated eligible people
- # of Medicaid participants compared with the total # of estimated eligible people
- Participation rates by race/ethnicity, language, age group, and other demographic markers
- Cross-program enrollment rates (e.g. % of Medicaid enrollees who are also enrolled in SNAP, % of SNAP households with a child under six who are also enrolled in WIC, etc.)
- % of applicants who are applying for the first time
This is an area where it is especially important to break out data by key demographics to better understand the participation gap and identify outreach or streamlined enrollment opportunities. It can also be difficult to accurately gauge participation rates without a clear understanding of how big the eligible population is. States can build an understanding of their eligible population by using data sources like the number of weekly unemployment claims, a state’s monthly numbers on people in the workforce, and statistics on state residents’ personal incomes.
Accuracy
How accurate are benefit allotments?
Measure:
- Total amount of overpayments and underpayments as a share of issuance
- % of cases with overpayments
- % of cases with underpayments
- Accuracy of determinations
- % of cases inaccurately denied
- % of cases inaccurately approved
- # of overpayments filed against clients
- % for intentional program violation
- % for agency error
- % for household error
“Yes, I was approved for one dollar ($1.00) per-month. Reason given [was that] I was overpaid $400 … $1.00 per month. What can I do to receive more aid?”
Appeals/Hearings
How smooth and responsive is the appeals process?
Measure the following:
- # of appeals made
- % of appeals where the agency decision is upheld
- % of hearings where agency decision is reversed
“We shouldn’t measure data for the sake of measuring it. Let’s say that … we identify a disparity. Unless [technical assistance] is able to truly help me understand policy or business processes that may be contributing to that disparity, or is able to show me what other states with better performance on that measure are doing, it doesn’t help. You can’t just measure something for the sake of measuring it … You have to actually help states drive outcomes on that measure.”
Federal Agencies
Federal agencies should be collecting comparison data to track changes and trends across the nation over time. These comparisons also highlight opportunities for technical assistance and peer-to-peer learning, and can direct policy initiatives and pilot programs for solutions like demonstration waivers.
State Agencies
State agencies should be tracking data progressions over time, observing trends in key “vital signs” metrics to measure progress.
State agencies should be comparing metrics across jurisdictions (counties, offices) to identify trends, bright spots, and challenge areas. These comparisons highlight opportunities for business process changes, technical assistance, and peer-to-peer learning.
Vendors or IT Systems
State agencies should ensure their vendors or IT systems regularly monitor and report back on the usability of client-facing digital products and tools, and confirm they can be responsive if a usability issue is discovered. Usability includes things like:
- Is the language easily understood by all users (written in plain language, at a fifth grade reading level, etc.)?
- Are websites accessible to all users (readable by screen readers, readable on all browsers, etc.)?
- Are systems able to quickly recover from bugs and breaks (monthly hours of website downtime, time to recover from downtime incidents, etc.)?
For more on building people-centered digital services, check out our Blueprint for a Human-Centered Safety Net.