U.S. flag

An official website of the United States government

Government Website

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Safely connect using HTTPS

Secure .gov websites use HTTPS
A lock () or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

CX Metrics Dashboard Learnings Report

One of Secretary Alejandro Mayorkas’s top priorities for 2023 was to “innovate and transform our delivery of services to advance mission execution, improve the customer experience, and increase access to services.”


The DHS Customer Experience (CX) Directorate was charged to “identify and harmonize customer experience metrics across DHS,” in a dashboard available to the public on the DHS website. The CX Steering Committee created the Data Collection and Measurement working group to support this effort by working to identify outcome1 (or customer goal)-based metrics among the operational Components.

Identifying Key Metrics

After consulting with component agencies about typical activities, the working group identified types of key metrics to share, narrowing down reporting to these two categories:

  1. Time to next step or decision (wait times)
  2. Requests for information or participation (volume of requests for service)

Initially the working group thought that selecting two categories of metrics would allow us to fulfill the harmonization portion of the goal. However, we realized that for many components, simply identifying operations CX data was a success. We received operational metrics from Federal Emergency Management Agency (FEMA), Transportation Security Administration (TSA), and U.S. Citizenship and Immigration Services (USCIS), which we assembled and arranged into an online CX Metrics Dashboard.

1 We use “outcomes” to refer to the concrete experiences and life changes that customers can achieve through services. Because “outcomes” can be a term of art in performance and evaluation, we also used the term “customer goals” to reflect the same meaning. The purpose in either case is to shift the focus from the service itself to the actual experience that customers have with and because of the service.


Stories. In the current status quo, CX as a field, is dominated by satisfaction metrics, analytics, and surveys. Government, in general, has a tendency towards big numbers in big lists. This project approached metrics from the angle of telling specific, intentional stories about key services.

Operational data. This project attempts to bring operational data into the conversation about CX metrics. When paired with solid user research and human-centered design practices, how might operational data unlock new ways of thinking about customer experience or point out areas of opportunity?


Purpose. This report summarizes learnings from the CX Metrics Dashboard development process. It covers what we did, why and what we learned.

Motivations. There are two broad motivations for doing this work: Policy, and strategy. In part, the dashboard responds to the Secretary and HSAC priority to identify and harmonize CX metrics across DHS. Strategically, as we establish CX as a function across DHS, we see a need to measure the outcomes of CX work. Building capacity to measure operational data helps set a baseline for CX measurement and give context to surveys and other data.

Democratizing measurement. Long-term success of CX in the organization depends partly on a measurement mindset that asks,

“What are the experiences that the public expects of us?”

“What experience do we want our customer to have?”

Metrics allow us to point at the measurable, concrete ways that we change people’s lives for the better—or show us the distance we have left to go.

Metrics as a medium. Data and metrics are one means by which government communicates with the public and federal staff. The current norm for communicating data within DHS is to embed statistics within long PDF reports. This project shifts that practice to deliver bite-sized numbers with context in a clear, accessible, and digestible format. It emphasizes the value of data with minimal interpretation: direct, to the point, digestible.


Approach. We approached the dashboard as a capacity-building exercise that became more focused on product delivery over time. We sought to build organizational muscle for identifying collecting and sharing metrics, both internally and externally. Throughout, we continually discussed the work’s purpose and honed our intended outcomes.

Stories. From the beginning, we brought a focus on human-centered data storytelling. If data and metrics are a means to tell a story, then it begs the question,

“What story and to what end?”

Similarly, “Whose story and how?” exist in this dashboard. Although we set out to include a range of DHS components, many stories remain unwritten.

Research. Very early user research suggests that the dashboard is a helpful first step in the direction of greater transparency. Both our stakeholders are well as test users have shared this sentiment. We have an opportunity to further this research by asking,

“What helps CX practitioners in telling the story of CX?”

Conversations. This work evolved through a series of conversations. In particular, back-and-forth between CX and Performance2 helped us make sense of the scope of the project and what we could do with what we had.

2 The Government Performance and Results Act (GPRA) and GPRA Modernization Act established and strengthened Performance as a government function. Performance teams help government agencies set performance goals and objectives — and measure progress towards meeting them. CX is considered by OMB to fall under performance measurement. There is ample ground for partnership between CX and Performance offices.


Diversity. There is a particular challenge around showing DHS metrics because of the diversity in nature of the various Component services. We found it fit to ask,

“What benefit does a casual visitor from the public get from seeing CX metrics from across various DHS agencies on one webpage?”

Burden. We wrestled with the burden that data collection imposed on federal staff, even while arguing that the effort was worthwhile. Government largely remains in a reactive posture when it comes to data, which probably informed the sense of burden.

How do we consider burden in requests for data?

As DHS grows its measurement maturity and moves towards a strategic data posture, we can look for opportunities to dovetail efforts and make good use of existing data.

Public needs. Based on our initial user research, we hypothesize that the public wants to see granular, localized data about services, as close to real-time as possible. At the same time, we wonder whether generalized service metrics can be useful.

Could metrics such as the ones we shared help set basic expectations and build back public trust?

Risk. When it comes to public services, time-based metrics for customer experience tend to vary by location and size of service center. However, we have limitations in government around what we can share. Before sharing, we must consider what might be fodder for litigation or testimony.

How do we balance carefulness with transparency in what we share?

Decentralization. A central challenge of working with data and metrics from the vantage point of HQ is a lack of cohesion and control. We do not control the data from where we sit. This required us working with Components individually to understand what data was available and was appropriate to share; which in turn meant meeting Components where they were when it came to data. Components had varying levels of maturity when it came to measurement.


Consider what metrics may be useful to field employees.

How might the metrics we create be useful to those in the field and headquarters alike? Metrics can be a medium for transparency between not only government and the public, but between HQ and field offices.

Field observation can help you learn how field staff use metrics to enhance their own work. You may find that leaders in the field are using metrics creatively to drive continuous improvement in ways that leaders at HQ do not get to see.

Consider the unintended consequences of metrics on the work of staff at various levels in the organization.

There is a tension between what executives and the public might look for from metrics. That might also differ from what Component and field level staff would find useful. In short, metrics are not one-size-fits-all. There may be many possible metrics futures, with different kind of data products that serve different users—executive, field, and public.


Lessons.We created the CX Metrics Dashboard to meet the Secretary’s priority to identify customer experience metrics across DHS. In the process, we encountered challenges with gathering operational data across a complex government agency. Moreover, we found that operational data has limits in its ability to speak to customer experience. Data drawn from operations can shine light on moments or stages of a customer experience. But can it measure what matters most to customers or detect change due to customer experience efforts?

Harmonize. In a reframe of the Secretary priority to “harmonize” CX metrics, we’ve identified an opportunity to achieve greater harmony across three types of customer experience data: user research, customer feedback, and operational data.

Envision. CX metrics should come from a deep understanding of one's customer. Base your metrics on a vision of how you seek to improve the customer experience – grounded in user research. Use metrics to roadmap your way to that vision, with progress metrics to inform and drive outcomes along the way. 

Weave. We will continue to partner with components across DHS to guide outcomes-based CX measurement. We look forward to weaving relationships within and across DHS HQ and components. Together, we can build a shared picture of the impact of our work.

  • Connect With Us

    • Learn about the work we did and lessons learned
    • Have a sounding board for your metrics work
    • Build relationships and partnerships
    • Get guidance and training on creating CX metrics

    Email the CX Directorate

Related Content

Last Updated: 06/13/2024
Was this page helpful?
This page was not helpful because the content