Enhancing the customer-centricity of GSA websites

How to assess wildly different websites in a consistent way
Oct 30, 2024

Designing an evaluation process

The public evaluates our mission and services primarily through online interactions. In 2021, GSA’s Service Design program in the Office of Customer Experience launched a strategy to enhance our customers’ digital experience. Building upon our case study on composite indicators, we asked ourselves: How do we define compliance? How do we measure it in a consistent way? This led us to a critical prompt: How do we get these wildly different websites to behave in ways customers expect, while aligning to federal policy and law?

Our objective was clear. We needed to design an evaluation process that captured qualitative and quantitative data to help us determine which websites needed improvement.

Over the next three years, we launched a series of digital analytics tools, successfully inventoried over 180 public-facing websites, and interviewed close to 90 website managers to gain a clear understanding of our agency’s digital portfolio. We also produced over 160 recommendations reports (one for each website we evaluated), to help each web team improve digital experience and increase compliance with federal web policies. We learned strategies to integrate both qualitative and quantitative data, ensuring our transformation process remains tightly aligned with our customers’ needs.

The evaluation toolkit: Qualitative and quantitative data in concert

Customer-centricity, also known as the understanding of customer needs and expectations, became the central theme of our website evaluation project. Using a combination of quantitative and qualitative approaches, we created a holistic view of each website’s performance and compliance. Furthermore, our team designed the Enterprise Digital Experience Index to improve the customer-centricity of GSA’s websites, and ground our work in GSA strategic goals and federal web policy, including the requirements for delivering a digital-first public experience.

The index uses qualitative and quantitative measures to determine whether sites are well-managed, and meet customer needs and agency mission. Our team leveraged a series of free, accessible digital analytics tools to evaluate the quantitative side. We also met with every website team in GSA to gather qualitative customer-centricity data.

Two silhouettes face each other, comparing quantitative and qualitative research terms.

Whale Design/iStock via Getty Images

The qualitative component of the index uses human-centered design interviews that have been compiled into evaluation documents. These evaluations identify such things as opportunities for additional coaching, sites that are not properly resourced to meet customer needs, and candidates for website modernization or decommissioning. We assessed each web team’s ability to identify their primary audience, site purpose, whether they used repeatable customer feedback mechanisms, and whether they took action based on customer feedback. Additionally, we evaluated whether teams possessed the necessary skills to improve their websites, and whether they used robust methodologies (in addition to Digital Analytics Program data) to measure the impact of these improvements.

The quantitative component analyzed the website’s accessibility, performance and search engine optimization, user behavior, U.S. Web Design System (USWDS) usage, and presence of required links. We designed a Google Chrome extension for our agency, curated from free and accessible analytics tools such as:

  • Site Scanner and custom crawlers: These tools help us evaluate compliance with agency-determined performance indicators, including the presence of certain USWDS components, and required links (FOIA, accessibility statement, privacy policy, etc.).
  • Digital Analytics Program (DAP): Integrated with Google Analytics, DAP offers a broad view of how users interact with the website, identifying such things as the top referring and outbound sites, page load times, and bounce rates.
  • Google Lighthouse: Assesses performance, search engine optimization, accessibility, and mobile optimization, and provides actionable insights to enhance site performance.
  • Accessibility testing: GSA has acquired and adopted a standard accessibility testing tool to help our web teams assess conformance with Section 508 standards. We scan for accessibility issues, focusing on critical metrics such as keyboard accessibility and alt text for images.

Tightening up digital experience

Sharing the data generated by these tools, along with appropriate context to help our teams understand the meaning of the data, led to significant improvements in website management and user experience.

In the first year of the project, we implemented a digital registry that streamlined the data collection process. Within one month, 100% of our teams had participated and registered their sites — a stark improvement from the previous year’s 70% input rate.

In the second year, we identified a responsible manager for 100% of our websites and partnered with GSA’s Office of Human Resources to develop a 3-part training series to orient people to the expectations and requirements of this role. This effort was based largely on what we learned through our qualitative analysis and has proven transformative for our agency. The management chain for each digital property is now visible at the enterprise level, and each website manager has a strong foundation in digital property management, in alignment with federal requirements and agency best practices. This project also inspired us to contribute to the design and launch of a new enterprise tool to display critical information about each GSA website, including linking each website to common service categories or themes. This helps our web teams gain a holistic perspective of their website and its role in the broader digital experience we offer GSA customers.

In year three, we piloted annual website self-assessments via our Digital Lifecycle Program which provides a framework for website management at GSA. This program provides implementation guidance to comply with over 100+ federal requirements and is a roadmap launching and managing websites, and assessing how we invest in our digital portfolio.

Over the course of this work, we discovered teams that hadn’t assessed the impact of their website on business operations and customer experience, and we identified several outdated products. These teams needed guidance on how to decommission websites that no longer served the public or supported agency mission. In 2024, we developed and published a decommissioning guide.

Through the Digital Lifecycle Program, our agency has reduced our digital portfolio by over 35%, resulting in huge cost savings for the agency, and a better digital experience for our customers. This demonstrates that our collaborative approach to advancing information technology strategies and continually improving our customers’ digital experience is working.

Embracing digital transformation

As GSA leadership continues to refine our agency vision for customer-centric digital service delivery, we can all demonstrate commitment through actions and resource allocation. Integrating both qualitative and quantitative data ensures our transformation process is tightly aligned with our customers’ needs, and positions GSA as a leader in customer centricity and digital excellence.

What can I do next?

Review an introduction to analytics to learn how metrics and data can improve understanding of how people use your website.

You can also join the Digital.gov Web Analytics Community of Practice to connect with government web practitioners who are working to share and make better, data-informed decisions using web analytics and other optimization strategies.

If you work at a U.S. federal government agency, and would like to learn more about this work, reach out to GSA’s Service Design team at [email protected].

Disclaimer: All references to specific brands, products, and/or companies are used only for illustrative purposes and do not imply endorsement by the U.S. federal government or any federal government agency.