Design Validation Through Data:
Implementing a SUS Survey at HRSA

The Health Resources and Services Administration's (HRSA)

SUS is a simple, standardized questionnaire that evaluates usability in terms of ease of use, complexity, confidence, and learnability. It produces a score from 0 to 100 that benchmarks the experience against industry standards.

This is an ideal tool to:

  • Quantify usability without requiring extensive user testing

  • Ensure objective, consistent data collection

  • Complement qualitative insights from user feedback

Overview

Project: Health Resources and Services Administration (HRSA) – Grant Submissions
Timeline: May – June 2025
Team: UX Designer (Hannah), Business Analyst, Project Manager, Federal Stakeholders

To evaluate the usability of three newly redesigned pages—Submissions, Program Links, and Linked Grants—I implemented a System Usability Scale (SUS) survey. This evidence-based tool helped us quantify the impact of our UX work, establish a usability baseline, and guide future iterations with real user input.

Challenge

After releasing redesigned versions of key HRSA grant management pages, our team needed a way to validate whether these updates improved the user experience. Traditional usability testing wasn’t feasible due to time and access constraints, so we turned to SUS as a lightweight, scalable solution for gathering consistent, data-driven insights from end users.

Project Goals

  • Validate whether design changes improved usability

  • Establish a usability benchmark for future design iterations

  • Strengthen stakeholder trust in UX methods through measurable results

  • Identify areas for improvement through both quantitative and qualitative feedback

My Role

As the Senior UX Designer, I led every phase of the survey initiative:

  • Designed and customized the SUS questionnaire

  • Aligned questions with core user flows and tasks

  • Built the survey in Microsoft Forms and created a scoring spreadsheet in Excel

  • Synthesized the data and presented key takeaways in a stakeholder-ready report

Discovery & Survey Strategy

The SUS survey was tailored to assess three core user flows:

  • Submissions: Reviewing and managing federal grant submissions

  • Program Links: Connecting grants to specific program types

  • Linked Grants: Managing related or exception grants

We selected SUS for its standardized format and fast implementation. It offered a reliable way to measure usability in terms of ease of use, complexity, confidence, and learnability—with a simple 10-question format and a 5-point Likert scale.

Each question was intentionally aligned to specific tasks across the three pages. Alternating positive and negative statements helped mitigate response bias and ensured a balanced view of the user experience.

Survey Questions with Purpose Statements 

  1. I think that I would like to use the submissions page frequently.

    • Purpose: Assesses overall user satisfaction and the perceived usefulness of the submissions page.

  2. I found the submissions page unnecessarily complex.

    • Purpose: Identifies perceived complexity and usability issues in the submissions page’s layout or functions.

  3.  I found it easy to add a program link.

    • Purpose: Evaluates the intuitiveness and ease of the “link” action on the program links page.

  4.  I think I would need technical support to add program links.

    • Purpose: Measures users' perceived need for assistance when interacting with the program links page, indicating difficulty or confusion.

  5.  I found the functions across the submissions, program links, and linked grants pages to be well integrated.

    • Purpose: Evaluates how well users perceive consistency and cohesion between the three pages.

  6. I thought there was too much inconsistency across the submissions, program links, and linked grants pages.

    • Purpose: Highlights design or interaction inconsistencies that might disrupt the user experience across the different pages.

  7. I think most people would learn to use the linked grants page very quickly.

    • Purpose: Gauges the learnability of the linked grants page, especially for new or infrequent users.

  8.  I found the linked grants page difficult to use.

    • Purpose: Captures any usability or navigational challenges specific to the linked grants page.

  9.  I felt very confident adding an exception grant.

    • Purpose: Evaluates user confidence and task success when performing the add exception grant action.

  10.  I needed to learn many things before I could easily add exception grants.

    • Purpose: Measures task complexity and learning curve for the add exception grant process.

  11.  Please share any additional feedback about your experience with the submissions, program links, or linked grants pages. Any suggestions for improvement or other comments are welcome!

    • Purpose: This question gathers additional comments or suggestions from users about their experience with the submissions, program links, or linked grants pages, helping us link feedback to the other survey questions and identify areas for improvement.

Scoring & Analysis

The SUS score is calculated on a 0–100 scale:

  • Positive questions are scored as (response − 1)

  • Negative questions are scored as (5 − response)

  • The total is multiplied by 2.5 for a final score

This scoring method provides a benchmark for usability and enables easy comparison across future design phases.

SUS Score and Usability Rating

85-100 Excellent

70-84 Good

50-69 Ok

25-49 Poor

0-24 Awful

Impact

When implemented, this SUS survey will help the HRSA team:

  • Quantitatively validate design changes

  • Prioritize enhancements based on user feedback

  • Establish a repeatable process for usability measurement

  • Align product decisions with user needs and expectations

  • Advocate for continued investment in UX improvements

Current State

Although the project is currently paused due to government restrictions, the survey has been developed and is ready to launch once development resumes. It will serve as a key step in validating and refining future iterations of HRSA’s grant management platform.

Previous
Previous

Redesigning a Federal Compliance Manual for Simplicity and Speed: Streamlining regulatory workflows

Next
Next

Visual Design for Corporate Marketing Team