Skip to content

Usability research findings for Q3 2023

Anne Petersen (they/them) edited this page Nov 15, 2023 · 5 revisions

Summary

In August through September of 2023, we performed usability testing with five participants with visual impairments to assess the usability and accessibility of several U.S. Web Design System (USWDS) components. Participants used a mix of assistive technology, including screen readers and screen magnification software. We found that users experienced the most friction with these components:

  • combo box
  • date picker
  • input mask
  • validation
  • buttons styled as links

Other components performed well, with some, such as character count and file input, only needing minor improvements. Components with the best performance were step indicator and accordion. After discussing the findings, the USWDS team has created issues to track necessary improvements. We will test the improved components again within the next few sprints.

Background

The USWDS team aims to set up a process to regularly test its components for accessibility and UX with people with disabilities. This test was designed to be the first of ongoing component testing. We are building on User Acceptance Testing (UAT) conducted by the Inclusive Interactions Team in 2022 and 2023. We tested the same components, some of which were updated since the last round of testing.

Goals

  1. Assess how USWDS components perform for people with visual impairments so that improvements can be made to enhance accessibility and the user experience.
  2. Test drive the process of conducting usability testing with people with disabilities in order to learn, build, and refine the process for future testing.

Method

  • 1 hour and 20 minute moderated usability testing sessions conducted via video conference.
  • Semi-structured interviews and think aloud protocol.
  • Participants were given a fictional scenario and asked to interact with a semi-functional prototype. We observed and took note of user pain points or otherwise noteworthy interactions.

Participants

Five people with vision impairment participated: three blind, two legally blind but with some vision

Assistive technology and tech setups used in sessions:

Three participants using only screen reader

  • VoiceOver on Macbook Air, Safari browser
  • NVDA on PC laptop with Windows 10 or 11, Edge browser
  • JAWS 2023, mini PC, typically uses Edge browser

One participant using only screen magnification

  • Fusion, PC, Chrome browser

One participant using a combination of screen reader and screen magnification

  • Fusion (screen magnification & JAWS), Dell desktop PC with Windows 11, Chrome browser

Experience levels with using assistive technology (AT) ranged from advanced beginner (two years using AT) to expert (20 years using AT).

General accessibility insights

Most common challenges they face with websites or apps

  • Items that are not clearly marked or labeled. All five users said this. It’s important to organize the page well with proper headings, so users can easily parse the information. One user said, “I feel sometimes like instead of being given a nice organized book, you're just given like this pile of notes and it's just like one by one.”

“If somebody doesn't have headings designated in the website and you're having to tab or arrow through each tiny little thing that's on the page, that can be incredibly time consuming, especially if you're just kind of trying to do a quick look up”

“Wikipedia [is] a good example of how they organize the page. So I can see the, like their tree, so to speak of, of outline of how the content below is laid out.”

  • Popups and advertising that you can’t skip. Two users said this.
  • Calendar pickers. “...why don't you let me just type in the date or use a dropdown menu or something? Because a lot of those times those calendar pickers don't work.”
  • No alt text for images.
  • Interactive maps with a pin instead of an address are unhelpful and unusable.

Advice for making online forms easier to use

  • Keep it simple and use clear labels.

“Just keeping it as simple as possible as far as forms and clearly indicating if it's a required field or not is helpful”

“If those are not labeled correctly, then the person doesn't actually know what type of control they're interacting with. So the screen reader will pick up that something is there but then won't tell you what it is or how you interact with it. So then you're doing a lot of fumbling around trying to figure out how to make it work.”

  • Provide clear ways to navigate multi-page forms, like Previous and Next buttons.
  • Provide helpful error messages upon form submission. For instance, if there’s a missing required field but the error message doesn’t indicate where, the user has to hunt down the problem.

Component testing results

High level findings

Component Major friction experienced Minor friction experienced Good performance Needs more research
Accordion     X  
Character count   X    
Combo box X      
Date picker X      
File input   X    
Input mask X      
Step indicator     X  
Validation X      
Sign-in link styled as a button  X     X
Banner       X

Components that caused the most friction

Input mask

  • When participants entered incorrect characters, the component did not offer proper feedback, so they were not made aware that the characters were unacceptable. Four participants commented about this issue.
  • Users didn’t know that the characters they were typing appeared in the field or automatically got deleted by the mask.
  • One user said it was not apparent to her which characters were acceptable to begin with.

Date picker

  • Keyboard controls did not work as expected. All three screen reader users whom we tested with had difficulty using the date picker. They could navigate the date picker only with up and down arrows, not with the left arrow, right arrow, page up, or page down.
  • User preference on entering dates. Two prefer to type dates in, and one likes memorable date picker dropdowns.
  • One person was not able to tab to the date picker calendar icon and had to use a mouse.
  • One participant entered an incorrect date format but received no feedback (e.g., entering 10/1/2023 instead of 10/01/2023).
  • Date picker calendar icon disappeared for one person who was using a special dark mode and high-contrast extensions for her setup.

Combo box

Four out of five participants experienced friction:

  • Four users expected to be able to use first-letter navigation (that “T” would take them to states starting with “T”). One person said it acted as expected (searched using the full word “Texas”), and this participant used the word “combobox” and seemed familiar with combo box conventions.
  • One person didn’t understand that the “X” to the right in the box is used to clear results. The participant commented that she only saw the “X” in combo boxes where there were multiple selections possible — so she thought the component had a deselect function rather than a clear function. Positive: One person liked that when you enter something wrong, the combobox gives feedback of “No results found.” Often, she sees combo boxes that give no feedback.

Past evidence about combo box bolsters these findings:

  • Combo box problems documented in this combobox bug report. USWDS may consider recommending using a “Select”component for state dropdowns, not a combobox.

Validation

  • Four participants were confused about the purpose of this validation message. In previous testing, two others said the same thing. Users expect an error message (in-line validation or error upon hitting the Submit button) about invalid input instead.
  • No one noticed the state changing (the checkmark) once they entered the correct email.

Components that caused minor friction

File input

Overall, this component is usable. Four participants who tested it clearly understood it was an area where they needed to upload a file. They could all open the dialog box by pressing the spacebar to select a file.

Issues:

  • One person commented that some sort of instructions might be nice. Instead of or in addition to “No file chosen,” guide the user on how to choose a file.
  • One person could open the dialog box but was unable to navigate files to select one. He kept getting lost and accidentally closed the prototype. He said he wasn’t experienced in uploading a file.
  • One person thought it might make more sense for “Drag file here” and “Choose from folder” to be two separate elements. She thought it was a little confusing that they were combined into one button.

Character count

Overall, three participants gave very positive feedback. They liked how the component offered delayed feedback and let you know when you’ve gone over the character limit.

“Oh, it worked really well. It was, it was giving me updates. It wasn't like being overzealous with it and trying to tell me how many characters there were every single time I typed a character like it was waiting until it was done. I think that works pretty well.”

  • That said, a majority of participants (three) prefer a hard cutoff when they have gone over the character limit. In other words, they prefer the system to stop allowing them to type characters. They said it’s annoying to type a lot of text in a box, not being told you’ve gone over the limit until you stop typing. It’s especially annoying for fast typers. Having to go back and see where to edit and cut content is a pain, so they’d rather just be prevented from exceeding the limit with some audible feedback. They still want the feedback of how many characters are left.
  • The screen magnification user needs more visual affordance when she’s gone over the limit. She expected the component to outline the box in red or stop her from typing when she exceeded the limit. She didn’t notice when she went over the limit and didn’t notice the little white hint box saying she went over the cutoff.

Components that performed well

Accordion

Though it seemed like the accordion was not well placed in the prototype itself, most understood that it was a collapsed thing that they could interact with to see more information.

Step indicator

Performed well for four out of five participants. They felt it oriented them well to what step they were at in the form. They also felt they could anticipate how much of the form was left.

Components where more research is needed

Links styled as buttons

In the prototype, we had a “Sign in” button on the front page, but it was actually coded as a link.It was just styled as a button. This setup was not intentional and actually goes against USWDS’s current guidance for buttons. Nevertheless, we received such a strong response from users that we want to look into it more.

All participants who used a screen reader had trouble finding and interacting with the sign in link. People expect the “Submit” or “Sign in” to be an actual button, so that is what they looked for.

Often, screen reader users use keyboard shortcuts to navigate certain elements (like “b” to find all the buttons). If a button is coded as a link, screen reader users might miss it. One person made the point that sometimes sites have hundreds of links. So, if the sign in were coded as a link, it would be hard to find and easy to lose track of.

Based on these results, our team continues to study and discuss links styled as buttons.

Banner

Two users were confused by the banner component.

  • One user mistook it for a header. She said, “I was thinking then that might be more menu options. Because usually any buttons like that that are collapsed at the top of the page like that are usually menu, navigation things.”
  • Another user (has some vision) commented that it was another example of something labeled as a button that wasn’t a button. To this user, the banner looked more like a dropdown menu, combobox, or link.

Recommendations

Prioritize making changes to components where users experienced the most friction and test again with a new set of participants.

Prioritize improvements for:

  • “Sign in” link styled as a button
  • Study use cases for links to be styled as buttons, and adjust our guidance accordingly.
  • Consider testing again with the link coded with role = button to see if the user experience improves.
  • Input mask
  • Provide audible and visible feedback when incorrect format is typed.
  • Continue work on #5227.
  • Date picker
  • Fix keyboard controls within the date picker.
  • Provide a helpful error message when an incorrect format is entered.
  • Automatically enter slashes so the user does not have to type them.
  • Explicitly state in our guidance that the component should always allow for the date to be typed manually.
  • Character count
  • Consider implementing a “hard cutoff’ (i.e., do not allow users to keep typing) when users go over character limit. More research may be needed to determine
  • Implement a salient visual cue, such as a red outline for the character box, when users go over the limit.
  • Combo box
  • Either make changes to combo box to meet user expectations around first letter navigation or discuss whether state and country dropdowns should use the “Select” component instead of the “Combo box.” If necessary, change guidance that recommends not to use select box for more than 15 options.
  • Validation
  • Reevaluate the proper use cases and implementations for validation.
  • Guidance should emphasize giving helpful feedback (e.g., proper error states) to users at the point of need. For example, provide inline validation and error messages upon form submission.
  • We need more error and validation guidance in general (as stated in this presentation).

Next steps

We will convert all of these recommendations into Github issues so that we can begin to make improvements to components. Our aim is to make what improvements we can in the next couple of sprints, and then test again with a different group of participants to see how the components perform.

Related issues

Clone this wiki locally