Skip to content

Recap: Component workflow session

Dan O. Williams edited this page Nov 17, 2017 · 1 revision

Meeting recap: Component workflow session (11-14-2017)

Summary

On Tuesday, November 14, 2017, the US Web Design Standards Core team met for an hour to talk about why and how to add new components to the Standards. While we had some existing guidelines (https://github.com/18F/web-design-standards/wiki/Component-prioritization) and came to the table with a good additional research (https://github.com/18F/web-design-standards/wiki/Techniques-for-evaluating-and-developing-components) we quickly came to understand we had something of a philosophical impasse over the role that Standards and the Core team have in the development of components — that is: given the choice, do we spend our Core resources building components (more like a conventional design system) or building the tools and guidance necessary for others to build components (something like Material Design). This is important, but beyond the scope of the meeting. We understand that this is a key component of our mission, identity, and vision, and we ended the meeting asking ourselves what questions we want to be asking of ourselves and our users to help us find a bigger answer.

Attendees

  • Maya Benari
  • John Donmoyer
  • Andrea Sigritz
  • Dan Williams, product owner

What we did

  • Assembled a list of a few common decision frameworks for evaluating a) why we should develop a component; and b) how to develop that component once we’ve identified it as important.
  • Spent a pre-meeting comment period responding to the list in a Google doc.
  • Reviewed the techniques and agreed on the distinction between “why” and “how”
  • We looked to the example of Sites and considered the value of flexibility and customization
  • Discussed focus and value: as time is a limited resource, how do we spend it where it matters most?
  • Decided that focus and value are evaluated through the lens of vision.
  • Brainstormed a list of questions we want to ask ourselves and users of the Standards that might help us determine our present value and opportunities to grow that value.

What we learned

  1. Components are commitment with costs. Developing and maintaining components takes time, and each new component adds incremental cost to the system. How do we balance the cost against the value of the component?
  2. Value is determined by context. The question is: valuable to whom? And in what context? If we want to determine what’s valuable to the Standards, we need to know more about what makes us valuable to our users, then optimize around that value. It’s the transitive property, but it’s also a loop. It is an ouroboros of value.
  3. The opportunity for change is a blocker, and that’s serious. Knowing that we have the opportunity to change the vision for the Standards is preventing us from making decisions now, since we’re worried that decisions we make now could be invalidated by future decision making. But vision work takes time, so the fact that this can be a blocker to present work is a serious concern. How can we make necessary lightweight decisions in a context of uncertainty? How can we quickly define what we know to be certain?
  4. We need to understand our core value and vision. Determining any value-oriented process (and which aren’t?) depends on understanding the product vision and value. And, as we saw above, because there’s uncertainty about whether we’ll need or want to adjust any aspect of our core vision, it’s making it hard to make what we see as upfront downstream decisions.
  5. We need to understand who we are now and where our users are now. It’s increasingly clear that we need to know who we are now to make decisions now, and to lay the groundwork for any future changes. We need to understand our current position, value, vision, and successfulness in the context of what our users need and expect to make any kind of rational change.

Potential questions for the team and our users (raw)

We brainstormed the following list of questions we'd like to answer. (Unedited)

  • Existing user: What components do you want, that aren’t available?
  • Existing user: What type of components do you want more of?
  • Existing user: Are you doing a redesign/refresh of your site(s) in 2018?
  • Existing user: How can we better ‘harvest’ your contributions?
  • Existing user: What type of guidance is useful for you around using the Standards?
  • Existing user: Do you go back and update once new versions are out?
  • Non user: Why haven’t you adopted the Standards?
  • Non user: What do you need to adopt components of the Standards?
  • How/when do you decide when to customize an existing component?
  • When you choose to build your own version of something that already exists, why did the team come to that decision?
  • What factors influence the look and feel of the components you are using in your site/application?
  • When you choose to customize an existing component from the standards, what are you changing?
  • How do you change something in the standards (style overrides, copy/paste code, manual edits etc)?
  • What do you expect out of the components that the Standards are already offering? What do you expect to be there, how feature-rich do you expect them to be? How many variants of something should there be?
  • What kind of guidance do you look for when you’re modifying an existing component in the standards? Are you worried about “breaking” things? How would you know if something were broken?
  • What do we need to deliver as a statement of our vision?
  • What research to do need to support any vision statement?
  • Who do we need to talk to?
  • What do we need to ask these folks?
  • What does ongoing research look like?
  • What can we measure and measure against?
  • What does the process of creating and “unveiling” this vision look like?
  • What is the final deliverable/process?
  • How do we set ourselves up for success, both in developing a vision and communicating it to community and stakeholders?
  • What do we need research to know?
  • Is there anything we don’t need research to know?
  • What problem is the Standards solving for you?
  • How can the Standards better solve that problem?
  • What problems do you wish the Standards could solve?
  • Did the Standards make your life easier? How?
  • Did the Standards make your product better? How?
  • Did the Standards save you time or money?
  • What do end users (i.e. not developers) value about the Standards?
  • How can the Standards act as a valuable proxy for baseline user needs?
  • How can the Standards continue to improve the lives of the people who use them?
  • What is the role of the Standards?
    • Product owner notes:
    • End-user advocate; something of a proxy for baseline UX
    • We’ve baked our research and best practices into the building blocks and guidance we provide
    • Thus, we have a strong need for research-backed solutions
    • Accessibility is a killer feature
    • Responsiveness is a killer feature
    • Human centered design advocate
    • Development and design facilitator
  • How did you build new components for the Standards when there was no existing component?
  • What techniques/frameworks have you developed with outside the Standards?
  • Would you want to submit your new component back to the Standards for official inclusion?
  • How did you override the standards?
  • Do you plan to upgrade to a new version of the Standards in the future on this site/project?
  • What build guidance did you use from the Standards? What was most helpful?
  • How did using the Standards affect your ability to pitch your project?
  • How can the Standards help make the case for doing valuable work?
  • How can the Standards help that work actually happen?
  • Do you have experience building with utility classes
  • How important is customization to your agency/client?
  • What is the killer feature of the Standards?
  • What do you love about the Standards?
  • What is our vision?
  • What does success look like (in 5 years, 10 years)? What effect do we want to have?
  • Should we spend time researching other component libraries?
  • Where are we in terms of composability, variation, and robustness of our current components?
  • Metrics from current users: how much time has the Standards saved you? Would you use it again and recommend it to others?
  • Who are our primary users?
  • How do we prioritize needs from different agencies?

Product owner notes on a component workflow

  • I’d like to flesh out more of what UX research means, and I’d like to think about what this means in the service of a) lowering the barrier to prototype; and b) raising (or, perhaps, hardening, making more rigorous) the barrier for official inclusion
  • What research do we need to begin design work? How do we document this research?
  • In that vein, how can this component workflow function as an inclusion path for new work — potentially new work not done by the Core team?
  • I wonder if some of the design/build/test could be done via a prototype/beta program — that is, we can release some new components for testing via a beta/prototype path (proposed → beta → supported; or something simpler than the previous maturity scale) where they’re exposed to system users and have a chance to collect feedback — potentially before they’re even named (via a utility class prototype path; something like that Tailwind describes here).
  • How might the component prototype/beta program overlap with continuous lifecycle research — specifically, how do we research and test new components as part of a research model that covers before/during/after phases of USWDS development?
  • A strong focus of any new component work is accessibility coherence with the rest of the system. How can we think of the Minimum Accessible Component, and how might that drive our system architecture, our theming, and our build guidance? I think building with MAC is the heart of the system, the core around which we iterate, optimize, and refine.
  • Earlier work I’ve done with design principles lead me to a evaluative framework I’d called CARED (outlined below; as in “Have we CARED?” — to some degree based on Spotify’s TUNE framework described in Design Doesn’t Scale). Could this have relevance to this discussion? We talk about Building and Testing, but less about Evaluating in connection with those stages, which deserves being significantly more explicit. CARED is an acronym which stands for:
    • Clear
    • Accessible
    • Resilient
    • Efficient
    • Direct
  • How is all this work tied to our presentation of the component on our site? How much of this work can/should we capture, document, and present on the site? How much can/should a user be able to see the process that went into developing the component? (For instance, if this were a blog, each component would be a tag — and we’d document our process as posts, tagging by components. Then, each component could link to a “tag page” that’d be a record of development work done on that piece.)
  • A lot of what we’re thinking about here has some relevance to a question of how much the USWDS is a design system (Polaris; a focus on remixing existing parts with little customization; new components developed in house and deployed), and how much it’s a framework (Bootstrap; a focus on building with more customization expected; new components often developed in the field based on guidance) plus a collection (a gallery?) of components, possibly components that may or may not be officially supported by the system, like beta or proposed community components. I lean to the side of Framework+, that we should focus on providing a flexible system for building, clear development guidance centered on accessibility and supported/described by a palette of utility-based building blocks, a MAC kit of supported patterns that are built to be customized, a public pathway for contributing new components, and a gallery of components — both officially supported and not-yet-officially supported.

Next steps

  • Launch a research project around the current state of the USWDS. (#2244)
  • Collect existing research to identify existing core value and needs. (#2245)
  • Produce a report on the state of the Standards and suggestions for the future by the end of this Core increment. (#2246)
Clone this wiki locally