Skip to content

Techniques for evaluating and developing components

Dan O. Williams edited this page Nov 16, 2017 · 2 revisions

✏️ Discuss this page and make suggestions here.

1. Ways to rate/evaluate new components

Ratings

  • don’t care/don’t need
  • nice to have
  • must have

“I Made This. Does It Go in the System?” by Nathan Curtis

  • Is it relevant to any other product? If so, how many?
  • Is it consistent with the system’s vision?
  • How much will it cost to make and maintain?
  • Does it trigger momentum in a new direction?
  • How deeply can YOU guide its use?
  • Is the timing right for contributor AND system?

Weighted shortest job first (WSJF; a part of Scaled Agile Framework)

  • WSJF = Cost of delay / duration

Cost of delay

  • User-business value – Do our users prefer this over that? What is the revenue impact on our business? Is there a potential penalty or other negative impact if we delay?
  • Time criticality – How does the user/business value decay over time? Is there a fixed deadline? Will they wait for us or move to another solution? Are there Milestones in the critical path impacted by this?
  • Risk reduction-opportunity enablement value – What else does this do for our business? Does it reduce the risk of this or a future delivery? Is there value in the information we will receive? Will this feature open up new business opportunities?

Component value (from Nathan Curtis’ book)

  • scale of reuse
  • similarity
  • specificity
  • significance
  • sustainability
  • sophistication
  • sociability
  • sync

Existing documentation

Contributor guidelines: contributing.md

Component Maturity Scale (deprecated): 🔒 https://docs.google.com/document/d/1Ycvxf5V--hnkWG4-PwZi4vnxokUFUWAmuBsP5ulHFPI/edit

Note: Deprecated because they were too cumbersome and became unused

2. New component build workflow

“Design System Features, Step-by-Step,” by Nathan Curtis

  • Discover
  • Design
  • Build
  • Document
  • Publish

Original USWDS process

  • UX research
  • UX design - wireframes and notation
  • Visual design
  • FE development
  • Documentation (Implementation, Accessibility, Usability (when to use, when to consider something else, guidance)
  • QA (mobile and browser testing)
  • Validation + user testing

Federal Front Door Trello process

  • Discover
  • Research
  • Design
  • Build
  • Test
  • Release
Clone this wiki locally