This week, Deque Systems hosted the inaugural Axe-Con. I wish I could have been able to attend more sessions, but the sessions I did attend were fantastic, and I learned a lot. Here are some of my takeaways.

Glenda Sims on Automation and Intelligent Guided Testing

See my livetweets from Glenda's talk.

In her talk, Glenda Sims shared Deque Systems' data (PDF) about accessibility defects uncovered in audits, and talked about how automation can help us surface those defects earlier in the process.

According to the Deque report, of all accessibility defects uncovered by audits, 57.38% were surfaced by automation. That's… way more than I would have thought.

When we use automation to catch accessibility defects, we're usually surfacing issues that are lintable or easy to put binary, pass/fail formulas to — things like "Do all images have alt text?" or "Does all text meet color contrast requirements?" Indeed, 30% of the surfaced accessibility defects were color contrast issues.

Can automation help us surface defects from more complex interactions like modals? Glenda says yes, by working in tandem with developers. Axe DevTools has introduced intelligent guided testing (IGT). The extension walks the developer through some of the more complex flows in their application, and asks whether the given experience is expected and intuitive. Based on the developer's feedback, Axe determines whether to surface a defect.

Glenda expects that, between traditional automated accessibility testing and intelligent guided testing, automation will be able to help surface 70%, maybe 80%, of issues. I'm excited to keep following this space.

Anna E. Cook on Auditing Design Systems for Accessibility

See my livetweets from Anna's talk.

"There's a misconception in design communities that accessibility tends to be mostly developers' responsibility, but developers can't fix design issues when they're design-centric. In fact, a Deque case study from last year found that 67% of accessibility issues originate in design."


Anna E. Cook shared how design systems can contribute to a site's accessibility, as well as their process for auditing those design systems.

I was particularly struck by the relationship between atomic design and accessibility. Design systems can make a site more accessible by establishing accessible color palettes, focus styles, typography, and more. However, atomic approaches can also introduce inaccessibility when they fail to establish guidance for holistic accessibility concerns. For instance, does your design system enforce/encourage proper header order, or determine accessible form validation and error handling experiences?

When auditing a design system, Anna includes the following in their feedback:

  • The component audited
  • The WCAG principle impacted (Perceivable, Operable, Understandable, Robust)
  • The WCAG Success Criterion impacted
  • A description of the defect
  • A recommended fix
  • The impact to end users
  • The audit date

The Readability Group on Readable Typefaces

See my livetweets from The Readability Group's talk.

The Readability Group, who recently performed a survey to measure the readability of several fonts, started by setting forth what they consider their three pillars of accessibility, which they use to gauge typefaces:

  1. Emotional accessibility: Is it appealing?
  2. Technical accessibility: Is it built correctly?
  3. Functional accesssibility: Does it work?

Oftentimes, when accessibility advocates pore over data and statistics, they're focusing on the technical and functional components of accessibility. However, typefaces depend on delicately balancing all three, and in how they communicate their survey data, they try to take all three factors into account.

A common theme throughout their analysis of their survey data was that the data often did not support the conventional wisdom about what makes typefaces readable. Namely…

  • Symmetrical, mirror-shaped characters (b, d, p, q) are often hailed as a readability aid, but fonts that portray these characters as symmetrical reflections of each other performed within margins of error with asymmetrical fonts.
  • Fonts hailed as good for dyslexic readers (Open Dyslexic, Dyslexie, Comic Sans) performed very poorly overall.
  • Conventional wisdom holds that sans serif fonts are more readable than serif fonts. While the most readable fonts from the survey were sans serif fonts, by and large, serif fonts performed just fine.

The Readability Group will keep poring over this data to glean more insights, but for now, it seems like the biggest, consistent indicator of readability for the typefaces that were studied was letter spacing.

Sarah Fossheim on Accessible Data Visualization

See my livetweets from Sarah's talk.

Sarah Fossheim shared the accessibility problems that can come from data visualization, and ways we can remedy them. From the get-go, they shared something that hadn't quite clicked for me before: data visualization itself is already a way to make information more accessible.

However, data visualization relies on visual cues (it's in the name!) such as color, contract, opacity, shapes, groups, and animations to convey the story of that information. This locks out people with visual disabilities. Sarah recommends using tools like Colorable to generate color palette where the colors each contrast with each other and with the background, but they also caution against making your colors too bright — some people, including many autistic people, are sensitive to very bright colors.

Other visual approaches to ensure colorblind users can understand your data visualization include patterns and icons. Sarah cautioned against overloading users with patterns, since they can clutter your page and, at worst, clash heavily with each other providing more sensory discomfort. Icons can work well — just make sure you provide screenreader alternatives and consider multicultural interpretations of your icons.

Many data visualizations lean on mouse interactions, such as displaying labels and legends on hover. Make sure that anything the user can do with their mouse, they can also do with their keyboard. This will help anyone who uses keyboard navigation, including screenreader users.

One of the biggest takeaways I had from Sarah's talk was that when building a data visualization, you should consider what the purpose of the dataviz is. What are you hoping the end user will get from the chart? The answer to that question will likely depend on your target audience — an audience of laypeople will likely be looking for something very different than an audience of academic researchers. What does that target audience care about? Whatever that takeaway is, whatever your audience is looking for, you should curate the accessible experience to focus on delivering that bottom line.

One way we can do this is building redundancy into the visualization. Use color, text, placement, and any other tricks you've got together so that end users can get the information they care about quickly.

Andrew Hayward on Accidental Advocacy

See my livetweets from Andrew's talk.

I thought Andrew Hayward's talk on advocacy and steering your organization's ship towards accessibility was very moving, and I know I'll be sitting on these thoughts for a while and musing on how I can be a more proactive advocate going forward.

Andrew defined advocacy as:

"Support or argument for a person or community, helping them to express their views and needs, and standing up for their rights."

For Andrew, advocacy relies on proactively challenging normativities, challenging the status quo centered on abled cis white men's experiences. To be proactive advocates, we have to:

  • Decenter ourselves and our own experiences
  • Use inclusive language, and consider who we might be excluding with our language
  • Proactively ask how to provide access
  • Consider our (many) audiences, and ask how advocating for one audience might impact others
  • Engage other people in our mission
  • Take our time and recognize our limits

In some cases, we're in a position where we have influence to create top-down change. In these cases, we have to ensure that we foster a culture where advocacy can thrive instead of getting squashed out. This environment requires:

  • The psychological safety and blameless culture to ensure people can safely challenge inaccessibility
  • Framing work as a learning problem, and not an execution problem — we're all constantly learning and improving, and we won't get it right the first try
  • Challenging solution aversion (or, the tendency to reject big problems whose solutions we don't like)

Andrew cautions that change may come slowly, but encourages us that over time, our influence and mission will grow until our advocacy becomes the norm.

Gerard Cohen on ARIA

See my livetweets from Gerard's talk.

Gerard Cohen put on an introduction to ARIA for the uninitiated. I attended because, although I'm pretty familiar with it by now and written about ARIA before, I'm super interested in new ways to introduce ARIA to beginners.

Gerard believes that ARIA is often excessively vilified — see, for instance, all the times that the First Rule of ARIA, "Don't use ARIA if you can use HTML instead," is unceremoniously flattened to the quippier, less-nuanced "Don't use ARIA."

That said, Gerard believes ARIA has been so troublesome for web developers for two reasons:

  1. Developers don't take the time to learn it.
  2. Support can be inconsistent across browsers and assistive technology.

Gerard spent some time walking through the ARIA specs, and demonstrating how the ARIA written can impact a screenreader user's experience.

At one point, he turned his attention to the ARIA Authoring Practices. This document by the World Wide Web Consortium provides tons of code snippets for implementing accessible widgets using ARIA attributes. He provided a few interesting caveats for the ARIA Authoring Practices that I hadn't seen before, so I wanted to call those out here:

  • Assumes perfect browser/assistive technology support for ARIA
  • Not designed for mobile/touch support
  • As a testbed for ARIA practices, the document may use superfluous ARIA where semantic markup solutions may have sufficed

I'll take these caveats to heart the next time I'm introducing someone to ARIA.

Kyle Boss on Accessibility and the Jamstack

See my livetweets from Kyle's talk.

Kyle Boss talked about how Jamstack developers, particularly those developing single-page apps built with component frameworks (so, for instance, sites built with Gatsby or Next), can build accessibility into their sites.

For instance, many single-page apps optimize routing by prefetching route contents and assets when the user hovers near a link — that way, when they click, everything the route needs to render is already right there. However, because single-page applications rerender on new routes, rather than triggering a hard page load, screenreaders don't provide the user feedback when they click a link. As far as the user is concerned, they clicked the link and nothing happened.

Kyle showed how we could use ARIA to create a RouteAnnouncer component — a live region that announces the new page's title. His implementation was inspired by the accessibility work from the Jamstack community, particularly the work that Marcy Sutton and Madalyn Parker put in at Gatsby.

He also showed off how the component model, particularly the higher-order component pattern, could be used to create components that requires accessibility practices or ensures sensible defaults for accessibility. In this particular case, Kyle created an Image higher-order component that wrapped around a given static-site generator of choice's component for optimized image rendering, to provide a developer-submitted alt text and fallback to a default of the empty string.

I think the issues Kyle addressed impact more single-page applications beyond static sites, but I think he provided clear, approachable solutions to them.