How might machine learning lower barriers to accessibility?

The NCSU 1st year Masters of Graphic Design Studio took on just this question over the course of the semester. We began with a series of conversational interface studies, moved on to a Future Artifact workshop with the amazing Krissi Xenakis, and, then, capped off the semester with a 6-week collaboration with IBM’s Watson Health team.

Over the course of this project, the grad students investigated how machine learning might lower barriers to accessibility for blind, visually impaired (BVI) and deaf, hard-of-hearing (DHH) users.

Enjoy these Scenario Videos of their final concepts. Scroll down to take a look at their user-centered design process.

NICO Project

Designers: Ellis Anderson, Alysa Buchanan, Matt Babb

Here-U Project

Designers: Shadrick Addy, Jessye Holmgren-Sidell, Matt Lemmond, Krithika Sathyamurthy

The Design Process

Matrix matching problematic tasks to Watson capabilities
Benchmarking and user interviews
Personas and Scenarios
Current "as is" user journey maps
Ideation exercises such as "What If?" exercise
Storyboards and Initial Concepts
Crits with IBM
User Testing
Revised User Journey Maps
Hi-fi prototypes
scenario video rough cuts

The Design Process:

  1. Matrix Exercise: mapping Watson tools to problematic tasks for DHH and BVI users
  2. Benchmarking and User Interviews
  3. Personas and Scenarios
  4. User Journey Map of current user experience
  5. Ideation: What If Exercise to explore possibilities improv style
  6. Sketches
  7. Roughs
  8. Storyboard of User Experience
  9. Crits with IBM
  10. User Testing
  11. Revised User Journey Maps
  12. Hi-fi Prototypes
  13. Scenario Videos and Final Presentations