Video – Innovating the Future of Work with Blind People

Last April, I was delighted to engage in conversation about my lab’s research as part of the HCI and the Future of Work and Wellbeing dialogue series, hosted virtually at Wellesley College. Alongside my PhD students, Ali Abdolrahmani, Kevin Storer, and Emory Edwards, we shared what we have learned about the future of work based on the experiences of people who are blind or low vision. The title, abstract, and video recording of our lively dialogue can be found below.

Title: Innovating the Future of Work with Blind People

Abstract: Time and again, when technologists have imagined the future of work, they have done so without consideration of people who are blind. Look no further than the display you are currently reading—the first displays and touchscreens appeared in the 1960s and 70s, while the first screen reader to make them accessible wasn’t invented until 1986. This is not atypical; most technologies are indeed “retrofit” for accessibility, often years and decades after their first introduction. Given this, how exactly do blind people work in the 21st century? What technical barriers do they face, and to what extent are barriers technical as opposed to sociocultural? How do we break the innovate-retrofit cycle, and what role can HCI scholars and practitioners play? For the past 7 years, my research has explored these questions with blind students and collaborators, through qualitative inquiry and participatory design–an approach, I argue, that not only results in accessible technologies from the start, but that also can lead to radical innovation that improves work for all. I look forward to engaging these ideas in dialogue with you.

CHI21 – Paper Accepted – Latte: Automating use case testing for accessibility

I was fortunate to collaborate with some of my colleagues in Software Engineering here at UCI, on this work led by PhD student Navid Salehnamadi. Latte builds on the pervasive practices of GUI use case testing, instead simulating screen reader (for people who are blind) and switch (for people who have limited dexterity) navigation through Android apps. Check out the technical details, how this approach outperforms contemporary methods, and what we learn about the future of accessible app development.

Salehnamadi, N., Alshayban, A., Lin, J.-W., Ahmed, I., Branham, S.M., Malek, S. “Latte: Use-Case and Assistive-Service Driven Automated Accessibility Testing Framework for Android.” In Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems (CHI ’21), Online Virtual Conference (originally Yokohama, Japan), May 8-13, 2021. (acceptance rate: 26%)

CHI21 – Paper Accepted – Voice interfaces and childhood literacy

I was fortunate to collaborate with recent UCI graduate, Ying Xu, from the UCI School of Education on this exciting study of voice-based communication apps targeting children. When we compared recommended adult-child communication patterns for building early literacy skills with those currently available through voice interfaces, we find the latter very much lacking. Check out our video preview and full paper for design recommendations.

Xu, Y., Branham, S.M., Deng, X., Collins, P., Warschauer, M. “Are Current Voice Interfaces Designed to Support Children’s Language Development?” In Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems (CHI ’21), Online Virtual Conference (originally Yokohama, Japan), May 8-13, 2021. (acceptance rate: 26%)

CHI21 – Paper Accepted – Transactional voice assistants

Building from our collaboration with Toyota, my senior PhD student, Ali Abdolrahmani, led this paper on how we can make voice assistants work better for both blind and sighted folks in contexts outside of the home. Check out our short video preview, and read the full paper!

Abdolrahmani, A., Gupta, M.H., Vader, M.-L., Kuber, R., Branham, S.M. “Towards More Transactional Voice Assistants: Investigating the Potential for a Multimodal Voice-Activated Indoor Navigation Assistant for Blind and Sighted Travelers.” In Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems (CHI ’21), Online Virtual Conference (originally Yokohama, Japan), May 8-13, 2021. (acceptance rate: 26%)

Blog – Academic Job Market – Pt.1 – Materials

Quick links: Research StatementTeaching StatementDiversity StatementCover Letter

Going on the academic job market in search of a tenure-track position at a research-focused institution can be scary––it was for me, at least. By the time I got up the nerve, I had a non-linear career path (4 years post-PhD in a teaching-focused position). I had dramatically changed research topics twice (advisor change in grad school, and once again for my postdoc). And, I didn’t really understand the current landscape of academia in my field, HCI, in part because my advisor never had to navigate it (Steve Harrison came directly from industry and was a Professor of Practice).

Fast forward to 2019. I am very happily seated in an office in Donald Bren Hall at UC Irvine’s Department of Informatics in my second year as an Assistant Professor. And, when I look back, I realize much of that fear was truly unnecessary. I have been collecting stories of other scholars with non-linear paths (mostly through Geraldine Fitzpatrick’s Changing Academic Life podcast, which I highly recommend), and reflecting on what I wish I had known just a couple years ago. So, in this post––which I plan to extend in chunks over time––I will share some of the resources and advice from kind mentors who helped me make it through, as well as some things I would do differently if I could have another go. I hope, wherever you may be on your journey and whatever you ultimately decide, you find parts of this post useful as you plan next steps.

Materials

I benefitted immensely from the job materials posted publicly by scholars like Jon Froehlich and Erika Poole, and materials shared by mentors like Amy Hurst. Don’t be shy to poke around the websites of your academic heroes, or even ask them directly, for copies of their materials. In the spirit of paying it forward, I am happy to share my:

  • Research Statement
    Notes: I decided to go for a two-page statement, though for a TT research position, longer statements are common. My assumption is that most faculty don’t have time to read more.
  • Teaching Statement
    Notes: I put aside advice to (1) make this one page only, (2) make this about my philosophy as opposed to my practice. Having worked three years as a full-time Lecturer, I had a significant amount of teaching experience under my belt, so I opted to showcase this in two pages with evidence. Your mileage may vary.
  • Diversity Statement
    Notes: As with my teaching statement, I opted to focus on my practice. Diversity and inclusion are a core part of my identity and the research, teaching, and service I seek out. If this isn’t the case for you, my example may be less useful.
  • Cover Letter
    Notes: The cover letter should be highly tailored for each institution, but it also needs to tell the core story of your research, teaching, and service. In this copy, I’ve removed the bits that were specific to my plans at UCI.

Preview: Rounding Up Job Ads

The next section I will write will revolve around which mailing lists I joined and which websites I scoured, as well as how I managed all of the positions in a spreadsheet. Perhaps the best advice I will give will relate to how you can make the job opportunities come to you:) Stay tuned for this and other sections, including:

  • Getting Feedback on Your Materials
  • Knowing When You’re Ready & the Narrative of “Fit”
  • Preparing for Phone and On-Site Interviews

Video – 10 min summary of Voice Assistant research

Brews and Brains at UCI is a student-led initiative to support science communication to the general public, a topic near and dear to my heart. So, when they invited me to share my team’s research on voice assistants and people with vision impairments at a local pub, I was all in. This event took place on October 15, 2019. As of December, the work I draw on is or will soon be reported in academic-ese in various venues:

  • Storer, K., Judge, T.K, Branham, S.M. “‘All in the Same Boat’: Tradeoffs of Voice Assistant Ownership for Mixed-Visual-Ability Families.” CHI 2020, forthcoming
  • Abdolrahmani, A., Storer, K.M., Mukkath Roy, A.R., Kuber, R., & Branham, S.M. Blind Leading the Sighted: Drawing Design Insights from Blind Users Towards More Productivity-Oriented Voice Interfaces. TACCESS Journal. forthcoming
  • Branham, S.M. & Mukkath Roy, A.R. “Reading Between the Guidelines: How Commercial Voice Assistant Guidelines Hinder Accessibility for Blind Users.” ASSETS 2019
  • Storer, K. & Branham, S.M. “That’s the Way Sighted People Do It: What Blind Parents Can Teach Technology Designers About Co-Reading with Children.” DIS 2019

This was fun to make, and I hope you find it fun and accessible to watch. Many thanks to Brews and Brains, who honored my request to caption the video, and who didn’t tease me when I went for a wine glass instead of a stein:)

CHI20 – Paper Cond. Accepted – navigation needs across disabilities!

Congratulations to my MS thesis student Maya Gupta and numerous other (under)graduate students who made our forthcoming paper (a collaboration with Ravi Kuber‘s group at UMBC) “Towards More Universal Wayfinding Technologies: Navigation Preferences Across Disabilities” possible. This interview study observed the navigation preferences of people across disability groups––primarily older adults, people with vision impairments, and people with mobility impairments––to see what we can learn about the most universally preferred features for routing technologies to include. Please stay tuned for a pre-print and more details about our findings.

CHI20 – Paper Cond. Accepted – mixed-visual-ability use of VAs!

I am excited and thankful to our CHI 2020 reviewers for supporting the publication of our latest paper “‘All in the Same Boat’: Tradeoffs of Voice Assistant Ownership for Mixed-Visual-Ability Families.” This paper was a collaboration between Kevin Storer––my talented PhD student––Tejinder Judge––Senior UX Researcher in Google’s Voice Assistant group––and myself. In this study, we looked at how blind parents with sighted partners and children negotiated tensions around use of smart speaker voice assistants in their homes. Please stay tuned for a pre-print and more details about our findings.