I am excited and thankful to our CHI 2020 reviewers for supporting the publication of our latest paper “‘All in the Same Boat’: Tradeoffs of Voice Assistant Ownership for Mixed-Visual-Ability Families.” This paper was a collaboration between Kevin Storer––my talented PhD student––Tejinder Judge––Senior UX Researcher in Google’s Voice Assistant group––and myself. In this study, we looked at how blind parents with sighted partners and children negotiated tensions around use of smart speaker voice assistants in their homes. Please stay tuned for a pre-print and more details about our findings.
Thanks to generous sponsorship from Toyota, in collaboration with UMBC, over the next two years INsite Lab will be developing mobility technologies for people with a range of disabilities. More details here.
My MS thesis student, Antony Rishin, and I are looking forward to sharing our latest study on Voice Assistants for accessibility. Check out an advance copy here: “Reading Between the Guidelines: How Commercial Voice Assistant Guidelines Hinder Accessibility for Blind Users.”
Voice assistants like Google and Siri hold great potential for people who are blind––no screens (sort of)! But, research done by my student Ali Abdolrahmani uncovered significant usability and accessibility challenges. Antony Rishin and I decided to find out whether these challenges can be explained by the voice assistant design guidelines that are published by companies. We did a content analysis of hundreds of pages of guidelines from Google, Amazon, Microsoft, Apple (which had shockingly sparse design documentation), and Alibaba. Long story short, there’s a lot of work to be done to bring these documents up to speed on accessible, usable experiences for people who are blind and others.
More details coming soon. Until then, you can download a pre-print of the paper below:
Storer, K. & Branham, S.M. “That’s the Way Sighted People Do It: What Blind Parents Can Teach Technology Designers About Co-Reading with Children.” In Proceedings of the ACM Conference on Designing Interactive Systems (DIS ’19), San Diego, CA, June 23-28, 2019. 10 pages. (acceptance rate: 25%) (Honorable Mention – top 2%) forthcoming
I am proud of my MS Thesis student, Antony Rishin Mukkath Roy, for taking home First Place at UMBC’s “Gritty Talk” graduate student competition during GEARS 2019. He is presenting preliminary work at iConference 2018, in a poster titled “Beyond being human: The (in)accessibility consequences of modeling VAPAs after human-human conversation.”
I am thrilled to announce that NSF CRII program is supporting my new research program around disabled parents and early childhood literacy development. The official abstract for the project, titled “CRII: CHS: Making Universally Usable Technologies to Enhance Parent-Child Co-Reading and Early Literacy Skills at Home,” is now published on the NSF website. The $175,000, two-year grant will primarily go toward funding the studies of a PhD student and compensating research participants for their time and expertise. Read the full story for more details, and to learn what my talented colleague, Daniel Epstein, has in store under his new NSF CRII grant:)
My outstanding PhD student, Kevin Storer, and I submitted our first paper together as advisor-advisee team this past January (squee!), and we’ve just been notified of its acceptance to DIS 2019🙂 The paper, titled “‘That’s the Way Sighted People Do It’: What Blind Parents Can Teach Technology Designers About Co-Reading with Children,” is the first HCI research to approach parent-child co-reading practices from the perspective of parents with disabilities. Stay tuned for the camera ready publication; until then, here’s our abstract:
Co-reading (when parents read aloud with their children) is an important literacy development activity for children. HCI has begun to explore how technology might support children in co-reading, but little empirical work examines how parents currently co-read, and no work examines how people with visual impairments (PWVI) co-read. PWVI’s perspectives offer unique insights into co-reading, as PWVI often read differently from their children, and (Braille) literacy holds particular cultural significance for PWVI. We observed discussions of co-reading practices in a blind parenting forum on Facebook, to establish a grounded understanding of how and why PWVI co-read. We found that PWVIs’ co-reading practices were highly diverse and affected by a variety of socio-technical concerns – and visual ability was less influential than other factors like ability to read Braille, presence of social supports, and children’s literacy. Our findings show that including blind parents in the design process offers key insights into co-reading, which help technologies in this space better meet the needs of both blind and sighted parents and children.our forthcoming DIS 2019 paper
Recently, a paper I co-authored with Cindy Bennett and Erin Brady about “Interdependence” was accepted to my favorite conference and, icing on the cake, received an award. This has been wildly exciting for me, because when I started to write about this very topic four years ago, others in my academic circle were uncertain that the topic was worthwhile. Especially as a PhD student or a young postdoc, it can be difficult to turn skepticism from your academic heroes into productive argument-fortification. For me, this is a story of “stick-with-it-ness” that helps me remember nothing is impossible if you keep questioning, keep reading, and––most importantly––keep coalition-building. Thank you, Cindy, Erin, ASSETS 2018, and all the people from feminist / DS / STS and disability activists who inspired us.
Our paper “Safe Spaces and Safe Places” earned a Recognition of Contribution to Diversity and Inclusion along with 4 other awesome papers at CSCW 2018! This community continues to show its commitment to inclusive excellence as it helps us raise awareness of safety and wellbeing of transgender people. Thank you CSCW and SIGCHI!
Cindy Bennett, Erin Brady, and myself are so pleased to learn our paper “Interdependence: A Frame for Assistive Technology Research and Design” earned a Student Best Paper Award (top 2% of submissions) at ASSETS 2018! This is one of my favorite papers of all time––so happy others are enjoying it, too.