Going on the academic job market in search of a tenure-track position at a research-focused institution can be scary––it was for me, at least. By the time I got up the nerve, I had a non-linear career path (4 years post-PhD in a teaching-focused position). I had dramatically changed research topics twice (advisor change in grad school, and once again for my postdoc). And, I didn’t really understand the current landscape of academia in my field, HCI, in part because my advisor never had to navigate it (Steve Harrison came directly from industry and was a Professor of Practice).
Fast forward to 2019. I am very happily seated in an office in Donald Bren Hall at UC Irvine’s Department of Informatics in my second year as an Assistant Professor. And, when I look back, I realize much of that fear was truly unnecessary. I have been collecting stories of other scholars with non-linear paths (mostly through Geraldine Fitzpatrick’s Changing Academic Life podcast, which I highly recommend), and reflecting on what I wish I had known just a couple years ago. So, in this post––which I plan to extend in chunks over time––I will share some of the resources and advice from kind mentors who helped me make it through, as well as some things I would do differently if I could have another go. I hope, wherever you may be on your journey and whatever you ultimately decide, you find parts of this post useful as you plan next steps.
I benefitted immensely from the job materials posted publicly by scholars like Jon Froehlich and Erika Poole, and materials shared by mentors like Amy Hurst. Don’t be shy to poke around the websites of your academic heroes, or even ask them directly, for copies of their materials. In the spirit of paying it forward, I am happy to share my:
Research Statement Notes: I decided to go for a two-page statement, though for a TT research position, longer statements are common. My assumption is that most faculty don’t have time to read more.
Teaching Statement Notes: I put aside advice to (1) make this one page only, (2) make this about my philosophy as opposed to my practice. Having worked three years as a full-time Lecturer, I had a significant amount of teaching experience under my belt, so I opted to showcase this in two pages with evidence. Your mileage may vary.
Diversity Statement Notes: As with my teaching statement, I opted to focus on my practice. Diversity and inclusion are a core part of my identity and the research, teaching, and service I seek out. If this isn’t the case for you, my example may be less useful.
Cover Letter Notes: The cover letter should be highly tailored for each institution, but it also needs to tell the core story of your research, teaching, and service. In this copy, I’ve removed the bits that were specific to my plans at UCI.
Preview: Rounding Up Job Ads
The next section I will write will revolve around which mailing lists I joined and which websites I scoured, as well as how I managed all of the positions in a spreadsheet. Perhaps the best advice I will give will relate to how you can make the job opportunities come to you:) Stay tuned for this and other sections, including:
Getting Feedback on Your Materials
Knowing When You’re Ready & the Narrative of “Fit”
Brews and Brains at UCI is a student-led initiative to support science communication to the general public, a topic near and dear to my heart. So, when they invited me to share my team’s research on voice assistants and people with vision impairments at a local pub, I was all in. This event took place on October 15, 2019. As of December, the work I draw on is or will soon be reported in academic-ese in various venues:
Storer, K., Judge, T.K, Branham, S.M. “‘All in the Same Boat’: Tradeoffs of Voice Assistant Ownership for Mixed-Visual-Ability Families.” CHI 2020, forthcoming
Abdolrahmani, A., Storer, K.M., Mukkath Roy, A.R., Kuber, R., & Branham, S.M. Blind Leading the Sighted: Drawing Design Insights from Blind Users Towards More Productivity-Oriented Voice Interfaces. TACCESS Journal. forthcoming
Branham, S.M. & Mukkath Roy, A.R. “Reading Between the Guidelines: How Commercial Voice Assistant Guidelines Hinder Accessibility for Blind Users.” ASSETS 2019
This was fun to make, and I hope you find it fun and accessible to watch. Many thanks to Brews and Brains, who honored my request to caption the video, and who didn’t tease me when I went for a wine glass instead of a stein:)
Congratulations to my MS thesis student Maya Gupta and numerous other (under)graduate students who made our forthcoming paper (a collaboration with Ravi Kuber‘s group at UMBC) “Towards More Universal Wayfinding Technologies: Navigation Preferences Across Disabilities” possible. This interview study observed the navigation preferences of people across disability groups––primarily older adults, people with vision impairments, and people with mobility impairments––to see what we can learn about the most universally preferred features for routing technologies to include. Please stay tuned for a pre-print and more details about our findings.
I am excited and thankful to our CHI 2020 reviewers for supporting the publication of our latest paper “‘All in the Same Boat’: Tradeoffs of Voice Assistant Ownership for Mixed-Visual-Ability Families.” This paper was a collaboration between Kevin Storer––my talented PhD student––Tejinder Judge––Senior UX Researcher in Google’s Voice Assistant group––and myself. In this study, we looked at how blind parents with sighted partners and children negotiated tensions around use of smart speaker voice assistants in their homes. Please stay tuned for a pre-print and more details about our findings.
Thanks to generous sponsorship from Toyota, in collaboration with UMBC, over the next two years INsite Lab will be developing mobility technologies for people with a range of disabilities. More details here.
My MS thesis student, Antony Rishin, and I are looking forward to sharing our latest study on Voice Assistants for accessibility. Check out an advance copy here: “Reading Between the Guidelines: How Commercial Voice Assistant Guidelines Hinder Accessibility for Blind Users.”
Voice assistants like Google and Siri hold great potential for people who are blind––no screens (sort of)! But, research done by my student Ali Abdolrahmani uncovered significant usability and accessibility challenges. Antony Rishin and I decided to find out whether these challenges can be explained by the voice assistant design guidelines that are published by companies. We did a content analysis of hundreds of pages of guidelines from Google, Amazon, Microsoft, Apple (which had shockingly sparse design documentation), and Alibaba. Long story short, there’s a lot of work to be done to bring these documents up to speed on accessible, usable experiences for people who are blind and others.
More details coming soon. Until then, you can download a pre-print of the paper below:
Storer, K. & Branham, S.M. “That’s the Way Sighted People Do It: What Blind Parents Can Teach Technology Designers About Co-Reading with Children.” In Proceedings of the ACM Conference on Designing Interactive Systems (DIS ’19), San Diego, CA, June 23-28, 2019. 10 pages. (acceptance rate: 25%) (Honorable Mention – top 2%) forthcoming
I am proud beyond words of my PhD student, @kmstorer! His first first-author paper, about how blind parents co-read with their children, has received an #HonorableMention at #DIS2019. Celebration when I return from #CHI2019!
I am thrilled to announce that NSF CRII program is supporting my new research program around disabled parents and early childhood literacy development. The official abstract for the project, titled “CRII: CHS: Making Universally Usable Technologies to Enhance Parent-Child Co-Reading and Early Literacy Skills at Home,” is now published on the NSF website. The $175,000, two-year grant will primarily go toward funding the studies of a PhD student and compensating research participants for their time and expertise. Read the full story for more details, and to learn what my talented colleague, Daniel Epstein, has in store under his new NSF CRII grant:)
My outstanding PhD student, Kevin Storer, and I submitted our first paper together as advisor-advisee team this past January (squee!), and we’ve just been notified of its acceptance to DIS 2019🙂 The paper, titled “‘That’s the Way Sighted People Do It’: What Blind Parents Can Teach Technology Designers About Co-Reading with Children,” is the first HCI research to approach parent-child co-reading practices from the perspective of parents with disabilities. Stay tuned for the camera ready publication; until then, here’s our abstract:
Co-reading (when parents read aloud with their children) is an important literacy development activity for children. HCI has begun to explore how technology might support children in co-reading, but little empirical work examines how parents currently co-read, and no work examines how people with visual impairments (PWVI) co-read. PWVI’s perspectives offer unique insights into co-reading, as PWVI often read differently from their children, and (Braille) literacy holds particular cultural significance for PWVI. We observed discussions of co-reading practices in a blind parenting forum on Facebook, to establish a grounded understanding of how and why PWVI co-read. We found that PWVIs’ co-reading practices were highly diverse and affected by a variety of socio-technical concerns – and visual ability was less influential than other factors like ability to read Braille, presence of social supports, and children’s literacy. Our findings show that including blind parents in the design process offers key insights into co-reading, which help technologies in this space better meet the needs of both blind and sighted parents and children.