This award marks an exciting collaboration between my lab and my colleagues Sam Malek and Iftekhar Ahmed in Software Engineering. We’ll be developing automated software engineering tools to enhance accessibility of Android applications, while also seeking to educate developers about the experiences of screen reader and switch users. See the official abstract below.
The ability to use software with ease is important for everyone, especially for approximately 15% of the world population with disabilities. Even the simplest operations, taken for granted by regular users, can be daunting tasks for disabled users. Unfortunately, in the current state of affairs, software inaccessibility is widespread. This can be attributed, at least partly, to the deficiencies in existing techniques and tools available to software engineers. Automated solutions for validating the accessibility of software are woefully insufficient. They either fail to detect many real accessibility issues or report too many superficial issues that are irrelevant in practice. Automated repair techniques, shown to be quite effective for improving various quality attributes of software (e.g., reliability, security), are scarce for accessibility. Existing interaction modalities are too rigid and cumbersome, seriously hindering the disabled users’ use and enjoyment of the profound advances in software technology.
This project lays the groundwork for innovative technologies that will enable end-users with vision and motor impairments to interact more effectively with software. Combining empirical data-driven research and tool building activities, the project will advance the state-of-the-art in several ways. First, the team of researchers will devise a use-case and assistive-service-driven accessibility issue detection technique capable of automatically identifying accessibility issues that are not detectable using the existing state-of-the-art techniques. Second, the researchers will develop an automated program repair solution that employs a combination of novel deep-learning and search-based strategies for fixing a variety of accessibility issues. Furthermore, the team will construct the means for automatically identifying use case macros for navigation optimization, allowing a disabled user to rapidly execute frequently accessed use cases through intuitive commands. Finally, utilizing a mixed-methods approach of user studies and interviews with both disabled users and software engineers, researchers will evaluate the efficacy of techniques developed in this project. Ultimately, the project will result in a suite of tools, which will be made available publicly, for helping developers with improving the accessibility of software systems that they construct.