AI-Powered Systems for Access and Equity
How can we leverage advances in artificial intelligence to better our society? I believe that technology should be used to enable rather than disable people. In particular, I strive to design enabling systems that promote equity and improve quality of life for marginalized populations.
I research intelligent interactive systems that enhance the perception and ability of people with disabilities. For now, I focus on people with visual impairments, which include people who are blind and people with low vision, a large and understudied population. My students and I conduct studies to understand people's perceptual abilities, behaviors, and experiences, and design novel interactive systems that help people navigate, learn STEM concepts, and socialize with friends and co-workers. Here are some links to help you learn more about me and my work:
Shiri Azenkot is an Associate Professor of Information Science at the Jacobs Technion-Cornell Institute at Cornell Tech, Cornell University, where she directs the Enhancing Ability Lab. She is also an affiliate faculty member in the Computer Science Department at the Technion--Israel Institute of Technology. Her research interests are in accessibility and interaction on new platforms. In 2019, she co-founded XR Access, a community dedicated to making augmented and virtual reality accessible to people with disabilities. Shiri frequently publishes at top HCI and accessibility conferences, including CHI, ASSETS, UIST, and UbiComp. She is the recipient of the NSF CAREER and CRII awards, and multiple best paper awards and nominations. Currently, her research is funded by the NSF, AOL, Verizon, and Facebook. Before arriving at Cornell Tech, she was a PhD student in Computer Science & Engineering at the University of Washington, where she was advised by Richard Ladner and Jacob Wobbrock.
For more information about me, see my CV.
It's official: I am now an Associate Professor with tenure!
Two papers accepted to CHI 2020!
I received a Google Faculty Award to work on XR for people with low vision!
Here's a video of our UIST paper that Yuhang Zhao presented last week in New Orleans.
I gave a talk about Accessibility and Immersive Media at the Immersive Media in Medicine Symposium.
Our paper was accepted for publication at ACM TACCESS! Preprint available soon.
Had a great time at the Montreal AI Symposium, where I gave a keynote about my work on AI-powered accessibility.
XR Access is today! This is the symposium on making XR Tech accessible, hosted by myself and Larry Goldberg from Verizon Media. Follow us on Twitter: @xraccess and #xraccess.
Check our my blog post on #AIandA11y.
Looking for a postdoc to work on mixed reality accessibility. Details here.
Our paper at ASSETS 2018 received a best paper nomination!
I will be giving the keynote at Tapia Celebration for Diversity in Computing in September. Details available here.
We will present interactive tactile maps at the Accessing Higher Ground Conference on accessible media, web, and technology in November!
For a brief overview of our work on low vision, see this article Yuhang and I wrote for the SIGACCESS newsletter.
Last week I gave a keynote at the New York State AER convention. Had a great time meeting and discussing our research with everyone.
Our paper in collaboration with Facebook was accepted to CSCW! Congrats to Yuhang for leading this work.
A couple of belated announcements: Two papers accepted to UIST (Congrats Lei and Danielle!) and one paper accepted to ASSETS (Congrats again to Lei)! Stay tuned for some preprint pdfs.