top of page
A headshot of Shiri Azenkot

Image description: headshot of Shiri Azenkot: woman wearing red-rimmed glasses, shoulder-length reddish-brown hair, denim jacket.

AI-Powered Systems for Access and Equity

How can we leverage advances in artificial intelligence to better our society? I believe that technology should be used to enable rather than disable people. In particular, I strive to design enabling systems that promote equity and improve quality of life for marginalized populations.


I research intelligent interactive systems that enhance the perception and ability of people with disabilities. For now, I focus on people with visual impairments, which include people who are blind and people with low vision, a large and understudied population. My students and I conduct studies to understand people's perceptual abilities, behaviors, and experiences, and design novel interactive systems that help people navigate, learn STEM concepts, and socialize with friends and co-workers. Here are some links to help you learn more about me and my work:


  • An overview of my work on enhanced perception systems for people with low vision.

  • A talk I gave at the University of Washington computer science colloquium.

  • Videos of recent projects: Markit and Talkit, CueSee, and Livefonts.

  • My CV.


Shiri Azenkot is an Associate Professor of Information Science at the Jacobs Technion-Cornell Institute at Cornell Tech, Cornell University, where she directs the Enhancing Ability Lab. She is also an affiliate faculty member in the Computer Science Department at the Technion--Israel Institute of Technology. Her research interests are in accessibility and interaction on new platforms. In 2019, she co-founded XR Access, a community dedicated to making augmented and virtual reality accessible to people with disabilities. Shiri frequently publishes at top HCI and accessibility conferences, including CHI, ASSETS, UIST, and UbiComp. She is the recipient of the NSF CAREER and CRII awards, and multiple best paper awards and nominations. Currently, her research is funded by the NSF, AOL, Verizon, and Facebook. Before arriving at Cornell Tech, she was a PhD student in Computer Science & Engineering at the University of Washington, where she was advised by Richard Ladner and Jacob Wobbrock.  


For more information about me, see my CV.


▶ 2.9.2023

The application is now open for the XR Access REU for summer 2023. Apply here


▶ 11.14.2022

I'm looking to hire a postdoc to work on XR accessibility and help lead XR Access. More info and application here!

▶ 10.27.2022

Our paper examining the microaggressions disabled people experience online was highlighted by the Cornell Chronicle. Read about it here.


▶ 10.26.22

I'm the General Chair for ASSETS'23. Stay tuned for more updates!

▶ 9.28.2022

Honored to receive the 10-year Impact award at MobileHCI with Shumin Zhai for our paper on Text Entry with Different Postures on Smartphones.

▶ 4.1.2021

Our REU Site on XR Accessibility is now accepting applications! Go to to apply. Deadline is April 21.

▶ 2.25.2021

My now former student Yuhang Zhao has graduated and joined the faculty at the University of Wisconsin, Madison as Assistant Professor!

▶ 7.1.2020

It's official: I am now an Associate Professor with tenure! 

▶ 6.30.2020

The XR Access Symposium will be held (virtually) on July 21-22. We have great speakers and activities lined up! Register now

▶ 1.16.2020

Two papers accepted to CHI 2020!


▶ 11.4.2019

I received a Google Faculty Award to work on XR for people with low vision!


▶ 10.30.2019

Here's a video of our UIST paper that Yuhang Zhao presented last week in New Orleans.


▶ 10.24.2019

I gave a talk about Accessibility and Immersive Media at the Immersive Media in Medicine Symposium.

▶ 10.2.2019

Presented our work on Interactive 3D Printed Models at a plenary talk in the Society for Imaging Science and Technology Conference Printing for Fabrication

▶ 9.14.2019

Our paper was accepted for publication at ACM TACCESS! Preprint available soon.

▶ 9.6.2019

Had a great time at the Montreal AI Symposium, where I gave a keynote about my work on AI-powered accessibility.

▶ 7.16.2019

XR Access is today! This is the symposium on making XR Tech accessible, hosted by myself and Larry Goldberg from Verizon Media. Follow us on Twitter: @xraccess and #xraccess.


▶ 4.1.2019

Check our my blog post on #AIandA11y.


▶ 2.28.2019

Looking for a postdoc to work on mixed reality accessibility. Details here.


▶ 11.1.2018

Our paper at ASSETS 2018 received a best paper nomination!

▶ 7.23.2018

I will be giving the keynote at Tapia Celebration for Diversity in Computing in September. Details available here.


▶ 7.15.2018

We will present interactive tactile maps at the Accessing Higher Ground Conference on accessible media, web, and technology in November!


▶ 7.8.2018

One paper accepted at ASSETS 2018 and one accepted at MobileHCI 2018. Preprints available soon! 


▶ 2.4.2018

For a brief overview of our work on low vision, see this article Yuhang and I wrote for the SIGACCESS newsletter. 


▶ 10.31.2017

Last week I gave a keynote at the New York State AER convention. Had a great time meeting and discussing our research with everyone.


▶ 9.3.2017

I just posted videos and pdfs for some of our latest publications! Check out videos for Markit and Talkit and Livefonts.


▶ 8.8.2017

Our paper in collaboration with Facebook was accepted to CSCW! Congrats to Yuhang for leading this work.


▶ 7.22.2017

A couple of belated announcements: Two papers accepted to UIST (Congrats Lei and Danielle!) and one paper accepted to ASSETS (Congrats again to Lei)! Stay tuned for some preprint pdfs.​

bottom of page