3D Modeling for Blind Programmers | Accessibility & Tools

by Priyanka Patel

DALLAS, February 7, 2026 — For blind and low-vision programmers, the world of 3D modeling has historically been out of reach. A new prototype program, A11yShape, is changing that, allowing these creators to independently design, inspect, and refine 3D models without relying on sighted assistance.

A New Dimension in Accessibility

A11yShape aims to bridge the gap for visually impaired programmers, opening doors to careers in hardware design, robotics, and beyond.

  • Traditionally, 3D design software relies on visual interaction, creating a barrier for those with visual impairments.
  • A11yShape leverages code-based modeling and AI assistance to overcome this challenge.
  • The program integrates with OpenSCAD, a script-based 3D modeling editor, and ChatGPT-4o for real-time support.
  • User testing has shown promising results, with participants reporting a new perspective on 3D modeling.

For a visually-impaired programmer, writing elegant code is often only half the battle. Without accessible modeling software, they’ve been unable to fully translate their ideas into tangible, verifiable designs—whether physical or virtual. That’s now beginning to shift.

Re-imagining Assistive 3D Design With OpenSCAD

A11yShape builds upon OpenSCAD, a 3D modeling editor that allows users to create designs entirely through text-based scripting, eliminating the need for visual dragging and clicking. The program enhances OpenSCAD by connecting modeling components across three user interface panels.

The tool also introduces an AI Assistance Panel, enabling users to submit queries to ChatGPT-4o to validate design decisions and debug OpenSCAD scripts. If a user selects a piece of code or a model component, A11yShape highlights the corresponding element in all three panels and updates the description, ensuring clarity for blind and low-vision users.

A11yShape’s three panels synchronize code, AI descriptions, and model structure so blind programmers can discover how code changes affect designs independently.
Anhong Guo, Liang He, et al.

User Feedback Shapes Development

The research team, led by Liang He, assistant professor of computer science at the University of Texas at Dallas, recruited four participants with varying degrees of visual impairment and programming experience. Participants used A11yShape to design models, and their workflows were observed. One participant, new to 3D modeling, reported that the tool “provided [the blind and low-vision community] with a new perspective on 3D modeling, demonstrating that we can indeed create relatively simple structures.”

However, participants also noted that lengthy text descriptions can make it difficult to grasp complex shapes. Several expressed the need for tactile feedback, either through physical models or tactile displays, to fully visualize their designs. To assess the accuracy of the AI-generated descriptions, the team gathered feedback from 15 sighted participants, who rated them between 4.1 and 5 on a scale of 1–5 for geometric accuracy, clarity, and the absence of errors.

A failed all-at-once attempt to construct a 3-D helicopter shows incorrect shapes and placement of elements. In contrast, when the user journey allows for completion of each individual element before moving forward, results significantly improve.
A new assistive program for blind and low-vision programmers, A11yShape, assists visually disabled programmers in verifying the design of their models.
Source: Anhong He, Anhong Guo, et al.

He plans to incorporate this feedback into future iterations, potentially integrating tactile displays, real-time 3D printing capabilities, and more concise AI-generated audio descriptions. He also emphasized the potential of A11yShape to lower the barrier to entry for aspiring blind and low-vision programmers.

“People like being able to express themselves in creative ways… using technology such as 3D printing to make things for utility or entertainment,” says Stephanie Ludi, director of DiscoverABILITY Lab and professor of computer science and engineering at the University of North Texas. “Persons who are blind and visually impaired share that interest, with A11yShape serving as a model to support accessibility in the maker community.”

The team presented A11yShape at the ASSETS conference in Denver in October.

You may also like

Leave a Comment