55 – She was blind and she taught AI how to really see. #ArtificialDecisions #MCC

She was blind. And she taught artificial intelligence how to really see.

Maya lost her sight as a child. But she never stopped reading the world: with her hands, her ears, her entire body. And with a sharp, precise mind. She studied computer science. Then, computer vision. Yes, vision. People thought she was crazy. But she knew that not seeing allowed her to notice what others missed.

Years later, she joined a team building an AI system to describe images for blind users. It worked, but poorly. It said things like “a man sitting” or “a woman with a purse.” Generic. Soulless. Sometimes even offensive. Biased labels. Bad assumptions. Maya quickly saw the issue. The AI “saw” but didn’t listen.

So she changed the method. She asked for each image to be described by human volunteers. Slowly. With nuance. With emotion. She listened to thousands of descriptions, turned them into structured data, mapped what truly matters to blind users: tone, relationships, context, intention.

She cleaned the original dataset. Expanded it. Removed labels like “normal” or “abnormal.” Introduced new ones: not just what is in the image, but why, who, what could happen next.

She taught AI not to describe, but to interpret. To assist, not oversimplify. To become a way of reading the world, not a blind copy machine.

The model got better. More useful. More respectful. More accurate. Not because it saw more, but because it was trained by someone who never had sight and therefore listened more deeply.

Maya didn’t give artificial intelligence vision. She taught it to pay attention. Which is a whole different kind of seeing.

#ArtificialDecisions #MCC

Share: