From Empathy to Innovation: Class 11 Student Creates Tech for the Visually Impaired

Seventeen-year-old innovator and accessibility advocate, Ashwat Prasanna, spoke to The Hans India about the journey behind EyeSight—an AI-powered, low-cost smart glasses solution designed to empower India’s visually impaired. In this conversation, he shared how empathy, user collaboration, and cutting-edge technology came together to create an inclusive device that aims to reach over 20,000 users by 2026.
You’re just 17 and already impacting lives—how did your journey into social innovation begin so early?
I volunteer at the Premanjali Foundation for the visually impaired. During my time there, I became friends with Charan – he shared my love for math and logic. We would spend a lot of time discussing puzzles and Olympiad questions. But what moved me was when I was told by one of the teachers there not to encourage him a lot since he would never have the opportunities I did – It was very unsettling for me for days – this day and age where we are talking about robot housekeepers and driverless cars, there were still pockets where technology hadn’t made its mark – For the next few months, I researched all available accessibility tech, what worked and what was missing – I realized that best results can be achieved if a device was specifically designed for the needs of the visually impaired – navigation, currency reading, scene description etc – that's how the very first version of EyeSight was born – three years ago.
Can you share one user testing experience that deeply moved you or reshaped your thinking?
More than user testing, I would say that this design was co-created with the users. From the outset, the design and features were influenced by the users' needs and wants. Throughout the version iterations, I got a lot of feedback about what worked and what did not hit the mark. One thing that hit home hard was the affordability. It was easy to get carried away with the latest in technology, but that would be pointless to most of the visually impaired users, as it falls outside their affordability limit. The challenge was to create the best possible version at the lowest cost.
How do you plan to make the device sustainable and scalable across India’s diverse regions and languages?
EyeSight uses the OpenAI API, which has incredible support for India’s local languages and even dialects, which currently gives us great reach and localisation in these regions. Additionally, in the future, we plan to fine-tune or train LLMs and AI models to better suit these regions. Another major part of sustainability and scalability is making the device affordable, which has been one of the most significant features so far. Specifically, by making the device 3D printable and using standard parts, it is something that can be assembled by nearly anyone, for everyone.
How does EyeSight’s offline functionality and ₹1500 pricing truly redefine affordability and accessibility in this space?
Compared to other devices with comparable features, which cost upwards of 10-15 thousand rupees, EyeSight is only Rs. 1500, made possible by our choice of design and functionality. Why is this significant? The reality is that for many of the institutions for the visually impaired, more than the features that define access, it’s the cost that is the most significant. With EyeSight, thousands more could have access to transformative assistive technology.
With a target of 20,000 users by 2026, how do you plan to tackle scale while keeping personalisation and support intact?
In the past, the majority of the prototypes have been used in small-scale testing, where they were used individually and not sold to the customer. The pilot phase (set to begin in May) will include loaned units to an institution, by which time the pricing will be finalised.
We have received an IB grant of 3000$, which has been very useful in building these late-stage prototypes.
Going forward, our first step will be to conduct large-scale user testing and refine the product over the next few months. Based on the testing results, we plan to approach manufacturing units with the refined specs.
As far as reaching users is concerned, we are planning to collaborate with schools for the visually impaired in Karnataka. Samarthanam Trust, NAB Karnataka, Mitra Jyoti, and Premanjali Foundation have been of incredible help to us in our creation process. The students in these institutions will be our initial beneficiaries.
How did support from programs like IB Global Youth Action Fund and RunWay Incubation shape EyeSight’s development?
Building the technical product is one thing; taking the product the last mile to reach the market is a whole other thing. As a student, I needed all the help I could get on building EyeSight as a product. RunWay Incubation is a division of PEARL Design Academy that incubates early-stage student ventures such as mine. There I learnt the basic fundamentals of creating a business plan, marketing and fundraising tools.
With this foundation, I was able to apply for and acquire the IB Global Youth Action Fund grant of 3000$.
This fund, in turn, has helped me build low-fi prototypes and a testable prototype with which I’m doing user testing.
How does EyeSight perform offline AI processing on a wearable device without needing constant cloud connectivity? What challenges did you face in optimising performance?
Currently, the code implements a combination of models for more detailed online access and quick, essential offline scene inference. This means the basic features of identifying objects, hazards, and safety risks are something that can be possible regardless of an internet connection, and we are working to implement more features to improve offline performance. This is especially significant since many of our users have mentioned that internet connectivity is often patchy in areas where they typically use the product.
How do the glasses trigger emergency alerts? Are they gesture-activated or context-based through environmental detection?
They need a simple tap gesture on the device before it informs the user and calls emergency services. In future versions, emergencies can be automatically identified using computer vision.
What’s next for EyeSight after this prototype phase? Are there any new features or partnerships in the works?
- First priority is to increase the user-level and field testing for multiple use cases; cooperate with NGOs, and work with the students
- From a packaging standpoint, we need to increase the product robustness and reduce the cost of various components; we have identified a hardware partner, and we will accelerate product redesign
- We will apply for national and international grants and financial partners for scaling and a large-scale launch