The inspiration behind the "Life Captions" project was born from a deep empathy for individuals with hearing impairments who face daily communication challenges. 'Man is a social animal,' and conversing with others is the basis of being human.
In a world where sign language is not universal, and hearing aid devices can be prohibitively expensive, many people with hearing difficulties experience social isolation, loneliness, and the weight of stigma.
A personal encounter with a hearing-impaired sales representative deeply affected me. My family and I hesitated to seek assistance and attempted to navigate the conversation independently. Similarly, my grandmother's increasing hearing difficulties due to age made her feel left out in family conversations. These experiences made me imagine the daily struggles of hearing-impaired individuals and spurred me to think, "There NEEDS to be a solution for this."
The concept was to convert live streaming audio to text and display it to users in real time. While many speech-to-text algorithms exist, most operate remotely on cloud servers, requiring an internet connection for transcription and raising concerns about data privacy and ethics.
"Life Captions" emerged as a solution to these challenges, offering real-time speech transcription without needing hand gestures, lip reading, an internet connection, or the constant distraction of looking at a phone screen.
Collaborations with the "I Can Hear Foundation" and Apollo Hospitals have been pivotal in developing the "Life Captions" project. When I initially conceived the idea, I contacted various organizations specializing in hearing loss disabilities to gain access and knowledge in this field. The "I Can Hear Foundation" responded positively and initiated a series of online meetings, followed by in-person interactions.