
Overview
Effective communication is crucial, especially in critical sectors like healthcare. However, not everyone can engage with existing digital communication tools in the same way. Deaf individuals, in particular, face significant challenges in scenarios where non-verbal communication is essential. "AIDA" is a conceptual solution designed to bridge this gap. Leveraging machine learning technology, my goal is to transform the healthcare experience for deaf individuals, empowering them to communicate effectively with healthcare professionals through video calls.
Objective
The primary objective is to enable seamless communication between deaf patients and doctors. By translating sign language into text in real-time, we can ensure that deaf individuals can accurately and effectively convey their health concerns, symptoms, and problems to medical professionals without the need for an intermediary. This direct line will enhance the quality of healthcare services provided to the deaf community, fostering a more inclusive environment in medical consultations.
The problem
The communication barrier between deaf individuals and healthcare providers is a significant challenge. According to the World Health Organization, over 5% of the world's population – or 466 million people – have disabling hearing loss, and this number is expected to rise to over 900 million by 2050.
This population often faces substantial obstacles in accessing healthcare services effectively due to communication barriers. In many cases, the lack of a common language between deaf patients and healthcare providers can lead to misdiagnoses, poor patient satisfaction, and inadequate healthcare.
Target users of AIDA is therefore deaf or hard of hearing individuals who use sign language as their primary mode of communication. This group requires a reliable and efficient way to communicate with healthcare providers, especially in situations where immediate or accurate communication is vital. Secondary users include healthcare professionals who interact with deaf patients. They need an intuitive and accurate system to understand and respond to the patients' needs effectively.
Research
In order to gain a clear understanding of the needs and preferences of the deaf community, extensive user research was carried out. This research was conducted in two distinct phases:
Literature Review and Analysis: A review of published materials was conducted to analyze and evaluate the use and effects of sign language. This included academic studies, surveys, and other documents within utilization of sign language in the healthcare field.
Direct Engagement with the Deaf Community: I actively interacted with numerous members of the deaf community. These conversations shed light on the practicalities of using sign language and the drawbacks of existing communication methods. This ensured that the user research was grounded in actual user experiences and needs.
The research led to the development of detailed personas, each representing distinct segments of the deaf community with their unique needs and preferences.
Anna Middleton et al.'s study, "Preferences for communication in clinic from deaf people," surveyed 999 deaf and hard of hearing individuals in the UK to understand their communication preferences in clinical settings In clinical and healthcare settings, 50% of the participants who used sign language expressed a preference for consultations through a sign language interpreter. Meanwhile, 43% preferred direct consultations with health professionals who are fluent in sign language. Only a small fraction, 7%, were open to spoken consultations, provided the health professional demonstrated strong deaf awareness, such as proficiency in lip-reading.
Complementary research conducted in the United States (Scheier, 2009) and the Netherlands (Smeijers & Pfau, 2009) highlighted the same issue of miscommunication between healthcare providers and their patients. The study in the United States found that miscommunication often led to misunderstandings in both diagnostic and therapeutic areas. In the Netherlands, only 13% of doctors and patients rated their patient-general practitioner (GP) communication as good, while 39% described it as moderate or poor. In a similar survey, U.S. physicians noted significantly greater difficulties in communicating with deaf patients compared to hearing patients (Ralston, Zazove, & Gorenflo, 1996). This communication barrier resulted in deaf patients being less likely to trust their physicians and to understand diagnoses and treatments. In a parallel finding from the United Kingdom, 44% of deaf patients reported difficulties in their last interaction with their GP or health center, a stark contrast to the 17% reported in a general population patient survey (NHS England, 2015 ; SignHealth, 2013).
Many of the mentioned challenges were also highlighted in the conversations I had with various members of the deaf community. Their personal experiences and insights reinforced the need for improved communication between healthcare providers and deaf patients.
Creating a solution
Following the research, I began the development of a solution with focus on addressing the identified communication challenges The goal was to create an intuitive, user-friendly interface that facilitates seamless sign language to text translation. The initial phase involved creating basic wireframes to outline the primary functions and layout of the application. This stage was important for establishing the foundation of the user interface.
Next, I developed a more detailed user interface, focusing on ease of use, accessibility, and visual appeal. The design process took into account color schemes, typography, and iconography that would be easily understandable for our target audience. The translation interface is designed to be real-time and interactive, displaying the translated text as the user signs.
An interactive prototype was then created to simulate the user experience. This prototype was important in visualizing the flow and functionality of the solution.
From the patient's perspective, the doctor's spoken language is translated in real time, providing an immediate and accessible understanding of the conversation. This real-time translation bridges the communication gap, allowing the patient to follow the consultation as it happens, just like a spoken conversation.
Prototyping and results
Following a demonstration-focused prototyping session in which users observed the functionality of the solution without directly interacting with it, the following statistics were gathered to support the potential of the solution:
Challenge and complexity
Language variations
Sign languages are not universal. Each country, and sometimes regions within countries, have their own sign language, like American Sign Language (ASL), British Sign Language (BSL), etc. This diversity poses a significant challenge in developing a universal translation tool. Just as with spoken languages, sign languages have dialects and idiomatic expressions that can vary widely, even within the same language. Ensuring accurate translation across these variations is a complex task. For the conceptual prototype I used ASL as it aligns best with my native language.
Facial Expressions
Sign language is not limited to hand gestures, it also includes facial expressions and body language. Facial expressions can change the whole meaning of a sign, and are important for conveying tone and emotion. Capturing and correctly interpreting these minor changes is incredibly challenging. I tried using Google's facial landmark model, which is designed to recognize and interpret facial cues accurately. It worked quite well, and the model was able to perform with a high degree of accuracy.
Other use cases
In addition to its application in medical scenarios, I have also created a prototype for an innovative use of this tool as an accessibility feature in everyday communication. Envisioned as a parallel to the voice recording shortcut in messaging apps, this feature allows for the translation of sign language into text messages. With simple gestures, users can communicate in sign language, and the tool converts these signs into written text. This text can then easily be sent as a message to friends or contacts.