BrailE- Reading Device for the Deaf and Blind in Real Time Speech
Abstract
Braille is a vital means of communication; it is a system for blind people, one of touch reading and writing in which Raised dots are impressions that represent the letters of the alphabet. It is an extremely important tool for blind people to educate themselves, and it is a critical component that supports not only educational advancement, but sub consequently increases employment prospects. The blind should be taught Braille to be able to become literate, which is a necessity in today's world. Braille is a much harder language than sign, as there are a lot of combinations of the impressions of the six raised dots that are not easy to memorize. Visually impaired people are required to master skills to communicate through Braille text, which itself is really time-taking and cumbersome task.
In addition, other people need to learn the same set of skills to understand and respond to the visually impaired person. We have devices that convert text to Braille language as well as real-time Braille to speech using Raspberry Pi camera and a Raspberry Pi. Other devices use FPGAs/Arduinos for converting speech to braille. This paper is a survey of different techniques that were used for the
conversion of text to braille and vice versa, and an evaluation of the accuracy of these methods is done.
Keywords:
Raspberry Pi, blind and deaf, FPGA, Speech Recognition, Arduino microcontroller, liquid crystal display, convolutional neural networkPublished
Issue
Section
License
Copyright (c) 2023 International Journal on Emerging Research Areas

This work is licensed under a Creative Commons Attribution 4.0 International License.
All published work in this journal is licensed under the Creative Commons Attribution 4.0 International License (CC BY 4.0). This license permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
How to Cite
Similar Articles
- Jyothis Joseph , Ajay K Baiju, Ganga Binukumar, Akshara Manoj, Sandra Elizabeth Rony, A Crowd Monitoring and Real-Time Tracking System using CNN , International Journal on Emerging Research Areas: Vol. 4 No. 1 (2024): IJERA
- Adona Shibu, Aarunya Retheep, Albin Joseph, Ali Jasim, Adona Shibu , International Journal on Emerging Research Areas: Vol. 4 No. 1 (2024): IJERA
- Amal P Varghese, Simy Mary Kurian, Advancements in ECG Heartbeat Classification: A Comprehensive Review of Deep Learning Approaches and Imbalanced Data Solutions , International Journal on Emerging Research Areas: Vol. 3 No. 2 (2023): IJERA
- Prayag Suresh, Sneha Susan Alex, Rojan Varghese, Thomas Zacharias, Shiney Thomas, Survey of Strabismus Detection Techniques , International Journal on Emerging Research Areas: Vol. 4 No. 1 (2024): IJERA
- Abhijith J, Athul Krishna S, Amarthyag P, Angela Rose Baby, Mekha Jose, CATARACT DETECTION USING DIGITAL CAMERA IMAGES , International Journal on Emerging Research Areas: Vol. 4 No. 1 (2024): IJERA
- P Sathya Narayan, Safad Ismail, Developing an Empathetic Interaction Model for Elderly in Pandemics , International Journal on Emerging Research Areas: Vol. 3 No. 1 (2023): IJERA
- Nihal Anil, Ms. Nighila Abhish, Jesila Joy , Noora Sajil , P R Vishnuraj, Augmented Neat Algorithm For Enhanced Cognitive Interaction (NEAT-X) , International Journal on Emerging Research Areas: Vol. 4 No. 1 (2024): IJERA
- Amina Manaf , Ance Maria Joseph , Angel Joy , Anjaly Anilkumar , K S Rekha, Driver Drowsiness Detection Using Python , International Journal on Emerging Research Areas: Vol. 4 No. 1 (2024): IJERA
- Syam Gopi, Evelyn Susan Jacob, Joel John, Raynell Rajeev, Steve Alex, Survey on AI Malware Detection Methods and Cybersecurity Education , International Journal on Emerging Research Areas: Vol. 4 No. 2 (2024): IJERA
- Arun Robin, Tijo Thomas Titus, Ms. Minu Cherian, Improved Handwritten Digit Recognition Using Deep Learning Technique , International Journal on Emerging Research Areas: Vol. 3 No. 2 (2023): IJERA
You may also start an advanced similarity search for this article.