Mr. Venkat Ghodake, Miss. Pragati Jain, Miss. Tanishqa Bhalekar, Mr. Yash Dhamdhere
Department of Electronics and Telecommunication Engineering,
AISSMS Institute of Information Technology, Pune 411001,India
Abstract: to detect bumps or changes in ground level.
However, canes are not very good at telling you As a state-of-the-art assistive technology solution, about the surroundings, such as whether there are any overhead barriers or approaching cars. This the effort provides smart gloves to help blind makes maneuvering across complex spaces like individuals navigate their surroundings. The Internet crowded sidewalks or busy roads difficult. of Things (IoT) is used by the gloves to give However, guiding dogs are incredibly intelligent environmental awareness and emergency assistance. animals that may provide invaluable assistance in There are almost 37 million blind people in the overcoming difficult circumstances. They can help world, with over 15 million of them coming from with tasks like crossing roadways, and they can even India. Braille is a language used by blind or visually recognize dangers and point users in the right impaired people for reading, writing, and direction. However, because training and caring for correspondence. When they are exploring new a guide dog requires a lot of time and energy, not places, they frequently need outside help. For people everyone has the time or resources to do so. who are blind, the ideal exam is to thoroughly Moreover, not all environments—like crowded investigate the places with minimal help beforehand. public transportation or busy offices—may be These individuals must be permitted to live safe, suitable for guiding dogs. fulfilling lives in a free society. In this study, an Ultimately, a person's independence and privacy ESP32-based smart glove for the visually impaired may be compromised if they depend on sighted has been developed, as innovation is expanding friends or family for support. Although sighted quickly. Because the recommended smart glove can partners might be a great resource, they might not always be ready or able to assist. This might be detect obstacles, it gives blind individuals more particularly difficult for jobs that need help confidence when they walk in public. It has an frequently, like finding regular ultrasonic sensor that can identify objects moving in This approach has the potential to benefit a wide front of blind people and obstacles. Additionally, it range of domains, including contains output devices that let people detect an Independent living: Our project could empower object's mobility through sound, like a buzzer, visually impaired people to navigate their vibrating motor, and LED indication. The surroundings more freely and safely in their daily recommended smart glove is only appropriate for lives. Mobility and transportation: Our project could blind people because it depends on them to improve navigation in public spaces, using public recognise the movement of an object by listening for transport, and exploring new environments. the buzzer's continuous beep and vibration. Accessibility: The gloves could improve accessibility in buildings, public areas, and on Keywords: sidewalks by providing obstacle detection and Visually impaired assistance, Smart gloves, Internet wayfinding assistance. of Things (IoT), Obstacle detection, Ultrasonic Enhanced Versatility with Audio Playback: The sensor, Vibration feedback, Navigation aid, ESP32. smart gloves incorporate an audio storing and playback module, offering significant versatility in information access. This module allows pre- Introduction: recorded messages to be played back on demand using a designated switch. Individuals who are visually impaired employ a This research suggests smart gloves made especially range of techniques to navigate their environment, for people with vision impairments as an innovative each having unique benefits and drawbacks. People way to overcome this problem. These gloves use the with vision impairments use several methods to Internet of Things (IoT) to provide emergency aid navigate their surroundings; each method has and environmental awareness. This ground-breaking advantages and disadvantages of its own. For method could completely change how blind instance, canes provide a person with a single point individuals engage with their environment, of contact with their surroundings and allow them promoting increased security, independence, and overall well-being. Literature Review: 5. "Development of a Smart Glove for the Blind This section includes a review of previous studies People Using Raspberry Pi" [Development of a that have been done already. Based on the Smart Glove for the Blind People Using Raspberry requirements of our project. We studied the Pi paper] following recently published research papers. While not directly available through a free source, 1. "IoT Based Assistive Glove for Visually Impaired this paper's title suggests it explores a smart glove People" by IRJET [IRJET paper] concept built using Raspberry Pi. You might be able This paper introduces an IoT-based glove designed to find it through your institution's library or search to empower visually impaired individuals with for similar papers that utilize Raspberry Pi for improved navigation capabilities. It explores inspiration on the electronic components needed for components like ultrasonic sensors for obstacle your project. Specifically, papers exploring detection, vibration motors for haptic feedback, and Raspberry Pi for wearables or embedded systems a GPS module, likely intended for similar navigation could provide valuable insights. purposes you envision. The paper delves into the integration of these components with an IoT platform, enabling features like remote tracking and Aim and Objectives: emergency assistance, potentially providing caregivers or loved ones with peace of mind. Aim: To develop a Third Eye and to create an ultrasonic vibrator glove to enhance sensory 2. "Smart Gloves for Visually Challenged" by perception for the blind. International Journal of Engineering Research & Technology [Smart Gloves for Visually Challenged Objectives: paper] 1. Design and Prototype Creation: To develop a This paper focuses on "Smart Gloves for Blind" functional prototype of an ultrasonic vibrator glove which utilizes ultrasonic waves to notify users about that integrates with a blind individual's sensory obstacles. It explores core components relevant to experience. your project, including Arduino boards, GPS 2. Ultrasonic Sensing Technology: Research and modules, and investigates the feasibility of keeping implement advanced ultrasonic sensing technology the technology affordable. This focus on to accurately detect obstacles and objects in the affordability is crucial for ensuring the accessibility user's surroundings. of assistive technologies for a wider range of users 3. Vibration Patterns: To design a variety of . vibration patterns that correspond to different 3. "Third Eye For Blind Ultrasonic Vibrator Glove distances, sizes, and types of objects, enhancing the with Bluetooth and GPS Tracking" [Third Eye For user's ability to perceive their environment. Blind paper] 4. User Interface and Control: To create an This paper showcases a project titled "Third Eye for intuitive user interface that allows the blind user to Blind" which incorporates an ultrasonic vibration easily control and customize the glove's settings, glove with Bluetooth and GPS tracking. While the such as vibration intensity and sensitivity. focus is on vibration feedback, the use of GPS aligns 5. Real-time Feedback: To implement real-time with your project's goals. You might find inspiration feedback mechanisms to provide immediate and for integrating GPS functionalities, particularly for accurate information about the user's surroundings, features like location announcement or waypoint enabling quicker decision-making and navigation. guidance.
paper] (mentioned in the second reference - IRJET paper) This paper explores a concept called "iTouch" - a 1. Microcontroller Unit (MCU) - ESP32 blind assistance smart glove with a wider range of Specifications: Dual-core CPU, up to 240 MHz, functionalities. While not a full research paper, it with WiFi and Bluetooth capabilities. provides an interesting perspective on potential Purpose : Acts as the central processing unit features that could be future expansions for your managing inputs and outputs, wireless project. The paper proposes functionalities like communications for GPS data and possibly object and people detection, which could connecting to a mobile app for additional significantly enhance a user's awareness of their functionalities. surroundings. Additionally, it explores 2. GPS Module environmental recognition using camera modules, Model: Neo-6M GPS Module potentially enabling features like currency Specifications : 2.5m accuracy, 5Hz update rates. identification or sign reading. Purpose: Provides real-time location tracking to guide the user through pre-determined or dynamic data, and controls the output devices (e.g., speaker, routes. vibrator motor). 3. Ultrasonic Sensors Ultrasonic Sensor: This sensor emits high- Model: HC-SR04 or similar. frequency sound waves and listens for their echoes. Specifications: Range of 2cm to 400cm, with an By measuring the time it takes for the sound waves accuracy of approximately 3mm. to travel to and from an object, the sensor can Purpose: Detects obstacles within a certain range by determine the distance to the object. sending and receiving ultrasonic waves, which help Relay: A relay is an electrical switch that is in calculating the distance to obstacles. controlled by an electrical signal. In this system, the 4. Audio Output Module ESP32 likely controls the relay, which in turn Specifications : Integrated with the MCU or controls the speaker or buzzer. standalone modules like the DFPlayer Mini. Speaker/Buzzer: This device produces sound. The Purpose : To provide audio feedback and type of sound produced (beeping, continuous tone, instructions to the user, including navigation etc.) is likely controlled by the ESP32 program. directions and obstacle warnings. Vibrator Motor: This motor creates a vibration that 5. Vibration Motors can be felt by the user. Specifications: Small DC motors or coin vibration motors. Purpose : To provide tactile feedback for obstacle Flowchart: detection, enhancing the user's awareness of immediate environmental changes. 6. Power Supply Specifications : Rechargeable battery pack, typically Lithium-Polymer or Lithium-Ion, around 1000 mAh or higher. Purpose : Powers the device ensuring mobility and portability without frequent recharges. 7. Connectivity and Control Interfaces Specifications: Buttons or tactile switches for user interaction. Purpose : Allow users to start navigation, toggle between modes, and replay audio instructions. Explaination: 8. Housing and Mounting Specifications: Lightweight, ergonomic design possibly using materials like ABS plastic or silicone. • Initialization: Upon turning on, the ESP32 Purpose: To encase all electronic components in a initializes and verifies that all of the wearable format, such as a glove, belt, or harness, attached devices, including the GPS, are making it easy and comfortable for the user to wear. receiving a signal. • User Input: Using a linked app or the Block Diagram : device's pre-programmed buttons, the user enters the desired destination. • Route Processing: Using GPS data, the system determines the optimum path while accounting for distance and information about known obstacles. The path is updated in real time, continuously. • Navigation and Obstacle Alerts: The GPS speaks voice directions and updates the user's position as they are moving. Ultrasonic sensors actively search for surrounding obstructions at the same time. The technology warns the user when an Working : impediment is identified by vibrating or providing audio signals that point in the Power Supply: This block provides power to the direction of the obstruction. entire system. • User Interaction: Throughout the trip, the ESP32: This is the microcontroller unit (MCU) that user can communicate with the system to controls the entire system. It reads data from the ultrasonic sensor, makes decisions based on that make adjustments, ask for a recalculation of the route, or repeat directions. Voice digitalWrite(TRIG_PIN, LOW); commands or simple switches are used to delayMicroseconds(2); control this. digitalWrite(TRIG_PIN, HIGH); • Constant Feedback: The system keeps delayMicroseconds(10); digitalWrite(TRIG_PIN, LOW); track of the user's location as well as any modifications to the path or surrounding Serial.print("Distance: "); area. In order to keep the user informed Serial.print(distance); about their surroundings and progress, this Serial.println(" cm"); input is essential. • Navigation's end: The system gives a last if (distance >= 2 && distance <= 40) { update and either shuts down or goes back to an idle state after arriving at the } else { destination or if the user decides to stop. digitalWrite(RELAY_PIN, HIGH); digitalWrite(BUZZ_PIN, LOW); } Software Design: if (digitalRead(SWITCH_PIN) == HIGH) { delay(500); #define BLYNK_TEMPLATE_ID while (Serial2.available() > 0) { "TMPL3hMESp5JM" if (gps.encode(Serial2.read())) { #define BLYNK_TEMPLATE_NAME displayInfo(); "Smart Blind Stick" } #define BLYNK_AUTH_TOKEN } "ZIQ3L4i_2_1ABRUqrZtYMvdFc1WfIJpg" #define BLYNK_PRINT Serial if (millis() > 5000 && gps.charsProcessed() #include <WiFi.h> < 10) { #include <BlynkSimpleEsp32.h> Serial.println(F("No GPS detected: check #include <TinyGPSPlus.h> wiring.")); while (true); char auth[] = BLYNK_AUTH_TOKEN; } char ssid[] = "realme"; // Type your WiFi } name char pass[] = "12345678"; // Type your WiFi delay(1000); password } #define TRIG_PIN 12 void displayInfo() { #define ECHO_PIN 13 Serial.print(F("Location: ")); #define BUZZ_PIN 18 #define RELAY_PIN 5 char locationStr[20]; #define SWITCH_PIN 19 if (gps.location.isValid()) { TinyGPSPlus gps; sprintf(locationStr, "%.6f,%.6f", gps.location.lat(), gps.location.lng()); void setup() { } else { Serial.begin(9600); sprintf(locationStr, "NO_GPS"); Serial2.begin(9600); } Blynk.begin(auth, ssid, pass); pinMode(TRIG_PIN, OUTPUT); Blynk.virtualWrite(V0, locationStr); pinMode(ECHO_PIN, INPUT); //Blynk.virtualWrite(V0, "19.89225, pinMode(SWITCH_PIN, INPUT); 75.3466"); pinMode(RELAY_PIN, OUTPUT); Serial.print("Lat: "); pinMode(BUZZ_PIN, OUTPUT); Serial.print(gps.location.lat(), 6); Serial.print(F(", Lng: ")); digitalWrite(RELAY_PIN, HIGH); Serial.print(gps.location.lng(), 6); delay(3000); Serial.println(); } } void loop() { Blynk.run(); Result:
The efficacy of our solution was evaluated in-depth
through comprehensive user testing and compared with other assistive technologies now accessible for those with visual impairments. The comprehensive evaluation produced the following results:
- Better Navigation: Our technology fared better in
navigation than traditional mobility aids, providing users with rapid and accurate directions. - Better Obstacle Detection: The ultrasonic sensors enhanced user safety in challenging situations and enabled proactive navigation adjustments by effectively detecting obstacles in the user's route. - High User Satisfaction: According to user comments, the system's overall performance, usability, and functionality are highly valued, indicating that it has the potential to significantly enhance the navigating experience for blind or visually impaired individuals. - Our system's unique advantages over other assistive technologies were made clear by comparison, including its extensive navigation functions, real-time obstacle recognition capabilities, and adaptable user interface. These results highlight how our technology can significantly improve the lives of visually impaired people by enabling them to navigate their environment with dignity, freedom, and confidence. This improves their quality of life and increases their inclusion in society. Conclusion:
To sum up, our research study has
Model Images: demonstrated a cutting-edge Internet of Things (IoT) smart glove system intended to provide blind people more autonomy and improved safety when navigating. To do this, the system makes use of a number of interconnected technologies. As the user's eyes, ultrasonic sensors efficiently identify impediments and gauge their distance from them. Vibration motors positioned strategically provide the user real-time awareness of their surroundings by relaying this crucial information. Moreover, GPS integration guarantees continuous tracking of the user's location, providing an essential safety net in case of emergency. When combined with the educational audio messages played through the glove, the system's capacity to send location data gives the user the opportunity to make knowledgeable navigational decisions. An additional degree of protection is added with the integration of the Blynk app. The technology promotes peace of mind for the user and their support system by enabling distant carers or loved ones to keep an eye on the person's whereabouts. With its all-encompassing approach to obstacle identification, safety protocols, and information dissemination, the suggested smart glove system is an invaluable assistive technology that encourages independent life among the visually mapping applications could leverage their existing impaired. knowledge to navigate more effectively. Throughout the project, we achieved several • Communication: key milestones: Enable emergency contact features through We successfully integrated a suite of sensors, cellular or Bluetooth connections. This would allow including ultrasonic sensors for obstacle detection, users to call for help or send SOS messages in case and potentially others like GPS for location tracking. of emergencies. This involved careful selection, calibration, and testing of the sensors to ensure they functioned reliably and delivered accurate data.
Future Scope: References:
•Environment recognition: [1] World Health Organization, Blindness and
Integrate object recognition cameras or vision impairment, Oct. 2019. [Online]. Available: LiDAR sensors. This allows the gloves to describe https://www.who.int/news- objects and surroundings in real-time, providing a room/factsheets/detail/blindness-and-visual- more comprehensive understanding of the user's impairment. [Accessed: 20- Jul.- 2020]. environment. For instance, the gloves could describe [2] R. R. Bourne, S. R. Flaxman, T. Braithwaite, M. the location and type of objects around the user, like V. Cicinelli, A. Das, J. B. Jonas and J. Keeffe," a park bench, a fire hydrant, or a traffic signal. Magnitude, temporal trends, and projections of the Partner with image recognition services like global prevalence of blindness and distance and near Google Cloud Vision or Amazon Rekognition to vision impairment: a systematic review and meta- identify objects through captured images. This analysis," The Lancet: Global Health, vol. 5, no. 9, approach could be particularly useful for pp. 888-897, Aug. 2017. recognizing objects that are not easily detectable [3] R. K. Katzschmann, B. Araki and D. Rus," Safe with ultrasonic sensors, such as signs, paintings, or Local Navigation for Visually Impaired Users With different types of clothing. a Time-of-Flight and Haptic Feedback Device," •Advanced navigation: IEEE Transactions on Neural Systems and Develop indoor navigation using beacons or Rehabilitation Engineering, vol. 26, no. 3, pp. 583 - existing infrastructure signals. This would allow the 593, March 2018. gloves to guide users inside buildings like malls, [4] h. Iyama, Y. Shigeno, E. Hirano, M. Kamoshita airports, or train stations, where GPS signals are and N. Nagai," QD laser eyewear as a visual field often weak or unavailable. aid in a visual field defect model," Scientific Report, Implement path planning algorithms to vol. 9, no. 1010, January 2019. suggest optimal routes. The gloves could take into [5] B. J. Nguyen, W. S. Chen, A. J. Chen, A. Utt, E. account factors like distance, terrain, and Hill, R. Apgar and D. L. Chao, Large-scale accessibility to create the most efficient and user- assessment of needs in low vision individuals using friendly navigation experience. the Aira assistive technology, Clinical •Biometric integration: ophthalmology, vol. 13, pp. 1853-1868, 20 Sep. Integrate heart rate or blood oxygen sensors to 2019. monitor the user's health. This data could be relayed [6] I. Khan, S. Khusro and I. Ullah," Technology- to the user through audio alerts or transmitted to a assisted white cane: evaluation and future caregiver or emergency services in critical directions," PeerJ, vol. 6, 2018. situations. [7] D. Zhou, Y. Yang and H. Yan," A Smart Virtual •Augmented reality integration: Eye Mobile System for the Visually Impaired," Pair the gloves with a smartwatch or AR IEEE Potentials, vol. 35, no. 6, pp. 13 - 20, Dec. headset to overlay navigational information or 2016. object recognition data onto the user's view. This [8] N. S. Ahmad, N. L. Boon and P. Goh, Multi- would provide a more intuitive and visually-like Sensor Obstacle Detection System Via Model- experience for users who are familiar with AR Based State-Feedback Control in Smart Cane technology. Design for the Visually Challenged, IEEE Access, •Customizability: vol. 6, pp. 64182 - 64192, 29 October 2018. Allow users to record their own voice instructions or preferred audio alerts. This personalization can improve user comfort and satisfaction with the device. Integrate with open-source mapping applications to provide more user-friendly navigation options. Users familiar with specific