
8.Pachodiwale, Z.A., Brahmankar, Y., Parakh, N., Patel, D.: Virtual Assistant for Visually Impaired Video (2020). Lin T-Y Maire M Belongie S Hays J Perona P Ramanan D Dollár P Zitnick C Lawrence Fleet D Pajdla T Schiele B Tuytelaars T Microsoft COCO: common objects in context Computer Vision – ECCV 2014 2014 Cham Springer 740 755 10.1007/978-2-1_48 Google Scholar Zhao L Li S Object detection algorithm based on improved YOLOv3 Electronics 2020 9 537 10.3390/electronics9030537 Google Scholar Cross Ref Liu W Anguelov D Erhan D Szegedy C Reed S Fu C-Y Berg Alexander C Leibe B Matas J Sebe N Welling M SSD: single shot multibox detector Computer Vision – ECCV 2016 2016 Cham Springer 21 37 10.1007/978-8-0_2 Google Scholar 3.WHO fact sheet on blindness and visual impairment. Global prevalence of blindness and distance and near vision impairment in 2020: progress towards the vision 2020 targets and what the future holds Invest. It can be integrated into other IoT devices such as smart walking sticks or wearable gadgets. Viva is a prototype intended to demonstrate the potential use-cases of this idea. Viva operates in a low-power mode with the screen turned off to efficiently utilize the limited battery resources on mobile phones. The UI of the virtual assistant is uniquely designed from the ground-up to be intuitive, without the need for any usual aids via voice commands or single point touch control – where the entire screen acts as a soft button. Feedback is provided to the user whenever there is a potential risk observed. Data collected is then processed by a risk-prediction algorithm to calculate the risk of collision. Object recognition mode includes a pre-built object recognition model that can recognize over 100 different common objects.
The navigation assistant analyzes a user’s surroundings by detecting and estimating distances from the user to the object.
#IGLASSES ULTRASONIC ANDROID#
This Android application has features such as navigation assistant, object detection, voice-controlled UI and emergency assistant. We present the architecture, as well as a proof-of-concept prototype intended to demonstrate a potential use-case for a commercial embedded product that can be integrated into a walking stick or any wearable gadget. The application provides haptic and voice navigation assistance by detecting obstacles in the user’s surroundings and calculating the potential risk. In this work, we present Viva, an Android-based virtual assistant aiming to help people with visual impairment. It is estimated that there are 1.3 billion people in the world with some form of vision loss. Visual impairment refers to the partial or complete loss of one’s ability to see.