A Lightweight Approach to Localization for Blind and Visually Impaired Travelers
Principal Investigator(s): View help for Principal Investigator(s) James Coughlan, Smith-Kettlewell Eye Research Institute
Version: View help for Version V1
Version Title: View help for Version Title Sensors 2022 submission
Name | File Type | Size | Last Modified |
---|---|---|---|
DeepLearning | 12/28/2022 07:28:PM | ||
LoggedData | 12/28/2022 06:48:PM | ||
PyNavigate | 12/28/2022 07:35:PM | ||
maps | 12/28/2022 07:29:PM |
Project Citation:
Coughlan, James. A Lightweight Approach to Localization for Blind and Visually Impaired Travelers. Ann Arbor, MI: Inter-university Consortium for Political and Social Research [distributor], 2022-12-28. https://doi.org/10.3886/E183714V1
Project Description
Summary:
View help for Summary
We present a localization algorithm based on computer vision and inertial sensing; the algorithm is lightweight in that it requires only a 2D floor plan of the environment, annotated with the locations of visual landmarks and points of interest, instead of a detailed 3D model (used in many computer vision localization algorithms), and requires no new physical infrastructure (such as Bluetooth beacons). The algorithm can serve as the foundation for a wayfinding app that runs on a smartphone; crucially, the approach is fully accessible because it doesn’t require the user to aim the camera at specific visual targets, which would be problematic for BVI users who may not be able to see these targets. In this work, we improve upon the existing algorithm so as to incorporate recognition of multiple classes of visual landmarks to facilitate effective localization, and demonstrate empirically how localization performance improves as the number of these classes increases
Funding Sources:
View help for Funding Sources
NEI/NIH (1R01EY029033);
NIDILRR (90RE5024);
NIDILRR (90REGE0018)
Related Publications
Published Versions
Report a Problem
Found a serious problem with the data, such as disclosure risk or copyrighted content? Let us know.
This material is distributed exactly as it arrived from the data depositor. ICPSR has not checked or processed this material. Users should consult the investigator(s) if further information is desired.