Login

Project

#296 Improving Mobility of Low Vision People with Super-Reality Glasses


Principal Investigator
Yang Cai
Status
Active
Start Date
July 1, 2019
End Date
June 30, 2020
Research Type
Advanced
Grant Type
Research
Grant Program
FAST Act - Mobility National (2016 - 2022)
Grant Cycle
2019 Mobility21 UTC
Visibility
Public

Abstract

We propose a pilot study of assistive technology for people with vision disabilities of central field loss (CFL) and low contrast sensitivity (LCS). Our technology includes super-reality (SR) glasses with enhanced image contrast by overlaying highlights of objects, signs onto the transparent glasses. The wearable technology will increase multimodal mobility, not only for driving, but also walking and navigating, and other activities of daily living. The deployment sites include Low Vision Rehabilitation Centers in PA, with partner Dr. Paul Freeman, OD of AGH.
    
Description
According to the World Health Organization (WHO), there are about 10 million visually impaired people in the US [1]. This number is escalating rapidly as the population ages, implying a growing barrier to a wide range of mobility activities that require access to visual information, including following a driving lane, detecting obstacles, pedestrians, bikes, recognizing signs and work zones, navigating with maps, and reading dynamic displays. Low vision associates with two major types of visual impairment: central field loss (CFL) and low contrast sensitivity (LCS). Central field loss (CFL) is often caused by age-related macular degeneration (AMD), which affects up to 8 million Americans [4]. Low contrast sensitivity is common in the aging population when the visual signals become blurry. Bioptic telescopic spectacles (BTS) can be used for driving by people with a visual acuity that is not sufficient to qualify for an unrestricted driver’s license. Bioptic telescopic spectacles consist of either monocular or binocular telescopes mounted to a pair of spectacles. Bioptic telescopic spectacles can be used in about 40 states for this purpose, including New York, Ohio, West Virginia. The state of Pennsylvania has not accepted bioptic spectacles yet but is considering to change the regulation. The specific requirements for bioptic telescopic spectacles drivers are however different from state to state. For example, the training times range from 20 hours up to 90 hours. However, BTS alone does not fully improve visual contrast sensitivity. 

We need to develop new technologies to improve the mobility of low vision people, not only for drivers but also for non-driver in daily lives. Modern transportation systems such as autonomous driving vehicles, ride-sharing services, and assisted vehicle services are designed to help people to move from point A to point B, but they are not enough to cover the full spectrum of mobile activities such as indoor navigation within a subway, an airport, a hospital, or a campus. We need a wearable device for visually impaired people to obtain information on demand, on location, and tailored for visual enhancement, tactile, and auditory cues with artificial intelligence.  

Here we propose a pilot study for the assistive technology for people with vision disabilities of central field loss (CFL) and low contrast sensitivity (LCS). Our technology includes a pair of super-reality (SR) glasses with enhanced image contrast, for example, highlighting objects, detect signs, and lanes. We call the technology super-reality because it provides more details than what the user can see, for example, thermal image and contours of pedestrians. In contrast to prevailing Augmented Reality (AR) and Virtual Reality (VR) technologies, which project either mixed reality objects or virtual objects to the glasses, Super Reality (SR) fuses real-time sensory information and enhance image from the reality. SR glasses technology has two advantages: it’s relatively fail-safe. If the battery dies or processor crashes, the glasses can still function because it is transparent. SR glasses can also be transformed into a VR or AR simulator when it overlays virtual objects such as pedestrians or vehicles onto the glasses for simulation. For over two years, the PI’s lab has worked on prototypes of SR glasses for first responders for public safety missions such as search and rescue. In this project, we will further develop the technology for low vision users. 
The real-time visual enhancement and alert information are overlaid on the transparent glasses. The visual enhancement module can be expanded to highlight details for macular degeneration and low contrast sensitivity people. The assistive technology also includes speech recognition interface, indoor navigation interface, and tactile feedback interface. The objective is to enable poor vision users to perform normal driving, to navigate inside public transportation facilities, to interact with autonomous or ride-sharing vehicles, to navigate to the destination and back. We believe that the proposed assistive technology would increase mobility for visually impaired people using vehicle services, or even retain their driver’s license after extended training and exams. The proposed technology can be used to improve the mobility to well-sighted people as well, for example, Smart City and Mobility systems where users can get information on-demand about urban traffic, first responders of emergency situations such as fire and flood, public transportation, and indoor localization and mapping. 

The pilot project includes four tasks: first, we will survey the state-of-the-art low vision rehabilitation technologies and training procedures. We will then design the interface between our super-reality glasses and the bioptic telescope spectacles (BTS) and other existing rehabilitation devices. We will explore the alternatives of BTS, including an onboard digital camera. Second, we will develop computer vision algorithms for enhancing contrast sensitivities with object detection of signs, lanes, pedestrians and vehicles with coded color edge enhancement, warning symbols, and audio signals. Third, we will design and implement the holographic overlay algorithm to align the highlighted information with actual objects on the screen. Finally, we will test the prototype at a low vision rehabilitation center with simulated mobility scenarios such as driving and walking.   

References:

1.	Fact Sheet Blindness and Low Vision | National Federation of the Blind. [Accessed: 02-Jul-2017]; [Online]. Available: https://nfb.org/fact-sheet-blindness-and-low-vision.
2.	Pascolini D, Mariotti SP. Global estimates of visual impairment: 2010. Br. J. Ophthalmol. 2011 Dec.[PubMed]
3.	Geruschat D, Dagnelie D. Assistive Technology for Blindness and Low Vision. CRC Press; 2012. Low Vision: Types of Vision Loss and Common Effects on Activities of Daily Life.
4.	Friedman DS, O’Colmain BJ, Tomany SC et al. Prevalence of age-related macular degeneration in the United States. Arch Ophthalmol 2004; 122: 564–572. 
5.	Browers AR, et al. Evaluation of a paradigm to investigate detection of road hazards when using a bioptic telescope. Optim. Vis. Sci. 2018, vol. 95(9).
6.	Bronstad PM, et al. Driving with central field loss III: vehicle control. Clinical and Experimental Optometry, 2016.
7.	Wilkinson ME and McGehee DV. Auditory global positioning system and advanced driver assistance systems: a safer alternative to bioptic telescopes for drivers who are visually impaired? Optom. Vis. Sci. TBD.
8.	Tadin D, Lappin JS, and Sonsino J. Recognition speed using a bioptic telescope. Optometry and Vision Science, vol. 85, no. 12, December 2008
9.	Owsley C. Driving with bioptic telescopes: organizing a research agenda. Optom. Vis. Sci. vol. 89, no.9 September 2012.
10.	Dougherrty BE, et al. Vision, training hours, and road testing results in bioptic drives. Optom. Vis. Sci. vol. 92, no. 4, April 2015
    
Timeline
The proposed pilot project will be 12 months, starting on July 1, 2019, through June 30, 2020. 
Month 1 – 3: Low vision rehabilitation technology and training survey and interface design for BTS.
Month 4 – 6: Augmented object contrast enhancement with computer vision algorithms and semi-annual report.
Month 7 – 9: Information overlay and alignment algorithm development.
Month 10 - 12: At the end of the project, we will present a prototype demonstration and final report.
    
Deployment Plan
The deployment will start with filling Innovation Disclosures at CMU CTTEC around halfway through the project. After review of the Disclosure, we would proceed Provisional Patent applications. We will then present our technical progress at the UTC-wide semi-annual presentation event. At the end of the project, we will demonstrate our super-reality glasses prototype at our partner’s three offices in PA: Allegheny General Hospital, Department of Ophthalmology, 420 East North Avenue, Suite 116 Pittsburgh, PA 15212, (412) 359-6300. Beaver County Association for the Blind, Low Vision Rehabilitation Services,
616 Fourth Street, Beaver Falls, PA 15010, (724) 843-1111. Keystone Blind Association, Low Vision Rehabilitation Services, 3056 East State Street, Hermitage, PA 16148. (724) 347-5501.
    
Expected Accomplishments and Metrics
1.	Super-Reality Glasses Prototype: including RGB camera, OLED display, motion sensors, and thermal camera. 
2.	Edge enhancement: computer vision algorithm that can enhance the imagery contrast at the speed of 20 fps.
3.	Sign detection: computer vision algorithms that can detect STOP sign and Work Zone.
4.	Pedestrian detection and highlight: computer vision algorithm for pedestrian detection and highlighting.
    

Individuals Involved

Email Name Affiliation Role Position
sbenicky@andrew.cmu.edu Benicky, Sheryl CIT ERA Other Other
ycai@cmu.edu Cai, Yang CyLab PI Faculty - Untenured, Tenure Track

Budget

Amount of UTC Funds Awarded
$25000.00
Total Project Budget (from all funding sources)
$

Documents

Type Name Uploaded
Data Management Plan Data_Management_Plan-V1.pdf Jan. 6, 2019, 4:43 p.m.
Data Management Plan Data_Management_Plan-V1_GUuAVze.pdf March 15, 2019, 12:43 p.m.
Presentation Cai-glasses-v1.pptx March 15, 2019, 12:45 p.m.
Publication Cai-glasses-v1_wm5gl65.pptx Oct. 2, 2019, 7:13 a.m.
Progress Report 296_Progress_Report_2019-09-30 Oct. 2, 2019, 7:13 a.m.

Match Sources

No match sources!

Partners

Name Type
AGH 1