Abstract
We deevloped a pilot study of assistive technology for people with vision disabilities of central field loss (CFL) and low contrast sensitivity (LCS). Our technology includes super-reality (SR) glasses with enhanced image contrast by overlaying highlights of objects, signs onto the transparent glasses. The wearable technology will increase multimodal mobility, not only for driving, but also walking and navigating, and other activities of daily living. The deployment sites include Low Vision Rehabilitation Centers in PA, with partner Dr. Paul Freeman, OD of AGH.
Description
According to the World Health Organization (WHO), there are about 10 million visually impaired people in the US [1]. This number is escalating rapidly as the population ages, implying a growing barrier to a wide range of mobility activities that require access to visual information, including following a driving lane, detecting obstacles, pedestrians, bikes, recognizing signs and work zones, navigating with maps, and reading dynamic displays. Low vision associates with two major types of visual impairment: central field loss (CFL) and low contrast sensitivity (LCS). Central field loss (CFL) is often caused by age-related macular degeneration (AMD), which affects up to 8 million Americans [4]. Low contrast sensitivity is common in the aging population when the visual signals become blurry. Bioptic telescopic spectacles (BTS) can be used for driving by people with a visual acuity that is not sufficient to qualify for an unrestricted driver’s license. Bioptic telescopic spectacles consist of either monocular or binocular telescopes mounted to a pair of spectacles. Bioptic telescopic spectacles can be used in about 40 states for this purpose, including New York, Ohio, West Virginia. The state of Pennsylvania has not accepted bioptic spectacles yet but is considering to change the regulation. The specific requirements for bioptic telescopic spectacles drivers are however different from state to state. For example, the training times range from 20 hours up to 90 hours. However, BTS alone does not fully improve visual contrast sensitivity.
We need to develop new technologies to improve the mobility of low vision people, not only for drivers but also for non-driver in daily lives. Modern transportation systems such as autonomous driving vehicles, ride-sharing services, and assisted vehicle services are designed to help people to move from point A to point B, but they are not enough to cover the full spectrum of mobile activities such as indoor navigation within a subway, an airport, a hospital, or a campus. We need a wearable device for visually impaired people to obtain information on demand, on location, and tailored for visual enhancement, tactile, and auditory cues with artificial intelligence.
Here we propose a pilot study for the assistive technology for people with vision disabilities of central field loss (CFL) and low contrast sensitivity (LCS). Our technology includes a pair of super-reality (SR) glasses with enhanced image contrast, for example, highlighting objects, detect signs, and lanes. We call the technology super-reality because it provides more details than what the user can see, for example, thermal image and contours of pedestrians. In contrast to prevailing Augmented Reality (AR) and Virtual Reality (VR) technologies, which project either mixed reality objects or virtual objects to the glasses, Super Reality (SR) fuses real-time sensory information and enhance image from the reality. SR glasses technology has two advantages: it’s relatively fail-safe. If the battery dies or processor crashes, the glasses can still function because it is transparent. SR glasses can also be transformed into a VR or AR simulator when it overlays virtual objects such as pedestrians or vehicles onto the glasses for simulation. For over two years, the PI’s lab has worked on prototypes of SR glasses for first responders for public safety missions such as search and rescue. In this project, we will further develop the technology for low vision users.
The real-time visual enhancement and alert information are overlaid on the transparent glasses. The visual enhancement module can be expanded to highlight details for macular degeneration and low contrast sensitivity people. The assistive technology also includes speech recognition interface, indoor navigation interface, and tactile feedback interface. The objective is to enable poor vision users to perform normal driving, to navigate inside public transportation facilities, to interact with autonomous or ride-sharing vehicles, to navigate to the destination and back. We believe that the proposed assistive technology would increase mobility for visually impaired people using vehicle services, or even retain their driver’s license after extended training and exams. The proposed technology can be used to improve the mobility to well-sighted people as well, for example, Smart City and Mobility systems where users can get information on-demand about urban traffic, first responders of emergency situations such as fire and flood, public transportation, and indoor localization and mapping.
The tasks of the project include: first, survey of the state-of-the-art of low vision rehabilitation technologies and training procedures, the interface between our super-reality glasses and the bioptic telescope spectacles (BTS) and other existing rehabilitation devices. We explored the alternatives of BTS, including an on-board digital camera. Second, we developed computer vision algorithms for enhancing contrast sensitivities with object detection of signs, lanes, pedestrians and vehicles with coded color edge enhancement, warning symbols, and audio signals. Third, we designed and implemented the holographic overlay algorithm to align the highlighted information with actual objects on screen.
1. Prototype One: Micro Video Heads-Up Display (HUD)
Our first generation of HUD is a micro video display system that is connected to an embedded computer. It can display live video with a zoom in function. It has the half-VGA resolution and at least 25 fps. However, the HUD obscures the view like many Bioptic Telescope products. In addition, we have not found any affordable OEMs for the video HUD component on the market.
2. Prototype Two: Holographic HUD
Our second prototype is to project the live video from the OLED to the concaved beam-split lens, which forms the virtual enlarged image in front of the glasses. The advantage of the holographic design is that the enlarged image is overlaid on top of the lens without obscuring the view. In this prototype, we used much more compact embedded computer, smaller, and lighter OLED. Due to constraints of the length of signal cables for the OLED and camera, we have not reached the optimal alignment yet. We also need to redesign optical components to reach the desirable telescopic results.
Output:
We have developed a new method to fuse multi-sensor information for detecting user's activities and pavement bumpers. We will file the Invention Disclosure to CMU Technology Transfer office and follow up with Provisional Patent. We participated NIST’s Haptic Interfaces Challenge and won the First Place Award in 2019. We are also participating the NIST’s AR Interface Challenge at Phase III in 2020.
Outcomes (Publications):
1. Yang Cai, Learn on the Fly, Proceedings of AHFE Conference, Springer, 2020
2. Yang Cai, Florian Alber, Sean Hackett, Indoor Localization on Helmet, Proceedings of AHFE Conference, Springer, 2020
3. Yang Cai, Sean Hackett and Florian Alber, Path Markup Language for Indoor Navigation, Proceedings of ICCS Conference, Springer, 2020
4. Yang Cai, et al. Heads-Up LiDAR on Helmet. Accepted to appear on Imaging Science and Technology Conference, Jan, 2020
5. Sean Hackett, Florian Alber, and Yang Cai. A Hyper-Reality Helmet for Virtual and Live Emergency Response, Proceedings of HCII Conference, Springer, 2020
6. Yang Cai, Angelo Genovese, Mel Siegel, et al. IoT-based Architectures for Sensing and Local Data Processing in Ambient Intelligence: Research and Industrial Trends, Proceedings of IEEE I2TMC, 2019
7. S. Hackett, Y. Cai and M. Siegel. Sensor Fusion-Based Activity Recognition from Fireman’s Helmet, ICSP-BMEI Conference, Huajiao, Oct. 2019
References:
1. Fact Sheet Blindness and Low Vision | National Federation of the Blind. [Accessed: 02-Jul-2017]; [Online]. Available: https://nfb.org/fact-sheet-blindness-and-low-vision.
2. Pascolini D, Mariotti SP. Global estimates of visual impairment: 2010. Br. J. Ophthalmol. 2011 Dec.[PubMed]
3. Geruschat D, Dagnelie D. Assistive Technology for Blindness and Low Vision. CRC Press; 2012. Low Vision: Types of Vision Loss and Common Effects on Activities of Daily Life.
4. Friedman DS, O’Colmain BJ, Tomany SC et al. Prevalence of age-related macular degeneration in the United States. Arch Ophthalmol 2004; 122: 564–572.
5. Browers AR, et al. Evaluation of a paradigm to investigate detection of road hazards when using a bioptic telescope. Optim. Vis. Sci. 2018, vol. 95(9).
6. Bronstad PM, et al. Driving with central field loss III: vehicle control. Clinical and Experimental Optometry, 2016.
7. Wilkinson ME and McGehee DV. Auditory global positioning system and advanced driver assistance systems: a safer alternative to bioptic telescopes for drivers who are visually impaired? Optom. Vis. Sci. TBD.
8. Tadin D, Lappin JS, and Sonsino J. Recognition speed using a bioptic telescope. Optometry and Vision Science, vol. 85, no. 12, December 2008
9. Owsley C. Driving with bioptic telescopes: organizing a research agenda. Optom. Vis. Sci. vol. 89, no.9 September 2012.
10. Dougherrty BE, et al. Vision, training hours, and road testing results in bioptic drives. Optom. Vis. Sci. vol. 92, no. 4, April 2015
Timeline
The proposed pilot project will be 12 months, starting on July 1, 2019, through June 30, 2020.
Month 1 – 3: Low vision rehabilitation technology and training survey and interface design for BTS.
Month 4 – 6: Augmented object contrast enhancement with computer vision algorithms and semi-annual report.
Month 7 – 9: Information overlay and alignment algorithm development.
Month 10 - 12: At the end of the project, we will present a prototype demonstration and final report.
Strategic Description / RD&T
Deployment Plan
The deployment will start with filling Innovation Disclosures at CMU CTTEC around halfway through the project. After review of the Disclosure, we would proceed Provisional Patent applications. We will then present our technical progress at the UTC-wide semi-annual presentation event. At the end of the project, we will demonstrate our super-reality glasses prototype at our partner’s three offices in PA: Allegheny General Hospital, Department of Ophthalmology, 420 East North Avenue, Suite 116 Pittsburgh, PA 15212, (412) 359-6300. Beaver County Association for the Blind, Low Vision Rehabilitation Services,
616 Fourth Street, Beaver Falls, PA 15010, (724) 843-1111. Keystone Blind Association, Low Vision Rehabilitation Services, 3056 East State Street, Hermitage, PA 16148. (724) 347-5501.
Expected Outcomes/Impacts
1. Super-Reality Glasses Prototype: including RGB camera, OLED display, motion sensors, and thermal camera.
2. Edge enhancement: computer vision algorithm that can enhance the imagery contrast at the speed of 20 fps.
3. Sign detection: computer vision algorithms that can detect STOP sign and Work Zone.
4. Pedestrian detection and highlight: computer vision algorithm for pedestrian detection and highlighting.
Expected Outputs
TRID
Individuals Involved
Email |
Name |
Affiliation |
Role |
Position |
sbenicky@andrew.cmu.edu |
Benicky, Sheryl |
CIT ERA |
Other |
Other |
ycai@cmu.edu |
Cai, Yang |
CyLab |
PI |
Faculty - Untenured, Tenure Track |
Budget
Amount of UTC Funds Awarded
$25000.00
Total Project Budget (from all funding sources)
$25000.00
Documents
Type |
Name |
Uploaded |
Data Management Plan |
Data_Management_Plan-V1.pdf |
Jan. 6, 2019, 4:43 p.m. |
Data Management Plan |
Data_Management_Plan-V1_GUuAVze.pdf |
March 15, 2019, 12:43 p.m. |
Presentation |
Cai-glasses-v1.pptx |
March 29, 2020, 1:04 p.m. |
Publication |
Cai-glasses-v1_wm5gl65.pptx |
Oct. 2, 2019, 7:13 a.m. |
Progress Report |
296_Progress_Report_2019-09-30 |
Oct. 2, 2019, 7:13 a.m. |
Publication |
ICCS-2020-Indoor-Nav-V1.pdf |
March 29, 2020, 1:04 p.m. |
Progress Report |
296_Progress_Report_2020-03-30 |
March 29, 2020, 1:05 p.m. |
Publication |
AHFE-2020-Indoor-Localization-1407-V2.pdf |
June 28, 2020, 6:52 p.m. |
Publication |
AHFE2020-1408-Learn-On-The-Fly-Cai-V7.pdf |
June 28, 2020, 6:52 p.m. |
Final Report |
Final_Report_-_296.pdf |
July 9, 2020, 4:31 a.m. |
Match Sources
No match sources!
Partners
Name |
Type |
AGH |
Deployment Partner Deployment Partner |