Login

Project

#308 Smart Glasses for Improving Mobility of Low Vision People (Phase Two)


Principal Investigator
Yang Cai
Status
Completed
Start Date
July 1, 2020
End Date
June 30, 2021
Project Type
Research Applied
Grant Program
FAST Act - Mobility National (2016 - 2022)
Grant Cycle
2020 Mobility21 UTC
Visibility
Public
Photo:
Download

Abstract

This proposal aims to continue the work toward a prototype of the smart glasses with the digital bioptic lens. In Phase Two, we will implement the digital bioptic lens on augmented reality glasses. Bioptic glasses is a telescope on glasses. It helps people with poor vision see details in the central retina area where it has degenerated by aging or diseases such as glaucoma. Currently, WV, NY, OH, and NJ allow people to drive with the bioptic glasses with certain training. For example, OH requires 40 hours training and WV requires 80 hours of training. Currently PennDOT is working on the state legislation regarding drivers with bioptic glasses. The existing mechanical bioptic lens are heavy and passive. It needs manual focus adjustment with two hands and also blocks part of the user’s vision. The proposed work is to replace the mechanical lens with digital bioptic heads-up display that can be projected onto the regular glasses. It will be lighter, auto-focus, and digitally zoomable. This will improve the mobility of low-vision users in many applications, including driving, sign recognition, navigation, situation-awareness, and training. The scope of work for Phase Two includes three tasks: 1) miniature digital camera and optic design, 2) miniature OLED heads-up display design, and 3) computer vision algorithms for bioptic display and visual enhancement.
    
Description
This proposal aims to continue the work toward a prototype of the smart glasses with the digital bioptic lens. In Phase One, we invented the sensor fusion interface for bumper/step detection. We use the lidar sensor for measuring the distance and gyroscope and accelerometers to track head movements. By applying a Kalman filter on the accelerometer and gyroscope data, we can improve the accuracy, removing the bias due to gravity and performing double integration to find the displacement of glasses. 

Compared to a visual step system which may easily locate a step up based on shadows, it may have much more difficulty for a step down which is not as easy to visually identify. This lidar and IMU sensory fusion solution does not suffer from this problem and is just as effective for detecting up and down steps. It also has much simpler computation than a visual camera detection system which would require more intensive image processing. Our paper has been accepted by IS&T society conference in 2020.

In Phase Two, we will implement the digital bioptic lens on augmented reality glasses. Bioptic glasses is a telescope on glasses. It helps people with poor vision see details in the central retina area where it has degenerated by aging or diseases such as glaucoma. 

Currently, WV, NY, OH, and NJ allow people to drive with the bioptic glasses with certain training. For example, OH requires 40 hours training and WV requires 80 hours of training. Dr. Freeman is on the board of PennDOT and working on state legislation regarding drivers with bioptic glasses. The existing mechanical bioptic lens are heavy and passive. It needs manual focus adjustment with two hands and also blocks part of the user’s vision. 

The proposed work is to replace the mechanical lens with digital bioptic heads-up display that can be projected onto the regular glasses. It will be lighter, auto-focus, and digitally zoomable. This will improve the mobility of low-vision users in many applications, including driving, sign recognition, navigation, situation-awareness, and training.

The scope of work for Phase Two includes three tasks: 1) miniature digital camera and optic design, 2) miniature OLED heads-up display design, and 3) computer vision algorithms for bioptic display and visual enhancement. The miniature digital camera will be selected along with optic design for the telescopic view. The camera data will be connected to the DSI bus of the onboard processor. The miniature OLED heads-up display will be used for holographic projection onto the glasses. The display will have an HDMI video interface and the near-eye display lens. Our main challenge will be the development of computer vision algorithms for the bioptic vision, including digital zooming, digital auto-focusing, contrast enhancing, and edge highlighting. The vision processing has to be in real-time and at least 30 fps. We will develop the rapid prototype of the digital bioptic glasses and test it in the lab environment with the simulated driving and walking scenarios.

Timeline
1. Digital camera and optic design:  Summer 2020
2. Heads-up display design: Fall 2020
3. Bioptic vision algorithms and prototype: Spring 2021
Strategic Description / RD&T

    
Deployment Plan
The scope of work for Phase Two includes three tasks: 1) miniature digital camera and optic design, 2) miniature OLED heads-up display design, and 3) computer vision algorithms for bioptic display and visual enhancement. The miniature digital camera will be selected along with optic design for the telescopic view. The camera data will be connected to the DSI bus of the onboard processor. The miniature OLED heads-up display will be used for holographic projection onto the glasses.  The display will have an HDMI video interface and the near-eye display lens. Our main challenge will be the development of computer vision algorithms for the bioptic vision, including digital zooming, digital auto-focusing, contrast enhancing, and edge highlighting. The vision processing has to be in real-time and at least 30 fps. We will develop the rapid prototype of the digital bioptic glasses and test it in the lab environment with the simulated driving and walking scenarios.
Expected Outcomes/Impacts
At the end of the proposed Phase Two work, we will demonstrate the prototype of the bioptic glasses with basic functions for the low-vision users. It will be lighter than the mechanical bioptic lens and operating in real-time in the outdoor and indoor environments.
Expected Outputs

    
TRID


    

Individuals Involved

Email Name Affiliation Role Position
ycai@cmu.edu Cai, Yang Cylab and BME PI Faculty - Research/Systems

Budget

Amount of UTC Funds Awarded
$25000.00
Total Project Budget (from all funding sources)
$100000.00

Documents

Type Name Uploaded
Data Management Plan Data_Management_Plan-V1_IXy4KLi.pdf Nov. 30, 2019, 5:22 p.m.
Publication Safety Analysis for AI Systems Sept. 30, 2020, 4:30 p.m.
Presentation Hyper-Reality Helmet in Virtual and Live Environment Sept. 30, 2020, 4:30 p.m.
Progress Report 308_Progress_Report_2020-09-30 Sept. 30, 2020, 4:30 p.m.
Publication Head-Mounted Lidar Imaging with Sensor Fusion March 29, 2021, 5:18 p.m.
Progress Report 308_Progress_Report_2021-03-31 March 29, 2021, 5:18 p.m.
Final Report Final_Report_-_308.pdf July 15, 2021, 11:01 a.m.
Publication Human-Machine Learning with Mental Map May 2, 2022, 9:56 a.m.
Publication Improving Mobility of Low Vision People with Super-Reality Glasses May 2, 2022, 9:56 a.m.
Publication Improving Mobility of Low Vision People with Super-Reality Glasses May 3, 2022, 5:03 p.m.

Match Sources

No match sources!

Partners

Name Type
AGH Deployment Partner Deployment Partner
Three Rivers Optical Deployment Partner Deployment Partner