Abstract
The Traffic Management Center (TMC) plays essential roles in daily mobility and Emergency Support Functions (ESF). They are centralized with a massive amount of live data. Here we propose a decentralized TMC with Augmented Reality (AR) that integrates geographically accurate 3D terrain and building models with the live mobility IoT data from open sources. The AR TMC system is portable and can run on a laptop, AR/VR device, or a stereo projector. We will test our prototype with our partners in EMS and the Fire Academy.
Description
INTRODUCTION
The Traffic Management Center (TMC) is the hub of transportation management systems where information about transportation networks is collected and combined with other operational and control data to manage the mobility to produce traveler information. It is the focal point for communicating transportation-related information to the media and the motoring public, a place where agencies can coordinate their responses to transportation situations and conditions. The TMC also links various elements of Intelligent Transportation Systems such as variable message signs, closed circuit video equipment, roadside count stations, etc., enabling decision-makers to identify and react to an incident in a timely manner based on real-time data.
TMCs encounter incidents on a daily basis. The operators are often overwhelmed with the flood of real-time data. For example, the TMC in PennDOT’s District 11 in Bridgeville, PA has over 200 CCTV cameras that are streamed to 50 monitors at any time. During the disastrous events, such as snow storms, power outages, flood, major accidents, and tornadoes, the TMC provides Emergency Support Functions (ESF) to first responders and the public. In many cases, mobile field command posts are necessary for emergency responders.
Many existing TMCs are located in fixed locations with massive wiring cables from sensors and a wall of monitors. The operators are overwhelmed by the massive isolated 2D data without a holistic view of the 3D reality. For example, many urban traffic jams are three-dimensional (3D), such as Fort Pitt Bridge, Pittsburgh International Airport, and subway systems in NYC and DC. Some emergency medical systems (EMS) often operate in a 3D space, for example, medical helicopters need to navigate through 3D structures, to find landing zones, and avoid obstacles like power lines and utility poles. Furthermore, future mobility would include multimodalities such as drones and autonomous driving vehicles that depend on 3D lidar data to navigate through the city.
Augmented Reality (AR) has emerged as a next generation intelligent interface that integrates cutting-edge sensors, vision, and data communication. AR devices overlay the important information to the surrounding environment, merging the virtual world with live data and interaction. It is portable for field use. In the wake of the pandemic, an AR device can turn an operator’s living room into a TMC including the video wall and live data through a high-speed Internet connection. For the last four years, the PI’s Lab has developed multiple AR systems. The team won the First Place Award from the NIST Haptic Interfaces Challenge in 2019 and won the NIST AR Challenge Phase I through III in 2020 and submitted the final prototype for Phase IV. See web links:
https://techxplore.com/news/2020-01-haptic-helmet-firefighters.html
https://www.nist.gov/ctl/pscr/open-innovation-prize-challenges/current-and-upcoming-prize-challenges/2020-chariot
Here we propose a new kind of Traffic Management Center (TMC) with Augmented Reality (AR) technology that integrates geographically accurate 3D terrain and building models with the live mobility IoT data from open sources such as highway CCTV, Google Map, USGS, DHS, and NOAA. The AR TMC system is portable and it can run on a laptop or a heads-up-display (HUD) device for AR or VR such as HoloLens 2 and Magic Leap ML1. It can also be projected to a holographic screen for stereo vision from a pair of affordable polarized lens glasses. We will study the mobility scenarios in normal and extreme conditions including normal city view from a traffic management center, MedEvac Dispatch Center, and flood in downtown Pittsburgh. We will deploy our prototype to our partners in UPMC MedEvac EMT, City of Pittsburgh Fire Bureau, and Allegheny Fire Academy. See the related demo video:
https://www.youtube.com/watch?v=aEzewo6xDMU
PROPOSED APRROACH
Task 1. AR City Models.
In this task, we will create geographically accurate city models that can be shared with TMC applications and GIS communities such as OpenStreetMap. Although there are 3D maps from iPhone Maps, Google Earth, and Google Maps, they are not accurate enough for close up views, especially the spaghetti-like bridges and warped streets, when you view them from a driver or pedestrian's point of view. Developing 3D models of a city is a time-consuming, expensive, and technical process. In many cases, the extracted 3D models are not geo-referenced and noisy.
In this project, we will first set the geographical boundaries for the map and extract the 3D models of a city from open sources such as Google Earth for infrastructures and buildings and USGS for elevation models. Then we will develop a middleware to register the texture, GPS coordinates, and elevation models into an integrated AR model. This middleware is critical to build an accurate physical 3D model, for example, the details under a bridge, inside the tunnel, the street views of buildings from the drivers and pedestrians’ view point. To incorporate water transportation or flood models, the middleware enables merging the river elevation models with the 3D models from Google Earth because it doesn’t contain the profile of rivers. Google Earth and USGS models use different resolutions. The middleware will fuse the multiresolution and multimodal models so that additional physical data can be integrated, e.g. flood water level and thermal imaging data, and so on. The libraries of the 3D articulated models are exported in standard formats such as GLB, OBJ, STL, DXF, PLY, and XYZ that can be readable to web browsers, Unity, and many AR/VR/MR devices.
Task 2. Real-Time Data Overlay.
The live streaming IoT (Internet of Things) data will be registered onto the AR city model. The real-time IoT data will come from open sources such as the CCTV video feed from PennDoT traffic camera web sites and live traffic data from Google Map, and air quality data from Allegheny County Health Department (ACHD), etc. For the CCTV video feed, we will use RTMP (Real-Time Messaging Protocol) for transmission. For live sensor data, we will use the MQTT (Message Queuing Telemetry Transport) protocol. The live CCTV videos can be registered to the 3D city model based on their locations and orientations. They also provide important mobility IoT data to the AR TMC. We will develop an algorithm to extract the IoT data from the live videos. Many of the live video feeds are in lower frame rates, which is challenging to estimate the vehicle speed. We will combine multiple camera data to obtain more accurate estimation.
Our innovations in the system include:
Geographically accurate 3D terrain and building models registered with the live IoT data
Adjustable Spotlight for the hotspot in each scenario
Compressing IoT data to minimize cognitive overflow, e.g. values of non-essential sensors
Rotating text toward the viewer always
Surrounding soundscape for an immersive experience
The user can switch between the hologram and HUD views
Vertical stack for co-locational data to avoid visual congestion
Demo video: https://www.youtube.com/watch?v=aEzewo6xDMU
Task 3. Mobility Scenarios in Normal and Extreme Conditions
The AR TMC system is portable and can run on a laptop or a heads-up-display (HUD) device for AR or VR such as HoloLens 2 and Magic Leap ML1. It can also be projected to a holographic screen (with a sprayed-on coating) for stereo vision from a pair of affordable polarized lens glasses. In this task, we will study a few scenarios: first the Pittsburgh model with the real-time traffic data from Oakland to the Pittsburgh International Airport in a normal day-to-day condition. We will create the virtual traffic management center with live traffic data from open sources such as CCTVs of PennDoT and Google Map. Second, we will simulate the Emergency Management System, MedEvac Dispatch Center with live data from helicopters and hospitals, weather, and traffic data services. Our model will help to coordinate the mobilities in a 3D space: inside the hospital, on the ground and in the air. The mobile AR TMC enables dispatch working in extreme conditions such as the disastrous field or work from home. Finally, we will superimpose extreme conditions to the AR city model, specifically, the downtown Pittsburgh area. We will simulate the extreme scenes such as flood and Marathon events in the area. These scenarios can be used for training emergency responders and traffic management operators, including fire fighters, emergency medical doctors, police, and SWAT teams. Furthermore, this system can be used for coordinating with multiple response teams for disaster rescues and humanitarian assistance, for example, the massive pileup accident on I-80 in a snowstorm. The MAR TMC can be deployed in the field and connected with the cellphone networks or satellite Internet with helicopters, hospitals, and existing TMC. The system is able to record the user’s response and the data logs for training purposes.
Furthermore, the AR city models can be used to create immersive driving experiences for example, driving on the most dangerous spot Fort Pitt Bridge in Pittsburgh, or on the icy and a hilly road during a snowy day. The system may also help to train the autonomous vehicle AI systems and drone pilots with immersive and live data.
Timeline
The project contains three tasks for three semesters (12 months):
1. AR City Models will be prototyped in summer starting from July 1, 2021
2. Real-Time Data Overlay will be implemented in the Fall semester
3. Mobility Scenarios in Normal and Extreme Conditions will be developed in the Springer semester, including normal city view from a traffic management center, MedEvac Dispatch Center, and flood in downtown Pittsburgh.
Strategic Description / RD&T
Deployment Plan
We will deploy our prototype to our partners in UPMC MedEvac EMT, City of Pittsburgh Fire Bureau, and Allegheny Fire Academy. The AR TMC system is portable and it can run on a laptop or a heads-up-display (HUD) device for AR or VR such as HoloLens 2 and Magic Leap ML1. It can also be projected to a holographic screen for stereo vision from a pair of affordable polarized lens glasses. We will study the mobility scenarios in normal and extreme conditions including normal city view from a traffic management center, MedEvac Dispatch Center, and flood in downtown Pittsburgh.
Our field test partners include Dr. Lenny Weiss of UPMC EM Center, MedEvac team, and SWAT. He will test the Emergency Medical Center traffic management scenarios for 17 helicopters. Ronald V. Romano (Subject Matter Expert in Emergency Medical Services). Mr. Romano is Chief of the City EMS. He will facilitate the collaborative relationship between the Department of Emergency Medicine and the research team of Carnegie Mellon University and provide input during project working sessions and formal demonstrations. The Chief’s support letter is included in this proposal.
Brian Kikkola, EFO, CTO (Subject Matter Expert in Fire and Public Safety). Chief Kikkola is Assistant Chief of the Bureau of Fire, the City of Pittsburgh, Department of Public Safety. He will provide emergency response training scenarios development support and AR prototype testing and evaluation, collaborating on Task 1-6 and potentially adopt the AR technology for training courses at Allegheny County Fire and Police Academy.
Daniel E. Stack (Subject Matter Expert in Public Safety). Mr. Stack is Fire Marshal of McCandless Township, PA. He will provide knowledge about fire inspection and emergency response training scenarios and help to test the AR interfaces at Allegheny County Fire and Police Academy.
Expected Outcomes/Impacts
We will develop a middleware that can extract geographically accurate 3D terrain and building models registered with the live IoT data from open sources such as highway CCTV, Google Map, USGS, DHS, and NOAA. The AR TMC system is portable and it can run on a laptop or a heads-up-display (HUD) device for AR or VR such as HoloLens 2 and Magic Leap ML1. It can also be projected to a holographic screen for stereo vision from a pair of affordable polarized lens glasses. We will study the mobility scenarios in normal and extreme conditions including normal city view from a traffic management center, MedEvac Dispatch Center, and flood in downtown Pittsburgh.
The AR TMC city models will cover the entire downtown area. The middleware will extract the 3D city model with point clouds and texture maps. The IoT video data is at least 1 fps and the sensor data is at least 1 dataset per minute. The AR frame rate will be at least 60 fps. The hand controller should be accurate enough to beam the keyboard and text with the smallest font 10.
Expected Outputs
TRID
Individuals Involved
Email |
Name |
Affiliation |
Role |
Position |
ycai@cmu.edu |
Cai, Yang |
CyLab |
PI |
Faculty - Research/Systems |
David.Martinelli@mail.wvu.edu |
Martinelli, David |
West Virginia |
Co-PI |
Faculty - Tenured |
mws@cmu.edu |
Siegel, Mel |
Robotics |
Co-PI |
Faculty - Research/Systems |
Budget
Amount of UTC Funds Awarded
$100000.00
Total Project Budget (from all funding sources)
$100000.00
Documents
Type |
Name |
Uploaded |
Data Management Plan |
DMP-Cai.pdf |
Dec. 18, 2020, 12:40 a.m. |
Project Brief |
MAR-TMC-Cai-March-2021.pptx |
March 19, 2021, 8:55 p.m. |
Presentation |
CHARIoT AR Challenge Final Presentation |
Sept. 28, 2021, 12:52 p.m. |
Progress Report |
362_Progress_Report_2021-09-30 |
Sept. 28, 2021, 12:52 p.m. |
Progress Report |
362_Progress_Report_2022-03-30 |
April 5, 2022, 7:44 a.m. |
Publication |
IoT-based architectures for sensing and local data processing in ambient intelligence: research and industrial trends |
May 2, 2022, 9:14 a.m. |
Publication |
Heads-up LiDAR imaging with sensor fusion |
May 2, 2022, 9:15 a.m. |
Publication |
Interactive Floor Mapping with Depth Sensors |
May 2, 2022, 9:16 a.m. |
Final Report |
362-_Final_Report.pdf |
Aug. 9, 2022, 11:47 a.m. |
Publication |
Publication |
Sept. 8, 2022, 4:33 a.m. |
Progress Report |
362_Progress_Report_2022-09-30 |
Oct. 7, 2022, 7:23 a.m. |
Match Sources
No match sources!
Partners
Name |
Type |
UPMC |
Deployment Partner Deployment Partner |
City of Pittsburgh |
Deployment Partner Deployment Partner |
McCandless Township |
Deployment Partner Deployment Partner |
City EMS |
Deployment Partner Deployment Partner |
HHS |
Deployment Partner Deployment Partner |