Login

Project

#21 The Intelligent Mobility Meter - Portable Fine-Grained Data Collection and Analysis of Pedestrian, Cyclist, and Motor Vehicle Traffic


Principal Investigator
Bernardo Pires
Status
Completed
Start Date
July 1, 2017
End Date
June 30, 2018
Project Type
Research Advanced
Grant Program
FAST Act - Mobility National (2016 - 2022)
Grant Cycle
2017 Mobility21 UTC
Visibility
Public

Abstract

The Intelligent Mobility Meter (IMM) is a portable data collection and analysis platform which will be able to collect fine-grained statistics on pedestrian, cyclist and vehicular traffic. The objective of the IMM project is to provide accurate and actionable data to government officials and transit advocates. The meter is an expansion of the UTC T-SET funded project “Automatic Counting of Pedestrians and Cyclists”, which researched, developed and deployed a robust pedestrian and bike counting system. The deployed system was employed - in collaboration with the City of Pittsburgh - to collect usage data on Pittsburgh’s bike lanes. However current system capabilities are limited to counting pedestrians and bikes. To create a true mobility meter, we intend to expand analysis to counting motor vehicles, as well as determine the flows of traffic, including speeds of all participants and way-times at intersections.    
Description
Proposed Work

Usage statistics have the capability to inform policy makers and transportation advocates on the best design for the infrastructure of the future. They also have the potential to help the current traffic engineers identify and resolve infrastructure problems. However, it is not feasible to place data collection devices everywhere. The Intelligent Mobility Meter (IMM) is a portable device that will have the capability to collect fine-grained statistics on the behavior of all road participants in any key area.

The IMM grew from a need to obtain statistics on usage of bike lanes, however it has now grown beyond that niche need to collect data for all levels of government (currently discussing possible sister-project with PennDOT). Statistics about pedestrians, motor and non-motor vehicles provide important information for government officials to build safe infrastructures for walking, biking and driving. In addition to pedestrian and bicyclist data, having information about the number of motor vehicles can give key insights of the trade-offs when road space is re-assigned from motor to bike usage. Funding of this project will allow the City of Pittsburgh to make use of our prototype and data collection capabilities at no cost, allowing the collection of a significant amount of key decision-making data, as guided by the needs of the city officials.
About the Previously Developed System – Technical Merit 

 A robust vision-based pedestrian and cyclist counting system has been researched, developed and deployed (Figure 1). This system consists of data collection hardware prototype system and an accurate computer vision based counting system. The presented method can work under different lighting and weather conditions. Approximately 50 hours of data was collected in different locations around CMU campus using the prototype. In order to label the pedestrians and the bikes in the recorded data, a web-based object labeling software was implemented. Unlike the existing object labeling tools in the images, our developed labeling tool includes novel properties to reduce the labeling time by incorporating spatial and geometric constraints between the frames of the videos. We labeled 10 hours of our recorded dataset (541 pedestrians and 111 cyclists) using this tool to train and test our counting method.

Figure-1: Developed hardware prototype.

Our pedestrian and cyclist counting system includes four consecutive tasks: 1) Detecting and localizing all people in a given image, 2) Distinguishing the pedestrians from bicyclists, 3) Tracking the pedestrians and bicyclists separately over time. 4) Counting them if they cross a virtual line in the image. Step 1 and 2 of our method is called as the cascaded object detector. It is built on the-state-of-art pedestrian detection method [1]. It was improved by exploring the geometry and constraints of the application. Namely, foreground detection, geometry prior information, and temporal moving direction (optical flow) are used as inputs to the [1]. Our improved detector outperforms the results of the state-of-the-art method by eliminating most of the false alarms (false detected pedestrians) at the same true detection rate. Also, since it utilizes geometric prior information about the scene, it outputs better fitted detection windows which are more beneficial in the tracking stage of the method.

In our first test, we experimented our detector with a publicly available challenging dataset [2] that is suitable for our purpose. The experimented dataset, called as TownCentre, includes a video, which is 5 minutes long and from a busy town street. It contains 298 pedestrians walking in two different directions. Our proposed counting method achieves ~95% accuracy by counting 283 people. Also, we conducted another experimented to analyze the performance of the tracker, which is first 3 steps our method mentioned above. We compared the results with one of the-state-of-art tracking algorithm [3]. As it was suggested in [3], trajectories of two pedestrians are analyzed. The mean average distances between the ground truth trajectories and the outputs of the trackers are calculated. Our method outperformed [3] by obtaining more accurate trajectories. A paper describing our methodology was accepted for publication on the premier conference on applications of Computer Vision [4].

 References
[1] R. Benenson, M. Omran, J. Hosang, B. Schiele, “Ten years of pedestrian detection, what have we learned?” Computer Vision ECCV 2014 Workshops, 2014 [2] B. Benfold and I. Reid, “Stable multi-target tracking in real-time surveillance video,”. IEEE CVPR, 2011, pp. 3457–3464. [3] Jung, C. R., “Combining Patch Matching and Detection for Robust Pedestrian Tracking in Monocular Calibrated Cameras”. Pattern Recognition Letters (Special Issue). 2013 [4] Mehmet Kemal Kocamaz, Jian Gong, Bernardo R. Pires, “Vision-based Counting of Pedestrians and Cyclists”, IEEE Winter Conference on Applications of Computer Vision WACV'16, March 7-9, 2016, Lake Placid, NY, USA.
Timeline
July 1, 2017 to October 31, 2017: Determining the locations in the city to record training dataset. 

November 1, 2017 to December 31, 2017: Collection of the training dataset to train the vehicle detector. 

January 1, 2018 to February 28, 2018: Manual analysis and labeling of the training dataset. 

March 1, 2018 to March 31, 2018: Training the vehicle detector and analyzing its performance. 

April 1, 2018 to June 30, 2018:  Extending the pedestrians and bicyclist tracking and counting methods for the vehicles. Producing the number of the counted vehicles in the training dataset. 
Strategic Description / RD&T

    
Deployment Plan
Initial deployment for collection of training dataset will be guided by technical needs and include Frew Street locations close to CMU campus. Further deployments will include, at the guidance of the City of Pittsburgh officials, Penn Avenue and Greenfield protected bike lanes.
Expected Outcomes/Impacts
- Creation of a vision-based combined pedestrian, bicyclist, and vehicle counting and analyzing system
- Collection of key decision-making data for city officials, advocacy groups and other stakeholders
- Multiple deployments at different Pittsburgh locations, including multiple bike lanes
Expected Outputs

    
TRID


    

Individuals Involved

Email Name Affiliation Role Position
bpires@cmu.edu Pires, Bernardo Carnegie Mellon University PI Other

Budget

Amount of UTC Funds Awarded
$100000.00
Total Project Budget (from all funding sources)
$100000.00

Documents

Type Name Uploaded
Presentation AV’s Blindspot: Detecting Pedestrians and Bicyclists March 30, 2018, 11:21 p.m.
Progress Report 21_Progress_Report_2018-03-30 March 30, 2018, 11:22 p.m.
Final Report 21_-Pires_IMM_FinalReport_abm5lQT.pdf April 29, 2019, 4:12 a.m.

Match Sources

No match sources!

Partners

No partners!