Abstract
In this project we will use cameras mounted on the outside and inside of a transit bus to observe passenger behavior. We are interested to determine the full trip of the passengers, from where they enter to where they exit the bus. We want to develop algorithms that can automatically detect unusual behavior. Of special interest are the trips of people with disabilities, we want to determine which parts of the trip or interactions with other people create barriers to access or equitable ridership. This information will help the transit agencies to make the trips more efficient and equitable.
Description
At the beginning of this project we will have two FreedomTransit buses equipped with cameras, GPS, communication, and computers. One of the buses will have cameras on the outside and inside of the bus and a second computer will be installed inside the bus that has a special graphics chip to efficiently run deep networks. The first bus was equipped during a previous UTC project, the system for the second bus will be paid for by a NSF project. Additionally, TheBus transit agency in Hawaii (www.thebus.org) wants to share their video data with us.
According to our interviews with transit agencies, video data of the passengers is very valuable to them. They need an accurate count of the number of passengers in order to get subsidies and those counts can be extracted from videos. Another use case is if there is a real or alleged incident, video can show what really happened. Other useful information from video is knowing the whole trip of passengers, where they enter and where they exit. That information can be used to optimize bus routes. But for all these use cases, it is very tedious to extract the information from the videos. Currently, there are no automated ways to get the information, a person has to watch the video and make notes. In the first part of this project, led by Christoph Mertz, we will develop and assemble software tools to automatically analyze the videos. The first step will be to use a common people detector to detect the passengers. A tracking algorithm will follow them while they are visible in the video. When a passenger transitions from outside to inside the bus or vice versa, the ID has to be properly handed off from one camera to the other. Passengers will have to be tracked throughout their trip or re-identified between entering and exiting the bus so that we can associate the beginning and end of each passenger trip. Finally, we want to detect unusual behavior by first characterizing normal behavior and then finding cases where the behavior deviates.
In the second part of the project, led by Patrick Carrington, we want to observe passengers with disabilities and passengers with situational challenges like pushing a stroller with a baby. There are very few studies which investigate all the complex difficulties these passengers face when interacting with the bus, the driver, other passengers, and their own belongings like smartphones. We will first find relevant events by training a detector for wheelchairs and strollers and applying this detector to our video data. We will then analyze the selected events.
This project will partially support two students and we will leverage funding from other sources (NSF).
The video tools we will develop will help the transit agency to save money and to optimize their operations. Observing the challenges passengers with disabilities and other situational impairments experience on the bus will help the agencies to address these challenges. All this will make transit more efficient and more equitable.
The long term goal of the BusEdge is enable the bus company or other users to query live or recorded bus video data for any relevant purpose, just like today we can query the text of web pages using a search engine like Google. The query in the current project is about passengers and especially passengers with physical or situational challenges. Prior work regarding wheelchair users and public transit has focused on aspects of the journey including access to particular bus routes, kneeling bus advances, and securement of the wheelchair using the provided safety tie down and clip-in systems. The emphasis on these areas is understandable, especially as these are the primary safety concerns a wheelchair user and transit system may have. Our approach in this study will allow us to begin to characterize actions and events that occur within the context of both safety equipment and ridership. For instance, by observing the interactions between wheelchair users and the bus operators we can begin to understand the prevalence of challenges during the entire process of riding the bus including paying the bus fare, finding and utilizing an available seating area, interacting with safety equipment (with or without assistance), and interacting with other passengers and bus systems to arrive at a destination. We will also be able to begin characterizing the impact that accessible ridership has on the utilization of space around a wheelchair accessible seat. For instance, when a wheelchair user needs a seat and other priority seating is not available for displaced passengers. What is the prevalence and impact of these types of issues? What rule based or common courtesy strategies might be used to mitigate resulting challenges? These are questions that have not been addressed in previous research.
Timeline
July 2022: Initial data collection
August - September 2022: Develop module to detect and track passengers; develop module to detect wheelchairs and strollers
October - November 2022: Develop module to hand off passenger IDs from one camera to another; apply wheelchair/stroller detector to create initial event dataset.
December - January 2023: Integrate detection/tracking/handing off to observe full trips; analyze initial event dataset.
February 2023: Present initial video analysis system to transit agency for feedback; present initial findings of difficulties faced by passengers with disabilities or situational challenges to transit agency and disability community for feedback.
March-April 2023: Implement improvements to video analysis system; improve wheelchair/stroller detector and create larger event dataset.
May 2023: Final analysis
June 2023: Final report and presentations to transit agency, disability community, and potential commercialization partners
Strategic Description / RD&T
Deployment Plan
The research questions we want to address in this project came from our conversations with transit agencies. Automatically counting people and determining the complete trips of passengers will be very useful tools for them. From the very beginning of the project we will use data collected with a commercially available safety system (from SafetyVision) on a transit bus during normal operations. Two of these systems, owned by us, will be on buses of a local transit agency (FreedomTransit). We can run analysis programs on the bus computer and we can retrieve data through a cellular connection, WiFi, or by physically swapping hard drives. SafetyVision plans to install their system on several hundred buses of TheBus, a transit agency in Hawaii. They are willing to share their collected video data with us. We will share our results with the transit agency and they might be able to directly make use of them. We will make our software tools available as open source. Our goal is for the system to be commercialization-ready by the end of the project.
Expected Outcomes/Impacts
Accomplishment 1: System that can automatically count passengers and determine whole trip.
Metrics 1: We will do a quantitative comparison between the counts we will get from the system and a manual count. We will determine the missed counts, false counts, and correct counts for at least one whole bus route.
Accomplishment 2: Detection of unusual behavior.
Metrics 2: We will do a qualitative comparison between what the system considers unusual behavior and what a human observer considers unusual behavior. What constitutes unusual behavior is subjective and happens infrequently. There is also a very broad spectrum of possible unusual behaviors. We therefore expect our system to only find a subset of unusual behavior.
Accomplishment 3: Observation of difficulties faced by passengers with disabilities or situational challenges.
Metric 3: This accomplishment is successful if we can gain insight into specific difficulties that are present during a trip on a transit bus.
General accomplishment for all tasks: Potential for more efficient transit operations for all participants.
General Metric: Feedback from the transit company and other participants. We will present our findings to the transit agency to get their feedback if the systems we developed would make it easier for them to make use of their video data. We will ask our contacts in the disability community to give us feedback on our observations of difficulties faced by passengers with disabilities or challenges during a bus trip.
Expected Outputs
TRID
Individuals Involved
Email |
Name |
Affiliation |
Role |
Position |
pcarrington@cmu.edu |
Carrington, Patrick |
Carnegie Mellon University |
Co-PI |
Faculty - Untenured, Tenure Track |
anuraggh@andrew.cmu.edu |
Ghosh, Anurag |
Carnegie Mellon University |
Other |
Student - Masters |
bklaas@cmu.edu |
Klaas, Rebecca |
Carnegie Mellon University |
Other |
Staff - Business Manager |
cmertz@andrew.cmu.edu |
Mertz, Christoph |
Carnegie Mellon University |
PI |
Other |
Budget
Amount of UTC Funds Awarded
$100000.00
Total Project Budget (from all funding sources)
$99935.00
Documents
Type |
Name |
Uploaded |
Data Management Plan |
Bus_on_the_edge__Passengers_DMP.pdf |
Nov. 18, 2021, 3:52 p.m. |
Progress Report |
385_Progress_Report_2022-09-30 |
Sept. 23, 2022, 7:13 a.m. |
Publication |
geometric_zoom_cvpr23.pdf |
March 27, 2023, 9:37 p.m. |
Presentation |
bus_edge_poster_consortium_meeting_2022.pdf |
March 28, 2023, 7:13 a.m. |
Progress Report |
385_Progress_Report_2023-03-30 |
March 28, 2023, 7:13 a.m. |
Publication |
Multimodal object detection via probabilistic ensembling |
March 30, 2023, 5:55 a.m. |
Publication |
Leveraging Structure from Motion to Localize Inaccessible Bus Stops |
March 30, 2023, 5:55 a.m. |
Publication |
Bus on the Edge: Applications |
March 30, 2023, 5:56 a.m. |
Publication |
Self-Calibration of Multiple LiDARs for Autonomous Vehicles |
March 30, 2023, 5:56 a.m. |
Publication |
Carla simulated data for rare road object detection |
March 30, 2023, 5:57 a.m. |
Publication |
Bus on the edge: Continuous monitoring of traffic and infrastructure |
March 30, 2023, 5:58 a.m. |
Publication |
Improving rush hour traffic flow by computer-vision-based parking detection and regulations |
March 30, 2023, 5:58 a.m. |
Publication |
Multimodal object detection via bayesian fusion |
March 30, 2023, 5:59 a.m. |
Publication |
Creating and Integrating Solutions to Enable the ‘Complete Trip’ |
April 10, 2023, 8:35 p.m. |
Publication |
” I Should Feel Like I’m In Control”: Understanding Expectations, Concerns, and Motivations for the Use of Autonomous Navigation on Wheelchairs |
April 10, 2023, 8:36 p.m. |
Publication |
Designing an Inclusive Mobile App for People with Disabilities to Independently Use Autonomous Vehicles |
April 10, 2023, 8:37 p.m. |
Publication |
Designing the Future of Transit Work |
April 10, 2023, 8:37 p.m. |
Publication |
Accessibility and The Crowded Sidewalk: Micromobility's Impact on Public Space |
April 10, 2023, 8:38 p.m. |
Final Report |
Final_Report_-_385.pdf |
Oct. 3, 2023, 2:22 p.m. |
Publication |
Anurag_MSR_Thesis.pdf |
Oct. 16, 2023, 10:24 a.m. |
Publication |
RISS_Final_Paper-2.pdf |
Oct. 16, 2023, 10:31 a.m. |
Publication |
Philip_Neugebauer_Masterthesis.pdf |
Oct. 16, 2023, 10:31 a.m. |
Progress Report |
385_Progress_Report_2023-09-30 |
Oct. 16, 2023, 10:33 a.m. |
Match Sources
No match sources!
Partners
Name |
Type |
FreedomTransit |
Deployment Partner Deployment Partner |
RoadBotics |
Deployment Partner Deployment Partner |
TheBus |
Deployment Partner Deployment Partner |