Login

Project

#197 Perception for Transportation Service Robots


Principal Investigator
Aaron Steinfeld
Status
Completed
Start Date
Jan. 1, 2018
End Date
Dec. 31, 2019
Project Type
Research Applied
Grant Program
FAST Act - Mobility National (2016 - 2022)
Grant Cycle
2018 Traffic21
Visibility
Public

Abstract

Transportation hubs, both large and small, serve as critical points in the travel chain. Due to their role as multi-modal nexus points, mobility breakdowns at hubs can impact large numbers of people across a wide range of disabilities. Hub-based service robots have the potential to assist people with and without disabilities through these complex and confusing facilities. This vision of the future is the focus of the Disability Rehabilitation Research Project on Robotics and Automation for Inclusive Transportation, which is part of the Accessible Transportation Technologies Research Initiative (ATTRI).

A key building block to support transportation hub assistance robots is the ability to perceive human torso orientation. This allows projections of where a person is walking, where they intend to move, and the regions of space they are attending to. The team has made initial progress on using low-cost stereo camera sensing to rapidly extract the torso plane of humans in 3D space. We seek to refine this capability to support use in future robots and deployments. Parts of this effort will include collection and preparation of such data for development and evaluation of service robot perception.
    
Description
In coordination with the USDOT’s Accessible Transportation Technologies Research Initiative (ATTRI), the National Institute on Disability, Independent Living, and Rehabilitation Research (NIDILRR) awarded the Disability Rehabilitation Research Project on Robotics and Automation for Inclusive Transportation (aka ATTRI DRRP) to Carnegie Mellon in 2017. The mission of this five-year ATTRI DRRP is to research and develop seamless transportation assistance from cloud-based autonomy and shared robots located in and around transportation hubs.

Transportation hubs, both large and small, serve as critical points in the travel chain. Due to their role as multi-modal nexus points, mobility breakdowns at hubs can impact large numbers of people across a wide range of disabilities. Hub-based service robots have the potential to assist people with and without disabilities through these complex and confusing facilities. Future ATTRI DRRP efforts will include a publicly deployed, hub-based, mobile robot for on-demand assistance. Example functionality in this deployment includes navigation assistance, robot guidance through a station, information retrieval (e.g., “is the elevator working?”), and rendezvous with services (e.g., station staff) and other unmanned systems (e.g., robots, etc.).  

A key building block for such functionality is the ability to perceive human torso orientation. This allows projections of where a person is walking, where they intend to move, and the regions of space they are attending to. The team has made initial progress on using low-cost stereo camera sensing to rapidly extract the torso plane of humans in 3D space. We seek to refine this capability to support use in future ATTRI robots and deployments.

Our initial methods merge the popular OpenPose human perception algorithms and depth data to support rapid perception of torso body elements. This is then used to estimate torso plane. Our simple forecasting algorithm outperforms complicated recurrent neural network methods, while being faster on the torso pose forecasting task. In initial lab-based comparisons, our method outperforms complex recurrent neural network methods while being approximately 45 times faster on a torso plane forecasting task.

Under Traffic21 funds, we seek to extend this to a more full-fledged perception system and apply it to service robots and tasks like socially appropriate navigation. This will require multi-person perception and improvements in robustness to more naturalistic data. The team currently has access to data collected from a static sensor in the Steel Plaza light rail station, but we believe additional data is needed from the perspective of a moving mobile robot in support of more realistic evaluation. Parts of this effort will include collection and preparation of such data for development and evaluation of service robot perception.
Timeline
Start - July 2018: Refine and enhance torso plane perception algorithm

August - December 2018: Collect natural transportation hub data and prepare for analysis

January - June 2019: Evaluate torso plane perception algorithm on naturalistic data
Strategic Description / RD&T

    
Deployment Plan
Future ATTRI DRRP efforts will include a publicly deployed, hub-based, mobile robot for on-demand assistance. Technology developed under this project will be integrated into this robot during deployment. Local government agencies have made commitments to negotiate with the team on the specifics of where and when this deployment will occur. 

This same technology may also be relevant in other deployments. For example, torso plane perception is also useful for other robots, autonomous cars, and other domains where inferring human intent and attention is useful.
Expected Outcomes/Impacts
Open source torso plane perception software.

Improvements over the state of the art on torso plane estimation, both in speed and accuracy.

We may generate naturalistic human motion data sets, pending IRB approval and negotiations with local sites.
Expected Outputs

    
TRID


    

Individuals Involved

Email Name Affiliation Role Position
henny@cmu.edu Admoni, Henny Robotics Institute Co-PI Faculty - Untenured, Tenure Track
abhijatb@andrew.cmu.edu Biswas, Abhijat Robotics Institute Other Student - Masters
steinfeld@cmu.edu Steinfeld, Aaron Robotics Institute PI Faculty - Research/Systems

Budget

Amount of UTC Funds Awarded
$29847.00
Total Project Budget (from all funding sources)
$57894.00

Documents

Type Name Uploaded
Data Management Plan PerceptioServiceRobots18_DMP.pdf June 4, 2018, 8:18 a.m.
Project Brief Aaron.pdf June 21, 2018, 10:34 a.m.
Presentation Human-Robot Interaction Sept. 24, 2018, 8:45 a.m.
Presentation ML and Robotics to Enable People With Disabilities to Go Where They Want Sept. 24, 2018, 8:45 a.m.
Progress Report 197_Progress_Report_2018-09-30 Sept. 24, 2018, 8:45 a.m.
Publication mmpc_rss2018_biswas.pdf March 29, 2019, 6:19 a.m.
Presentation Self-Driving Cars March 29, 2019, 6:19 a.m.
Presentation Start-up Special Session: Status and Development Direction of Korean Robot Industry Sept. 27, 2019, 6:38 a.m.
Progress Report 197_Progress_Report_2019-03-30 March 29, 2019, 6:20 a.m.
Presentation Robotics & AI to Empower People with Disabilities to Go Where They Want Sept. 27, 2019, 6:38 a.m.
Presentation ML and Robotics to Enable People with Disabilities to Go Where They Want Sept. 27, 2019, 6:38 a.m.
Presentation ITS and Beyond to Enable Accessible Transportation for All Sept. 27, 2019, 6:38 a.m.
Progress Report 197_Progress_Report_2019-09-30 Sept. 27, 2019, 6:38 a.m.
Final Report 197_-_Final_Report.pdf Jan. 3, 2020, 4:22 a.m.
Progress Report 197_Progress_Report_2020-03-30 March 30, 2020, 6:03 a.m.
Publication Accessible Transportation Technologies Research Initiative: State of the Practice Scan Oct. 22, 2020, 12:08 p.m.
Publication Accessible Public Transportation: Designing Service for Riders with Disabilities. Oct. 24, 2020, 7 p.m.
Publication Automated Vehicles (AVs) for People with Disabilities Oct. 24, 2020, 7:06 p.m.

Match Sources

No match sources!

Partners

Name Type
Port Authority of Allegheny County Deployment Partner Deployment Partner