Login

Project

#24 Sensors Know When to, What to, and How to Interact With Human in Vehicles


Principal Investigator
Anind Dey
Status
Completed
Start Date
Jan. 1, 2017
End Date
Aug. 31, 2018
Project Type
Research Applied
Grant Program
MAP-21 TSET National (2013 - 2018)
Grant Cycle
2017 TSET UTC
Visibility
Public

Abstract

From laptops to smartphones, ubiquitous computing devices equipped with sensors generate information about our daily routines on the go. This information enables computers to estimate our in-situ states and proactively provide information services to users. Consequently, users can interact with cyber information anywhere, at anytime, including in vehicles while driving. However, arbitrary or inopportune prompts to interact with cyber information interfere with safe vehicle operation. As human attention is finite, the ubiquity of HCI opportunities comes at a cost, more significantly during driving.
To address this problem, this project explores the issues of when to intervene (i.e., optimal timing), how to intervene (i.e., presentation methods), and what to intervene (i.e., types of HCI demands or information). We aim to enable sensor-equipped computers to handle these issues and designhuman-centered intelligent interventions. This project extends our prior UTC work to create enabling technologies that support seamless interaction between human and ubiquitous computing spaces. 
In this project, we collect big sensor data streams from a least intrusive set of wearable or internet-of-things sensors, worn by vehicle users and/or embedded in vehicles, including daily smart devices. During a set of human-subject experiments in naturalistic field driving situations, we will investigate how drivers interact with proactive adjustments of information initiated by system intelligence rather than user demand. We also consider presentation methods and types of interaction schemes acrosshuman visual, auditory, and haptic sensory channels. The near goal is to create a smarter, contextually intelligent cyber-physical system that supports intelligibility of system behavior. These experiments will provide a set of sensor-based real-time models of drivers’ cognitive load, user interruptibility, and user experience of proactive information services. Ultimately, these technologies will help drivers safely interact with proactive cyber-information interventions.    
Description

    
Timeline

    
Strategic Description / RD&T

    
Deployment Plan

    
Expected Outcomes/Impacts

    
Expected Outputs

    
TRID


    

Individuals Involved

Email Name Affiliation Role Position
anind@cs.cmu.edu Dey, Anind HCII PI Faculty - Research/Systems
noemail1@noemail.com Kim, SeungJun HCII Co-PI Faculty - Research/Systems

Budget

Amount of UTC Funds Awarded
$127500.00
Total Project Budget (from all funding sources)
$127500.00

Documents

Type Name Uploaded
Publication Leveraging Human Routine Models to Detect and Generate Human Behaviors April 24, 2017, 7:13 p.m.
Publication Making Machine Learning Applications for Time-Series Sensor Data Graphical and Interactive. April 24, 2017, 7:13 p.m.
Publication Usability evaluation of smart watch UI/UX using experience sampling method April 24, 2017, 7:13 p.m.
Publication Multi-modal Interruptions in a Context Based Framework April 24, 2017, 7:13 p.m.
Presentation Improving User Experience in Ubiquitous HCI Situations - Sensors Know When to, How to, and What to Interact with Human April 24, 2017, 7:13 p.m.
Presentation Improving User Experience in Ubiquitous HCI Situations - Sensors Know When to, How to, and What to Interact with Human April 24, 2017, 7:13 p.m.
Progress Report 24_Progress_Report_2017-09-30 Oct. 2, 2017, 11:28 a.m.
Publication Integrated Driving Aware System in the Real-World: Sensing, Computing and Feedback March 21, 2021, 4:47 p.m.

Match Sources

No match sources!

Partners

No partners!