Funding source: US DOT
Contract number: DTRT13-G-UTC57
Funding amount: $99,998
Performance period: 9/1/2017 to 8/31/2018
Project description
Traffic congestion has become inescapable across the United States, especially in urban areas. Yet, support is lacking for taxes to fund expansion of the existing network. Thus, it is imperative to find novel ways to improve efficiency of the existing infrastructure. A major obstacle is the inability to enforce socially optimal routes among the commuters. We propose to improve routing efficiency by leveraging heterogeneity in commuter preferences. We learn individual driver preferences over the route characteristics and use these preferences to recommend socially optimal routes that they will likely follow. The combined effects of socially optimal routing and personalization help bridge the gap between utopic and user optimal solutions. We take the view of a recommendation system with a large user base but no ability to enforce routes in a highly congested network. We (a) develop a framework for learning individual driver preferences over time, and (b) devise a mathematical model for computing personalized socially optimal routes given (potentially partial) information on driver preferences. We evaluated our approach on data collected from Amazon Mechanical Turk and compared with Logistic Regression and our model improves prediction accuracy by over 12%.