摘要

Bluetooth-based traffic detection is an emerging travel time collection technique; however, its use on arterials has been limited due to several challenges. In particular, data missing not at random (MNAR) is a common data set problem caused by system network failure or sensor malfunctioning. Solving the MNAR problem requires travel-time decomposition (TTD) using complete travel times spanning successive links. Previous work has focused on TTD methodologies that use probe vehicle data. However, these approaches may be unsuitable for Bluetooth-based data. Therefore, this study proposes a machine learning-based approach to decomposing Bluetooth-based travel time. A modified hidden Markov model was developed to model travel-time distributions and traffic-state transitions. A genetic algorithm (GA) was applied to solve a numerical optimal decomposition based on maximum likelihood. Two real-world travel-time data sets were used for validation of the approach. The proposed hidden Markov chain with GA (HMMGA) approach and Gaussian mixture model with GA (GMMGA) were compared with a benchmark approach using distance-based allocation. The results showed that the HMMGA significantly outperformed both the GMMGA and benchmark approaches. Using the HMMGA, the average mean absolute percentage error was up to 72% lower compared to the benchmark approach.