Temporal Tracking Urban Areas using Google Street View

Thumbnail Image

Publication or External Link





Tracking the evolution of built environments is a challenging problem in computer vision due to the intrinsic complexity of urban scenes, as well as the dearth of temporal visual information from urban areas. Emerging technologies such as street view cars, provide massive amounts of high quality imagery data of urban environments at street-level (e.g., sidewalks, buildings, and aesthetics of streets). Such datasets are consistent with respect to space and time; hence, they could be a potential source for exploring the temporal changes transpiring in built environments. However, using street view images to detect temporal changes in urban scenes induces new challenges such as variation in illumination, camera pose, and appearance/disappearance of objects.

In this thesis, we leverage Google Street View’s new feature, “time machine”, to track and label the temporal changes of built environments, specifically accessibility features (e.g., existence of curb-ramps, condition of sidewalks). The main contributions of this thesis are: (i) initial proof-of-concept automated method for tracking accessibility features through panorama images across time, (ii) a framework for processing and analyzing time series panoramas at scale, and (iii) a geo-temporal dataset including different types of accessibility features for the task of detection.