Large Scale View Planning

Abstract

View planning for visual coverage is a fundamental robotics problem. Coverage for small objects (e.g. for inspection) or small scale indoor scenes have been studied extensively. However, view planning to cover a large scale urban environment remains challenging. Algorithms that can scale up to the size of such environments while providing performance guarantees are missing.

In this paper, we model urban environments as a set of surfaces with $k$ distinct surface normals whose viewing cones must be visited by a robot. We model the resulting coverage problem as a novel variant of Traveling Salesman Problem with Neighborhoods (TSPN). The neighborhoods are defined as cones, which constrain the path coverage quality. We present a polynomial time algorithm which admits an approximation factor of $O(\frac{k}{\tan(\alpha)}\max{[L_B,W_B,H_B]})$, where $\alpha$ is the maximum viewing angle, and $L_B,H_B,W_B$ are respectively the length, width, and height of a minimum enclosing box of a city scene $B$. In addition to the analytical upper bounds, we show in simulations that our method outperforms three baseline methods in both trajectory length and run-time. We also demonstrate our method and evaluate the coverage quality of a city containing more than 70 buildings in a photo-realistic rendering software.