Networked Drone Cameras
A network of drone cameras can be deployed to cover live events, such as high-action sports game played on a large field, but managing networked drone cameras in real-time is challenging. Distributed approaches yield suboptimal solutions from lack of coordination but coordination with a centralized controller incurs round-trip latencies of several hundreds of milliseconds over a wireless channel. We propose a fog-networking based system architecture to automatically coordinate a network of drones equipped with cameras to capture and broadcast the dynamically changing scenes of interest in a sports game. We design both optimal and practical algorithms to balance the tradeoff between two metrics: coverage of the most important scenes and streamed video bitrate. To compensate for network round-trip latencies, the centralized controller uses a predictive approach to predict which locations the drones should cover next. The controller maximizes video bitrate by associating each drone to an optimally matched server and dynamically re-assigns drones as relay nodes to boost the throughput in low-throughput scenarios. This dynamic assignment at centralized controller occurs at slower time-scale permitted by round-trip latencies, while the predictive approach and drones’ local decision ensures that the system works in real-time.
Further, we leverage adaptive video streaming to compensate for the aerial wireless channel variations and to avoid video jitter. Consumer UAVs today use fixed-bitrate video streaming where users configure the resolution (4K or 1080p). However, applications with real-time streaming that deploy UAVs in the wild will require adaptive video streaming to tackle uncertain wireless link capacities and meet their video quality requirements. To tackle this problem, we design adaptive video streaming algorithms that can provide significant gains for UAV streaming. Our system SkyEyes leverages two novel aspects to aid UAV streaming: content-based compression and video rate adaptation based on location sensors and client buffer status.
Experimental results over tens of flights on the field suggest our system can achieve really good performance, for example, 8 drones can achieve a tradeoff of 94% coverage and (on average) 2K video support at 20 Mbps by optimizing between coverage and throughput. By dynamically allocating drones to cover the game or act as relays, our system also demonstrates a 2x gain over systems maximizing static coverage alone that achieves only 9 Mbps video throughput.