32nd IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), California, Amerika Birleşik Devletleri, 16 - 20 Haziran 2019, ss.563-572
Object tracking remains one of the fundamental problems of computer vision since it becomes difficult under some realistic conditions such as fast camera movement, occlusion and similar of objects to the tracked target. As a real-world application, tracking objects using cameras mounted on unmanned aerial vehicles (UAVs) has become very popular. With the increasing availability of small single board computers with high parallel processing power capabilities, tracking of objects by using onboard computers within UAVs in real time has become feasible. Although these onboard computers allow a wide variety of computer vision methods to be executed on a UAV, there is still a need to optimize these methods for running time and power consumption. In this paper, we propose a hybrid method for a UAV to detect and track other UAVs efficiently. To detect the target UAV at the beginning of the video and in the case where the tracked UAV has been lost, we use the deep learning-based YOLOv3 and YOLOv3-Tiny models, which provide one of the best trade-offs between speed and accuracy in the literature. To track the detected UAVs in real time, a kernelized correlation filter is used. Combining these two methods provides high accuracy and speed even on onboard computers. To train the neural nets and test our method, we have collected a new dataset composed of videos of various UAVs in flight, captured from another UAV. The performance of the proposed method has been compared with other state-of-the-art methods in the literature on this dataset. Additionally, we also tested the proposed trackers on aerial videos captured from UAVs. Experimental results show that the proposed hybrid trackers achieve the state-of-the-art performance on all tested datasets. The code is available at https://github.com/bdrhn9/hybrid-tracker.