全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

显著子区域在线选择的目标鲁棒跟踪

DOI: 10.13195/j.kzyjc.2013.1038, PP. 1788-1792

Keywords: 目标鲁棒跟踪,局部模型,显著区域,粒子滤波

Full-Text   Cite this paper   Add to My Lib

Abstract:

针对复杂环境下的视觉目标鲁棒跟踪问题,模拟人视觉选择注意显著区域的智能特性,提出一种在线选择目标显著子区域的跟踪方法.根据中心-周围差异和相对背景的差异提取具有区分性的子区域,通过跟踪误差分析子区域时序一致性,选择稳定的显著子区域,利用子区域局部与目标整体的空间关系估计目标位置.实验结果表明,通过动态选择显著的目标子区域,能够提高对部分遮挡和背景相似干扰影响的适应性.

References

[1]  Jepson A D, Fleet D J, El-Maraghi T F. Robust online appearance models for visual tracking[J]. IEEE Trans on Pattern Analysis and Machine Intelligence, 2003, 25(10): 1296-1311.
[2]  Wang Q, Chen F, Xu W L, et al. Object tracking via partial least squares analysis[J]. IEEE Trans on Image Processing, 2012, 21(10): 4454-4465.
[3]  Ross D A, Lim J, Lin R S, et al. Incremental learning for robust visual tracking[J]. Int J of Computer Vision, 2008, 77(1/2/3): 125-141.
[4]  Mei X, Ling H B. Robust visual tracking and vehicle classification via sparse representation[J]. IEEE Trans on Pattern Analysis and Machine Intelligence, 2011, 33(11): 2259-2272.
[5]  Adam A, Rivlin E, Shimshoni I. Robust fragments-based tracking using the integral histogram[C]. Proc of IEEE Conf on Computer Vision and Pattern Recognition. New York, 2006: 798-805.
[6]  Zhang K, Zhang L, Yang M H. Real-time compressive tracking[C]. Proc of 12th Eur Conf on Computer Vision. Florence, 2012: 864-877.
[7]  Liu B Y, Huang J Z, Yang L, et al. Robust tracking using local sparse appearance model and ??-selection[C]. Proc of IEEE Conf on Computer Vision and Pattern Recognition. Colorado: Springs, 2011: 1313-1320.
[8]  Jia X, Lu H C, Yang M H. Visual tracking via adaptive structural local sparse appearance model[C]. Proc of IEEE Conf on Computer Vision and Pattern Recognition. Providence, 2012: 1822-1829.
[9]  Zhong W, Lu H C, Yang M H. Robust object tracking via sparsity-based collaborative model[C]. Proc of IEEE Conf on Computer Vision and Pattern Recognition. Providence, 2012: 1838-1845.
[10]  Cehovin L, Kristan M, Leonardis A. Robust visual tracking using an adaptive coupled-layer visual model[J]. IEEE Trans on Pattern Analysis and Machine Intelligence, 2013, 35(4): 941-953.
[11]  Frintrop S, Kessel M. Most salient region tracking[C]. Proc of IEEE Int Conf on Robotics and Automation. Kobe, 2009: 758-763.
[12]  Yang M, Wu Y, Hua G. Context-aware visual tracking[J]. IEEE Trans on Pattern Analysis and Machine Intelligence, 2009, 31(7): 1195-1209.
[13]  Grabner H, Matas J, Van Gool L, et al. Tracking the invisible: Learning where the object might be[C]. Proc of IEEE Conf on Computer Vision and Pattern Recognition. San Francisco, 2010: 1285-1292.
[14]  Xu X X,Wang Z L, Chen Z H. Visual tracking model based on Feature-Imagination and its application[C]. Proc of Int Conf on Multimedia Information Networking and Security. Nanjing, 2010: 370-374.
[15]  朱明清, 王智灵, 陈宗海. 基于人类视觉智能和粒子滤波的鲁棒目标跟踪算法[J]. 控制与决策, 2012, 27(11): 1720-1724.
[16]  Yilmaz A, Javed O, Shah M. Object tracking: A survey[J]. ACM Computing Surveys, 2006, 38(4): 1-45.
[17]  (ZhuMQ,Wang Z L, Chen Z H. Human visual intelligence and particle filter based robust object tracking algorithm[J]. Control and Decision, 2012, 27(11): 1720-1724.)
[18]  Yang M, Yuan J S, Wu Y. Spatial selection for attentional visual tracking[C]. Proc of IEEE Conf on Computer Vision and Pattern Recognition. Minneapolis, 2007: 1590-1597.
[19]  Collins R T, Liu Y X. On-line selection of discriminative tracking features[C]. Proc of IEEE Int Conf on Computer Vision. Nice, 2003: 346-352.

Full-Text

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133