IEEE Transactions on Circuits and Systems for Video Technology
- Vol: 22 Issue: 1
- Vol: 22 Issue: 2
- Vol: 22 Issue: 3
- Vol: 22 Issue: 4
- Vol: 22 Issue: 5
- Vol: 22 Issue: 6
- Vol: 22 Issue: 7
- Vol: 22 Issue: 8
- Vol: 22 Issue: 9
- Vol: 22 Issue: 10
- Vol: 22 Issue: 11
- Vol: 22 Issue: 12
- Vol: 21 Issue: 1
- Vol: 21 Issue: 2
- Vol: 21 Issue: 3
- Vol: 21 Issue: 4
- Vol: 21 Issue: 5
- Vol: 21 Issue: 6
- Vol: 21 Issue: 7
- Vol: 21 Issue: 8
- Vol: 21 Issue: 9
- Vol: 21 Issue: 10
- Vol: 21 Issue: 11
- Vol: 21 Issue: 12
- Vol: 20 Issue: 1
- Vol: 20 Issue: 2
- Vol: 20 Issue: 3
- Vol: 20 Issue: 4
- Vol: 20 Issue: 5
- Vol: 20 Issue: 6
- Vol: 20 Issue: 7
- Vol: 20 Issue: 8
- Vol: 20 Issue: 9
- Vol: 20 Issue: 10
- Vol: 20 Issue: 11
- Vol: 20 Issue: 12
- Vol: 19 Issue: 1
- Vol: 19 Issue: 2
- Vol: 19 Issue: 3
- Vol: 19 Issue: 4
- Vol: 19 Issue: 5
- Vol: 19 Issue: 6
- Vol: 19 Issue: 7
- Vol: 19 Issue: 8
- Vol: 19 Issue: 9
- Vol: 19 Issue: 10
- Vol: 19 Issue: 11
- Vol: 19 Issue: 12
- Vol: 8 Issue: 1
- Vol: 8 Issue: 2
- Vol: 8 Issue: 3
- Vol: 8 Issue: 4
- Vol: 8 Issue: 5
- Vol: 8 Issue: 6
- Vol: 8 Issue: 7
- Vol: 8 Issue: 8
- Vol: 18 Issue: 1
- Vol: 18 Issue: 2
- Vol: 18 Issue: 3
- Vol: 18 Issue: 4
- Vol: 18 Issue: 5
- Vol: 18 Issue: 6
- Vol: 18 Issue: 7
- Vol: 18 Issue: 8
- Vol: 18 Issue: 9
- Vol: 18 Issue: 10
- Vol: 18 Issue: 11
- Vol: 18 Issue: 12
- Vol: 17 Issue: 1
- Vol: 17 Issue: 2
- Vol: 17 Issue: 3
- Vol: 17 Issue: 4
- Vol: 17 Issue: 5
- Vol: 17 Issue: 6
- Vol: 17 Issue: 7
- Vol: 17 Issue: 8
- Vol: 17 Issue: 9
- Vol: 17 Issue: 10
- Vol: 17 Issue: 11
- Vol: 17 Issue: 12
- Vol: 16 Issue: 1
- Vol: 16 Issue: 2
- Vol: 16 Issue: 3
- Vol: 16 Issue: 4
- Vol: 16 Issue: 5
- Vol: 16 Issue: 6
- Vol: 16 Issue: 7
- Vol: 16 Issue: 8
- Vol: 16 Issue: 9
- Vol: 16 Issue: 10
- Vol: 16 Issue: 11
- Vol: 16 Issue: 12
- Vol: 15 Issue: 1
- Vol: 15 Issue: 2
- Vol: 15 Issue: 3
- Vol: 15 Issue: 4
- Vol: 15 Issue: 5
- Vol: 15 Issue: 6
- Vol: 15 Issue: 7
- Vol: 15 Issue: 8
- Vol: 15 Issue: 9
- Vol: 15 Issue: 10
- Vol: 15 Issue: 11
- Vol: 15 Issue: 12
- Vol: 14 Issue: 1
- Vol: 14 Issue: 2
- Vol: 14 Issue: 3
- Vol: 14 Issue: 4
- Vol: 14 Issue: 5
- Vol: 14 Issue: 6
- Vol: 14 Issue: 7
- Vol: 14 Issue: 8
- Vol: 14 Issue: 9
- Vol: 14 Issue: 10
- Vol: 14 Issue: 11
- Vol: 14 Issue: 12
- Vol: 13 Issue: 1
- Vol: 13 Issue: 2
- Vol: 13 Issue: 3
- Vol: 13 Issue: 4
- Vol: 13 Issue: 5
- Vol: 13 Issue: 6
- Vol: 13 Issue: 7
- Vol: 13 Issue: 8
- Vol: 13 Issue: 9
- Vol: 13 Issue: 10
- Vol: 13 Issue: 11
- Vol: 13 Issue: 12
- Vol: 12 Issue: 1
- Vol: 12 Issue: 2
- Vol: 12 Issue: 3
- Vol: 12 Issue: 4
- Vol: 12 Issue: 5
- Vol: 12 Issue: 6
- Vol: 12 Issue: 7
- Vol: 12 Issue: 8
- Vol: 12 Issue: 9
- Vol: 12 Issue: 10
- Vol: 12 Issue: 11
- Vol: 12 Issue: 12
- Vol: 9 Issue: 1
- Vol: 9 Issue: 2
- Vol: 9 Issue: 3
- Vol: 9 Issue: 4
- Vol: 9 Issue: 5
- Vol: 9 Issue: 6
- Vol: 9 Issue: 7
- Vol: 9 Issue: 8
- Vol: 11 Issue: 1
- Vol: 11 Issue: 2
- Vol: 11 Issue: 3
- Vol: 11 Issue: 4
- Vol: 11 Issue: 5
- Vol: 11 Issue: 6
- Vol: 11 Issue: 7
- Vol: 11 Issue: 8
- Vol: 11 Issue: 9
- Vol: 11 Issue: 10
- Vol: 11 Issue: 11
- Vol: 11 Issue: 12
- Vol: 10 Issue: 1
- Vol: 10 Issue: 2
- Vol: 10 Issue: 3
- Vol: 10 Issue: 4
- Vol: 10 Issue: 5
- Vol: 10 Issue: 6
- Vol: 10 Issue: 7
- Vol: 10 Issue: 8
- Vol: 28 Issue: 1
- Vol: 28 Issue: 2
- Vol: 28 Issue: 3
- Vol: 28 Issue: 4
- Vol: 28 Issue: 5
- Vol: 28 Issue: 6
- Vol: 28 Issue: 7
- Vol: 28 Issue: 8
- Vol: 28 Issue: 9
- Vol: 28 Issue: 10
- Vol: 28 Issue: 11
- Vol: 28 Issue: 12
- Vol: 27 Issue: 1
- Vol: 27 Issue: 2
- Vol: 27 Issue: 3
- Vol: 27 Issue: 4
- Vol: 27 Issue: 5
- Vol: 27 Issue: 6
- Vol: 27 Issue: 7
- Vol: 27 Issue: 8
- Vol: 27 Issue: 9
- Vol: 27 Issue: 10
- Vol: 27 Issue: 11
- Vol: 27 Issue: 12
- Vol: 26 Issue: 1
- Vol: 26 Issue: 2
- Vol: 26 Issue: 3
- Vol: 26 Issue: 4
- Vol: 26 Issue: 5
- Vol: 26 Issue: 6
- Vol: 26 Issue: 7
- Vol: 26 Issue: 8
- Vol: 26 Issue: 9
- Vol: 26 Issue: 10
- Vol: 26 Issue: 11
- Vol: 26 Issue: 12
- Vol: 25
- Vol: 25 Issue: 1
- Vol: 25 Issue: 2
- Vol: 25 Issue: 3
- Vol: 25 Issue: 4
- Vol: 25 Issue: 5
- Vol: 25 Issue: 6
- Vol: 25 Issue: 7
- Vol: 25 Issue: 8
- Vol: 25 Issue: 9
- Vol: 25 Issue: 10
- Vol: 25 Issue: 11
- Vol: 25 Issue: 12
- Vol: 22 Issue: 1
- Vol: 22 Issue: 2
- Vol: 22 Issue: 3
- Vol: 22 Issue: 4
- Vol: 22 Issue: 5
- Vol: 22 Issue: 6
- Vol: 22 Issue: 7
- Vol: 22 Issue: 8
- Vol: 22 Issue: 9
- Vol: 22 Issue: 10
- Vol: 22 Issue: 11
- Vol: 22 Issue: 12
- Vol: 21 Issue: 1
- Vol: 21 Issue: 2
- Vol: 21 Issue: 3
- Vol: 21 Issue: 4
- Vol: 21 Issue: 5
- Vol: 21 Issue: 6
- Vol: 21 Issue: 7
- Vol: 21 Issue: 8
- Vol: 21 Issue: 9
- Vol: 21 Issue: 10
- Vol: 21 Issue: 11
- Vol: 21 Issue: 12
- Vol: 20 Issue: 1
- Vol: 20 Issue: 2
- Vol: 20 Issue: 3
- Vol: 20 Issue: 4
- Vol: 20 Issue: 5
- Vol: 20 Issue: 6
- Vol: 20 Issue: 7
- Vol: 20 Issue: 8
- Vol: 20 Issue: 9
- Vol: 20 Issue: 10
- Vol: 20 Issue: 11
- Vol: 20 Issue: 12
- Vol: 19 Issue: 1
- Vol: 19 Issue: 2
- Vol: 19 Issue: 3
- Vol: 19 Issue: 4
- Vol: 19 Issue: 5
- Vol: 19 Issue: 6
- Vol: 19 Issue: 7
- Vol: 19 Issue: 8
- Vol: 19 Issue: 9
- Vol: 19 Issue: 10
- Vol: 19 Issue: 11
- Vol: 19 Issue: 12
- Vol: 8 Issue: 1
- Vol: 8 Issue: 2
- Vol: 8 Issue: 3
- Vol: 8 Issue: 4
- Vol: 8 Issue: 5
- Vol: 8 Issue: 6
- Vol: 8 Issue: 7
- Vol: 8 Issue: 8
- Vol: 18 Issue: 1
- Vol: 18 Issue: 2
- Vol: 18 Issue: 3
- Vol: 18 Issue: 4
- Vol: 18 Issue: 5
- Vol: 18 Issue: 6
- Vol: 18 Issue: 7
- Vol: 18 Issue: 8
- Vol: 18 Issue: 9
- Vol: 18 Issue: 10
- Vol: 18 Issue: 11
- Vol: 18 Issue: 12
- Vol: 17 Issue: 1
- Vol: 17 Issue: 2
- Vol: 17 Issue: 3
- Vol: 17 Issue: 4
- Vol: 17 Issue: 5
- Vol: 17 Issue: 6
- Vol: 17 Issue: 7
- Vol: 17 Issue: 8
- Vol: 17 Issue: 9
- Vol: 17 Issue: 10
- Vol: 17 Issue: 11
- Vol: 17 Issue: 12
- Vol: 16 Issue: 1
- Vol: 16 Issue: 2
- Vol: 16 Issue: 3
- Vol: 16 Issue: 4
- Vol: 16 Issue: 5
- Vol: 16 Issue: 6
- Vol: 16 Issue: 7
- Vol: 16 Issue: 8
- Vol: 16 Issue: 9
- Vol: 16 Issue: 10
- Vol: 16 Issue: 11
- Vol: 16 Issue: 12
- Vol: 15 Issue: 1
- Vol: 15 Issue: 2
- Vol: 15 Issue: 3
- Vol: 15 Issue: 4
- Vol: 15 Issue: 5
- Vol: 15 Issue: 6
- Vol: 15 Issue: 7
- Vol: 15 Issue: 8
- Vol: 15 Issue: 9
- Vol: 15 Issue: 10
- Vol: 15 Issue: 11
- Vol: 15 Issue: 12
- Vol: 14 Issue: 1
- Vol: 14 Issue: 2
- Vol: 14 Issue: 3
- Vol: 14 Issue: 4
- Vol: 14 Issue: 5
- Vol: 14 Issue: 6
- Vol: 14 Issue: 7
- Vol: 14 Issue: 8
- Vol: 14 Issue: 9
- Vol: 14 Issue: 10
- Vol: 14 Issue: 11
- Vol: 14 Issue: 12
- Vol: 13 Issue: 1
- Vol: 13 Issue: 2
- Vol: 13 Issue: 3
- Vol: 13 Issue: 4
- Vol: 13 Issue: 5
- Vol: 13 Issue: 6
- Vol: 13 Issue: 7
- Vol: 13 Issue: 8
- Vol: 13 Issue: 9
- Vol: 13 Issue: 10
- Vol: 13 Issue: 11
- Vol: 13 Issue: 12
- Vol: 12 Issue: 1
- Vol: 12 Issue: 2
- Vol: 12 Issue: 3
- Vol: 12 Issue: 4
- Vol: 12 Issue: 5
- Vol: 12 Issue: 6
- Vol: 12 Issue: 7
- Vol: 12 Issue: 8
- Vol: 12 Issue: 9
- Vol: 12 Issue: 10
- Vol: 12 Issue: 11
- Vol: 12 Issue: 12
- Vol: 9 Issue: 1
- Vol: 9 Issue: 2
- Vol: 9 Issue: 3
- Vol: 9 Issue: 4
- Vol: 9 Issue: 5
- Vol: 9 Issue: 6
- Vol: 9 Issue: 7
- Vol: 9 Issue: 8
- Vol: 11 Issue: 1
- Vol: 11 Issue: 2
- Vol: 11 Issue: 3
- Vol: 11 Issue: 4
- Vol: 11 Issue: 5
- Vol: 11 Issue: 6
- Vol: 11 Issue: 7
- Vol: 11 Issue: 8
- Vol: 11 Issue: 9
- Vol: 11 Issue: 10
- Vol: 11 Issue: 11
- Vol: 11 Issue: 12
- Vol: 10 Issue: 1
- Vol: 10 Issue: 2
- Vol: 10 Issue: 3
- Vol: 10 Issue: 4
- Vol: 10 Issue: 5
- Vol: 10 Issue: 6
- Vol: 10 Issue: 7
- Vol: 10 Issue: 8
- Vol: 28 Issue: 1
- Vol: 28 Issue: 2
- Vol: 28 Issue: 3
- Vol: 28 Issue: 4
- Vol: 28 Issue: 5
- Vol: 28 Issue: 6
- Vol: 28 Issue: 7
- Vol: 28 Issue: 8
- Vol: 28 Issue: 9
- Vol: 28 Issue: 10
- Vol: 28 Issue: 11
- Vol: 28 Issue: 12
- Vol: 27 Issue: 1
- Vol: 27 Issue: 2
- Vol: 27 Issue: 3
- Vol: 27 Issue: 4
- Vol: 27 Issue: 5
- Vol: 27 Issue: 6
- Vol: 27 Issue: 7
- Vol: 27 Issue: 8
- Vol: 27 Issue: 9
- Vol: 27 Issue: 10
- Vol: 27 Issue: 11
- Vol: 27 Issue: 12
- Vol: 26 Issue: 1
- Vol: 26 Issue: 2
- Vol: 26 Issue: 3
- Vol: 26 Issue: 4
- Vol: 26 Issue: 5
- Vol: 26 Issue: 6
- Vol: 26 Issue: 7
- Vol: 26 Issue: 8
- Vol: 26 Issue: 9
- Vol: 26 Issue: 10
- Vol: 26 Issue: 11
- Vol: 26 Issue: 12
- Vol: 25
- Vol: 25 Issue: 1
- Vol: 25 Issue: 2
- Vol: 25 Issue: 3
- Vol: 25 Issue: 4
- Vol: 25 Issue: 5
- Vol: 25 Issue: 6
- Vol: 25 Issue: 7
- Vol: 25 Issue: 8
- Vol: 25 Issue: 9
- Vol: 25 Issue: 10
- Vol: 25 Issue: 11
- Vol: 25 Issue: 12
Volume 26 Issue 12 • Dec. 2016
Sponsor
Filter Results
-
-
IEEE Transactions on Circuits and Systems for Video Technology publication information
|
PDF (91 KB)
-
Adaptive and Robust Sparse Coding for Laser Range Data Denoising and Inpainting
Publication Year: 2016, Page(s):2165 - 2175
Cited by: Papers (4)Sparse coding (SC) is making a significant impact in computer vision and signal processing communities, which achieves the state-of-the-art performance in a variety of applications for images, e.g., denoising, restoration, and synthesis. We propose an adaptive and robust SC algorithm exploiting the characteristics of typical laser range data and the availability of both range and reflectance data ... View full abstract»
-
Effective Strip Noise Removal for Low-Textured Infrared Images Based on 1-D Guided Filtering
Publication Year: 2016, Page(s):2176 - 2188
Cited by: Papers (9)Infrared images typically contain obvious strip noise. It is a challenging task to eliminate such noise without blurring fine image details in low-textured infrared images. In this paper, we introduce an effective single-image-based algorithm to accurately remove strip-type noise present in infrared images without causing blurring effects. First, a 1-D row guided filter is applied to perform edge-... View full abstract»
-
Multilevel Modified Finite Radon Transform Network for Image Upsampling
Publication Year: 2016, Page(s):2189 - 2199
Cited by: Papers (5)A local line-like feature is the most important discriminate information in the image upsampling scenario. In recent example-based upsampling methods, grayscale and gradient features are often adopted to describe the local patches, but these simple features cannot accurately characterize complex patches. In this paper, we present a feature representation of local edges by means of a multilevel fil... View full abstract»
-
Real-Time Pose Detection and Tracking of Hundreds of Objects
Publication Year: 2016, Page(s):2200 - 2214
Cited by: Papers (5)We propose a novel model-based method for tracking the 6-DOF pose of a very large number of rigid objects in real time. By combining dense motion and depth cues with sparse keypoint correspondences, and by feeding back information from the modeled scene to the cue extraction process, the method is both highly accurate and robust to noise and occlusions. A tight integration of the graphical and com... View full abstract»
-
Using Discriminative Motion Context for Online Visual Object Tracking
Publication Year: 2016, Page(s):2215 - 2225
Cited by: Papers (6)In this paper, we propose an algorithm for online, real-time tracking of arbitrary objects in videos from unconstrained environments. The method is based on a particle filter framework using different visual features and motion prediction models. We effectively integrate a discriminative online learning classifier into the model and propose a new method to collect negative training examples for up... View full abstract»
-
Multiperson Tracking by Online Learned Grouping Model With Nonlinear Motion Context
Publication Year: 2016, Page(s):2226 - 2239
Cited by: Papers (3)An online approach to learn elementary groups containing only two targets, i.e., pedestrians, for inferring high-level context is introduced to improve multiperson tracking. In most existing data association-based tracking approaches, only low-level information (e.g., time, appearance, and motion) is used to build the affinity model, and each target is considered as an independent agent. Unlike th... View full abstract»
-
A Consensus Model for Motion Segmentation in Dynamic Scenes
Publication Year: 2016, Page(s):2240 - 2249The study of phenomena segmentation in natural scenes has attracted growing attention and is a popular research topic. While there are many studies detailing algorithms for motion segmentation in dynamic scenes, an important question arising from these studies is how to combine these algorithms. How can the label correspondence problem be resolved? Answering this question is difficult, because the... View full abstract»
-
Action Recognition by Time Series of Retinotopic Appearance and Motion Features
Publication Year: 2016, Page(s):2250 - 2263
Cited by: Papers (9)We present a method for recognizing and localizing actions in video by the sequence of changing appearance and motion of the participants. Appearance is modeled by histogram of oriented gradients object detectors, while motion is modeled by optical-flow motion-pattern detectors. Sequencing is modeled by a hidden Markov model (HMM) whose output models are these appearance and motion detectors. The ... View full abstract»
-
Visual Focus of Attention Estimation With Unsupervised Incremental Learning
Publication Year: 2016, Page(s):2264 - 2272
Cited by: Papers (5)In this paper, we propose a new method for estimating the visual focus of attention (VFOA) in a video stream captured by a single distant camera and showing several persons sitting around a table, like in formal meeting or video conferencing settings. The visual targets for a given person are automatically extracted online using an unsupervised algorithm that incrementally learns the different app... View full abstract»
-
Multi-loss Regularized Deep Neural Network
Publication Year: 2016, Page(s):2273 - 2283
Cited by: Papers (17)A proper strategy to alleviate overfitting is critical to a deep neural network (DNN). In this paper, we introduce the cross-loss-function regularization for boosting the generalization capability of the DNN, which results in the multi-loss regularized DNN (ML-DNN) framework. For a particular learning task, e.g., image classification, only a single-loss function is used for all previous DNNs, and ... View full abstract»
-
Improving QoE and Fairness in HTTP Adaptive Streaming Over LTE Network
Sergio Cicalò ; Nesrine Changuel ; Velio Tralli ; Bessem Sayadi ; Frédéric Faucheux ; Sylvaine KerboeufPublication Year: 2016, Page(s):2284 - 2298
Cited by: Papers (13)HTTP adaptive streaming (HAS) has emerged as the main technology for video streaming applications. Multiple HAS video clients sharing the same wireless channel may experience different video qualities as well as different play-out buffer levels, as a result of both different video content complexities and different channel conditions. This causes unfairness in the end-user quality of experience. I... View full abstract»
-
Cast2Face: Assigning Character Names Onto Faces in Movie With Actor-Character Correspondence
Publication Year: 2016, Page(s):2299 - 2312
Cited by: Papers (2)Automatically identifying characters in movies has attracted researchers' interest and led to several significant and interesting applications. However, due to the vast variation in character appearance as well as the weakness and ambiguity of available annotation, it is still a challenging problem. In this paper, we investigate this problem with the supervision of actor-character name corresponde... View full abstract»
-
Retrieval in Long-Surveillance Videos Using User-Described Motion and Object Attributes
Publication Year: 2016, Page(s):2313 - 2327
Cited by: Papers (2)We present a content-based retrieval method for long-surveillance videos in wide-area (airborne) and near-field [closed-circuit television (CCTV)] imagery. Our goal is to retrieve video segments, with a focus on detecting objects moving on routes, that match user-defined events of interest. The sheer size and remote locations where surveillance videos are acquired necessitates highly compressed re... View full abstract»
-
A Vision Processor With a Unified Interest-Point Detection and Matching Hardware for Accelerating a Stereo-Matching Algorithm
Publication Year: 2016, Page(s):2328 - 2343In this paper, a unified interest-point detection and matching hardware with an optimized memory architecture is proposed for a real-time stereo-matching system. In order to support a stereo-matching algorithm, the unified datapath in the hardware performs not only interest-point detection and matching algorithms such as features from the accelerated segment test (FAST) and binary robust independe... View full abstract»
-
-
2016 Index IEEE Transactions on Circuits and Systems for Video TechnologyVol. 26
|
PDF (184 KB)
-
-
IEEE Transactions on Circuits and Systems for Video Technology information for authors
|
PDF (267 KB)
Aims & Scope
IEEE Transactions on Circuits and Systems for Video Technology (TCSVT) covers the circuits and systems aspects of all video technologies. General, theoretical, and application-oriented papers with a circuits and systems perspective are encouraged for publication in TCSVT on or related to image/video acquisition, representation, presentation and display; processing, filtering and transforms; analysis and synthesis; learning and understanding; compression, transmission, communication and networking; storage, retrieval, indexing and search; and/or hardware and software design and implementation.
Meet Our Editors
Editor-in-Chief
Shipeng Li
iFLYTEK Co. Ltd.
No. 666 West Wangjiang Road
Hi-Tech Zone, Hefei, China 230088
Peer Review Support Services
Desiree Noel
IEEE Publishing Operations
d.noel@ieee.org
732-562-2644