# IEEE Transactions on Pattern Analysis and Machine Intelligence

## Filter Results

Displaying Results 1 - 25 of 25

Publication Year: 2015, Page(s): C1
| |PDF (340 KB)
• ### IEEE Transactions on Pattern Analysis and Machine Intelligence Editorial Board

Publication Year: 2015, Page(s): C2
| |PDF (318 KB)
• ### Guest Editors’ Introduction to the Special Issue on Bayesian Nonparametrics

Publication Year: 2015, Page(s):209 - 211
| |PDF (61 KB) | HTML
• ### Are Gibbs-Type Priors the Most Natural Generalization of the Dirichlet Process?

Publication Year: 2015, Page(s):212 - 229
Cited by:  Papers (13)
| |PDF (1015 KB) | HTML Media

Discrete random probability measures and the exchangeable random partitions they induce are key tools for addressing a variety of estimation and prediction problems in Bayesian inference. Here we focus on the family of Gibbs–type priors, a recent elegant generalization of the Dirichlet and the Pitman–Yor process priors. These random probability measures share properties that are appe... View full abstract»

• ### Differential Topic Models

Publication Year: 2015, Page(s):230 - 242
Cited by:  Papers (1)
| |PDF (1540 KB) | HTML

In applications we may want to compare different document collections: they could have shared content but also different and unique aspects in particular collections. This task has been called comparative text mining or cross-collection modeling. We present a differential topic model for this application that models both topic differences and similarities. For thi... View full abstract»

• ### The Supervised Hierarchical Dirichlet Process

Publication Year: 2015, Page(s):243 - 255
Cited by:  Papers (6)
| |PDF (906 KB) | HTML

We propose the supervised hierarchical Dirichlet process (sHDP), a nonparametric generative model for the joint distribution of a group of observations and a response variable directly associated with that whole group. We compare the sHDP with another leading method for regression on grouped data, the supervised latent Dirichlet allocation (sLDA) model. We evaluate our method on two real-world cla... View full abstract»

• ### Nested Hierarchical Dirichlet Processes

Publication Year: 2015, Page(s):256 - 270
Cited by:  Papers (17)
| |PDF (2150 KB) | HTML

We develop a nested hierarchical Dirichlet process (nHDP) for hierarchical topic modeling. The nHDP generalizes the nested Chinese restaurant process (nCRP) to allow each word to follow its own path to a topic node according to a per-document distribution over the paths on a shared tree. This alleviates the rigid, single-path formulation assumed by the nCRP, allowing documents to easily express co... View full abstract»

• ### Pitman Yor Diffusion Trees for Bayesian Hierarchical Clustering

Publication Year: 2015, Page(s):271 - 289
Cited by:  Papers (1)
| |PDF (1796 KB) | HTML

In this paper we introduce the Pitman Yor Diffusion Tree (PYDT), a Bayesian non-parametric prior over tree structures which generalises the Dirichlet Diffusion Tree [30] and removes the restriction to binary branching structure. The generative process is described and shown to result in an exchangeable distribution over data points. We prove some theoretica... View full abstract»

• ### Combinatorial Clustering and the Beta Negative Binomial Process

Publication Year: 2015, Page(s):290 - 306
Cited by:  Papers (6)
| |PDF (1237 KB) | HTML Media

We develop a Bayesian nonparametric approach to a general family of latent class problems in which individuals can belong simultaneously to multiple classes and where each class can be exhibited multiple times by an individual. We introduce a combinatorial stochastic process known as the negative binomial process ( ${\rm NBP}$ View full abstract»

• ### Negative Binomial Process Count and Mixture Modeling

Publication Year: 2015, Page(s):307 - 320
Cited by:  Papers (23)
| |PDF (924 KB) | HTML Media

The seemingly disjoint problems of count and mixture modeling are united under the negative binomial (NB) process. A gamma process is employed to model the rate measure of a Poisson process, whose normalization provides a random probability measure for mixture modeling and whose marginalization leads to an NB process for count modeling. A draw from the NB process consists of a Poisson distributed ... View full abstract»

• ### Latent IBP Compound Dirichlet Allocation

Publication Year: 2015, Page(s):321 - 333
Cited by:  Papers (1)
| |PDF (1739 KB) | HTML Media

We introduce the four-parameter IBP compound Dirichlet process (ICDP), a stochastic process that generates sparse non-negative vectors with potentially an unbounded number of entries. If we repeatedly sample from the ICDP we can generate sparse matrices with an infinite number of columns and power-law characteristics. We apply the four-parameter ICDP to sparse nonparametric topic modelling to acco... View full abstract»

• ### Distance Dependent Infinite Latent Feature Models

Publication Year: 2015, Page(s):334 - 345
Cited by:  Papers (2)
| |PDF (834 KB) | HTML Media

Latent feature models are widely used to decompose data into a small number of components. Bayesian nonparametric variants of these models, which use the Indian buffet process (IBP) as a prior over latent features, allow the number of features to be determined from the data. We present a generalization of the IBP, the distance dependent Indian buffet process (dd-IBP), for modeling... View full abstract»

• ### A Bayesian Nonparametric Approach to Image Super-Resolution

Publication Year: 2015, Page(s):346 - 358
Cited by:  Papers (20)
| |PDF (1919 KB) | HTML Media

Super-resolution methods form high-resolution images from low-resolution images. In this paper, we develop a new Bayesian nonparametric model for super-resolution. Our method uses a beta-Bernoulli process to learn a set of recurring visual patterns, called dictionary elements, from the data. Because it is nonparametric, the number of elements found is also determined from the data. We test the res... View full abstract»

• ### A Survey of Non-Exchangeable Priors for Bayesian Nonparametric Models

Publication Year: 2015, Page(s):359 - 371
Cited by:  Papers (4)
| |PDF (590 KB) | HTML Media

Dependent nonparametric processes extend distributions over measures, such as the Dirichlet process and the beta process, to give distributions over collections of measures, typically indexed by values in some covariate space. Such models are appropriate priors when exchangeability assumptions do not hold, and instead we want our model to vary fluidly with some set of covariates. Since the concept... View full abstract»

• ### Modeling Non-Gaussian Time Series with Nonparametric Bayesian Model

Publication Year: 2015, Page(s):372 - 382
Cited by:  Papers (1)
| |PDF (1486 KB) | HTML Media

We present a class of Bayesian copula models whose major components are the marginal (limiting) distribution of a stationary time series and the internal dynamics of the series. We argue that these are the two features with which an analyst is typically most familiar, and hence that these are natural components with which to work. For the marginal distribution, we use a nonparametric Bayesian prio... View full abstract»

• ### Fast Nonparametric Clustering of Structured Time-Series

Publication Year: 2015, Page(s):383 - 393
Cited by:  Papers (7)
| |PDF (13868 KB) | HTML Media

In this publication, we combine two Bayesian nonparametric models: the Gaussian Process (GP) and the Dirichlet Process (DP). Our innovation in the GP model is to introduce a variation on the GP prior which enables us to model structured time-series data, i.e., data containing groups where we wish to model inter- and intra-group variability. Our innovation in the DP model is an imp... View full abstract»

• ### Bayesian Nonparametric Methods for Partially-Observable Reinforcement Learning

Publication Year: 2015, Page(s):394 - 407
Cited by:  Papers (3)
| |PDF (1460 KB) | HTML Media

Making intelligent decisions from incomplete information is critical in many applications: for example, robots must choose actions based on imperfect sensors, and speech-based interfaces must infer a user’s needs from noisy microphone inputs. What makes these tasks hard is that often we do not have a natural representation with which to model the domain and use for choosing actions; we must... View full abstract»

• ### Gaussian Processes for Data-Efficient Learning in Robotics and Control

Publication Year: 2015, Page(s):408 - 423
Cited by:  Papers (43)
| |PDF (1439 KB) |  Media

Autonomous learning has been a promising direction in control and robotics for more than a decade since data-driven learning allows to reduce the amount of engineering knowledge, which is otherwise required. However, autonomous reinforcement learning (RL) approaches typically require many interactions with the system to learn controllers, which is a practical limitation in real systems, such as ro... View full abstract»

• ### Scaling Multidimensional Inference for Structured Gaussian Processes

Publication Year: 2015, Page(s):424 - 436
Cited by:  Papers (5)
| |PDF (1233 KB)

Exact Gaussian process (GP) regression has ${cal O}(N^{3})$ runtime for data size $N$, making it intractable for large View full abstract»

• ### Bayesian Models of Graphs, Arrays and Other Exchangeable Random Structures

Publication Year: 2015, Page(s):437 - 461
Cited by:  Papers (14)
| |PDF (1066 KB) | HTML

The natural habitat of most Bayesian methods is data represented by exchangeable sequences of observations, for which de Finetti’s theorem provides the theoretical foundation. Dirichlet process clustering, Gaussian process regression, and many other parametric and nonparametric Bayesian models fall within the remit of this framework; many problems arising in modern data analysis do not. Thi... View full abstract»

• ### Relational Learning and Network Modelling Using Infinite Latent Attribute Models

Publication Year: 2015, Page(s):462 - 474
| |PDF (1493 KB) | HTML Media

Latent variable models for network data extract a summary of the relational structure underlying an observed network. The simplest possible models subdivide nodes of the network into clusters; the probability of a link between any two nodes then depends only on their cluster assignment. Currently available models can be classified by whether clusters are disjoint or are allowed to overlap. These m... View full abstract»

• ### Bayesian Nonparametric Models for Multiway Data Analysis

Publication Year: 2015, Page(s):475 - 487
Cited by:  Papers (12)
| |PDF (1317 KB) | HTML Media

Tensor decomposition is a powerful computational tool for multiway data analysis. Many popular tensor decomposition approaches—such as the Tucker decomposition and CANDECOMP/PARAFAC (CP)—amount to multi-linear factorization. They are insufficient to model (i) complex interactions between data entities, (ii) various data types (e.g., missing data and binary data), and (iii) noisy obse... View full abstract»

• ### RockStars3D

Publication Year: 2015, Page(s): 488
| |PDF (849 KB)
• ### IEEE Transactions on Pattern Analysis and Machine Intelligence Information for Authors

Publication Year: 2015, Page(s): C3
| |PDF (319 KB)
• ### IEEE Computer Society

Publication Year: 2015, Page(s): C4
| |PDF (340 KB)

## Aims & Scope

The IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI) is published monthly. Its editorial board strives to present most important research results in areas within TPAMI's scope.

Full Aims & Scope

## Meet Our Editors

Editor-in-Chief
Sven Dickinson
University of Toronto
e-mail: sven@cs.toronto.edu