Skip to main content

TSP Volume 67 Issue 1

Online and Stable Learning of Analysis Operators

In this paper, four iterative algorithms for learning analysis operators are presented. They are built upon the same optimization principle underlying both Analysis K-SVD and Analysis SimCO. The forward and sequential analysis operator learning (AOL) algorithms are based on projected gradient descent with optimally chosen step size. The implicit AOL algorithm is inspired by the implicit Euler scheme for solving ordinary differential equations and does not require to choose a step size.

Read more

Learning Tensors From Partial Binary Measurements

We generalize the 1-bit matrix completion problem to higher order tensors. Consider a rank- r order- d tensorT in RN ××RN  with bounded entries. We show that when r=O(1) , such a tensor can be estimated efficiently from only m=Or (Nd)  binary measurements. This shows that the sample complexity of recovering a low-rank tensor from 1-bit measurements of a subset of its entries is roughly the same as recovering it from unquantized measurements—a result that had been known only in the matrix case, i.e., when d=2.

Read more

Learning of Tree-Structured Gaussian Graphical Models on Distributed Data Under Communication Constraints

In this paper, learning of tree-structured Gaussian graphical models from distributed data is addressed. In our model, samples are stored in a set of distributed machines where each machine has access to only a subset of features. A central machine is then responsible for learning the structure based on received messages from the other nodes. We present a set of communication-efficient strategies, which are theoretically proved to convey sufficient information for reliable learning of the structure.

Read more