CCN

CCN Seminar with Brett Larsen (Stanford University)

America/New_York
Description

You are cordially invited to a CCN Seminar with Brett Larsen


Title: Practical Leverage-Based Sampling for Low-Rank Tensor Decomposition

Abstract: [1] The low-rank canonical polyadic tensor decomposition is useful in data analysis and can be computed by solving a sequence of overdetermined least squares subproblems. Motivated by sparse tensors, we propose sketching each subproblem using leverage score upper bounds to select a subset of the rows, with probabilistic guarantees on the solution accuracy. Crucially, the required number of rows in our sketched system is independent of the number of nonzeros in the full tensor and the number of rows in the full system. Numerical results on real-world large-scale tensors show the method is significantly faster than deterministic methods at nearly the same level of accuracy.  [2] A variety of recent works, spanning pruning, lottery tickets, and training within random subspaces, have shown that deep neural networks can be trained using far fewer degrees of freedom than the total number of parameters. We analyze this phenomenon for random subspaces both at a random initialization and early in training, finding a sharp phase transition in the success probability of hitting a given loss sublevel set as the training dimension surpasses a threshold. We then theoretically characterize this threshold and its dependence on the initialization in terms of properties of the high-dimensional geometry of the loss landscape.

[1] Brett W. Larsen and Tamara G. Kolda. “Practical Leverage-Based Sampling for Low-Rank Tensor Decomposition." arXiv:2006.16438.
[2] Brett W. Larsen, Stanislav Fort, Nic Becker, Surya Ganguli. “How many degrees of freedom do we need to train deep networks: a loss landscape perspective." ICLR 2022. arXiv 2107.05802.