Landscape Analysis for Overcomplete Tensor and Neural Collapse, Qing Qu

Tensor Methods and Emerging Applications to the Physical and Data Sciences 2021 Workshop IV: Efficient Tensor Representations for Learning and Computational Complexity “Landscape Analysis for Overcomplete Tensor and Neural Collapse“ Qing Qu - University of Michigan, Center for Data Science Abstract: In this talk, we provide the first global landscape analysis for overcomplete tensor decomposition and neural collapse. For both problems, we show that landscapes have similar benign global geometric structures. First, overcomplete tensor decomposition relates to many applications in representation learning, such as overcomplete dictionary learning and convolutional dictionary learning. Under tight frame assumptions of the overcomplete components, we show that the nonconvex loss over the sphere has no spurious local minimizers. Second, recent seminal work by Donoho et al. showed a prevalence phenomenon during the terminal phase of network training - neural collapse. By studying the op
Back to Top