Event Calendar
Sign Up

3620 South Vermont Avenue, Los Angeles, CA 90089

View map


Zhichao Wang, UC Berkeley


Title: When Do Conjugate Kernels Learn XOR? Eigen-Spike Phase Transitions via Quadratic Equivalents


Abstract: Recent work in random matrix theory (RMT) has developed the notion of linear (deterministic) equivalents: typically linear surrogate models that approximate the spectral behavior of large nonlinear random matrices, such as nonlinear feature maps in neural networks (NNs). On one hand, these equivalents make theoretical predictions tractable by reducing to a simpler model that falls under the umbrella of classical RMT tools. However, this leaves open the question of whether this idealized linear equivalence remains meaningful when dealing with high-dimensional nonlinearly separable data, such as classifying nonlinearly separable data. Motivated by this, in this talk, I will consider the conjugate kernel (CK), which is the nonlinear feature map of a feedforward NN, under a canonical nonlinearly separable dataset: the XOR problem. I will analyze informative outlier eigenvalues in the CK and whether their corresponding eigenvectors asymptotically align with XOR labels as a proxy for nonlinear learnability. Then I will present a quadratic equivalent to the spiked CK that enables a precise analysis of emergent informative spikes as one modifies various knobs common in ML practice: sample complexity, signal-to-noise ratio (SNR), nonlinear activation choice, and pretrained features. In each of these scenarios, we derive a precise BBP-type phase transition in which linear classification via the CK eigenvectors becomes possible. This talk is based on a joint work with Collin Cranston, Todd Kemp, and Michael Mahoney.

This program is open to all eligible individuals. USC operates all of its programs and activities consistent with the university’s Notice of Non-Discrimination. Eligibility is not determined based on race, sex, ethnicity, sexual orientation or any other prohibited factor.

 

Event Details