2019 13th International conference on Sampling Theory and Applications (SampTA) | 2019
Optimally Sample-Efficient Phase Retrieval with Deep Generative Models
Abstract
We consider the phase retrieval problem, which asks to recover a structured n-dimensional signal from m quadratic measurements. In many imaging contexts, it is beneficial to enforce a sparsity prior on the signal to reduce the number of measurements necessary for recovery. However, the best known methodologies for sparse phase retrieval have a sub-optimal quadratic dependency on the sparsity level of the signal at hand. In this work, we instead model signals as living in the range of a deep generative neural network G : ℝk → ℝn. We show that under the model of a d-layer feed forward neural network with Gaussian weights, m = O(kd2 log n) generic measurements suffice for the ℓ2 empirical risk minimization problem to have favorable geometry. In particular, we exhibit a descent direction for all points outside of two arbitrarily small neighborhoods of the true k-dimensional latent code and a negative reflection of it. Our proof is based on showing the sufficiency of two deterministic conditions on the generator and measurement matrices, which are satisfied with high probability under random Gaussian ensembles. We corroborate these results with numerical experiments showing that enforcing a generative prior via empirical risk minimization outperforms sparse phase retrieval methods.