TY - GEN
T1 - Support Recovery of Sparse Signals from a Mixture of Linear Measurements
AU - Gandikota, Venkata
AU - Mazumdar, Arya
AU - Pal, Soumyabrata
N1 - Publisher Copyright:
© 2021 Neural information processing systems foundation. All rights reserved.
PY - 2021
Y1 - 2021
N2 - Recovery of support of a sparse vector from simple measurements is a widelystudied problem, considered under the frameworks of compressed sensing, 1-bit compressed sensing, and more general single index models. We consider generalizations of this problem: mixtures of linear regressions, and mixtures of linear classifiers, where the goal is to recover supports of multiple sparse vectors using only a small number of possibly noisy linear, and 1-bit measurements respectively. The key challenge is that the measurements from different vectors are randomly mixed. Both of these problems have also received attention recently. In mixtures of linear classifiers, an observation corresponds to the side of the queried hyperplane a random unknown vector lies in; whereas in mixtures of linear regressions we observe the projection of a random unknown vector on the queried hyperplane. The primary step in recovering the unknown vectors from the mixture is to first identify the support of all the individual component vectors. In this work we study the number of measurements sufficient for recovering the supports of all the component vectors in a mixture in both these models. We provide algorithms that use a number of measurements polynomial in k, log n and quasi-polynomial in ℓ, to recover the support of all the ℓ unknown vectors in the mixture with high probability when each individual component is a k-sparse n-dimensional vector.
AB - Recovery of support of a sparse vector from simple measurements is a widelystudied problem, considered under the frameworks of compressed sensing, 1-bit compressed sensing, and more general single index models. We consider generalizations of this problem: mixtures of linear regressions, and mixtures of linear classifiers, where the goal is to recover supports of multiple sparse vectors using only a small number of possibly noisy linear, and 1-bit measurements respectively. The key challenge is that the measurements from different vectors are randomly mixed. Both of these problems have also received attention recently. In mixtures of linear classifiers, an observation corresponds to the side of the queried hyperplane a random unknown vector lies in; whereas in mixtures of linear regressions we observe the projection of a random unknown vector on the queried hyperplane. The primary step in recovering the unknown vectors from the mixture is to first identify the support of all the individual component vectors. In this work we study the number of measurements sufficient for recovering the supports of all the component vectors in a mixture in both these models. We provide algorithms that use a number of measurements polynomial in k, log n and quasi-polynomial in ℓ, to recover the support of all the ℓ unknown vectors in the mixture with high probability when each individual component is a k-sparse n-dimensional vector.
UR - http://www.scopus.com/inward/record.url?scp=85131856947&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85131856947&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85131856947
T3 - Advances in Neural Information Processing Systems
SP - 19082
EP - 19094
BT - Advances in Neural Information Processing Systems 34 - 35th Conference on Neural Information Processing Systems, NeurIPS 2021
A2 - Ranzato, Marc'Aurelio
A2 - Beygelzimer, Alina
A2 - Dauphin, Yann
A2 - Liang, Percy S.
A2 - Wortman Vaughan, Jenn
PB - Neural information processing systems foundation
T2 - 35th Conference on Neural Information Processing Systems, NeurIPS 2021
Y2 - 6 December 2021 through 14 December 2021
ER -