Fundamental Limits of Over-the-Air Optimization: Are Analog Schemes Optimal?
We consider over-the-air convex optimization on a $d$ dimensional space where coded gradients are sent over an additive Gaussian noise channel with variance $\sigma ^{2}$ . The codewords satisfy an average power constraint $P$ , resulting in the signal-to-noise ratio (SNR) of $P/\sigma ^{2}$ . We derive bounds for the convergence rates for over-the-air optimization. Our first result is a lower bound for the convergence rate showing that any code must slowdown the convergence rate by a factor of roughly $\sqrt {d/\log (1+ \mathtt {SNR})}$ .