High-SNR Capacity of AWGN Channels With Generic Alphabet Constraints
AbstractWe present a generalized notion of entropy taken with respect to a measure in a coordinate-independent manner and prove several novel entropy convergence theorems. A particular focus is entropy of random variables on smooth submanifolds of R^N.
We apply these results to computing the information capacity of an AWGN channel whose alphabet is constrained to an n-dimensional smooth submanifold of R^N. Such submanifolds are shown to arise naturally when coding alphabets in R^N are subjected to a set of smooth constraint functions. The asymptotic capacity in the high-SNR limit is computed for such AWGN channels with manifold constraints in two variants: a compact alphabet manifold, and a non-compact scale-invariant alphabet manifold with an additional average power constraint on the input distribution. The high-SNR capacity expression resembles Shannon's famous Gaussian channel capacity formula, with an additional constant term determined by the geometry of the alphabet constraint manifold-- namely, a volume derived from the manifold.
We apply the above theory in a study of the channel capacity of radar pulse waveforms. In our model, each radar pulse also constitutes a code letter for transmission of information. It is desirable in this context to constrain the alphabet of waveforms to those particularly suited to efficient and effective radar signal processing, giving rise to a channel described by the above work. We numerically compute the volume component of our asymptotic capacity expression for a plausible range of performance characteristics of the radar signal processing. We plot curves that show the inherent trade-off for our radar between signal processing performance and channel capacity.
Citable link to this pagehttp://nrs.harvard.edu/urn-3:HUL.InstRepos:37945006
- FAS Theses and Dissertations