INFORMATION THEORETIC BOUNDS FOR SPARSE RECONSTRUCTION IN RANDOM NOISE

Information Theoretic Bounds for Sparse Reconstruction in Random Noise

Information Theoretic Bounds for Sparse Reconstruction in Random Noise

Blog Article

Compressive sensing (CS) plays a Ski de fond - Enfant - Skis - Classic pivotal role in the signal processing and we address on the issues, i.e., the information-theoretic analysis of CS under random noise in this paper.

To distinguish from existing literature, we aim at providing a precise reconstruction of the source signal.From the analysis of the recovery performance, we calculate the lower band upper bound of the probability of error for CS.To be more specific, we provide more discussions for the case where both the source Sta-Rite Dynamo Parts and the noise follow Gaussian distribution.

It has been proved that perfect reconstruction of the signal vector is impossible if the corresponding conditions are not satisfied, which can be served as the theoretical reference of noisy CS.In terms of the necessary proofs, we leverage the results from information theory and estimation theory.The compression of real underwater acoustic sensor network (UWASN) data is applied to verify the theoretical bounds derived in this paper.

Report this page