Reed-Muller codes polarize

-
Min Ye, Princeton University
Fine Hall 214

Reed-Muller (RM) codes were introduced in 1954 and have long been conjectured to achieve Shannon's capacity on symmetric channels. The activity on this conjecture has recently been revived with the emergence of polar codes. RM codes and polar codes are generated by the same matrix G_m= [1 & 0 \\ 1 & 1]{\otimes m}G_m= [1 & 0 \\ 1 & 1]{\otimes m} but using different subset of rows.  RM codes select simply rows having largest weights. Polar codes select instead rows having the largest conditional mutual information proceeding top to down in GmGm; while this is a more elaborate and channel-dependent rule, the top-to-down ordering allows Arikan to show that the conditional mutual information polarizes, and this gives directly a capacity-achieving code on any symmetric channel. RM codes are yet to be proved to have such a property, despite the recent success for the erasure channel. 

In this talk, we connect RM codes to polarization theory. We show that proceeding in the RM code order, i.e., not top-to-down but from the lightest to the heaviest rows in GmGm, the conditional mutual information again polarizes. We further demonstrate that it does so faster than for polar codes. This implies that GmGm contains another code, different than the polar code and called here the twin-RM code, that is provably capacity-achieving on any symmetric channel. This gives, in particular, a necessary condition for RM codes to achieve capacity on symmetric channels. It further gives a sufficient condition if the rows with largest conditional mutual information correspond to the heaviest rows, i.e., if the twin-RM code is the RM code. We demonstrate here that the two codes are at least similar and give further evidence that they are indeed the same. 

This talk is based on joint work with Emmanuel Abbe. The paper will appear at FOCS this year.

Min Ye received his B.S. in Electrical Engineering from Peking University, Beijing, China in 2012, and his Ph.D. in the Department of Electrical and Computer Engineering, University of Maryland, College Park in 2017. He is currently a postdoctoral researcher at Princeton University. He is the recipient of the IEEE Data Storage Best Paper Award. His research interests include coding theory, information theory, differential privacy, and machine learning.