
All-In-One Data Decoders Using GRAND
Presentation Menu
In 1948, Shannon stated that the best error correction performance comes at longer code lengths. In 1978, Berlekamp, McEliece, and Tilborg established that optimally accurate decoding of linear codes is NP-complete in code length, so there is no optimally accurate universal decoder at long code lengths. Forward error-correction decoding has traditionally been a code-specific endeavor. Since the design of conventional decoders is tightly coupled to the code structure, one needs a distinct implementation for each code. The standard co-design paradigm either leads to significantly increased hardware complexity and silicon area to decode various codes or restrictive code standardization to limit hardware footprint. An innovative recent alternative is noise-centric guessing random additive noise decoding (GRAND). This approach uses modern developments in the analysis of guesswork to create a universal algorithm where the effect of noise is guessed according to statistical knowledge of the noise behavior or through phenomenological observation. Because of the universal nature of GRAND, it allows efficient decoding of a variety of different codes and rates in a single hardware instantiation. The exploration of the use of different codes, including heretofore undecodable ones, e.g., Random Linear Codes (RLCs), is an interesting facet of GRAND. This talk will introduce universal hard-detection and soft-detection decoders using GRAND, which enables low-latency, energy-efficient, secure wireless communications in a manner that is future-proof since it will accommodate any type of code.
This work is joint with Muriel Medard (MIT) and Ken Duffy (Northeastern University).