Communication in the Presence of Noise and
Algorithms for Error-Correction
The increased reliance of information technology in our day-to-day life has resulted in an explosion in the amount of data that is stored in digital media and transmitted over the web. Associated with the increased amount of storage are increased expectations: One hopes to be able to transmit more information, faster, over communication channels; and store more information, cheaply and for longer periods of time, on the storage media. However, all channels introduce noise over time and corrupt the stored/transmitted information. The task of coping with errors in the new information technology era leads to new challenges and a resurgence of some of the classical ones. This research project investigates foundational questions and designs solutions for some of these challenges.
The task of dealing with noise in storage and transmission of information is a classical one and has been studied substantially under the title of information theory and coding theory. The new challenges arise due to the changing backdrop to these questions:
Increased amounts of data on the net.
Increased expectaion on the amount of time this data is expected to survive.
Reduced amount of effort that the user is willing to put in to the task of saving this data.
Changing nature of interaction between the data and the noise-introducing environment.
Variations in computational power at the hands of the user.
Some of the fundamental research challenges that are highlighted by this explosion of data include:
Finding error-correcting codes achieving capacity of given channels.
Designing efficient error-correcting algorithms to decode up to capacity.
Designing alternate models to capture the varying forms of interaction with the environment.
Examining tradeoffs between computational power and decoding capacity.
Some of these challenges are classical ones (e.g., the first two above), while others are much more novel (e.g. , the last two above).
The research component of this project investigates the challenges above with special emphasis on the novel ones. Three major components underlying this project are:
A systematic study of "list-decoding" algorithms as an approach to explore some of the
A worst-case approach to information theory to capture more general models of error.
A probabilistic approach to decoding to explore the computation-redundancy tradeoffs at very
low levels of computational capabilities.