06EC65 - INFORMATION THEORY AND CODING |
PART – A |
UNIT – I |
INFORMATION THEORY: Introduction, Measure of information,
Average information content of symbols in long independent sequences,
Average information content of symbols in long dependent sequences. Markoff
statistical model for information source, Entropy and information rate of
mark-off source. |
UNIT – II |
SOURCE CODING: Encoding of the source output, Shannon’s encoding
algorithm. Communication Channels, Discrete communication channels,
Continuous channels. |
UNIT – III |
FUNDAMENTAL LIMITS ON PERFORMANCE: Source coding
theorem, Huffman coding, Discrete memory less Channels, Mutual
information, Channel Capacity. |
UNIT – IV |
Channel coding theorem, Differential entropy and mutual information for
continuous ensembles, Channel capacity Theorem. |
PART – B |
UNIT – V |
INTRODUCTION TO ERROR CONTROL CODING: Introduction,
Types of errors, examples, Types of codes Linear Block Codes: Matrix
description, Error detection and correction, Standard arrays and table look up
for decoding. |
UNIT – VI |
Binary Cycle Codes, Algebraic structures of cyclic codes, Encoding using an
(n-k) bit shift register, Syndrome calculation. BCH codes. |
UNIT – VII |
RS codes, Golay codes, Shortened cyclic codes, Burst error correcting codes.
Burst and Random Error correcting codes. |
UNIT – VIII |
Convolution Codes, Time domain approach. Transform domain approa |
REFERENCE |
TEXT BOOKS: |
1. Digital and analog communication systems, K. Sam Shanmugam,
John Wiley, 1996.
2. Digital communication, Simon Haykin, John Wiley, 2003. |
Reference Books |
1. ITC and Cryptography, Ranjan Bose, TMH, II edition, 2007
2. Digital Communications - Glover and Grant; Pearson Ed. 2nd Ed
2008 |
|