(i) Information theory is a branch of mathematics dealing with quantifying information. It was developed by Claude Shannon and involves key concepts like entropy which measures uncertainty.
(ii) Error coding schemes allow for reliable communication over unreliable channels by detecting and correcting errors. Common schemes include repetition codes, parity bits, checksums, and cyclic redundancy checks.
(iii) Forward error correction adds redundant bits that allow the receiver to correct errors without retransmission. Applications of error coding include internet protocols, deep space communications, satellite broadcasting, data storage, and memory.