Skip to Main Content
Information Combining is an introduction to the principles of information combining. The concept is described, the bounds for repetition codes and for single parity-check codes are proved, and some applications are provided. As the focus is on the basic principles, it considers a binary symmetric source, binary linear channel codes, and binary-input symmetric memoryless channels. Information Combining first introduces the concept of mutual information profiles and revisits the well-known Jensen's inequality. Using these tools, the bounds on information combining are derived for single parity-check codes and for repetition codes. The application of the bounds is illustrated in four examples. Information Combining provides an excellent tutorial on this important subject for students, researchers and rpofessonals working in communications and information theory.