# What is information theory?

Information theory, also known as the mathematical theory of communication, is an approach that studies data processing and measurement in the transmission of information. The communication process proposed by its creators establishes the flow of a message between a sender and a receiver through a determined channel.

Information theory is also responsible for measuring and representing information, as well as the processing capacity of communication systems to transmit that information. It is also a branch of mathematical probability theory.

## How did information theory come about?

The mathematical theory of information was proposed in 1949 by mathematician and engineer Claude Shannon and biologist Warren Weaver. However, it was the result of research initiated almost thirty years prior by scientists such as Andrei Markovi and Ralph Hartley; the latter is known for being one of the first representatives of binary language.

The contribution made by Alan Turing, who created the blueprint for a machine capable of processing pieces of data through the emission of symbols, was the last precedent for the development, culmination, and consecration of what would be called the Mathematical Theory of Communication.

All the studies of that time shared the same goal: finding efficient ways to use communication channels to send information through a channel without affecting the quality of the message that was received.

## What are the elements of information theory?

1. Source of information or sender: the party capable of issuing a message. In information theory, the main sources are:
1. Random: when the message cannot be predicted.
2. Structured: when there is a certain level of redundancy and order.
3. Unstructured: when all messages are random, unrelated, and meaningless, so there is a loss of part of the message.
2. Message: a set of data transported through a channel.
3. Code: a set of elements that follow a set of rules to be combined so they can be interpreted.
4. Channel: the medium by which the message is transmitted so that it reaches the receiver.
5. Information: what the sender seeks to convey through a message. From the point of view of mathematical probability, a theoretical framework of information theory, information should be proportional to the number of bits needed to recognize the message.
6. Recipient: the party that receives the message. Being able to assimilate the content of the message that originates from the source or sender is essential.
7. Noise: different causes that prevent the message from arriving normally in the information flow process, impeding the receiver from being able to understand it fully.

## What are the general approaches of information theory?

• It allows us to study the information process, such as the possible communication channels and understanding the data sent.
• It determines the simplest, most effective way to send a message without any alterations in the process.
• It recognizes elements of distortion or impediments in order for a message to reach a recipient optimally.
• It establishes that both sender and receiver should be able to encode and decode the messages.
• It analyzes the speed with which the messages are transmitted.
• It includes the possibility that a message has multiple meanings, so the recipient gives it meaning, as long as they have the same code as the sender.
• If the choice of message is presented as one of only two options, the value of the information is equal to one (a unit called a  bit). In order for the information to be a bit, there must be equally likely alternatives in the process. In addition, it states that there is more information with the more alternatives there are, and all are equally likely.

## What applications does information theory have?

Information theory represents one of the most important branches of applied mathematics. Some of its various uses include:

• Computer science, such as cryptography and understanding data.
• Electrical engineering, such as communication theory and coding theory.
• Statistics.
• Biology, in the study of DNA sequences and the genetic code.
• Payments, electronic transactions, and authentication processes.
• Encrypting messages or steganography.