![change in entropy formula change in entropy formula](https://i.ytimg.com/vi/B64ql759zXk/maxresdefault.jpg)
Now, this amount is estimated not only based on the number of different values that are present in the variable but also by the amount of surprise that this value of the variable holds. The entropy measures the “amount of information” present in a variable. So, we get information from a variable by seeing its value, in the same manner as we get details (or information) from a message or letter by reading its content. In other words, a variable is nothing but a unit of storage. Notionally, we can understand that information is something that can be stored in, transferred, or passed-on as variables, which can further take different values. In simple words, we know that information is some facts learned about something or someone. Let’s look at this concept in depth.īut first things first, what is this information? What ‘information’ am I referring to? The English meaning of the word entropy is: it is a state of disorder, confusion, and disorganization. So, we know that the primary measure in information theory is entropy. For this purpose, information entropy was developed as a way to estimate the information content in a message that is a measure of uncertainty reduced by the message.
![change in entropy formula change in entropy formula](http://www.wikicalculator.com/formula_image/Change-in-entropy(specific-heat--volume--pressure-or-temperature-is-not-all-constant)-523.png)
The work was aimed at the problem of how best to encode the information a sender wants to transmit. In his paper, he had set out to mathematically measure the statistical nature of “lost information” in phone-line signals. “ Information theory is a mathematical approach to the study of coding of information along with the quantification, storage, and communication of information.”
![change in entropy formula change in entropy formula](https://www.grc.nasa.gov/WWW/k-12/VirtualAero/BottleRocket/airplane/Images/entropy.gif)
Shannon was also known as the ‘father of information theory’ as he had invented the field of information theory. Shannon, mathematician, and electrical engineer, published a paper on A Mathematical Theory of Communication, in which he had addressed the issues of measure of information, choice, and uncertainty. The term entropy was first coined by the German physicist and mathematician Rudolf Clausius and was used in the field of thermodynamics. The focus of this article is to understand the working of entropy by exploring the underlying concept of probability theory, how the formula works, its significance, and why it is important for the Decision Tree algorithm. It is a must to know for anyone who wants to make a mark in Machine Learning and yet it perplexes many of us. IntroductionĮntropy is one of the key aspects of Machine Learning. This article was published as a part of the Data Science Blogathon.