Money-Medicine-Entropy-Information



By Ramana Annamraju

                           

A petty thief in the state of Michigan botched up a robbery. He left one shoe at the scene while trying to flee. Authorities ran a DNA profile on the shoe. But the biological DNA sample is not sufficient enough to identify anyone in the database. Then the authorities sent the 'extracted data' without biological samples to a unique company in New Zealand. The company specializes in mathematical algorithms based on the law of Entropy.

The entropy-based algorithm extracted relevant data from the chaotic data otherwise seemed impossible. The culprit was identified and convicted with the extracted data without using any biological sample. But, the robber decried, it is a computer conviction, not with the law. He did not realize the conviction stemmed from a universal law that is more powerful and more convincing than any law from this planet. That is the second law of thermodynamics; even the Universe can not escape from this law. The court upheld his conviction and sentenced him.

Which one do you pick? Million dollars pack on the left or scrambled cash on the right? How do you for sure know there is less money in the piled cahs? The answer is embedded in the law of Entropy. First, if you pick up cash that was not counted, there is additional work involved. First, you have to hire and pay the workers to count. Before that, you have to run background checks on employees, which costs money. Then You have to put security cameras, that cost money too; Still, the workers may steal money. These are additional costs incurred with scrambled cash. Overall, you take home less cash. There is a random chance there may be more cash with the uncounted pile. But the randomness by its default, creates uncertainty. Uncertainty lowers the value of money. The law of Entropy says the less information you know about a system, the higher the value of Entropy will be. When the Entropy value keeps going up, eventually, it results in a market crash.

Entropy is a measuring tool for the second law of thermodynamics. It may look strange, but there is a deep connection between Entropy and Information. That connection was realized by a brilliant American Engineer and MIT Professor, Claude Elwood Shannon. (1916 - 2001)



With all its mistakes, you could read this sentence "dootcr has aimttded the pintaet". It is not because you are a genius as in social media claims to be, but rather it is related to the Entropy of the English language. In 1951, Claude Shannon published a paper, "Prediction and Entropy of Printed English", showing upper and lower bounds of Entropy. English may not be an efficient language, but it furnishes abundant Information with punctuations, commas, and embedded silent letters. Therefore, you could easily comprehend without asking further questions. On the other hand, my native language Telugu from India does not have periods or commas. If there is a single letter missing in a word, it is hard to understand even for people who know the language.

It means you have to ask more questions to clarify what it is. Fewer questions mean Information has the lower Entropy, and more questions mean the Information has the higher Entropy. Claude Shanon calculated entropy of each English letter is approximately 2.62 bits. Claude Shannon is truly ingenious in calculating and quantifying abstract Information.




If you are President of a country when designing economic policies or if you are a software engineer, developing a platform for eCommerce, or if you are a CEO of a hospital concerned about Healthcare Access and Equity, you have to understand the concept of Entropy. Once you have a basic understanding of this concept, you will be amazed at what a powerful tool indeed it is, in your everyday performance.



The law of Entropy bounds Energy, Money, and Information. In a free-market society, the price is by far the most important Information. This is where money "wants" to go. Some economists argue Bitcoin will be the future of money because of its low Entropy value compared to Fiat money. Fiat money is not backed by any commodity such as gold or silver and is typically declared by a decree from the government to be legal tender. The high Entropy-based money like Fiat money allows corruption and other chaotic conditions to arise because the Information available is opaque
to the public.

Software makers are under the impression that growth in eCommerce is the panacea of all economic owes. Rasing stock value and increased salaries, who does not like it? But in reality, more 'orders' you generate result in more consumption and more production. Energy consumption and demand for resources will keep increasing. According to the law of Entropy, it creates an ever-increasing 'disorder'. That disorder manifests in chaos.

                 


Robert Ayers a renowned Economist and Physicist, an ardent supporter of Law Entropy puts this argument "We burn oil. We mine ores and smelt them into metals, which we fashion into goods. But we do so at the expense of the depletion of those irreplaceable natural resources".The unprecedented energy consumption by humans creates natural calamities of those never seen before: uncontrollable fires and ever-increasing floods. Whether we believe in God or not, we have to trust the Second Law of Thermodynamics for ourselves and for our children.

If you are a healthcare leader, it is important to notice that Entropy is at the center of Healthcare Equity and Access. Shannon Entropy is successfully used in several countries for unevenness in healthcare delivery. It may be surprising, but the nation of Iran is one of them. There is no reason we can not apply here in America. The hospitals in rural America may need more Physicians than the physician-to-population ratio. The burden of multiple chronic conditions specific to ethnic populations may need additional resources. Varied topography like the Appalachian mountain region plays a role. Whenever and wherever there is complexity and randomness of healthcare needs, Entropy-based mathematics sheds light. You have crystal clear information right in front of you. Access and availability to Information for a CEO are more precious than diamonds.

We can not separate Entropy and Information. For this, we have to thank this remarkable American of the 20th century, Claude Elwood Shannon. It is a well-deserved honor to call him the "Father of Information Theory."

*********

PS: On a personal note, I apologize to my friend Laura Turner, daughter of CNN founder Ted Turner. In a friendly debate at the Silicon Valley festival, I argued that higher consumption increased living standards in countries like India and China. I am wrong!


Laura is much more far-sighted than me. As a leader of the Captian Planet Foundation, she envisioned the insatiable appetite for consumption results in environmental disaster. Laura Turner has the rock-solid, Second law of Thermodynamics behind her cause.

________________________________

Purchase Claude Shannon Merchandise click here or image


Additional Read :

(The law of Entropy bounds Energy, Money, and Information. The following equation in the picture demonstrates that relationship. TE = Thermo Dynamical Entropy, ME- Monitory Entropy, and IE stand for _Information Entropy. X=- Technological ) innovation's eXponential order of scaling.)





Comments

Popular posts from this blog

A Nurse with Data - DailyWearForMedicine.com

The Incredible Mind of Ramanujan www.RAMANA.com

Avicenna - First Cardiologist - DailyWearForMedicine.com