Money-Medicine-Entropy-Information
By Ramana Annamraju
A petty thief in the state of Michigan botched up a robbery. He left one shoe at the scene while trying to flee. Authorities ran a DNA profile on the shoe. But the biological DNA sample is not sufficient enough to identify anyone in the database. Then the authorities sent the 'extracted data' without biological samples to a unique company in New Zealand. The company specializes in mathematical algorithms based on the law of Entropy.
The entropy-based algorithm extracted relevant data from the chaotic data otherwise seemed impossible. The culprit was identified and convicted with the extracted data without using any biological sample. But, the robber decried, it is a computer conviction, not with the law. He did not realize the conviction stemmed from a universal law that is more powerful and more convincing than any law from this planet. That is the second law of thermodynamics; even the Universe can not escape from this law. The court upheld his conviction and sentenced him.
Which one do you pick? Million dollars pack on the left or scrambled cash on the right? How do you for sure know there is less money in the piled cahs? The answer is embedded in the law of Entropy. First, if you pick up cash that was not counted, there is additional work involved. First, you have to hire and pay the workers to count. Before that, you have to run background checks on employees, which costs money. Then You have to put security cameras, that cost money too; Still, the workers may steal money. These are additional costs incurred with scrambled cash. Overall, you take home less cash. There is a random chance there may be more cash with the uncounted pile. But the randomness by its default, creates uncertainty. Uncertainty lowers the value of money. The law of Entropy says the less information you know about a system, the higher the value of Entropy will be. When the Entropy value keeps going up, eventually, it results in a market crash.
Entropy is a measuring tool for the second law of thermodynamics. It may look strange, but there is a deep connection between Entropy and Information. That connection was realized by a brilliant American Engineer and MIT Professor, Claude Elwood Shannon. (1916 - 2001)
Robert Ayers a renowned Economist and Physicist, an ardent supporter of Law Entropy puts this argument "We burn oil. We mine ores and smelt them into metals, which we fashion into goods. But we do so at the expense of the depletion of those irreplaceable natural resources".The unprecedented energy consumption by humans creates natural calamities of those never seen before: uncontrollable fires and ever-increasing floods. Whether we believe in God or not, we have to trust the Second Law of Thermodynamics for ourselves and for our children.
If you are a healthcare leader, it is important to notice that Entropy is at the center of Healthcare Equity and Access. Shannon Entropy is successfully used in several countries for unevenness in healthcare delivery. It may be surprising, but the nation of Iran is one of them. There is no reason we can not apply here in America. The hospitals in rural America may need more Physicians than the physician-to-population ratio. The burden of multiple chronic conditions specific to ethnic populations may need additional resources. Varied topography like the Appalachian mountain region plays a role. Whenever and wherever there is complexity and randomness of healthcare needs, Entropy-based mathematics sheds light. You have crystal clear information right in front of you. Access and availability to Information for a CEO are more precious than diamonds.
We can not separate Entropy and Information. For this, we have to thank this remarkable American of the 20th century, Claude Elwood Shannon. It is a well-deserved honor to call him the "Father of Information Theory."*********
PS: On a personal note, I apologize to my friend Laura Turner, daughter of CNN founder Ted Turner. In a friendly debate at the Silicon Valley festival, I argued that higher consumption increased living standards in countries like India and China. I am wrong!
________________________________
Purchase Claude Shannon Merchandise click here or image
Additional Read :
(The law of Entropy bounds Energy, Money, and Information. The following equation in the picture demonstrates that relationship. TE = Thermo Dynamical Entropy, ME- Monitory Entropy, and IE stand for _Information Entropy. X=- Technological ) innovation's eXponential order of scaling.)
Comments
Post a Comment