Sat04292017

LAST_UPDATEFri, 28 Apr 2017 10pm

It’s Official. Cooling To Absolute Zero Is Mathematically Impossible

THERMODYNAMICS

The laws of thermodynamics, a cornerstone of modern physics, help to explain how physical quantities act under certain conditions and in certain circumstances. These laws have mostly gone unchallenged, except for the third law. It was developed by chemist Walther Nernst between 1906–12. Scientists have long questioned the validity of this law, which states:

The entropy of any pure substance in thermodynamic equilibrium approaches zero as the temperature approaches zero (Kelvin), or conversely, the temperature (Kelvin) of any pure substance in thermodynamic equilibrium approaches zero when the entropy approaches zero.

And, in a recent development, researchers from the University College London (UCL) published a paper in nature communications that proves that absolute zero cannot be reached (in a finite number of steps) in a system where entropy cannot reach zero. According to one member of the research team, Lluis Masanes, “We show that you can’t actually cool a system to absolute zero with a finite amount of resources and we went a step further…We then conclude that it is impossible to cool a system to absolute zero in a finite time, and we established a relation between time and the lowest possible temperature. It’s the speed of cooling.”

THE COOL IMPLICATIONS

These and other scientists are hopeful that this research will make people take the third law of thermodynamics more seriously. They also hope, and believe, that this progress could have seriously incredible implications. Understanding cooling and the rate at which we can actually cool could help to further current quantum computing research.

Quantum computing has advanced immeasurably in recent months and years. Some, like IBM, even have their sights set on commercializing quantum computing (something many have seen as more of a sci-fi fantasy than a possible reality). Understanding the third law of thermodynamics and how (and the rate at which) cooling occurs could help to advance this quantum research even further, since overheating is a main concern in such high-level computing.

-Futurism