Physicists have successfully calculated the entropy generated by a clock — which is created each time the clock ticks — that can be operated at various levels of precision. The findings, reported in Physical Review X, suggest that the better a clock's precision, the more entropy it emits.
Much like machines do when they emit heat and increase the entropy of their surroundings, all clocks, even a regular, battery-powered clock, generate entropy when they tick, according to Science News. This means tick and tock each has a fundamental entropy cost.
It was previously measured that the highest possible accuracy of the ticks and the sum of entropy released have a direct relationship. However, this could only be said for tiny quantum clocks, as larger clocks are too complicated for such calculations.
It wasn't exactly clear whether this rule applied to other types of clocks as well.
'Measuring the thermodynamic cost of timekeeping'
The physicists built a clock out of a thin membrane suspended between two posts, tens of nanometers thick and 1.5 millimeters long, to see how much entropy was released when a simplistic clock ticked.
When an electrical signal was sent into the clock, the membrane flexed up and down, and this motion occurred at regular intervals.
It was discovered that the more powerful the electrical signal, the more precisely the clock ticked. As the accuracy of the clock improved, the entropy increased in unison, suggesting the theoretical relationship for quantum clocks also extends to other forms of timepieces.
Although it's worth noting that the physicists only looked at one type of clock, and it's still unclear if the relationship between accuracy and entropy extends to clocks in general, some scientists believe it does.
According to the quantum physicist Ralph Silva of ETH Zurich, who didn't participate in the study, this new study could steer toward this direction. "It’s a data point in favor that it’s probably the case for all clocks. But that’s not been proven," he said.