Dońda’s Law

In recent years appeared papers by Melvin M. Vopson et.al. (doi: 10.1063/5.0100358,m doi: 10.1063/1.5123794 (AIP Advances)) from University of Portsmouth, devoted to, so called, information dynamics as analogue to thermodynamics, linked by the concept of entropy treated on equal grounds with information entropy introduced e.g. by Shannon. It even led to suggestion of mass-energy-information equivalence principle, that repeats the original humoristic idea of Polish sci-fi author from 1960-ties, Stanisław Lem, in the novel ‘Professor Dońda’
(https://en.wikipedia.org/wiki/Professor_A._Do%C5%84da ).
The concept seems to be rather naive but still brings sad observation of the contemporary scientific publishing (and peer review) system.
The principal idea leads to conclusion that the large amount of information will gain mass.
The definition of information entropy originated from telegraph communication and referred to set of separable signals of known meaning.
“Frequently the messages have meaning; that is they refer to or are correlated according to some system with certain physical or conceptual entities. These semantic aspects of communication are irrelevant to the engineering problem.” C. E. Shannon, The Bell System Technical Journal, Vol. 27, pp. 379–423, 623–656.
In physical world all data perceived by senses can be the information. But we cannot quantify it unless imposing meaning on it.  Shannon’s definition of information entropy relates to understandable, meaningful information that can be quantified and transmitted. For a real physical system, amount of all information carried with it cannot be defined. In quantum realm it should be expressed in non-measurable cubits. Estimation of probabilities depends on definition of probability space – definition of all equally likely elementary events. But in physics, the probability depends on energy that systems tend to minimize and various particles or quasi-particles (states) follow different statistics – Boltzmann, Fermi-Dirac or Bose-Einstein. Coding of information may consist in changing state of fragments of the system (remaining in metastable state) that can be detected (read) conserving the state. The fragments may slightly interact with each other changing overall energy of the system. Different ‘words’ may thus have slightly different energy. Why should they have different informational entropy ? They have when coding letters of a language. They have different sequence probability also in case of genes or nucleic bases of human genome. But the code can be somehow arbitrary, like in coding numbers (but not processor commands), and it is hard to see why different sequences of bits should have different probability in case of numbers in computer memory. The context of meaning have here a fundamental importance and the information entropy cannot be comparable or treated on equal grounds with the thermodynamic entropy. The former is context related and cannot be a property of the physical system.
Taking mass-energy equivalence, any attempt to measure information related mass rise is futile as, in practice, the change in energy will dominate hypothetical entropy change.