Entropy and Enthalpy - Reality or Grandiose Mistake
Entropy and Enthalpy - Reality or Grandiose Mistake
Entropy and Enthalpy - Reality or Grandiose Mistake
8, 2023-06
doi: 10.13140/RG.2.2.24085.19683
Abstract
Thermodynamics embraces both the concepts of entropy and enthalpy. Strangely, both
concepts express work done by an expanding system in terms of W=PdV, without providing
the necessary clarity concerning what PdV actually represents. In this report, we shall discuss
that such work is external to the expanding system. This means that one should question the
validity of accepted notions in thermodynamics including those concerning both enthalpy and
enthalpy.
Keywords
1. Introduction
Cite as: Mayhew, K. & Hernandez, H. (2023). Entropy and Enthalpy: Reality or Grandiose Mistake?
ForsChem Research Reports, 8, 2023-06, 1 - 19. Publication Date: 24/03/2023.
Entropy and Enthalpy:
Reality or Grandiose Mistake?
Kent Mayhew and Hugo Hernandez
[email protected]
[email protected]
Of course, all thermodynamic concepts were born from the necessity to explain different
phenomena observed in Nature. Perhaps Heat and Temperature are the most tangible notions
we may experience, thanks to the thermal receptors in our skin. Like all other of our senses,
those thermal feelings are just the interpretation in our brains of natural phenomena. In this
case, our perception of heat and temperature is related to the frequency and intensity of
molecular collisions with our thermal receptors, creating the feelings of ‘hot’ and ‘cold’ in our
brains. However, neither Energy nor Heat are physical entities (like matter), and cannot be
directly measured but only calculated.
While Energy is a familiar notion to most of us, other concepts such as Entropy and Enthalpy
are more difficult to assimilate. And this is in part due to the vagueness involved in their
definitions.
In classical thermodynamics one often considers the change of a system’s internal energy
( ) in terms of the energy transferred into that system as heat ( ), and the work
that a closed expanding system does ( ). As the subscript indicates, such work done is
external to the system. In which case, one can write the First Law of Thermodynamics [4] as
follows:
(2.1)
In the mid-19th century Clausius considered a new parameter, calling it entropy ( ) with the
notion that an isothermal entropy change ( ) could be envisioned in the following manner [5]:
(2.2)
where is the temperature of the system.
(2.3)
where is the work done by the system onto the surrounding environment ( ).
Mayhew [6,7] has pointed out that as written in Eq. (2.3), lacks clarity. First, one needs to
know if the pressure refers to the internal pressure ( ) of the system or to the external
pressure ( ). Secondly, one needs to know if the volume refers to the volume of the system
( ), or to the volume of the surroundings ( ).
Eq. (2.3) can be written with more clarity in terms of the closed expanding system as follows:
(2.4)
Notice that the work done by the system depends on the pressure opposing motion [8]. In the
case of a closed expanding system, the pressure opposing motion is the external pressure.
( ) ( )
(2.5)
If one prefers, the two systems can be separated. In which case, one would say that the
expanding system lifts the overlying surrounding atmosphere. Either way, the upward lifting of
the surrounding/overlying atmosphere’s mass signifies a potential energy increase of that
atmosphere’s mass ( ). This can be expressed as follows:
( )
(2.6)
where , , and , respectively are the mass of the overlying atmosphere that was lifted,
acceleration due to Earth’s gravity (9.82 m/s2), and the height that the atmosphere was lifted.
Therefore, Eq. (2.6) can be envisioned in terms of a potential energy increase of the
surrounding atmosphere. In which case, one may choose to rewrite Eq. (2.6) as follows:
( )
(2.7)
To some, a certain mental simplicity may be obtained by contemplating in terms of the heating
of the expanding system. If is again considered to be the heat in, then Eq. (2.7)
becomes:
(2.8)
This can be rewritten as:
(2.9)
which in simplest terms imaginable can be rewritten in terms of the work done onto the
atmosphere ( ). That being a first law style equation:
(2.10)
A circular path has just been walked. It drives home the point that lucidity can be obtained if
one accepts that the atmosphere has mass, and that all expanding systems on Earth’s surface
must lift the mass of the overlying/surrounding atmosphere. Again, in the simplest of terms,
this is work done onto the atmosphere by the expanding systems, as defined by Eq. (2.6).
Importantly, such work results in a potential energy increase of the overlying/surrounding
atmosphere.
One must emphasize that this atmospheric potential increase happens irrelevant to whether
the expanding system expands upwards, downwards or sideways. In all cases, the atmosphere
can only move upwards, it certainly cannot be downwards into the Earth. And if it expands
sideways, then one may deal with pressure increase, which over time (seconds at the most) will
result in the lifting of the atmosphere’s mass as the surrounding atmosphere readjusts back to
atmospheric pressure.
To ensure that there is no misunderstanding Figure 1A) shows a horizontally expanding piston-
cylinder which can be described by Eq. (2.10). One could further envision; as shown in Figure 1B)
a massless piston in a vertical piston-cylinder expanding with a mass (m) on top of the piston.
In which case, Eq. (2.10) becomes§:
(2.11)
Let us now add some complication to Eq. (2.10) and write
( ) ( )
(2.12)
Mayhew [6,7,9,10,11] has further realized that ( ) can be associated to the magnitude of
lost work ( ). Thus, in terms of lost work, Eq. (2.12) can be rewritten as:
( )
(2.13)
Why is the work done onto the atmosphere ( ) considered as being lost work ( )? The
reason is that the heated expanding system will never be able to recover the work it did onto
the surrounding atmosphere. Hence, that work is forever lost by that expanded system.
In terms of potential energy, if that expanded system starts to contract then the atmosphere’s
potential energy increase (as defined by Eq. 2.6) becomes kinetic energy within the
surrounding atmosphere. In other words that energy is transformed from atmospheric
potential energy into atmospheric kinetic energy, resulting in an infinitesimal heating of our
atmosphere. Or, if one prefers an infinitesimal temperature increase of the surrounding
atmosphere.
If the energy of the system remains constant during the expansion process (isothermal
process), then , and Eq. (2.13) becomes:
( )
(2.14)
Eq. (2.14) seems ridiculous because it equates entropy change of the expanding system to work
lost onto the surrounding atmosphere (which depends on the state of the atmosphere). This
alone should lend itself to the questioning of entropy as a state function, not to say as the
foundation of thermodynamics.
§
When a liquid turns into a gas, there is a potential energy increase due to the change in position of the
vaporizing molecules. However, the mass of vaporizing molecules is very small when compared to the
mass of overlying/surrounding atmosphere, so such potential energy contribution can be neglected.
If , then one might ponder can the system’s temperature increase? Since entropy
change is contemplated in terms of an isothermal system, and is in terms of the
atmosphere energy increase, then seeming expressing what is witnessed in terms of Eq. (2.14)
is non-sensible. In reality, as the expanding system is heated, it must undergo an infinitesimal
temperature increase that results in an infinitesimal pressure increase. It is this pressure
increase that forces the system to expand (since ), and hence, to perform work
onto the surrounding atmosphere (lost work). Seemingly, infinitesimal arguments help hide the
inconsistencies inherent to isothermal entropy increases.
Are isothermal entropy increases really needed? One could rewrite Eq. (2.14) in terms of the
heat into the expanding system, i.e.:
(2.15)
Certainly, realistic expanding systems such as those of a car’s piston-cylinder do not adhere to
infinitesimal arguments, and hence do not adhere to the notions of entropy. Consider that the
closed piston-cylinder (system) is quasi-static and is not insulated. As that system expands it
does work onto the surrounding atmosphere, but the system’s temperature remains constant.
Since the gas’ expansion is isothermal, then in terms of the kinematics of the gas molecules,
there is no system energy change. However, the volume between the gas molecules expands
and this volume is occupied by radiation (generally blackbody). Thus, there must be an increase
in the amount of radiation in any expanding isothermal gaseous system. This radiation energy
increase must come from a combination of the system walls and the heat into the system.
For low and moderate temperature systems, the radiation’s energy is negligible when
compared to the energy associated with the gas molecule’s kinematics. Therefore, for any
expanding gas the notion of is an approximation based upon the assumption that
the energy of a gas’ kinematics is much greater than the energy of the surrounding radiation.
The above would not be true for a high temperature expanding gas. Consider a blast furnace.
There is no arguing that radiative heat emanating from the blast furnace is a significant portion
of the witnessed thermal energy. That thermal energy is an undeniable combination of
kinematic energy and radiative energy. While kinematic energy is associated with the gas
molecules’ translational, rotational and vibrational energies, radiative energy is dispersed
between those gas molecules.
Consider the isothermal expansion of a compressed gas. In order to remain isothermal that gas
must expand in a quasi-static manner. In so doing, it allows for thermal energy (heat) to pass
from the walls into the expanding gas. The walls would cool unless they obtain that thermal
energy from the surrounding atmosphere. In other words, the atmosphere unwittingly passes
a combination of kinematic and radiative heat back into the system’s walls, which then
conducts energy back into the expanding gas. One realizes the need for quasi-static conditions
without the need for entropy.
Are the complications that entropy instills necessary? Reconsider Eq. (2.12) and/or (2.13). One
may choose to rewrite Eq. (2.12) in terms of isothermal entropy change as follows:
( ) ( )
(2.16)
However, there are none of Entropy’s required over-complications, if one simply writes:
( )
(2.17)
If the expanding system undergoes a temperature change (with no phase change) then in
terms of that systems’ molar isometric heat capacity ( ) and its number of moles ( ), one
obtains:
( )
(2.18)
where is the isobaric heat capacity.
There is nothing wrong with Eq. (2.17) or Eq. (2.18). If a system is heated quasi-statically so that
changes within are infinitesimal, then one could consider the case of some isothermally heated
processes. In which case one returns to Eq. (2.15).
3. Logarithmic Functionality
Is entropy nothing more than an ill-conceived mathematical contrivance [12,13]? How does one
explain all that necessary logarithmic functionality?
Clausius’ Eq. (2.2) expresses isothermal entropy change in a manner that enables one to attain
a natural logarithmic functionality when the heat transferred to the system is exclusively used
for temperature change (with no phase change). The resulting differential equation:
(3.1)
( )
(3.2)
The natural logarithm functionality emerges frequently throughout both thermodynamics and
chemistry, giving Entropy a sense of correctness. However, let us remark that natural logarithm
terms are present whenever we have a differential equation of the general form:
(3.3)
where and are arbitrary variables, and is an arbitrary constant. The corresponding
solution of Eq. (3.3) is:
∫ ∫ ( )
(3.4)
Such is the case, for example, of isothermal compression work done on an ideal gas, where**:
( ) ∫ ∫ ∫ ( )
(3.5)
as well as first order kinetic equations with the general form:
(3.6)
**
is the number of molecules in the ideal gas, and represents Boltzmann’s
constant.
Since natural logarithms represent the inverse of the exponential function, logarithms can also
be found whenever an exponential term is present. Now, many phenomena involve the
exponential term since it is closely related to the normal distribution, which is widely found in
Nature, particularly when molecules are involved. Thus, exponential (and logarithmic) terms
are quite commonly found in molecular processes. This is the case, for example, of the
exponential term in Arrhenius-type reaction rate expressions [14].
As we can see, the emergence of logarithm functionalities in Nature is not related in any way to
the existence of the notion of Entropy.
Let us now reconsider Eq. (2.4). Quasi-static isothermal expansion often starts and finishes at
atmospheric pressure, hence . Therefore, one could write:
( ) ( )
(3.7)
It must be stressed that Eq. (3.7) is only an approximation for quasi-static processes where
changes are infinitesimal, rather than for realistic processes. One could divide through by
temperature and hence rewrite Eq. (3.7) as:
( )
(3.8)
Accepting Eq. (3.8) requires the complete understanding of what a system’s entropy change
actually is; an understanding beyond Clausius’ notion described by Eq. (2.2). Without this
clarity, it becomes an unnecessary over-complication of reality.
Some years after the notion of Entropy was introduced by Clausius, Ludwig Boltzmann
provided a novel molecular interpretation of Entropy based on the indistinguishability of
individual molecules in a compound [15]. According to Boltzmann, Entropy is related to the
number of possible indistinguishable systems (microstates) sharing the same macroscopic
properties. Therefore, equilibrium will be determined by the system configuration with most
indistinguishable microstates, also representing the maximum possible entropy.
While Boltzmann’s and Clausius’ definitions of Entropy are completely different, they share
certain mathematical similitude under very specific conditions (e.g., ideal monatomic gas
systems) [16], what has led to the assumption that both notions are equivalent. But they are
not. Thus, it is highly questionable whether or not the interpretation of Entropy provided by
statistical thermodynamics can simply replace everywhere the original concept described by
Eq. (2.2).
For example, if we have a pure system (all molecular are identical and have the exact same
mass), and ideally assume that all collisions are elastic, then the kinetic energy of the colliding
molecules will be completely interchanged [2,17], and therefore, the distribution of molecular
kinetic energies will remain invariant, independently of the original distribution considered, and
no other distribution is feasible. In this case, any original distribution would become the
equilibrium distribution, and it should have the highest possible Entropy, which is clearly a
paradoxical result for statistical thermodynamics. Of course, the assumption of elastic collisions
is only ideal because molecular collisions are basically always inelastic††; however, it does not
change the fact that possible microstates are not equiprobable, and that their probability
strongly depends on the current microstate of the system. It is also for these constraints that
the most probable molecular kinetic energy distribution in a system is the distribution and
not the exponential Boltzmann distribution [18].
On the other hand, the so-called Second Law of Thermodynamics can be obtained from the
analysis of molecular collisions [19]. It can be shown that for large molecular systems, the
probability of obtaining a negative total change in system Entropy asymptotically approaches
zero, but is not necessarily zero. In fact, for small systems, such probability is not negligible,
††
Inelastic collisions occur when the total kinetic energy of the colliding particles is high resulting in the
emission of submolecular and/or subatomic particles, including any kind of radiation.
resulting in “apparent” violations of the second law. The problem here is that the “second law”
is not a “law” in the strict sense of the word, as they are not universally valid, but only valid for
large, macroscopic systems. In addition, the analysis of energy exchange during molecular
collisions allows us to relate the net flow of energy in terms of the temperature difference, for
large macroscopic systems [17], explaining why heat flows from a hot body to a cold body
without introducing the notion of Entropy.
Thus, choosing to use the cumbersome Entropy-based Second Law to explain a system’s
energy degradation is an over-complication. Certainly, while a system’s capability to transfer
thermal energy, or do work, depends on the particular molecular configuration of the system
(individual molecular positions and velocities) and its surroundings, those phenomena can be
explained in simpler macroscopic terms, allowing for the dismissal of both Clausius’ and
Boltzmann’s notions of Entropy.
On a final note: There remain the associations of volume increases, randomness, and entropy.
Randomness is a highly subjective notion. Arguably, what is more random or more ordered lay
in the eyes of the beholder, as pointed out by Ben-Naim [20]. Even so, as a system expands, it
generally does appear more random. This has led to the second law being used to explain
inefficiencies involving expanding systems. It becomes of interest that the notion of lost work
allows one to understand the inherent inefficiencies associated with the work done by
expanding systems.
Onto the above inefficiency, one may add the inherent inefficiencies associated with using
gases to perform work. This occurs because not all of a gas’ internal energy can be converted
into work [7]. Added to inefficiencies associated with friction, one may ponder the need to
employ the second law to explain why macroscopic perpetual motion remains a false narrative.
Notice that molecular motion is a very nice example of perpetual motion at a microscopic scale,
and it is precisely molecular motion what explains inefficiencies at the macroscopic scale.
Entropy remains a mathematical contrivance. Its ambiguity is hidden by the mistake of not
clarifying that work done by an expanding system is always external to that system. This was in
part hidden by the consideration of quasi-static processes. What about enthalpy?
The same principles apply. For enthalpy change one traditionally writes:
(5.1)
( ) ( )
(5.2)
One has to be careful here. Eq. (5.2) concerns the enthalpy of closed expanding systems such as
those described by the enthalpy of vaporization.
Let us say that one is describing a non-expanding or collapsing/condensing system. What is the
enthalpy of that process (a.k.a. the enthalpy of condensation)? Since , then Eq. (5.2)
becomes:
( )
(5.3)
For more clarity let us rewrite Eq. (5.2) and (5.3):
( ) ( )
(5.4)
( )
(5.5)
One of the grandiose mistakes made in thermodynamics is the claim that enthalpy of
condensation is the same magnitude as enthalpy of vaporization (at least for closed expanding
systems).
Why the grandiose mistake? One can readily measure the enthalpy of vaporization using an
isobaric calorimeter. The same cannot be done for enthalpy of condensation. So rather than
realizing that the magnitude of enthalpy is not the same for condensation as it is for
vaporization, scientists simply incorrectly claimed it was.
During vaporization, a molecule originally present close to the interphase abandons the liquid
phase when its kinetic energy overcomes the potential attraction of its neighboring liquid
molecules. The vaporizing molecule (red sphere with red arrow) then enters into a vapor
phase, eventually colliding with the movable wall, exerting an opposing pressure due to the
presence of air molecules. Thus, part of the energy of the liquid is used to push the wall and the
air molecules (lost work), making room for the vaporizing molecules.
On the other hand, during condensation, a molecule originally in the vapor phase, by losing
energy it can no longer overcome the potential attraction of its neighbors, resulting in
incorporation into the liquid phase. However, the moving wall is not necessarily responsible for
the condensation (unless the external pressure is deliberately increased, which is not the case
for atmospheric conditions). Since the remaining vaporized molecules cannot exert the same
pressure as before, the system is compressed. However, there is no work done by the
atmosphere on the system during condensation.
Even in the absence of phase change, in realistic processes, heating is closer to rapid than
infinitesimal. Hence, the system’s temperature does increase, as represented by Eq. (2.18).
Such expression can be rearranged to yield:
( ) ( )
(5.6)
Indicating that lost work is simply related to the difference between isometric and isobaric heat
capacities, as has been previously described [7].
Finally, let us consider that there is a change to bonding potential energy such as often occurs
in a chemical reaction ( ). As a generality one could consider that . In which
case Eq. (2.15) can be rewritten as follows:
( ) ( )
(6.1)
Writing Eq. (6.1) has the same limitations as writing Eq. (2.16). Why even make the association
with entropy? Simplicity is once more attained by writing:
( )
(6.2)
Again Eq. (6.2) only applies to closed expanding systems, such as experimental gaseous
systems.
If the only change to the system’s energy is the change to the bonding energy due to phase
change then Eq. (5.4) and (5.5) are similar to Eq. (6.2), except that enthalpy replaces the heat
in. In terms of enthalpy and bonding potential change, one could respectively write for
vaporization and condensation:
( ) ( )
(6.3)
( )
(6.4)
One should ask is enthalpy really necessary? As discussed in [6,7], based upon Eq. (6.3), the
latent heat of vaporization can be expressed as:
( ) ( ) ( ) ( )
(6.5)
where signifies the latent heat, the subscript ( ) indicates that the change is from liquid
to gas, (i.e., vaporization), and is the change in bonding energy.
( ) ( )
(6.6)
where the subscript ( ) signifies condensation.
Let us now consider the more general case observed in real chemical processes, which is that
of a multicomponent system where some of the components may experience phase change.
Eq. (2.1) can be expressed as follows:
(6.7)
where is the number of components in the system, and is the change in internal energy
for the -th component.
If the system’s temperature is changing without a phase change we may write (from Eq. 2.18):
(6.8)
where is the number of moles from the -th component, and its corresponding molar
isometric or isochoric heat capacity.
( )
(6.9)
∑( ( ( ) ( ) ) ( ) )
(6.11)
Now, considering a certain amount of energy added to the system either by heat transfer or
external work, given by , how do you determine the system temperature
change ( ) and the molar phase fraction change ( ) for each component? In this
(relatively simple) case we have an underdetermined system with a single equation but
unknowns.
Furthermore, if Eq. (2.2) is used, Eq. (6.11) results in the more complicated expression:
∑( ( ( ) ( ) ) ( ) ) ∑
∑ ( ̃ ̃ )
(6.12)
where ̃ and ̃ represent the molar entropy of component for phases and ,
respectively.
( ( ) ( ) ) ( ) ( ̃ ̃ )
(6.13)
Of course, Eq. (6.13) might satisfy the first law, but it does not mean that it is necessarily
correct. Furthermore, in the presence of work, Eq. (6.13) is no longer valid.
This multicomponent, multiphase example clearly shows that the so-called laws of
Thermodynamics are insufficient to solve more realistic situations. What we are missing in this
case, are expressions relating the distribution of power added to the system into rates of
temperature change and/or phase change. The required expressions can be found by
considering the behavior of the system at the molecular level, and imply a dynamic behavior,
which is, unfortunately, missing in conventional “equilibrium thermodynamics”.
7. Conclusion
Many different concepts have emerged over the years in the field of Thermodynamics to
describe the macroscopic behavior of matter, including: Energy, Heat, Work, Enthalpy, Entropy,
and many others. While these concepts were introduced to facilitate the mathematical
representation of natural phenomena at the macroscopic scale, some of them have also
contributed to over-complicate the science, hindering the correct understanding of those
phenomena. Such is the case of Entropy and Enthalpy.
The first issue with these two concepts is that they have been mistakenly considered as true
physical entities, while they are only abstract mathematical constructs. In second place, they
are commonly ill-defined, because both are considered state functions, but they actually
involve external conditions (particularly the external pressure). Only at mechanical equilibrium
they can be considered as state functions, but this is a highly unrealistic assumption for most
processes of interest.
A new Thermodynamics is possible which does not require the introduction of ambiguous
notions. This new Thermodynamics can describe macroscopic phenomena, while remaining
consistent at the microscopic and molecular scales. The new Thermodynamics corrects
grandiose mistakes unwittingly incurred in the past, which emerged probably due to a limited
understanding of Nature.
This report provides data, information and conclusions obtained by the author(s) as a result of original
scientific research, based on the best scientific knowledge available to the author(s). The main purpose
of this publication is the open sharing of scientific knowledge. Any mistake, omission, error or inaccuracy
published, if any, is completely unintentional.
This research did not receive any specific grant from funding agencies in the public, commercial, or not-
for-profit sectors.
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International (CC BY-NC
4.0). Anyone is free to share (copy and redistribute the material in any medium or format) or adapt
(remix, transform, and build upon the material) this work under the following terms:
Attribution: Appropriate credit must be given, providing a link to the license, and indicating if
changes were made. This can be done in any reasonable manner, but not in any way that
suggests endorsement by the licensor.
NonCommercial: This material may not be used for commercial purposes.
References
[1] Thomson, W. (1851). On the dynamical theory of heat. On the quantities of mechanical energy
contained in different states, as to temperature and density. Transactions of the RS of Edinburgh
(1853), 20, 475-482. doi: 10.1080/14786445508641913.
[2] Hernandez, H. (2017). A Mathematical Reflection on the Origin of the Laws of Conservation of
Energy and Momentum. ForsChem Research Reports, 2, 2017-1, 1-17. doi:
10.13140/RG.2.2.28312.60167.
[3] Smith, C., & Wise, M. N. (1989). Energy and empire: A biographical study of Lord Kelvin. Cambridge
University Press, Cambridge. p. 336. ISBN: 9780521261739.
[4] Fermi, E. (1936). Thermodynamics. Dover Publications, Inc., New York. Chapter II. The First Law of
Thermodynamics. pp. 11-28. https://www.worldcat.org/en/title/2379038.
[5] Clausius, R. (1867). On Several Convenient Forms of the Fundamental Equations of the Mechanical
Theory of Heat [1865]. In: Clausius, R. The Mechanical Theory of Heat with its Applications to the
Steam-Engine and to the Physical Properties of Bodies. John van Voorst, London. pp. 326-365.
https://www.worldcat.org/es/title/mechanical-theory-of-heat-with-its-applications-to-the-steam-
engine-and-to-the-physical-properties-of-bodies/oclc/1547121.
[6] Mayhew, K.W. (2022). New Thermodynamics: The Second Law Buried by Illusions. Hadronic Journal
45 (1), 97-116. http://hadronicpress.com/docs/HJ-45-1D.pdf.
[7] Mayhew, K.W. (2021). New Thermodynamics: Inelastic Collisions, Lost Work, and Gaseous
Inefficiency. Hadronic Journal, 44 (1), 67-96. http://hadronicpress.com/docs/HJ-44-1E.pdf.
[8] Hernandez, H. (2020). Thermo-mechanical Dynamics of a Piston Engine: Carnot Cycle, Entropy and
the Second Law of Thermodynamics. ForsChem Research Reports, 5, 2020-11, 1-50. doi:
10.13140/RG.2.2.35995.28960.
[9] Mayhew, K.W. (2020). New Thermodynamics: Reversibility and Free Energy. Hadronic Journal, 43
(1), 51-59. http://hadronicpress.com/docs/HJ-43-1B.pdf.
[10] Mayhew, K.W. (2015). Second law and lost work. Physics Essays, 28 (1), 152-155. doi: 10.4006/0836-
1398-28.1.152.
[11] Mayhew, K.W. (2020). New Thermodynamics: Inefficiency of a Piston-Cylinder. European Journal of
Engineering Research and Science, 5 (2), 187-191. doi: 10.24018/ejeng.2020.5.2.1765.
[12] Mayhew, K. W. (2015). Entropy: An ill-conceived mathematical contrivance? Physics Essays, 28 (3),
352-357. doi: 10.4006/0836-1398-28.3.352.
[13] Aguirre, J. and Hernandez, H. (2020). Entropy: A Physical Entity or a Mathematical Construct?
ForsChem Research Reports, 5, 2020-10, 1-30. doi: 10.13140/RG.2.2.34938.93124.
[14] Hernandez, H. (2019). Collision Energy between Maxwell-Boltzmann Molecules: An Alternative
Derivation of Arrhenius Equation. ForsChem Research Reports, 4, 2019-13, 1-27. doi:
10.13140/RG.2.2.21596.33926.
[15] Boltzmann, L. (& Brush, S. G., translator) (1964). Lectures on Gas Theory. Dover Publications, Inc.,
New York. Section §5. Proof that Maxwell’s velocity distribution is the only possible one. pp. 25-29.
https://archive.org/details/lectures-on-gas-theory-ludwig-boltzmann.
[16] Hernandez, H. (2022). Clausius’ vs. Boltzmann’s Entropy. ForsChem Research Reports, 7, 2022-20, 1 -
11. doi: 10.13140/RG.2.2.33844.73608.
[17] Hernandez, H. (2020). Expected Momentum and Energy Changes during Elastic Molecular Collisions.
ForsChem Research Reports, 5, 2020-13, 1-15. doi: 10.13140/RG.2.2.26182.09288.