Isentropic Flow
Isentropic Flow
flow
Wikepedia
In
thermodynamics,
an
isentropic
process
or
isoentropic
process
(ισον
=
"equal"
(Greek);
εντροπία
entropy
=
"disorder"(Greek))
is
one
in
which
for
purposes
of
engineering
analysis
and
calculation,
one
may
assume
that
the
process
takes
place
from
initiation
to
completion
without
an
increase
or
decrease
in
the
entropy
of
the
system,
i.e.,
the
entropy
of
the
system
remains
constant.[1][2]
It
can
be
proven
that
any
reversible
adiabatic
process
is
an
isentropic
process.
A
simple
more
common
definition
of
isentropic
would
be
"
No
change
in
entropy".
An isentropic flow is a flow that is both adiabatic and reversible. That is, no heat is
added to the flow, and no energy transformations occur due to friction or dissipative
effects. For an isentropic flow of a perfect gas, several relations can be derived to define
the pressure, density and temperature along a streamline.
Note that energy can be exchanged with the flow in an isentropic transformation, as long
as it doesn't happen as heat exchange. An example of such an exchange would be an
isentropic expansion or compression that entails work done on or by the flow.
The
Free
Dictionary
by
Farlex
(encyclopedia2.thefreedictionary.com)
Compressible
flow
in
which
entropy
remains
constant
throughout
the
flowfield.
A
slight
distinction
is
sometimes
made,
especially
in
Europe,
as
follows.
If
the
entropy
of
a
fluid
element
moving
along
a
streamline
in
a
flow
remains
constant,
the
flow
is
isentropic
along
a
streamline.
However,
the
value
of
the
entropy
may
be
different
along
different
streamlines,
thus
allowing
entropy
changes
normal
to
the
streamlines.
An
example
is
the
flowfield
behind
a
curved
shock
wave;
here,
streamlines
that
pass
through
different
locations
along
the
curved
shock
wave
experience
different
increases
in
entropy.
Hence,
downstream
from
this
shock,
the
entropy
can
be
constant
along
a
given
streamline
but
differs
from
one
streamline
to
another.
This
type
of
flow,
with
entropy
constant
along
streamlines,
is
sometimes
defined
as
isentropic.
Flow
with
entropy
constant
everywhere
is
then
called
homentropic.
See
Compressible
flow,
Entropy,
Isentropic
process
Because
of
the
second
law
of
thermodynamics,
an
isentropic
flow
does
not
strictly
exist.
From
the
definition
of
entropy,
an
isentropic
flow
is
both
adiabatic
and
reversible.
However,
all
real
flows
experience
to
some
extent
the
irreversible
phenomena
of
friction,
thermal
conduction,
and
diffusion.
Any
nonequilibrium,
chemically
reacting
flow
is
also
irreversible.
However,
there
are
a
large
number
of
gas
dynamic
problems
with
entropy
increase
negligibly
slight,
which
for
the
purpose
of
analysis
are
assumed
to
be
isentropic.
Examples
are
flow
through
subsonic
and
supersonic
nozzles,
as
in
wind
tunnels
and
rocket
engines;
and
shock-‐free
flow
over
a
wing,
fuselage,
or
other
aerodynamic
shape.
For
these
flows,
except
for
the
thin
boundary-‐layer
region
adjacent
to
the
surface
where
friction
and
thermal
conduction
effects
can
be
strong,
the
outer
inviscid
flow
can
be
considered
isentropic.
If
shock
waves
exist
in
the
flow,
the
entropy
increase
across
these
shocks
destroys
the
assumption
of
isentropic
flow,
although
the
flow
along
streamlines
between
shocks
may
be
isentropic.
See
Adiabatic
process,
Boundary-‐layer
flow,
Shock
wave,
Thermodynamic
principles,
Thermodynamic
processes
1st
law
of
thermodynamics
(Wikepedia)
The first law of thermodynamics is a version of the law of conservation of energy,
specialized for thermodynamical systems. It is usually formulated by stating that the
change in the internal energy of a closed system is equal to the amount of heat supplied to
the system, minus the amount of work done by the system on its surroundings. The law of
conservation of energy can be stated: The energy of an isolated system is constant.
The first explicit statement of the first law of thermodynamics, by Rudolf Clausius in
1850, referred to cyclic thermodynamic processes.
"In
all
cases
in
which
work
is
produced
by
the
agency
of
heat,
a
quantity
of
heat
is
consumed
which
is
proportional
to
the
work
done;
and
conversely,
by
the
expenditure
of
an
equal
quantity
of
work
an
equal
quantity
of
heat
is
produced."[1]
Clausius stated the law also in another form, this time referring to the existence of a
function of state of the system called the internal energy, and expressing himself in terms
of a differential equation for the increments of a thermodynamic process. This equation
may be translated into words as follows:
In
a
thermodynamic
process
of
a
closed
system,
the
increment
in
the
internal
energy
is
equal
to
the
difference
between
the
increment
of
heat
accumulated
by
the
system
and
the
increment
of
work
done
by
it.[2]
2nd
law
of
thermodynamics
(Wikepedia)
The second law of thermodynamics states that the entropy of an isolated system never
decreases, because isolated systems spontaneously evolve towards thermodynamic
equilibrium—the state of maximum entropy. Equivalently, perpetual motion machines of
the second kind are impossible.
The second law is a postulate of thermodynamics, but it can be understood and proven
using the underlying quantum statistical mechanics. It is an expression of the fact that
over time, differences in temperature, pressure, and chemical potential decrease in an
isolated physical system, leading eventually to a state of thermodynamic equilibrium. In
the language of statistical mechanics, entropy is a measure of the number of microscopic
configurations corresponding to a macroscopic state. Because equilibrium corresponds to
a vastly greater number of microscopic configurations than any non-equilibrium state, it
has the maximum entropy, and the second law follows because random chance alone
almost guarantees that the system will evolve towards equilibrium.
Entropy (Wikepedia)
Entropy
was
originally
defined
by
Rudolf
Clausius
in
1865
as
an
extensive
thermodynamic
function
of
state
that
is
the
measure
of
a
system’s
thermal
energy
per
unit
temperature
that
is
unavailable
for
doing
mechanical
work.
Unlike
most
other
functions
of
state,
entropy
cannot
be
directly
measured
but
must
be
calculated.
The
need
for
an
entropy
function
emerges
from
the
fundamental
thermodynamic
relation.
Entropy
has
the
dimension
of
energy
divided
by
temperature,
which
has
a
unit
of
joules
per
kelvin
(J/K)
in
the
International
System
of
Units.
The
term
entropy
was
Clausius
based
on
the
Greek
εντροπία
[entropía],
a
turning
toward,
from
εν-‐
[en-‐]
(in)
and
τροπή
[tropē]
(turn,
conversion).[2][note
2]
The
infinitesimal
change
in
the
entropy
(dS)
of
a
system
is
the
infinitesimal
transfer
of
heat
energy
(δQ)
to
a
closed
system
driving
a
reversible
process,
divided
by
the
equilibrium
temperature
(T)
of
the
system.
Enthalpy
(Wikepedia)
Enthalpy is a measure of the total energy of a thermodynamic system. It includes the
internal energy, which is the energy required to create a system, and the amount of
energy required to make room for it by displacing its environment and establishing its
volume and pressure.
The enthalpy is the preferred expression of system energy changes in many chemical,
biological, and physical measurements, because it simplifies certain descriptions of
energy transfer. This is because a change in enthalpy takes account of energy transferred
to the environment through the expansion of the system under study.
The total enthalpy, H, of a system cannot be measured directly. Thus, change in enthalpy,
ΔH, is a more useful quantity than its absolute value. The change ΔH is positive in
endothermic reactions, and negative in heat-releasing exothermic processes. ΔH of a
system is equal to the sum of non-mechanical work done on it and the heat supplied to it.
For processes under constant pressure, ΔH is equal to the change in the internal energy of
the system, plus the work that the system has done on its surroundings.[1] This means that
the change in enthalpy under such conditions is the heat absorbed (or released) by a
chemical reaction. Enthalpies for chemical substances at constant pressure assume
standard state: most commonly 1 bar pressure. Standard state does not, strictly speaking,
specify a temperature (see standard state), but expressions for enthalpy generally
reference the standard heat of formation at 25 °C.
The word enthalpy is based on the Greek word enthalpos (ἔνθαλπος), which means to put
heat into. It comes from the Classical Greek prefix ἐν-, en-, meaning to put into, and the
verb θάλπειν, thalpein, meaning "to heat". The word enthalpy is often incorrectly
attributed[citation needed] to Benoit Paul Émile Clapeyron and Rudolf Clausius through the
1850 publication of their Clausius-Clapeyron relation. This misconception was
popularized by the 1927 publication of The Mollier Steam Tables and Diagrams.
However, neither the concept, the word, nor the symbol for enthalpy existed until well
after Clapeyron's death.
The earliest writings to contain the concept of enthalpy did not appear until 1875,[2] when
Josiah Willard Gibbs introduced "a heat function for constant pressure". However, Gibbs
did not use the word "enthalpy" in his writings.[note 1] Instead, the word "enthalpy" first
appears in the scientific literature in a 1909 publication by J. P. Dalton. According to that
publication, Heike Kamerlingh Onnes (1853-1926) actually coined the word.[3]
Over the years, many different symbols were used to denote enthalpy. It was not until
1922 that Alfred W. Porter proposed the symbol "H" as the accepted standard,[4] thus
finalizing the terminology still in use today.
Streamline
Fluid flow is characterized by a velocity vector field in three-dimensional space, within
the framework of continuum mechanics. Streamlines, streaklines and pathlines are
field lines resulting from this vector field description of the flow. They differ only when
the flow changes with time: that is, when the flow is not steady.[1] [2]
• Streamlines
are
a
family
of
curves
that
are
instantaneously
tangent
to
the
velocity
vector
of
the
flow.
These
show
the
direction
a
fluid
element
will
travel
in
at
any
point
in
time.
By definition, different streamlines at the same instant in a flow do not intersect, because
a fluid particle cannot have two different velocities at the same point. Similarly,
streaklines cannot intersect themselves or other streaklines, because two particles cannot
be present at the same location at the same instant of time; unless the origin point of one
of the streaklines also belongs to the streakline of the other origin point. However,
pathlines are allowed to intersect themselves or other pathlines (except the starting and
end points of the different pathlines, which need to be distinct).