Sofar. Entropy.<quoted text>If mutations are accumulated within a genome, they would either be neutral or beneficial. Negative mutations would be lethal or they would be weeded out by selection. There is no genetic entropy.

A measure of the disorder of a system.

A measure of the amount of energy in a system that is available

for doing work; entropy increases as matter and energy in the

universe degrade to an ultimate state of inert uniformity.

Differentiation of Heat (Q), dQ, is not an exact differential and

therefore cannot be integrated. Therefore we introduce an

integration factor (1/T) such that dQ/T can be integrated. And this

dQ/T is called entropy.

So even though Log W=o the amount of energy expended could differ.

Establish the link between statistical entropy and physical

entropy.

Without additional information about the die, the most unbiased distribution is such that all outcomes are equally probable.

P(X =1)= P(X = 2)=...= P(X = 6)=1/ 6

Shannons Measure of Uncertainty

Shannon [1948] suggested the following measure of uncertainty, which is commonly known as the statistical entropy

1. H is a positive function of p1, p2, , pn.

2. H = 0 if one outcome has probability of 1.

3. H is maximum when the outcomes are equally likely.

In the case of the die, you will find the maximum entropy to be

H = ln6

Stirling approximation: ln(N!)= N ln N - N, for very large N

Conclusion:

ln omega=NH

ln omega is linearly proportional to H. Therefore, maximizing the

total number of possible outcomes is equivalent to maximizing Shannons statistical entropy.

Statistical Entropy-> H = Const X ln omega <-# of possible outcomes

Entropy in Statistical Physics

Definition of physical entropy:

S = const X ln omega, omega =# of possible microstates of a close system.

A microstate is the detailed state of a physical system.

Example: In an ideal gas, a microstate consists of the position and velocity of every molecule in the system. So the number of microstates is just what Feynman said: the number of different ways the inside of the system can be changed without changing the

outside.

Principle of maximum entropy (The second law of thermodynamics)

If a closed system is not in a state of statistical equilibrium, its macroscopic state will vary in time, until ultimately the system reaches a state of maximum entropy.

Example:

S = Const x ln (# of Velocity States X # of Position States)

# of velocity states does not change.

# of position states does change

delta S = S2-S1 = const x {ln(2V)^N - lnV^N}= const X Nln2

Moreover, at equilibrium, all microstates are equally probable.

Temperature:

Temperature T is defined as

1/T = dS/dE . The temperatures of bodies in equilibrium with

one another are equal.

Since T is measured at a fixed number of particles N and volume V, a more stringent definition is

T =(dE dS)N,V .

Thus far, S is defined to be

S = const X ln (omega).

If S is a dimension-less quantity, T has the dimensions of energy (e.g. in units of Joules (J)).

But J is too large a quantity. Example:

Room temperature = 404.34 x 10-23 J !

What is the physical unit of T?

It is more convenient to measure T in degrees Kelvin (K). The conversion factor between energy and degree is the Boltzmanns constant, kB = 1.38 X 10-23 J / K. Hence we redefine

S and T by incorporating the conversion factor.

S = kBln omega and T = T/ kB

Using the Boltzman factor:

Same change in entropy, but more energy is given away by the system initially with higher T. Hence temperature is a measure of the tendency of an object to spontaneously give up

energy to its surroundings.

cont.

153,461 - 153,480of 179,706 Comments Last updatedJun 24, 2016