Should evolution be taught in high school?

Feb 24, 2008 Full story: www.scientificblogging.com 176,162

Microbiologist Carl Woese is well known as an iconoclast. At 79 years of age, Woese is still shaking things up. Most recently, he stated in an interview with Wired that...

"My feeling is that evolution shouldn't be taught at the lower grades. You don't teach quantum mechanics in the grade schools. One has to be quite educated to work with these concepts; what they pass on as evolution in high schools is nothing but repetitious tripe that teachers don't understand." Full Story

“Don't get me started”

Level 1

Since: Jul 09

Minneapolis

#155850 Oct 2, 2013
Chimney1 wrote:
<quoted text>
I thought you were starting to tell a joke.
Yes, I was looking for the punch line too. I hope someone eventually comes up with a good oneliner for it.

“I am Sisyphus”

Since: Nov 07

Location hidden

#155851 Oct 2, 2013
Aura Mytha wrote:

Interesting. Nice dinner conversation topic. But what relationship to the chimera topic?

“Think&Care”

Since: Oct 07

Location hidden

#155852 Oct 2, 2013
Aura Mytha wrote:
You just got to love the 'Belly Button Biodiversity Project'!

“I am Sisyphus”

Since: Nov 07

Location hidden

#155853 Oct 2, 2013
polymath257 wrote:
<quoted text>
It has inspired me to go back and look at the texts I have.
One *huge* issue that we have barely touched is that there is no good general theory of non-equilibrium phenomena. The BBGKY hierarchy goes a certain distance, but it simply doesn't allow detailed higher order descriptions of highly non-equilibrium situations. And even that is only a classical formulation. Quantum versions of BBGKY are very much matters of current research.

To put this in laymans terms to test my understanding. Essentially, even if you look at a mountain from a number of different perspectives you still don't get a comprehensive view (understanding) of the mountain. And since most of nature consists of mountains (in this metaphor. or literally non-equilibrium thermodynamics) then we are left with limited understanding of how these systems work in real life.

So, applying a single formula as UC is attempting to do is really trying to fit a hypercube "peg" into a n-sphere "hole".

4th Dimensionally speaking, of course.

Level 6

Since: Aug 07

North Miami Beach, FL

#155854 Oct 2, 2013
Chimney1 wrote:
<quoted text>
No. Firstly, if we are changing the definition of entropy then we cannot use the laws of thermodynamics to prove anything. Entropy in thermodynamics is measured by the units J/K Even in Boltzmann's version. Creager is using Boltzmann. J/K.
Therefore applied energy can never reduce entropy no matter how "ordered" it is. Only the emision of energy can. End of story. Creager is talking rubbish. I read his paper and watched his youtube video.
He makes the same daft mistake over and over. You tidy a room and the entropy decreases according to Creager. Really? What energy is applied? The glucose burned by your muscles as you move things around. What happens when you burn glucose? You warm up. Where does it go? Into the air of the room. And the heat added to the room - thermodynamically - is an addition of entropy. He ignores that. Pretty silly. In total, when energy is transformed, if ALL of it, perfectly, is transformed into non heat energy such as kinetic or gravitational potential or chemical bonds, the toal entropy increase would be zero. Not negative, ever. Zero is the best case scenario for applied energy. And because we knowthat tidying a room generates some heat, we know the energy transformation is not perfect but WILL result in some increase in total entropy. End of story.
Tidying the room increased total entropy. We still do it because we value the localised increase in order created even though the total disorder increases. Who cares about warming up the air a degree or two? We dont. But thermodynamics DOES.
No Chimney. You math is wrong. And you are using the wrong application of entropy. The formula does not consider any specific quantity of heat its calculation, Boltzmann's entropy is a function of the constant K *(Do you know what a constant is? It's a fixed value used in a formula.), log, and the number of equivalent microstates. We are calculating the change in entropy, not K or its dimensions. Botlzmann's Entropy formula DOES NOT calculate J/K anything! It calculates S, Entropy! This is Boltzmann's entropy. Again,

"This postulate, which is known as Boltzmann's principle, may be regarded as the foundation of statistical mechanics, which describes thermodynamic systems using the statistical behaviour of its constituents. It turns out that S is itself a thermodynamic property, just like E or V. Therefore, it acts as a link between the microscopic world and the macroscopic."

The number, Boltzmann's constant is used in the Boltzmann's formula but it doesn't have any dimensional use - it is just a constant; just a number to be plugged in.

I think you know this. This is just your attempt to create confusion and doubt so nobody will believe that entropy affects things like everything in nature including evolution!

“I am Sisyphus”

Since: Nov 07

Location hidden

#155855 Oct 2, 2013
Urban Cowboy wrote:
<quoted text>
No Chimney. You math is wrong. And you are using the wrong application of entropy. The formula does not consider any specific quantity of heat its calculation, Boltzmann's entropy is a function of the constant K *(Do you know what a constant is? It's a fixed value used in a formula.), log, and the number of equivalent microstates. We are calculating the change in entropy, not K or its dimensions. Botlzmann's Entropy formula DOES NOT calculate J/K anything! It calculates S, Entropy! This is Boltzmann's entropy. Again,
"This postulate, which is known as Boltzmann's principle, may be regarded as the foundation of statistical mechanics, which describes thermodynamic systems using the statistical behaviour of its constituents. It turns out that S is itself a thermodynamic property, just like E or V. Therefore, it acts as a link between the microscopic world and the macroscopic."
The number, Boltzmann's constant is used in the Boltzmann's formula but it doesn't have any dimensional use - it is just a constant; just a number to be plugged in.
I think you know this. This is just your attempt to create confusion and doubt so nobody will believe that entropy affects things like everything in nature including evolution!

http://books.google.com/books...

"The general struggle for existence of animate beings is not a struggle for raw materials these, for organisms, are air, water and soil, all abundantly available nor for energy which exists in plenty in any body in the form of heat, but a struggle for [negative] entropy, which becomes available through the transition of energy from the hot sun to the cold earth." Ludwig Boltzmann

Gibbs Free energy rules.

“Think&Care”

Since: Oct 07

Location hidden

#155856 Oct 2, 2013
Urban Cowboy wrote:
<quoted text>
No Chimney. You math is wrong. And you are using the wrong application of entropy.
No, he is actually correct here. You are the one that is applying the equations incorrectly.
The formula does not consider any specific quantity of heat its calculation, Boltzmann's entropy is a function of the constant K *(Do you know what a constant is? It's a fixed value used in a formula.), log, and the number of equivalent microstates. We are calculating the change in entropy, not K or its dimensions. Botlzmann's Entropy formula DOES NOT calculate J/K anything! It calculates S, Entropy! This is Boltzmann's entropy.
*sigh* The Boltzmann constant has units J/K, so the entropy also has such units.

Next, if you want to use the micro-cononical ensemble, which is appropriate for the entropy representation, you need to have you main variables be the total energy U, the volume V, and the number of particles N. Also, these need to stay constant during your calculations.

In this representation, the temperature, T is given by 1/T=dS/dU. In other words, an increase of internal energy by dU produces an increase of entropy of dU/T.
Again,
"This postulate, which is known as Boltzmann's principle, may be regarded as the foundation of statistical mechanics, which describes thermodynamic systems using the statistical behaviour of its constituents. It turns out that S is itself a thermodynamic property, just like E or V. Therefore, it acts as a link between the microscopic world and the macroscopic."
The number, Boltzmann's constant is used in the Boltzmann's formula but it doesn't have any dimensional use - it is just a constant; just a number to be plugged in.
And it is a dimensional constant with dimensions J/K. So the entropy will have units J/K. This is good because it gives the link between the statistical mechanics variable W (or M, or Omega) and quantities like temperature, pressure, and energy that we can actually measure.
I think you know this. This is just your attempt to create confusion and doubt so nobody will believe that entropy affects things like everything in nature including evolution!
The confusion isn't Chimney's. It is yours.

So, now, how exactly do you calculate the number of micro-states to get your conclusions?

“Think&Care”

Since: Oct 07

Location hidden

#155857 Oct 2, 2013
Dogen wrote:
<quoted text>
To put this in laymans terms to test my understanding. Essentially, even if you look at a mountain from a number of different perspectives you still don't get a comprehensive view (understanding) of the mountain. And since most of nature consists of mountains (in this metaphor. or literally non-equilibrium thermodynamics) then we are left with limited understanding of how these systems work in real life.
This analogy works to some extent, yes.
So, applying a single formula as UC is attempting to do is really trying to fit a hypercube "peg" into a n-sphere "hole".
4th Dimensionally speaking, of course.
Well, it is even worse because UC doesn't even apply that formula correctly. he starts out fine, but then goes rapidly off the deep end.

Level 6

Since: Aug 07

North Miami Beach, FL

#155858 Oct 2, 2013
polymath257 wrote:
<quoted text>
Where exactly did you compute the number of available micro-states for either situation?
Also, did you actually read what Chimney said? He said that the *total* change in entropy (house+environment) was more in the case of the construction crew than for the bomb. You neglected to consider the entropy change in the environment.
It's not an actual calculation of each atom in a house. If something is arranged in a very specific particular order it is going to have fewer equivalent microstates than something that is just at random. I supposed you need a much simpler example to put actual numbers on it. But the author, Creager, gives several examples with heating and cooling and the simplest case, and also the construction materials. It works with just about any thing you can think of. And you guys still have not shown where any of the math is wrong and give no examples were it fails. You know what this means, right? The initial energy applied to genomes is causing the entropy of the genome to increase confirming Sanford. But we already knew that.

Level 6

Since: Mar 12

Dubai, UAE

#155859 Oct 2, 2013
Urban Cowboy wrote:
<quoted text>
No Chimney. You math is wrong. And you are using the wrong application of entropy. The formula does not consider any specific quantity of heat its calculation, Boltzmann's entropy is a function of the constant K *(Do you know what a constant is? It's a fixed value used in a formula.), log, and the number of equivalent microstates. We are calculating the change in entropy, not K or its dimensions. Botlzmann's Entropy formula DOES NOT calculate J/K anything! It calculates S, Entropy! This is Boltzmann's entropy. Again,
"This postulate, which is known as Boltzmann's principle, may be regarded as the foundation of statistical mechanics, which describes thermodynamic systems using the statistical behaviour of its constituents. It turns out that S is itself a thermodynamic property, just like E or V. Therefore, it acts as a link between the microscopic world and the macroscopic."
The number, Boltzmann's constant is used in the Boltzmann's formula but it doesn't have any dimensional use - it is just a constant; just a number to be plugged in.
I think you know this. This is just your attempt to create confusion and doubt so nobody will believe that entropy affects things like everything in nature including evolution!
Heheh. K is not "a constant". Its a variable, temperature, HELD constant in isothermic and equilibrium examples. And in the case of Boltzmann's formula, where the system is at EQUILIBRIUM, then of course temperature is not changing. But its a variable sitting at a particular level, eg 298 K, NOT a physical constant, which is a different beast entirely.

Boltzmann's constant on the other hand IS an actual physical constant, always, expressed in the UNITS J/K and it was added to provide the link between number of microstates (a pure number) and ENTROPY, which is a quantity based on J/K. That is WHY Boltzmann added it! Its a conversion factor turning "number of microstates" into "quantity of entropy", expressed as J/K.

S (entropy) IS J/K, and its explained statistically by the fact that a high entropy state is a probabilistically more common state than a low entropy one. Why? Because there are far more possible microstates of the system that make no difference to the macrostate in high entropy than in low entropy.

But after all that, its STILL true that adding energy to a system - no matter how "ordered" the energy is, can not reduce the system's entropy.

Go ahead, hold K (temperature) constant.

That can ONLY happen

1. if the energy transformation is perfect (no gain or loss of entropy) OR

2. the heat energy created is being carried off as fast as its generated (heat is created but siphoned off out of the system).

Now we get to the crunch. You can hold entropy constant by siphoning off the waste heat out of the system. You can also LOWER entropy by doing the same thing. And ONLY by doing that. To reduce entropy, you have to suck some energy out of the system.

Creager does not even consider the energy leaving the system. He talks about the "entropy of the energy applied" which is nonsense, both in the mangling of units and in the fact that a net gain in energy cannot ever lower total entropy.

Boltzmann, Clausius, makes no difference in this respect.

“ The Lord of delirious minds.”

Level 8

Since: Dec 10

Location hidden

#155860 Oct 2, 2013
Dogen wrote:
<quoted text>
Interesting. Nice dinner conversation topic. But what relationship to the chimera topic?
Not much really it's just what I said though, that we really are a collection of organisms in a symbiont and parasitical relationships. Though considered a single organism.
I was wrong however saying a chimera is two organisms , it is however a single organism with the parts of more than one merged together in a symbiont relationship. But the fact humans and all animal life are really a collection of different ones , in symbiosis
is also a truth. We cant live without a collection of bacterias and enzymes, etc. But we also have parasitical creatures we don't really need but they don't go away.

http://www.ncbi.nlm.nih.gov/pubmed/9049991

Level 6

Since: Aug 07

North Miami Beach, FL

#155861 Oct 2, 2013
polymath257 wrote:
<quoted text>
No, he is actually correct here. You are the one that is applying the equations incorrectly.
<quoted text>
*sigh* The Boltzmann constant has units J/K, so the entropy also has such units.
Next, if you want to use the micro-cononical ensemble, which is appropriate for the entropy representation, you need to have you main variables be the total energy U, the volume V, and the number of particles N. Also, these need to stay constant during your calculations.
In this representation, the temperature, T is given by 1/T=dS/dU. In other words, an increase of internal energy by dU produces an increase of entropy of dU/T.
<quoted text>
And it is a dimensional constant with dimensions J/K. So the entropy will have units J/K. This is good because it gives the link between the statistical mechanics variable W (or M, or Omega) and quantities like temperature, pressure, and energy that we can actually measure.
<quoted text>
The confusion isn't Chimney's. It is yours.
So, now, how exactly do you calculate the number of micro-states to get your conclusions?
No. That's why it is called Boltzmann's principle. We can apply it to statistics such as the tossing of dice. If we calculated the W of the dice toss, would it make sense to you to give the probability in J/K? Come on man!

“I am Sisyphus”

Since: Nov 07

Location hidden

#155862 Oct 2, 2013
Urban Cowboy wrote:
<quoted text>
It's not an actual calculation of each atom in a house. If something is arranged in a very specific particular order it is going to have fewer equivalent microstates than something that is just at random. I supposed you need a much simpler example to put actual numbers on it. But the author, Creager, gives several examples with heating and cooling and the simplest case, and also the construction materials. It works with just about any thing you can think of. And you guys still have not shown where any of the math is wrong and give no examples were it fails. You know what this means, right? The initial energy applied to genomes is causing the entropy of the genome to increase confirming Sanford. But we already knew that.

As the genome has been evolving and doing just fine for probably 2 billion years this issue seems comical.

And you do know that an exploding house is in no way comparable to genomic evolution.

And Sanford was already refuted before his book even hit the shelves. Modern science is very efficient.

“ The Lord of delirious minds.”

Level 8

Since: Dec 10

Location hidden

#155863 Oct 2, 2013
polymath257 wrote:
<quoted text>
You just got to love the 'Belly Button Biodiversity Project'!
LOL grossology

Level 6

Since: Mar 12

Dubai, UAE

#155864 Oct 2, 2013
Urban Cowboy wrote:
<quoted text>
It's not an actual calculation of each atom in a house. If something is arranged in a very specific particular order it is going to have fewer equivalent microstates than something that is just at random. I supposed you need a much simpler example to put actual numbers on it. But the author, Creager, gives several examples with heating and cooling and the simplest case, and also the construction materials. It works with just about any thing you can think of. And you guys still have not shown where any of the math is wrong and give no examples were it fails. You know what this means, right? The initial energy applied to genomes is causing the entropy of the genome to increase confirming Sanford. But we already knew that.
Sorry, but the concept of "entropy of energy" used in thermodynamics is a basic violation of unit consistency, and all that follows in Creager's paper is pure nonsense because of that. That IS the math. In physics, equations always relate to units of something, not just pure numbers. A "unit inequality" is just as wrong as a numeric one.

For example a brick cannot weigh five seconds and the distance to the sun cannot be 93 million tonnes per yen. You cannot stuff up with the units. You cannot have an entropy of energy because energy does not have a temperature and J/K requires a temperature (K!).

And you have to understand that the most perfect possible application (addition) of energy cannot ever LOWER entropy, because there are no "negative Joules" and therefore J/K going down can only signal Joules of energy exiting the system, never entering it.

Perhaps if I say it five more times this basic stuff will start to sink in. Then you may see why Creager is bunkum.

Level 6

Since: Aug 07

North Miami Beach, FL

#155865 Oct 2, 2013
Chimney1 wrote:
<quoted text>
Sorry, but the concept of "entropy of energy" used in thermodynamics is a basic violation of unit consistency, and all that follows in Creager's paper is pure nonsense because of that. That IS the math. In physics, equations always relate to units of something, not just pure numbers. A "unit inequality" is just as wrong as a numeric one.
For example a brick cannot weigh five seconds and the distance to the sun cannot be 93 million tonnes per yen. You cannot stuff up with the units. You cannot have an entropy of energy because energy does not have a temperature and J/K requires a temperature (K!).
And you have to understand that the most perfect possible application (addition) of energy cannot ever LOWER entropy, because there are no "negative Joules" and therefore J/K going down can only signal Joules of energy exiting the system, never entering it.
Perhaps if I say it five more times this basic stuff will start to sink in. Then you may see why Creager is bunkum.
Again, he is using Boltzmann's principle as applied to statistics. When will you realize this? Never I suppose.

Level 6

Since: Aug 07

North Miami Beach, FL

#155866 Oct 2, 2013
Chimney1 wrote:
<quoted text>
Sorry, but the concept of "entropy of energy" used in thermodynamics is a basic violation of unit consistency, and all that follows in Creager's paper is pure nonsense because of that. That IS the math. In physics, equations always relate to units of something, not just pure numbers. A "unit inequality" is just as wrong as a numeric one.
For example a brick cannot weigh five seconds and the distance to the sun cannot be 93 million tonnes per yen. You cannot stuff up with the units. You cannot have an entropy of energy because energy does not have a temperature and J/K requires a temperature (K!).
And you have to understand that the most perfect possible application (addition) of energy cannot ever LOWER entropy, because there are no "negative Joules" and therefore J/K going down can only signal Joules of energy exiting the system, never entering it.
Perhaps if I say it five more times this basic stuff will start to sink in. Then you may see why Creager is bunkum.
DO you get an answer in J/K when you calculate the probability of rolling a 6 sided die?

Level 9

Since: Sep 08

Everett, WA

#155867 Oct 2, 2013
Chimney1 wrote:
<quoted text>
Sorry, but the concept of "entropy of energy" used in thermodynamics is a basic violation of unit consistency, and all that follows in Creager's paper is pure nonsense because of that. That IS the math. In physics, equations always relate to units of something, not just pure numbers. A "unit inequality" is just as wrong as a numeric one.
For example a brick cannot weigh five seconds and the distance to the sun cannot be 93 million tonnes per yen. You cannot stuff up with the units. You cannot have an entropy of energy because energy does not have a temperature and J/K requires a temperature (K!).
And you have to understand that the most perfect possible application (addition) of energy cannot ever LOWER entropy, because there are no "negative Joules" and therefore J/K going down can only signal Joules of energy exiting the system, never entering it.
Perhaps if I say it five more times this basic stuff will start to sink in. Then you may see why Creager is bunkum.
Creationists quite often make this sort of mistake. They get outside of their specialty and they make a starting argument that is pure nonsense.

For example the many "odds" arguments can almost always be debunked by showing that the presuppositions they are based upon are nonsense. At that point no more math is necessary.

“Think&Care”

Since: Oct 07

Location hidden

#155868 Oct 2, 2013
Urban Cowboy wrote:
<quoted text>
No. That's why it is called Boltzmann's principle. We can apply it to statistics such as the tossing of dice. If we calculated the W of the dice toss, would it make sense to you to give the probability in J/K? Come on man!
If you want to apply the SLoT, then you need the thermodynamic entropy which has Boltzmann's constant. And the correspondence between this and Carnot's definition is Boltzmann's principle.

As you say, come on man!

“Think&Care”

Since: Oct 07

Location hidden

#155869 Oct 2, 2013
Urban Cowboy wrote:
<quoted text>
Again, he is using Boltzmann's principle as applied to statistics. When will you realize this? Never I suppose.
Boltzmann's principle is a principle in statistical mechanics. And that is part of physics. In particular, it relates the number of available micro-states (W) to the thermodynamic entropy (S) by S=k*ln(W) where k is Boltzmann's constant. Because k has units of J/K, so does S, which is required to be the thermodynamic entropy. And if you want to apply the SLoT, then you need to be using the thermodynamic entropy.

Tell me when this thread is updated:

Subscribe Now Add to my Tracker

Add your comments below

Characters left: 4000

Please note by submitting this form you acknowledge that you have read the Terms of Service and the comment you are posting is in compliance with such terms. Be polite. Inappropriate posts may be removed by the moderator. Send us your feedback.

Evolution Debate Discussions

Title Updated Last By Comments
Evolution vs. Creation (Jul '11) 4 min replaytime 132,608
How would creationists explain... 30 min Dogen 338
An atheistic view on evolution vs. a godly view... 3 hr Chimney1 531
Creationism coming to Ohio classrooms? Not with... 4 hr nobody 7
god is not real!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! (Jun '06) 14 hr Brian_G 13,618
24 hour dental emergency (Nov '13) Fri Zach 4
Science News (Sep '13) Fri Ricky F 2,936
More from around the web