Should evolution be taught in high sc...

Should evolution be taught in high school?

There are 179505 comments on the www.scientificblogging.com story from Feb 24, 2008, titled Should evolution be taught in high school?. In it, www.scientificblogging.com reports that:

Microbiologist Carl Woese is well known as an iconoclast. At 79 years of age, Woese is still shaking things up. Most recently, he stated in an interview with Wired that...

"My feeling is that evolution shouldn't be taught at the lower grades. You don't teach quantum mechanics in the grade schools. One has to be quite educated to work with these concepts; what they pass on as evolution in high schools is nothing but repetitious tripe that teachers don't understand."

Join the discussion below, or Read more at www.scientificblogging.com.

Level 6

Since: Aug 07

Arlington, VA

#156337 Oct 9, 2013
polymath257 wrote:
<quoted text>
I understand what you say. I also know that you are wrong. If you want a change in entropy, you find the entropy at two different times and subtract.
Easy. Except, I guess, for you.
Not two different times. Two different microstates. It's the difference between them. There's no time difference. They are separate analyses done at the exact same time. In the original example there was raw materials as subject to the two energies. We are not taking the raw materials microstate. We are comparing the end product of applying either a construction crew or a bomb to the same raw materials.

“Think&Care”

Since: Oct 07

Location hidden

#156340 Oct 9, 2013
Urban Cowboy wrote:
<quoted text>
Not two different times. Two different microstates.
This alone shows you don't know what a micro-state is and how the entropy is computed.

No, you are NOT looking at the difference between two micro-states. You are looking at the difference in entropy of the system between two times. The entropy is defined by the *number* of micro-states. You *never* look at only two different micro-states in these matters.
It's the difference between them. There's no time difference. They are separate analyses done at the exact same time.
Completely and utterly wrong.
In the original example there was raw materials as subject to the two energies. We are not taking the raw materials microstate. We are comparing the end product of applying either a construction crew or a bomb to the same raw materials.
Wow. Just, wow. So, you have two scenarios: raw materials-->house, and raw materials-->bomb. Those raw materials are not present at the same time as the house or the aftermath of the bomb.

And, of course, once again you forget to include the energy released into the environment. And when you include that, the total entropy change is more for the construction crew than it is for the bomb.

Level 6

Since: Mar 12

Location hidden

#156341 Oct 9, 2013
Urban Cowboy wrote:
<quoted text>
Not two different times. Two different microstates. It's the difference between them. There's no time difference. They are separate analyses done at the exact same time. In the original example there was raw materials as subject to the two energies. We are not taking the raw materials microstate. We are comparing the end product of applying either a construction crew or a bomb to the same raw materials.
And if you want to see the entropy picture of the end products, you have to look at ALL the end products which includes the amount of entropy added to the heat sink in both processes. See you dont just have a house and a bombsite. In each case you have what is onsite PLUS whatever has been added to the surroundings. You have to account for all the energy applied and where it has ended up in each process. And whichever process has generated the most heat has generated the most entropy, as i have been saying right from the start.

For all your digressions, that simple fact is still true whether you use the clausius or the boltzmann approach to solving it.

“See how you are?”

Level 5

Since: Jul 12

Earth

#156342 Oct 9, 2013
one way or another wrote:
I wish one of you had a brain, really I do.
You mean one to spare, of course.

“No such thing as ABIODARWINISM”

Level 9

Since: Jan 11

No ABIODARWINISTS either!

#156343 Oct 9, 2013
polymath257 wrote:
<quoted text>
That is correct.
Well that is good to know that you guys have completely lost me.

Been enjoying the discussion. Very educational.

Level 6

Since: Aug 07

Arlington, VA

#156344 Oct 9, 2013
OK, Chimney and Poly, this is my final comment on the matter. There is no sense in continuing this conversation since you two refuse to cooperate with me on how statistical entropy is used to evaluate entropy as various energies are applied to various systems. I have supported the formulas and forms with university papers, lessons, texts, wiki pages, and youtube videos. Here it is again. It works. It's predictions ARE devestating to both abiogenesis and forward/vertical evolution.

Entropy is defined as follows:

S = k log W

where:

S = entropy
k = Boltzmann constant = 1.380662 x 10-23 J K-1
W = the number of equivalent micro states (possible arrangements) of a system.

Also note that W is a natural number so that S is either zero or positive. This formula comes from the Wiki page on Entropy (statistical thermodynamics).

http://en.wikipedia.org/wiki/Entropy_ (statistical_thermodynamics)

The change in S can be defined as:

dS = S2 S1

Plugging in gives:

dS = k log W2 k log W1

or

dS = k log W2/W1

This explains why adding more randomness to a system makes it more random and adding more order to a system makes it more organized. So applying energy to a system in a manner more random than that system will increase the entropy of that system and applying energy to a system in a manner less random than that system will decrease the entropy of that system.

We want to know what happens to the entropy of a system when energy is applied to it. We have two microstates; one for the energy and one for the system. A microstate is a specific configuration of a system that the system may occupy with a certain probability in the course of its fluctuations.

We want a relationship that explains what happens when energy is applied to all the microstates. So each energy microstate is related to each system microstate. When evaluating whether or not an energy microstate affects a system microstate, there are only two possible outcomes:

1. The component of the system gets energy applied:

W(2n)= W(en)

or

2. The component does not get energy applied:
W(2n)= W(1n)

This gives the Entropy Change Formula (dS) formula:
dS = k log We/Ws

Which means applying energy to a system will change the entropy of that system. This formula predicts that energy applied to a system in a manner more random than that system will increase the entropy of that system; and energy applied to a system in a manner less random than that system will decrease the entropy of that system. To illustrate the various combinations that are possible, the following schema is developed:

A = High W energy; B = Low W energy; C = High W system; and D = Low W system.

CASE I:
If A --> C, then S decreases (little)
Examples: Surgery, genetic engineering, cloning, robotics, etc.

CASE II:
If A --> D, then S decreases (much)
Examples: Construction, books, music, ID, etc.

CASE III:
If B --> C, then S increases (much)
Examples: Bomb, rust, skin cancer, mutations, etc.

CASE IV:
If B --> D, then S increases (little)
Examples: melting, evaporation, expansion, etc.

The entropy change formula shows the difference between construction work and a bomb. Construction work has a We smaller than the Ws of the raw material, while in a bomb a We is larger than the Ws of the raw material.

“No such thing as ABIODARWINISM”

Level 9

Since: Jan 11

No ABIODARWINISTS either!

#156345 Oct 9, 2013
DanFromSmithville wrote:
<quoted text>Well that is good to know that you guys have completely lost me.
Been enjoying the discussion. Very educational.
Darn it. Should be "haven't completely lost me".

“Don't get me started”

Level 1

Since: Jul 09

Minneapolis

#156346 Oct 9, 2013
Urban Cowboy wrote:
<quoted text>
I already did that and much more. Did very well actually. You might be able to pass the basic introductory courses but you have great difficulty in applying those principles to new concepts. I still don't think you understand the role of boltzmanns constant in the statistical entropy formula. And I'm pretty sure you don't know what a microstate is. And I have no idea how you can confuse information theory into this mix. Until you understand these things you are going to keep reaching for answers in the wrong places. You need to study more on statistical entropy, microstates, the role boltzmann's constant, and re-read Creager's paper a few more times. You will see that his work agrees in principle with the common understanding of these areas.
I do admit that the math in this conversation is way past my understanding. So of course I have to look at other factors to see if there is anything that I can get a handle on.

There is one thing that should stand way, way out to anyone who has any interest in science. That point is: If Creager is properly using statistical entropy to show that evolution violates the SLoT, wouldn't his work make him a contender for the Nobel Prize? This should be right there with the Higgs Boson, right?

So why are all the classic physicists ignoring him?

“Nihil curo de ista tua stulta ”

Since: May 08

Orlando

#156347 Oct 9, 2013
Fundies say there's no practical use for the ToE.

http://phys.org/news/2013-10-crop-productivit...

A step towards increasing crop productivity
October 9th, 2013 in Biology / Biotechnology

( Phys.org )A breakthrough in understanding the evolutionary pathways along which some crops have become significantly more productive as others may help scientists boost yields of some staple foodstuffs.

Research carried out at Cambridge and Oxford Universities, and published last week in the journal eLIFE makes an important contribution to worldwide efforts to develop high-yielding crops by mimicking the natural processes of evolution that have led some plants to be more productive than others.

Crops can be divided into two broad categories in terms of the way in which they use photosynthesis to convert sunlight and water into carbohydrates. In C3 plants carbon dioxide is first fixed into a compound containing three carbon atoms while in C4 plants carbon dioxide is initially fixed into a compound containing four carbons atoms.

This seemingly minor variation in photosynthesis makes an important difference: C4 plants are around 50% more efficient than C3 plants, and despite accounting for just 3% of plant species, C4 plants contribute 30% to terrestrial productivity.

The world faces pressure from a growing population, and productive land is increasingly at a premium. One way to improve yields without cultivating more land is to engineer crops to use C4 photosynthesis. To do this, scientists must understand the evolutionary steps that lead from C3 to C4 photosynthesis.

A step towards increasing crop productivity

All C4 plants evolved from C3 plants. Scientists think that this process took place over many millions of years. No one knows exactly what causes the sequence of changes that makes it possible for plants to learn this trick, and although the C4 pathway is considered highly complex, this system has evolved independently in many groups of plants.

A collaboration bringing together plant sciences and mathematics initiated by Drs Ben Williams and Iain Johnston has revealed the series of events that allowed plants using the C4 pathway to evolve from C3 plants.

Their work on evolutionary pathways may help scientists to engineer current C3 crops to use the more efficient C4 pathway and because of their increased productivity, increase world food security. In doing so, scientists will be mimicking and speeding up the natural variations that have taken place in wild species.

Williams assessed the presence or absence of 16 traits known to be important for the C4 pathway in 73 different plants, some using C4 photosynthesis, some using the C3 pathway, and others that seem to use a blend of both C3 and C4. Johnston then developed Bayesian modelling techniques, to produce a model that predicts the steps associated with this highly complex evolutionary process. The model was underpinned by data occupying a 16-dimensional space with 65,536 nodes within that space.

Dr Hibberd said: "What their work reveals provides incredible new insight into a complex evolutionary process and furthermore is essentially positive news for those of us interested in engineering more productive staple food stuffs such as rice. This is because the work shows that there is significant flexibility in the evolutionary paths that plants have used to get from C3 to C4 photosynthesis.

"This finding therefore implies that the engineering effort is not constrained to only one route. This should help scientists to develop crops with significantly improved yields to feed the world. Like the proverbial roads that all lead to Rome, Ben and Iain have shown that there are many routes taken by plants in the evolutionary process towards C4 photosynthesis."

<truncated to fit Topix limitations. More at link above>
Mugwump

London, UK

#156348 Oct 9, 2013
appleboy wrote:
<quoted text>
I do admit that the math in this conversation is way past my understanding. So of course I have to look at other factors to see if there is anything that I can get a handle on.
There is one thing that should stand way, way out to anyone who has any interest in science. That point is: If Creager is properly using statistical entropy to show that evolution violates the SLoT, wouldn't his work make him a contender for the Nobel Prize? This should be right there with the Higgs Boson, right?
So why are all the classic physicists ignoring him?
An excellent question that no doubt will get a dumbass response.

Level 6

Since: Mar 12

Location hidden

#156349 Oct 9, 2013
Urban Cowboy wrote:
.
Entropy is defined as follows:
S = k log W
where:
S = entropy
k = Boltzmann constant = 1.380662 x 10-23 J K-1
W = the number of equivalent micro states (possible arrangements) of a system.
(Note the statistical definition includes J/K!)
The change in S can be defined as:
dS = S2 S1
Plugging in gives:
dS = k log W2 k log W1
or
dS = k log W2/W1
(Note that a "change" refers to before and after)
This explains why adding more randomness to a system makes it more random and adding more order to a system makes it more organized.
And yet again, this statement simply does not follow from the maths you presented. THIS simple point, right from the start, is what you have been completely unable to explain because its incorrect. Or at least, it does NOT follow from what went before.

Creager inexplicably tries the same trick in another way too.

He says "It is known that heating a system produces a DS > 0, while cooling a system produces a DS < 0" TRUE.

Then he says...

"When a system is cooled the electromagnetic forces between molecules are better able to guide the molecular motion such that the energy of these electromagnetic forces are applied to the system as a whole..." DREAMING! What utter nonsense.

Quite simply, when a system is COOLED, some of the random kinetic energy of the molecules is REMOVED, reducing the numebr of available microstates to the system. This is the simple, fundamental point that you and he keep trying to avoid, the ACTUAL thing that reduces the entropy. NOT the "way the energy was applied", but the fact that it was REMOVED, that reduced the entropy.

You have gone on a merry go round through everything you could think of to avoid this simple and well established fact of physics.

You know as well as Creager does, by now, that the only reason he does not talk about the critical function of removing energy is that it shoots his "entropy of applied energy" into the waste basket.

Where it belonged anyway, because like it or not, energy cannot = energy/temperature AND "statistical entropy" is just another way of describing thermodynamic entropy which ANY physics graduate (outside of Bob Jones University) will be able to tell you.

I can even go through it, step by step as Clausius and Boltzmann did, and explain exactly why. But if you were really interested in learning, rather than just backing any old sham attack on evolution, you would have done that yourself by now.

Creager paper is a joke - on you.

“See how you are?”

Level 5

Since: Jul 12

Earth

#156350 Oct 9, 2013
Chimney1 wrote:
<quoted text>
And if you want to see the entropy picture of the end products, you have to look at ALL the end products which includes the amount of entropy added to the heat sink in both processes. See you dont just have a house and a bombsite. In each case you have what is onsite PLUS whatever has been added to the surroundings. You have to account for all the energy applied and where it has ended up in each process. And whichever process has generated the most heat has generated the most entropy, as i have been saying right from the start.
For all your digressions, that simple fact is still true whether you use the clausius or the boltzmann approach to solving it.
Whether the same amount of energy is applied to a system in a random burst or with deliberation and slowness makes no difference to the product unless you factor other variables, in which case you are measuring >those variables< by the difference in product. 5=5 1+1+1+1+1=5 It took a fraction of a second longer to validate the latter string of numbers, but 5 remains equal to 5 no matter how many times you circle the roundabout. At the end of the day, Hammer strikes versus bomb blast or RG machine versus pile of junk is completely irrelevant to the entropy of the system when the total microstates are equivalent.
Urb insists that an equal amount of wood and steel has greater entropy if it has been sawed into 2x4's and mulch and assembled in just-such-a-way than if it is a dead tree and a heap of nails. Anyone else would realize that the form doesn't matter - they are in every aspect except shape and distribution exactly the same thing. His entire argument revolves around an arbitrarily specific and preferential interpretation of "order" (aka "better", aka portable goalpost) that >must< include artificial manipulation and beneficial to the agenda. He would be the art critic who fawns over accidentally spilled paint as a masterwork, until he finds it was not spilled by Jackson Pollock.

“Think&Care”

Since: Oct 07

Location hidden

#156351 Oct 9, 2013
Urban Cowboy wrote:
OK, Chimney and Poly, this is my final comment on the matter. There is no sense in continuing this conversation since you two refuse to cooperate with me on how statistical entropy is used to evaluate entropy as various energies are applied to various systems. I have supported the formulas and forms with university papers, lessons, texts, wiki pages, and youtube videos. Here it is again. It works. It's predictions ARE devestating to both abiogenesis and forward/vertical evolution.
Entropy is defined as follows:
S = k log W
where:
S = entropy
k = Boltzmann constant = 1.380662 x 10-23 J K-1
W = the number of equivalent micro states (possible arrangements) of a system.
Also note that W is a natural number so that S is either zero or positive. This formula comes from the Wiki page on Entropy (statistical thermodynamics).
http://en.wikipedia.org/wiki/Entropy_ (statistical_thermodynamics)
The change in S can be defined as:
dS = S2 S1
Plugging in gives:
dS = k log W2 k log W1
or
dS = k log W2/W1
Everything up to this is correct. Notice that entropy has the factor k out in front, with has units J/K. Also, that we are looking at the change on entropy, which means the difference in the entropy at two times.
This explains why adding more randomness to a system makes it more random and adding more order to a system makes it more organized. So applying energy to a system in a manner more random than that system will increase the entropy of that system and applying energy to a system in a manner less random than that system will decrease the entropy of that system.
And this is where you fall off the horse. This does NOT follow from the above (correct) equations.
We want to know what happens to the entropy of a system when energy is applied to it. We have two microstates; one for the energy and one for the system. A microstate is a specific configuration of a system that the system may occupy with a certain probability in the course of its fluctuations.
And here again we have the problem that we are definitely *not* simply looking at two micro-states. We are comparing the *number* of micro-states. In fact, according to the (correct) equation above, we need to look at the *ratio* between the number of micro-states on either side of the change.
We want a relationship that explains what happens when energy is applied to all the microstates. So each energy microstate is related to each system microstate.
And again, this is false and shows a lack of understanding of the concept of a micro-state.
When evaluating whether or not an energy microstate affects a system microstate, there are only two possible outcomes:
1. The component of the system gets energy applied:
W(2n)= W(en)
Why would we expect the *number* of micro-states to be the same? That is what your formula says.
or
2. The component does not get energy applied:
W(2n)= W(1n)
Again, wrong. There is no reason for the *number* of micro-states to be the same.
This gives the Entropy Change Formula (dS) formula:
dS = k log We/Ws
Which, again, is another leap. It does not follow from the previous (wrong) material and definitely doesn't follow from the (correct) material you started with.
Which means applying energy to a system will change the entropy of that system. This formula predicts that energy applied to a system in a manner more random than that system will increase the entropy of that system; and energy applied to a system in a manner less random than that system will decrease the entropy of that system.
And specific examples have shown this conclusion is false in practice.
So,
1) Your derivation is flawed. You start with correct material and apply it incorrectly.
2) The conclusion itself is wrong. Specific examples (hammer on a RG machine, laser shining on black paper vs a mirror) show it to be wrong in *reality*.

“I am Sisyphus”

Since: Nov 07

Location hidden

#156352 Oct 9, 2013
polymath257 wrote:
<quoted text>
Yes, count the micro-states at a particular time. That will give the entropy.
Now, to find the *change* in entropy, you need to find the entropy at two times and subtract. This is so basic I find it impossible you can't grasp this.
In other words, you are faking your lack of understanding, which simply makes you a troll.

He seems to think energy is poofed away magically instantaneously.

“I am Sisyphus”

Since: Nov 07

Location hidden

#156353 Oct 9, 2013
Urban Cowboy wrote:
<quoted text>
Not two different times. Two different microstates. It's the difference between them. There's no time difference.

BWHAHAHAHAHAHAHAHA!!!!!!

Define microstate
Define ordered.

"The second law of thermodynamics describes how the entropy of an isolated system changes in time."

http://en.wikipedia.org/wiki/Microstate_%28st...

DUH!


“I am Sisyphus”

Since: Nov 07

Location hidden

#156354 Oct 9, 2013
one way or another wrote:
So now the evo morons are learning to be childish, stupid and calm. Lmao
Way ta go morons.

You are our patron 'aint.

“I am Sisyphus”

Since: Nov 07

Location hidden

#156355 Oct 9, 2013
Urban Cowboy wrote:
OK, Chimney and Poly, this is my final comment on the matter. There is no sense in continuing this conversation since you two refuse to cooperate with me on how statistical entropy is used to evaluate entropy as various energies are applied to various systems. I have supported the formulas and forms with university papers, lessons, texts, wiki pages, and youtube videos.

No, you have supported poly and chimney's positions with those links. You have simply failed to understand what they said.
It certainly is not worth continuing this discussion with you. All we are doing at this point is demonstrating you don't get the basic concepts and cannot define the basic terms.
Urban Cowboy wrote:
Here it is again. It works. It's predictions ARE devestating to both abiogenesis and forward/vertical evolution.

Hysterical.
Urban Cowboy wrote:

Entropy is defined as follows:
S = k log W
where:
S = entropy
k = Boltzmann constant = 1.380662 x 10-23 J K-1
W = the number of equivalent micro states (possible arrangements) of a system.
Also note that W is a natural number so that S is either zero or positive. This formula comes from the Wiki page on Entropy (statistical thermodynamics).
http://en.wikipedia.org/wiki/Entropy_ (statistical_thermodynamics)
The change in S can be defined as:
dS = S2 S1
Plugging in gives:
dS = k log W2 k log W1
or
dS = k log W2/W1
This explains why adding more randomness to a system makes it more random and adding more order to a system makes it more organized.

This is right so far. But now you are going to go off the deep end because you do not understand what "random" and "ordered" mean in the context of entropy. You are using common definitions of the words and not considering how they are used in entropy.

“I am Sisyphus”

Since: Nov 07

Location hidden

#156356 Oct 9, 2013
appleboy wrote:
<quoted text>
I do admit that the math in this conversation is way past my understanding. So of course I have to look at other factors to see if there is anything that I can get a handle on.
There is one thing that should stand way, way out to anyone who has any interest in science. That point is: If Creager is properly using statistical entropy to show that evolution violates the SLoT, wouldn't his work make him a contender for the Nobel Prize? This should be right there with the Higgs Boson, right?
So why are all the classic physicists ignoring him?

Because he is a loon.

Oh, your question was rhetorical. Sorry. I may have missed that.

“Seventh son”

Level 8

Since: Dec 10

Will Prevail

#156357 Oct 9, 2013
appleboy wrote:
<quoted text>
I do admit that the math in this conversation is way past my understanding. So of course I have to look at other factors to see if there is anything that I can get a handle on.
There is one thing that should stand way, way out to anyone who has any interest in science. That point is: If Creager is properly using statistical entropy to show that evolution violates the SLoT, wouldn't his work make him a contender for the Nobel Prize? This should be right there with the Higgs Boson, right?
So why are all the classic physicists ignoring him?
One reason he is wrong that evolution violates the SLoT, is that
one of the criteria when for searching for extraterrestrial life.
Is to look for systems that show signs of an entropy reduction.
The biosphere is a system that reduces entropy , because organisms on Earth scavenge some of the waste products of others.
Microbiologists say DNA has the ability to correct code, but natural selection also weeds out part of negative entropy.
Then there is the fact that if DNA becomes too disordered it would would not produce a living organism. So there are many examples as to why evolution and life itself are somewhat immune to gathering and passing on negative entropy.

Level 6

Since: Aug 07

Tarentum, PA

#156358 Oct 9, 2013
appleboy wrote:
<quoted text>
I do admit that the math in this conversation is way past my understanding. So of course I have to look at other factors to see if there is anything that I can get a handle on.
There is one thing that should stand way, way out to anyone who has any interest in science. That point is: If Creager is properly using statistical entropy to show that evolution violates the SLoT, wouldn't his work make him a contender for the Nobel Prize? This should be right there with the Higgs Boson, right?
So why are all the classic physicists ignoring him?
No, the Nobel Prize is reserved for the extreme left-wingnuts like Al Gore for spreading liberal anti-science propaganda and other pro-evolution, pro-Big bang garbage. Creager would probably gag at the thought of it.

Tell me when this thread is updated:

Subscribe Now Add to my Tracker

Add your comments below

Characters left: 4000

Please note by submitting this form you acknowledge that you have read the Terms of Service and the comment you are posting is in compliance with such terms. Be polite. Inappropriate posts may be removed by the moderator. Send us your feedback.

Evolution Debate Discussions

Title Updated Last By Comments
News Evolution vs. Creation (Jul '11) 12 min DanFromSmithville 187,615
News It's the Darwin crowd that lacks the facts in e... (Mar '09) 41 min Chimney1 148,472
News Atheism, for Good Reason, Fears Questions (Jun '09) 43 min Rose_NoHo 6,265
News "Science vs. Religion: What Scientists Really T... (Jan '12) 55 min Chimney1 27,468
Will Gravitational Waves Reveal the Origin of t... 18 hr Critical Eye 10
News This year's first batch of anti-science educati... Fri Critical Eye 7
Poll Theory of Intelligent Design (Apr '09) Fri MIDutch 2,852
More from around the web