No. The environment is irrelevant. The isothermic system at measurement is in equilibrium. That is stipulated. You are either confabulating or intentionally obstructing by creating distractions.<quoted text>
Yes, you do have to consider the effects on the environment. And yes, that means the whole universe. Now, that is usually fairly simple once you consider the energy crossing the boundaries, etc, but it *does* have to be taken into account.
So no, it is NOT just the location of the particles of the raw material that needs to be considered: it is the *total* effect.
Should evolution be taught in high school?
There are 180392 comments on the www.scientificblogging.com story from Feb 24, 2008, titled Should evolution be taught in high school?. In it, www.scientificblogging.com reports that:
Microbiologist Carl Woese is well known as an iconoclast. At 79 years of age, Woese is still shaking things up. Most recently, he stated in an interview with Wired that...
"My feeling is that evolution shouldn't be taught at the lower grades. You don't teach quantum mechanics in the grade schools. One has to be quite educated to work with these concepts; what they pass on as evolution in high schools is nothing but repetitious tripe that teachers don't understand."Join the discussion below, or Read more at www.scientificblogging.com.
Since: Aug 07 12,216 
#156943
Oct 16, 2013

Since: Aug 07 12,216 
#156944
Oct 16, 2013
I never claimed any such thing about hats. And I have asked you several times who you were and what you did and you said you could not do that. So I have no idea who you are. You seem to have done some reading but I don't if you even went to college. I told I had a career in the National Weather Service and had to take all the engineering physics, calculus, thermodynamics, etc. Look at any university curriculum for meteorology. So I doubt you are going to "blow me away" with your secret education/career that you can't disclose. I'll tell you what I think you are: Probably a junior high school teacher. Probably teach English or history. Those are the ones that match your level of snarky arrogance. Big mouth and very little substance. 
Since: Aug 07 12,216 
#156945
Oct 16, 2013
But here you go again. You don't discuss the science or any thing relevant or any specifics. Instead, all you can do is trashtalk. 
“Don't get me started” Since: Jul 09 4,907 Minneapolis 
#156946
Oct 16, 2013
But entropy IS about before and after. 
Since: Aug 07 12,216 
#156947
Oct 16, 2013
You are still discussing this as if it were thermodynamic entropy. This is a different subject. Statistical entropy. It deals in probabilities. We are taking statistics on microstates. There is no temperature differences because it is stipulated that when we count the microstates the system is at equilibrium. When we count the microstates it is in a nanosecond point in time. If you want to discuss Creager's, would you please at least stay on the same subject? This is as if we have to walk up 5 flights of stairs to reach our destination and you can't even take the first step. 
“Don't get me started” Since: Jul 09 4,907 Minneapolis 
#156948
Oct 16, 2013
I still do not understand the purpose of this conceptual problem that cannot apply to the universe that we live in. 
Since: Aug 07 12,216 
#156949
Oct 16, 2013
Not necessarily. The change can be calculated but it can also be measured instantaneously. The formula for that is S = k Log W and that is not the change. dS = k log W2/W1 is the change. There are many, many forms that it can take. 
Since: Aug 07 12,216 
#156950
Oct 16, 2013
Did you listen to what the professor said? Statistical mechanics has many applications. They use it in science, medicine, and even in finance, in information, in many areas. Listen to him in Lesson 1: He said "statistical mechanics is a mathematical structure which has many applications". "In a nutshell, statistical mechanics is simply probability theory, which can be applied to many different circumstances." 
“ad victoriam” Since: Dec 10 43,201 arte et marte 
#156951
Oct 17, 2013
No this is where you're wrong. Statistical mechanics uses probability and mathematical equations to define what is going on at the particle/molecular level of a larger system. To answer question's like what happens to the atoms in H2O as it is heated to boil, or cooled to freeze. http://en.wikipedia.org/wiki/Statistical_mech... 
“ad victoriam” Since: Dec 10 43,201 arte et marte 
#156952
Oct 17, 2013
The formulas can be used to calculate other things that use very large numbers. But.... We have a winner! Probability theory is um.... drum roll ..........Tah Dah....Probability theory 
Birkenhead, UK 
#156953
Oct 17, 2013
Aw crap! It got a transfer. 
Since: Aug 07 12,216 
#156954
Oct 17, 2013
That's not what he said smartass. He said statistical mechanics is probability theory, which of course includes Boltzmann's formula. You guys hate all this stuff (real science) because it so obviously means the end of the ideological "common descent" evolution. 
“Think&Care” Since: Oct 07 24,462 Location hidden 
#156955
Oct 17, 2013
But if you are interested in the conservation of energy, you need to compare the energy at two different times. Why would that be confusing? But you would NOT simply look at how much weight their livers gained or lost. You would have to consider the whole system. Good question. How do you go about counting them? Are you sure? Why do you think that is relevant? Is the robot part of the system? Is the bomb? Please provide details for this claim. Next, and once again, you have ignored the effect of this application of energy on the environment. How did the number of microstates in the environment change? And yes, that is a crucial aspect of all of this. And I have shown specific examples where this is wrong. I have shown that the 'ordered' application of energy can increase entropy. There are also examples where the disordered application can decrease entropy. 
“Think&Care” Since: Oct 07 24,462 Location hidden 
#156956
Oct 17, 2013
Suppose we took two huge boxes which does not let out heat or matter. We put the raw materials into each box along with a bomb and a robot. So the materials in each box are identical. Then, in one box we set off the bomb and in the other box we let the robot build a house. We then wait until the contents of both boxes are at equilibrium. How does the entropy of one box compare to the entropy of the other box? The answer? They are the *same*. 
“Think&Care” Since: Oct 07 24,462 Location hidden 
#156957
Oct 17, 2013
The basic ideas of probability theory can be applied to information theory and to statistical mechanics. The formulas are even very similar, so the calculations are very close. There are *four* types of entropy that can be considered: 1. Information theory. We have a probability distribution with probabilities p_i. The entropy of the distribution is then given by S=sum p_i log(p_i). Typically, the logarithm used is to the base 2, but that is not a central requirement. 2. Nonequilibrium statistical mechanics. Here, we look at the probability distribution associated with a particular macrostate. The entropy is now S=k*sum p_i ln(p_i). There are two differences with the information thoery version of entropy. First, the logarithm is always the natural logarithm (base e). Second, we always multiply the result by Boltzmann's constant. 3. Equilibrium statistical mechanics. Here, we count the number of microstates (meaning the distinct quantum states), W. Then the entropy is given by S=k*ln(W). Again we use Boltzmann's constant. This is consistent with the second version of entropy because in equilibrium all microstates are equally probable, so each p_i=1/W. 4. Thermodynamic entropy. Here, we define the entropy of a pure crystal at zero kelvins to be 0. And then we define the change in entropy through dS=dQ/T where dQ is the reversible heat change in the system. The second law of thermodynamics is about this version of entropy. It says that the entropy of an isolated system will increase over time. Now, it turns out that the Thermodynamic entropy and the equilibrium statistical mechanics entropy are *exactly* the same. That is part of the whole point of statistical mechanics as a branch of physics. So any conclusions you can get from the thermodynamic entropy will *also* hold for the equilibrium statistical mechanics entropy. The information theoretic version of entropy is clearly related, but different than the others. All of the others have dimension of energy/temperature. 
Since: Aug 07 12,216 
#156958
Oct 17, 2013
That is wrong. If you count the equivalent microstates of each, the debris will have a very high number and the house will have a very low number. Obviously the house will have much lower entropy than the debris. 
Since: Aug 07 12,216 
#156959
Oct 17, 2013
This totally contradicts the lessons that you just provided and contradicts what the lecturor said.(And contradicts every other text book on the subject and all of science.) Did you even watch your own videos? 
Since: Aug 07 12,216 
#156960
Oct 17, 2013
The environment is irrelavent because the fundamental postulate of statistical mechanics requires the system to be in equilibrium. I've only pointed this out about 2 dozen times now. 
“See how you are?” Since: Jul 12 16,104 Earth 
#156961
Oct 17, 2013
Creager doesn't stay on the same subject. At 2:14 he diverts completely away from the SLoT to LaLa land. 
“Think&Care” Since: Oct 07 24,462 Location hidden 
#156962
Oct 17, 2013
And you are wrong each time. The environment *is* relevant because energy goes into the environment. When discussing changes in entropy, the number of microstates in the environment needs to be dealt with. You continually make the following mistakes: 1. Claiming the environment is irrelevant. Any energy or matter going out into the environment needs to be accounted for. The fact that the system is isothermal doesn't mean the environment can be ignored. 2. Not acknowledging that thermodynamic entropy and the entropy of statistical mechanics are the same thing. This is a large part of what Boltzmann showed and is crucial for the understanding of both subjects. 3. Not understanding that the entropy of statistical mechanics has the units of J/K. That is one way it differs from the entropy of information theory. But it is also why it gives the same answers as the thermodynamic entropy. 4. Not understanding that a microstate is a quantum state. The number of quantum states in your debris field is almost identical to the number of quantum states in your house. 5. Not acknowledging the counterexamples to your predictions. I have shown several which you continue to ignore. 
 
Add your comments below
Evolution Debate Discussions
Title  Updated  Last By  Comments 

Nonsense of a high order: The confused world of...  6 min  yehoshooah adam  3,853 
It's the Darwin crowd that lacks the facts in e... (Mar '09)  58 min  marksman11  161,478 
Atheism, for Good Reason, Fears Questions (Jun '09)  1 hr  Nohweh  30,608 
Do alleged ERVs confirm common descent?  4 hr  Dogen  110 
"Science vs. Religion: What Scientists Really T... (Jan '12)  5 hr  Dogen  70,561 
Gd versus Evolution?  11 hr  Dogen  35 
Episode 2: The Birth of Climate Denial  12 hr  Subduction Zone  7 
Find what you want!
Search Evolution Debate Forum Now
Copyright © 2017 Topix LLC