Uncommon Descent Serving The Intelligent Design Community

The Second Law: In Force Everywhere But Nowhere?

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

I hope our materialist friends will help us with this one.

As I understand their argument, entropy is not an obstacle to blind watchmaker evolution, because entropy applies absolutely only in a “closed system,” and the earth is not a closed system because it receives electromagnetic radiation from space.

Fair enough. But it seems to me that under that definition of “closed system” only the universe as a whole is a closed system, because every particular place in the universe receives energy of some kind from some other place. And if that is so, it seems the materialists have painted themselves into a corner in which they must, to remain logically consistent, assert that entropy applies everywhere but no place in particular, which is absurd.

Now this seems like an obvious objection, and if it were valid the “closed system/open system” argument would have never gained any traction to begin with. So I hope someone will clue me in as to what I am missing.

Comments
Mung: If someone is determined to reject a conclusion, he will always find some premise to object to. The issue is, that that implies other commitments or conclusions that may be much more assailable. Sufficient has long since been said to ground an informational view of thermodynamics, and to show why it then becomes quite intelligible as to why generating relevant FSCO/I rich systems such as for OOL, is maximally implausible on the relevant considerations. KF kairosfocus
Salvador:
I criticize OOL, but I not because of the 2nd law. I prefer to argue rote probability just like 500 fair coins heads. It’s clearer and unassailable.
What's the difference?
The second law of thermodynamics is all about probability. - Granville Sewell
Mung
Salvador:
The 2nd law of Thermodynamics is not a obstacle to mindless OOL any more than 2nd law of motion by Newton. This was stated by a recognized pioneer of Intelligent Design, Walter Bradley...
: The Mystery of Life's Origin : Reassessing Current Theories : Charles B. Thaxton, Walter L. Bradley, Roger L. Olsen : Lewis and Stanley : 1984 : Chapter 7 : Thermodynamics and the Origin of Life ...a mechanism of coupling the energy flow to the organizing work is unknown for prebiological systems.
Mung
Salvador:
The 2nd law of Thermodynamics is not a obstacle to mindless OOL…
: Biological Information : New Perspectives : 1.3.2. : Information and Thermodynamics in Living Systems : Andy C. McIntosh ...it is the thesis of this paper that the message/information itself is sitting on biochemical molecular bonds which are in a significantly raised free energy state. Understanding the thermodynamics of this machinery shows that it is thermodynamically impossible both to form such machinery (abiogenesis) without intelligence, and that the laws of thermodynamics prohibit any formation of new machinery which is not there already or latently coded for in the DNA (evolutionary development). - p. 183 At the molecular level, the principles of thermodynamics do not permit the formation of new machinery from that which is already set up or coded for in a particular function performed by the cells of living organisms. There is in fact an 'uphill' gradient in the formation of any of the molecular bonds in the nucleotides and most of the proteins, since they want to pull apart. Consequently there is no natural chemical pathway to form these, rather there is a move away from their formation to equilibrium. - p. 186
Mung
: Biological Information : New Perspectives : 1.3.1. : Entropy, Evolution and Open Systems : Granville Sewell The second law of thermodynamics is all about probability; it uses probability at the microscopic level to predict macroscopic change. [4] - p. 173 fn 4. In Classical and Modern Physics, Kenneth Ford writes: "There are a variety of ways in which the second law of thermodynamics can be stated., and we have encountered two of them so far: (1) For an isolated system, the direction of spontaneous change is from an arrangement of lesser probability to an arrangement of greater probability. (2) ..." Mung
kf:
Mung: I have been busy elsewhere.
That's quite all right sir! As you can probably tell, I have more than enough material to go over to keep me busy. :) Mung
PPPS: There is of course an al6ternative, the FSCO/I can come about, quite plausibly, by directed, designed, organised configuring work, much as we build a building or manufacture a PC etc. Venter et al, already have pointed ways in which molecular nanotech processes can be brought to bear on the manufacturing challenge. kairosfocus
PPS: On Darwin's pond, note a molecular scale of up to 10^-9 m is reasonable. A 1 mm cube is [10^6]^3 = 10^18 such, the size of a grain of sand. We have LARGE numbers, and in a liquid, a molecular shuffling process that is random, as reflected in Brownian motion. Chemical interactions are complex, with activation energies, configs, handedness and whatnot. Now, let's set up an ensemble of 1 m^3 Darwin's warm pond vats with any cluster of relevant -- but racemic, monomers in any reasonable concentrations. Add sunlight, O2 concs that are reasonable, UV, lightning, vibrations, winter and summer etc. The distribution of states will be a cross section of the accessible phase space. Use the 10^80 atoms and 10^17 s of the available cosmos that we observe, with all the C, O, N, H etc you want based on cosmological abundance. Why, even, make a sub group look like earths all you want. The result is predictable, there will be nothing relevant to life, by overwhelming impact of the config space to the available materials, regardless of openness and inflows and outflows. The FSCO/I -- islands of function -- config space challenge is that strong. kairosfocus
PS: While there are disputes, the principle of equiprobability of accessible states in a system of contingent possibilities has always been a useful first approximation, one justified by enormous success. One may then modify based on further info, but should always respect what captures a key aspect of reality as simple as why we hold that tossing a common presumed fair die gives 1/6 probability for any given face. It's all a matter of finding a plausible, reasonable, empirically adequate model. And in physics as well as economics, a model that captures a pivotal aspect, though not everything, has a lot to teach us. (Right now, on the day job so to speak, modified Solow production functions have much to say to long term tech-driven growth waves, i.e. I'se got Kondratieff AND long term product life cycle diffusion of innovation on the mind. Which is of course a stochastic process with a memory, a constrained hill climb with survival of the fittest . . . and with micro and macro level design everywhere.) kairosfocus
Mung: I have been busy elsewhere. While the considerations involved can be dressed up with the usual apparatus and walks across/trajectories in a phase space or population of possibilities are important [as are issues on whether probabilities are stable, or whether processes depend on memory of past states or forget them . . . in wending across a sea of non-function to find an isolated shore of function, there is no basis for cumulative paths, hill climbing presumes already being on an island of function . . . ], the matter in the end is essentially simple. We observe at lab scale, and can characterise system state satisfactorily for various purposes at that level. Consistent with that, there is a large number of what Brillouin more accurately termed ultramicroscopic possibilities, the microstates. The greater the degree of freedom at that level, the less constrained the state, and the greater the number of ways to get into or be in it. Hence, time's arrow. Entropy, first identified at macro-level, turns out to be characterisable in terms of the gap between macro and specific micro state, in info terms. And, there are two key ways to get there: probability calcs and summations with log probabilities duly weighted, and going through what a bit is. That is as a bit answers to state in a Y/N contingency, the string length [on average] to specify microstate starting from macrostate is effectively the same result. Relevance to ID is, that life is observable and functionally highly constrained relative to a pond full of molecular precursors. To the point where spontaneous transition from Darwin's pond to living form without programmed, complex configuring work, is effectively utterly implausible on the scope of the observed cosmos. And confident assertions that boil down to anything can happen when you can inject energy and matter into a system and allow it to exhaust energy and matter, frankly are a dodge. If you add energy and molecules to a pond, you add exponentially to the string length to specify its microstate, which bumps up the odds of moving to bulk, dominating clusters of states, which will be non-functional relative to the desired gated, encapsulated, metabolising automaton with coded von Neumann self replicating facility. The only empirically, analytically plausible source for FSCO/I is design, period. That's why, open, closed or isolated system, it is a reliable sign of design. But then, at this stage one is not expecting to persuade those who are willing to throw away first principles of reasoning to hold on to their ideology. By that act, they have removed themselves from the table of rational, evidence and logic informed discourse. One hopes some out there will wake up and realise what they have done. KF kairosfocus
I think I love statistical thermodynamics:
As stated in Section 1.1, the aim of statistical thermodynamics is to calculate thermodynamic quantities of macroscopic systems composed of a very large number of microscopic particles. We have started with an isolated system, characterized by a fixed energy E, volume V, and number of particles N (or N if there are c-components). Such a system is of no interest to us. Any measurement or any observation would necessarily violate the condition of isolation of the system. Nevertheless, such a system, or the corresponding ensemble, is the cornerstone of statistical thermodynamics. The reason is perhaps paradoxical: we do not know anything about the distribution of the states of such a system. Therefore, the best guess we can make is that all the states of such a system are equally probable. This was one of our postulates. - Statistical Thermodynamics: With Applications to the Life Sciences
That last sounds suspiciously like the Principle of Insufficient Reason http://mathworld.wolfram.com/PrincipleofInsufficientReason.html Rationally Speaking is a blog maintained by Prof. Massimo Pigliucci: http://rationallyspeaking.blogspot.com/2005/10/principle-of-insufficient-reason.html A Rehabilitation of the Principle of Insufficient Reason Mung
A new scientific theory has been born during the last few years, the theory of information, This theory is based on probability considerations. Once stated in a precise way, it can be used for many fundamental scientific discussions. It enables one to solve the problem of Maxwell's demon and to show a very direct connection between information and entropy. The thermodynamical entropy measures the lack of information about a certain physical system. - Leon Brillouin, 1956 It's ok Salvador, you're only 60 decades behind. Mung
Piotr @ 17:
Eric Anderson: Did anyone say that the origin of life or its evolution was explained by the second law of thermodynamics? Who said so? Where?
Well let's see.
Some people even go a step further, not only answering the question of "How do living organisms avoid decay?" but claiming that the Second Law can explain life. In his book The Second Law, Atkins (1984) writes:
"In Chapter 8 we see also how the Second Law accounts for the emergence of the intricately ordered forms characteristic of life."
Furthermore, in a more recent book, Atkins (2007) says:
The Second Law is one of the all-time great laws of science, for it illuminates why anything, from the cooling of hot matter to the formulation of a thought, happen at all.
and later in the book, he adds:
The Second Law is of central importance...because it provides a foundation for understanding why any change occurs...the acts of literary, artistic, and musical creativity that enhance our culture.
Finally, in both of the above mentioned books, Atkins writes on the Second Law:
"...no other scientific law has contributed more to the liberation of the human spirit than the Second Law of thermodynamics."
All these quotations are very impressively stated, but are totally irrelevant to the Second Law. The Second Law does not provide an explanation for any change, not for the simple expansion of a gas, and certainly not for any change associated with life. - Arieh Ben-Naim. Entropy and the Second Law. p 230
Mung
Recently came across this book by Manfred Eigen: From Strange Simplicity to Complex Familiarity: A Treatise on Matter, Information, Life and Thought
This book presents a vivid argument for the almost lost idea of a unity of all natural sciences. It starts with the "strange" physics of matter, including particle physics, atomic physics and quantum mechanics, cosmology, relativity and their consequences (Chapter I), and it continues by describing the properties of material systems that are best understood by statistical and phase-space concepts (Chapter II). These lead to entropy and to the classical picture of quantitative information, initially devoid of value and meaning (Chapter III). Finally, "information space" and dynamics within it are introduced as a basis for semantics (Chapter IV), leading to an exploration of life and thought as new problems in physics (Chapter V).
Some quotes from the Prolegomenon: "The focal point of this first volume is the concept of information, which is meant to include both the quantitative and the qualitative (or semantic) aspects of information, thereby providing a link between physics and biology." "The problem I wish to address is a different one: it is that of the design of the unimaginably complex blueprints of the living state that have now been continuously in existence for some 4000 million years. How did this information originate, and how could it eventually bring about structures as complex as the human brain? This question is a special version of an unsolved mathematical problem: how can problems of exponential complexity be solved within polynomial time?" "In this book we are going to see what properties are required in order to endow these systems with information-generating behaviour." "The existene of life, then, is dependent on the existence of conditions for self-organization of information-gathering systems." "...the internal feedback mechanism of selection must include some relationship that brings the idea of purpose onto the physical level of dynamics. The magic word that does this is "information", used here in the sense both of absolute quantity, representing "entropic complexity", and of semantic quality, representing "specified complexity". The latter turns out to be uniquely linked to reproduction. If it is to offer any advantage, specified information must be capable of (1) conservation, (2)proliferation, (3)variation and (4) selection. The common factor that links all these four requirements is error-prone replication. Information - more precisely, semantic information - represents a particular choice from among a tremendous variety of alternative structures with finite lifetime." Design. Specified Complexity. What's next, Irreducible Complexity? Mung
How do thermodynamically open systems arise within a system that is thermodynamically closed or isolated. Mung
Salvador:
The Earth is an open system, but we wouldn’t want the entropy of the Earth to escape and go to zero, we’d want the sun giving us entropy as we dump entropy out into cold space.
This makes no sense to me. Are you using the word entropy when you mean heat or energy, or something else? Mung
Salvador:
The 2nd law of Thermodynamics is not a obstacle to mindless OOL
Then what is? What could be? Mung
Salvador:
The 2nd law of Thermodynamics is not a obstacle to mindless OOL…
Yet:
Like every other material system, living organisms are subject to the laws of physics and chemistry, and in particular to the laws of thermodynamics. - Information and the Origin of Life. p. 131
D. you don’t know what you’re talking about Mung
Salvador:
The 2nd law of Thermodynamics is not a obstacle to mindless OOL…
Yet:
According to the laws of physical chemistry, a mixture of chemical substances - even when it contains the most complex proteins or nucleic acids - will ultimately follow the second law of thermodynamics and revert to the most probable state of chemical equilibrium ... - Ludwig von Bertalanffy (quoted in Information and the Origin of Life. p. 109
D. you don’t know what you’re talking about Mung
Salvador:
The 2nd law of Thermodynamics is not a obstacle to mindless OOL…
Biological Information: New Perspectives 1.3.2 Information and Thermodynamics in Living Systems 1.3.2.2 Biological information storage and retrieval - thermodynamic issues
There are major thermodynamic hurdles in arguing that the emergence of DNA could come about by a - random gathering of the sugar phosphates and nucleotides. - p. 184
Mung
More recommended reading for Salvador: Statistical Thermodynamics With Applications to the Life Sciences Salvador:
The 2nd law of Thermodynamics is not a obstacle to mindless OOL...
Please reconsider. Mung
Salvador:
So spell it out. What probability are you talking about: A. probability of an object being alive B. probability of an finding an object in a particular energy microstate C. probability of something else D. you don’t know what probability you’re talking about If your answer is “C”, then specify probability you are talking about.
Salvador, allow me to rephrase your post a couple times: S2:
So spell it out. What entropy are you talking about: A. entropy of an object being alive B. entropy of an finding an object in a particular energy microstate C. entropy of something else D. you don’t know what entropy you’re talking about If your answer is “C”, then specify entropy you are talking about.
S3:
So spell it out. What information are you talking about: A. information of an object being alive B. information of an finding an object in a particular energy microstate C. information of something else D. you don’t know what information you’re talking about If your answer is “C”, then specify information you are talking about.
Once again, I invite you to quote me. What was it that I said that got you all bent out of shape? And why? Mung
Salvador:
Why are you avoiding answering a simple question?
Because I am trying to teach you something. *gasp* Try asking questions that only require a "yes or no" answer. Mung
kf:
Once zones of interest, even if numerous, are rare in the config spaces of relevant scale, blind chance and/or mechanical necessity are hopeless empirical approaches.
Hi kf, If you have Yockey's Information Theory, Evolution, and the Origin of Life check out Chapter 2, Section 4.1., especially his observations about expr 4.1 and expr 4.7. I'm in particular interested in whether you see any connexion between the Shannon-McMillan-Breiman Theorem and your observations in @133. Mung
kairosfocus:
I have already outlined to you that the entropy metric for an entity reducible to a config space is a case of the entropy being a measure of the chain of un-answered y/n questions to specify microstate given lab-observable macrostate and state characterising variables. This gives us a result in effect as a measure of missing info or equivalently degrees of microscopic freedom.
: Information Theory : Robert B. Ash : Dover Reprint, 1990 : 1.3. Three interpretations of the uncertainty function An inspection of the form of the uncertainty function reveals...a weighted average of the numbers...where the weights are the probabilities of the various values of X. - p. 12 The essential content of the "noiseless coding theorem," to be proved in Chapter 2, is that the average number of "yes or no" questions needed to specify the value of X can never be less than the uncertainty of X. - p. 13 Thus we may state the second interpretation of the uncertainty measure: H(X) = the minimum average number of "yes or no" questions required to determine the result of one observation of X. - p. 14 Mung
SalC: Actually, the demand for probability calculations and numbers is significantly misplaced. Instead, we should go back to the underlying phase space thinking, which we can simplify to configuration spaces. WLOG, take 500 ordinary H/T coins. The space of possible configs is 2^500 ~ 3.27*10^150. For 1,000, that would be 1.07 * 10^301. For the former, give each of the 10^57 atoms of our solar system a tray of 500 coins, and allow a toss and inspect every 10^-14s, a rate linked to the fastest chemical interactions. At that rate, across a reasonable solar system lifespan, 10^17 s, we see that we can sample the equivalent of 1 straw to a cubical haystack 1,000 LY on the side. Even were such a stack superposed on our galactic neighbourhood, with all the thousands of stars implied, a sample of that scope with all but certainty would pick the overwhelming bulk, straw. Too much haystack, too few needles, too little search opportunity. For 1,000 bits, the resources of the cosmos would be utterly overwhelmed and the haystack equivalent would utterly dwarf the 90 bn LY wide observed cosmos. Now, this is WLOG, as any nodes and arcs functional pattern can be reduced to a set of coded strings, cf. AutoCAD etc. A search of a config space can be reduced to a picking rrom the power set, which brings in why lucky searches are in an even bigger space. Once zones of interest, even if numerous, are rare in the config spaces of relevant scale, blind chance and/or mechanical necessity are hopeless empirical approaches. It is only by judicious management of searches in well behaved beighbourhoods of optima -- islands of function -- that evolutionary algorithms make progress. I have already outlined to you that the entropy metric for an entity reducible to a config space is a case of the entropy being a measure of the chain of un-answered y/n questions to specify microstate given lab-observable macrostate and state characterising variables. This gives us a result in effect as a measure of missing info or equivalently degrees of microscopic freedom. In a Darwin's pond or the like, molecular clusters face a config space challenge, and to arrive at a gated, encapsulated metabolic entity with code based von Neumann self replicator facility is a worse needle in haystack search than those listed. The logic behind 2LOT is highly relevant to both OOL and origin of body plans by blind mechanisms. And so also is information theory by the logic just given. Where, chaining Y/N Q's and specifying bit chains thereby was materially similar to part of the way Shannon reasoned about information. This does not directly yield a probability metric, but it is related. So, I have for years highlighted config spaces, search challenge and the link to info in light of the very obvious point that when a complex functionally specific entity requires a relatively small and isolated cluster of configs to work, that is an island of function. Also, as you will easily see from the Clausius formulation, when d'q is transferred from A to B because of temp difference, B's entropy rises precisely as the number of ways of arranging mass and energy at micro level has risen. Thence, we observe that a Darwin pond would be energy importing. Without coupling of energy to mechanisms that do shaft work, the most credible result is more disorder through increased randomness. Yes, hurricanes etc are spontaneously forming heat engines, but we are discussing FSCO/I rich entities more akin to a fanjet engine than a cyclone. The living cell points to design based on FSCO/I and 2LOT linked issues are implicit and inextricably involved in why. I therefore invite us to return focus to the main point. KF kairosfocus
So spell it out. What probability are you talking about: A. probability of an object being alive B. probability of an finding an object in a particular energy microstate C. probability of something else D. you don’t know what probability you’re talking about If your answer is “C”, then specify probability you are talking about. Why are you avoiding answering a simple question? A,B,C or D? scordova
Salvador, have you been reading my posts in this thread? Are you ignoring the posts by kairosfocus? Mung
Salvador:
If you say “thermodynamic probability”, you better define what you think that means, because it should mean the probability of some “thing” being true, and what is that “thing”.
See my post @ 124. Mung
Salvador:
I think no one is understanding how to represent what you said.
Oh, so now it was all just a misunderstanding. ok, I'm willing to forgive and forget. After all, it's the Catholic thing to do. Salvador:
So what probability are you talking about when you wrote this:
Have you been reading my posts in this thread? I followed up my initial statement with numerous additional posts. I would have thought there was no room for misunderstanding. Would you like me to list them all for you? Salvador:
You’re only being asked to clarify your own statements, because as your statement stands, it’s meaningless and irrelevant to the probability of life.
Sal, even the casual reader can see that you weren't asking me to clarify anything. Let's not pretend, shall we? We're both adults. If what I said was irrelevant, why didn't you make that claim/argument initially? And why the hypocrisy? I have repeatedly asked you to clarify comments you've made, and your response is to delete my posts. Salvador:
If you’re talking about the probability of thermodynamic energy microstate, that is irrelevant to the probability of life.
The attentive reader will notice the self-contradiction. Salvador @8:
The 2nd law of Thermodynamics is not a obstacle to mindless OOL.
By which you must mean that the Second Law of Thermodynamics is irrelevant to the Origin of Life. Now which of us needs to "clarify" our position? Mung
Mung, So what probability are you talking about? Are you talking about thermodynamic probability of a thermodynamic microstate or are you talking the probability of life? What probability do think your quotes are talking about? Probability of an energy microstate or of life or something else or can't you tell the difference between the two? :roll: So spell it out. What probability are you talking about: A. probability of an object being alive B. probability of an finding an object in a particular energy microstate C. probability of something else D. you don't know what probability you're talking about If your answer is "C", then specify probability you are talking about. Why are you avoiding answering a simple question? If you say "thermodynamic probability", you better define what you think that means, because it should mean the probability of some "thing" being true, and what is that "thing". HINT: thermodynamic probability usually means probability that a particular microstate is found. But if that is the case, it's generally insufficient to say anything about something being alive. scordova
Re-posting here, given Salvador's propensity for deleting my posts in any thread he has control over:
The laws of information had already solved the paradoxes of thermodynamics; in fact, information theory consumed thermodynamics. The problems in thermodynamics can be solved by recognizing that thermodynamics is, in truth, a special case of information theory. - Decoding the Universe, p. 87
Mung
Re-posting here, given Salvador's propensity for deleting my posts in any thread he has control over: Gordon Davisson:
Either way, you'll get the same result you would've from Boltzmann's formula...Either way, it doesn't affect the physics at all -- they're just different ways of looking at the same old familiar entropy.
Mung
there is no difference between the entropy argument and the information argument and the probability argument.
So what probability are you talking about? Are you talking about thermodynamic probability of a thermodynamic microstate or are you talking the probability of life. If you're talking about the probability of thermodynamic energy microstate, that is irrelevant to the probability of life. You're statement is meaningless at best, irrelevant to the question of life at worst. scordova
Salvador:
For the reader’s benefit. The 2nd law can be stated in terms of entropy. As Gordon pointed out an alternative form of the 2nd law is:
For the reader's benefit:
The term entropy and its denotation by the letter S were introduced by Rudolf Julius Emanuel Clausius (1822-1888) in 1864 in his work on the foundations of classical thermodynamics, where he formulated the second law of thermodynamics in the form that the entropy of a closed system cannot decrease. Later, the concept of thermodynamic entropy was clarified and grounded by Ludwig Eduard Boltzmann (1844-1906), the famous formula for thermodynamic entropy is: S = k * ln W (3.2.1) where W is the thermodynamic probability that the system is in the state with the entropy S. - Theory Of Information: Fundamentality, Diversity and Unification
Mung:
Salvador, there is no difference between the entropy argument and the information argument and the probability argument.
Mung
Mung, I think no one is understanding how to represent what you said. So what probability are you talking about when you wrote this:
there is no difference between the entropy argument and the information argument and the probability argument.
You're only being asked to clarify your own statements, because as your statement stands, it's meaningless and irrelevant to the probability of life. scordova
I opened another of my books on information theory and find the following quote:
I is impossible to defeat an ignorant man by argument. - William McAdoo
Mung
Salvador, pardon, but there's that little matter of your misrepresentations of what I claimed that needs to be cleared up. Mung
SalC: Pardon, cf 79 on above. KF kairosfocus
Mung, What probability are you talking about when you wrote this:
there is no difference between the entropy argument and the information argument and the probability argument.
Is the probability of winning a lottery, the probability of Rubio being the next president, the probability of a coin being heads, the probability of life? You're statement is meaningless without specifying the probability you were talking about. scordova
Salvador, Since you claim to be able to calculate the entropy of the origin of life, please do so for the benefit of our interested readers. Since you claim to be able to calculate the information of the origin of life, please do so for the benefit of our interested readers. Once you have done this, I will gladly calculate the probability of the origin of life. Salvador:
We were talking probability of life weren’t we?
Were you? Where? Salvador:
The 2nd law of Thermodynamics is not a obstacle to mindless OOL any more than 2nd law of motion by Newton.
It follows then that entropy is not an obstacle to mindless OOL. Salvador:
For the reader’s benefit. The 2nd law can be stated in terms of entropy.
Yes, most of us already knew this. In fact, I made the connection early in this thread. But thanks anyways. If the 2LOT is no obstacle, then neither is entropy. Right? Mung:
There is no difference between the entropy argument and the information argument and the probability argument.
It also follows that, according to you, the probability argument and the information argument are likewise no obstacle to OOL. So in a single post you manage to undermine all of ID, and you don't even know it. Yet you claim to be an ID'ist. This is a consistent pattern you've displayed for years. And for pointing it out I've been "banned" from your threads. But you can't ban me from this thread, so bring it on Salvador. Show us what you have when you can't control the debate by deleting dissenting opinion. Mung
Mung wrote: there is no difference between the entropy argument and the information argument and the probability argument.
We were talking probability of life weren't we? You can bail out by saying: "No, I wasn't talking about the probability of life." If that's the way you backtrack, then what probability are you talking about? scordova
Salvador:
How about you show the readers that one can derive a probability number for life based on thermodynamic entropy. It would be a service to our readers if you did.
How about you show our readers where I claimed that I or anyone else could do so. Salvador:
You claim you can relate the thermodynamic entropy and the probability of the system being alive.
I ask again, where did I make that a claim? Mung
For the reader's benefit. The 2nd law can be stated in terms of entropy. As Gordon pointed out an alternative form of the 2nd law is:
* The entropy of an isolated system cannot decrease.
Here is a followup discussion of entropy and Shannon information: https://uncommondescent.com/physics/shannon-information-entropy-uncertainty-in-thermodynamics-and-id/ scordova
Mung, How about you show the readers that one can derive a probability number for life based on thermodynamic entropy. It would be a service to our readers if you did. If such a probability can't be derived based on thermodynamic entropy, then say so. State what you believe. scordova
Slavador:
Suppose the thermodynamic entropy of an object is 50 bits (Shannon). Tell the readers what the probability is the object is alive.
What's really sad about this is that you seem to think this is a serious question that ought to be taken seriously and that other readers of this thread ought to agree with you. Or perhaps it's just another of your red herrings, like this one:
Think you are up to the calculation? Most college science students who study thermodynamic entropy would be able to answer that question. If you can’t, why are you giving people reading suggestions about thermodynamic entropy since you can’t even do a basic entropy calculation yourself?
Let's say, for the sake of argument, that I cannot perform even the most simple entropy calculation. So what? Your argument is that if I cannot perform a simple entropy calculation that none of he sources that I cite can either, and can therefore be ignored as irrelevant, which is just absurd. Absurd, Salvador.
You claim you can relate the thermodynamic entropy and the probability of the system being alive.
Where did I make that claim? Mung
Mung, Suppose the thermodynamic entropy of an object is 50 bits (Shannon). Tell the readers what the probability is the object is alive. You claim you can relate the thermodynamic entropy and the probability of the system being alive. Go ahead. Demonstrate it. Do the calculation and defend your claim. This question is relevant to the discussion of the 2nd law and life. So how about it Mung? If you can't relate a simply stated thermodynamic entropy expressed in Shannon bits to the probability that the system is a living system or not, then how do you expect readers to believe you. Can you do the calculation to prove your point? If you can't say so. Are you capable of even doing the calculation? A simple yes or no will suffice. scordova
Salvador @ 107:
What argument are you talking about? The information argument about OOL, the probability argument about OOL, the thermodynamic entropy argument about OOL?
Mung:
I thought I made it clear that there was no difference between the three.
Salvador:
You did no such thing.
So this exchange between the two of was precipitated by an exchange you claim never took place. How do you explain the following? Mung @ 35:
Salvador, there is no difference between the entropy argument and the information argument and the probability argument.
Mung
Salvador:
I related Shannon, Dembski, Boltzmann and Clausius notions of entropy
At first I took your word for it, but now I am beginning to doubt this. Salvador:
... why are you giving people reading suggestions about thermodynamic entropy since you can’t even do a basic entropy calculation yourself?
Which of the reading suggestions that I have offered do you disagree with, and why? Salvador:
...since you can’t even do a basic entropy calculation yourself...
Hilarious. You consistently delete my posts from any thread you author (continuing to the present day) and in the past have modified the content of my posts in threads you've authored to make it appear as if I had written something which I did not in fact write. It probably takes no stretch to imagine that you've just ignored any post in which I actually performed an entropy calculation. So perhaps you're not lying, but rather just willfully ignorant. In any event, this is just red herring. You took umbrage at the following:
Salvador, there is no difference between the entropy argument and the information argument and the probability argument.
It's still a mystery why you took offense. I have cited numerous sources in support of my claim. So far all you have in rebuttal is the assertion that I can't do a simple entropy calculation, therefore I must be wrong. Mung
Yet more reading for Salvador, since he still doesn't get it: A Farewell To Entropy: Statistical Thermodynamics Based on Information
...the aim of this book is to show that statistical thermodynamics will benefit from the replacement of the concept of entropy by "information" ...
From the book description on Amazon:
The principal message of this book is that thermodynamics and statistical mechanics will benefit from replacing the unfortunate, misleading and mysterious term "entropy" with a more familiar, meaningful and appropriate term such as information, missing information or uncertainty. This replacement would facilitate the interpretation of the "driving force" of many processes in terms of informational changes and dispel the mystery that has always enshrouded entropy. It has been 140 years since Clausius coined the term "entropy"; almost 50 years since Shannon developed the mathematical theory of "information" -- subsequently renamed "entropy". In this book, the author advocates replacing "entropy" by "information", a term that has become widely used in many branches of science. The author also takes a new and bold approach to thermodynamics and statistical mechanics. Information is used not only as a tool for predicting distributions but as the fundamental cornerstone concept of thermodynamics, held until now by the term "entropy". The topics covered include the fundamentals of probability and information theory; the general concept of information as well as the particular concept of information as applied in thermodynamics; the re-derivation of the Sackur-Tetrode equation for the entropy of an ideal gas from purely informational arguments; the fundamental formalism of statistical mechanics; and many examples of simple processes the "driving force" for which is analyzed in terms of information.
Mung
Mung, Liquid water has a standard molar entropy of 69.9 J/K per mole. You have a warm little pond of 1,000 gallons of water. Estimate the entropy amount in that warm little pond. Think you are up to the calculation? Most college science students who study thermodynamic entropy would be able to answer that question. If you can't, why are you giving people reading suggestions about thermodynamic entropy since you can't even do a basic entropy calculation yourself? But I expect you might find the answer on the internet. Suppose now you find that answer, translate your answer for that thermodynamic entropy into a probability for OOL in that warm little pond. :-) You claimed you should be able to, since you said there is no difference between thermodynamic entropy and OOL probability. :roll: Well, how about it Mung? scordova
I asked:
What argument are you talking about? The information argument about OOL, the probability argument about OOL, the thermodynamic entropy argument about OOL?
Mung responded:
I thought I made it clear that there was no difference between the three.
You did no such thing. Suppose an object has thermodynamic entropy on the order of 10^7 J/K. Show for the reader how this amount of entropy translates into the probability of OOL. :roll: HINT: you can't, because they are generally unrelated, you can not make the assertion the probability of OOL is the same as the probability related to thermodynamic entropy. Thermodynamic entropy deals with probability of finding a system in a particular energy or position/momentum microstate, this can be equivalently described into the Shannon entropy for energy or position/momentum microstates. But not all possible Shannon entorpies a necessarily thermodynamic entropies. You don't seem to understand that! The probability of seeing a particular energy or position/momentum microstate is not the same as the probability of OOL.
More reading suggestions for Salvador:...
Quit offer reading suggestions as if you really understand what you're talking about, because you don't. Take some grad level statistical mechanics and thermodynamics before your start giving reading suggestions to those that have actually studied such material at the graduate level. :roll: You think:
there was no difference between the three (probability of OOL, thermodynamic entropy, information).
suggests you're the one who needs to read and comprehend. Take a given thermodynamic entropy level (not just 10^7 J/K) for an object and explain to the readers how that translates into a probability of OOL. If you can't make the derivation, then that shows you're just bloviating about stuff you have no clue about. So how about it mung, if you're given the amount of thermodynamic entropy for a warm little pond, do you think you can state, based on the thermodynamic entropy alone, whether there is something alive in it? C'mon, mung, let's settle this once and for all. :-) scordova
More reading suggestions for Salvador: Science and Information Theory
A classic source for understanding the connections between information theory and physics, this text was written by one of the giants of 20th-century physics.
Scientific Uncertainty, and Information Mung
BA77, perhaps you should take a lesson from Salvador, who while constantly boasting in his four degrees still can't stand up to kf. Mung
kairosfocus @102:
BA77, your privilege. I just invite you to ponder that Q-Mech is C20, but Stat Mech, per Gibbs (and Boltzmann) is C19. One of the simplest roads in is to think of boxes of bouncing marbles — quite classical, cf my App 1, the always linked. KF
That's one thing I love about the Amnon Katz text.
Professor Katz first sets up a general theory of statistical mechanics and then applies it to problems in and out of equilibrium. He carries out classical and quantum mechanical treatments in parallel and stresses the analogy between them.
Mung
kf, no disrespect, but your reply 'C20 C19 bouncing marbles etc' makes no sense whatsoever to me as to coherently explaining the quantum Zeno effect nor coherently explaining any of the other quantum effects presented to you. Moreover, I can easily ask you, on the other hand, to ponder the irrationality that denying your 'mind' entails. If you refuse the budge from your position on this matter we are at a genuine impasse since I find your position, (i.e. no causal effect for consciousness), to be genuinely incoherent. But so be it. I have too much respect for your tireless effort and work to gripe too much. bornagain77
BA77, your privilege. I just invite you to ponder that Q-Mech is C20, but Stat Mech, per Gibbs (and Boltzmann) is C19. One of the simplest roads in is to think of boxes of bouncing marbles -- quite classical, cf my App 1, the always linked. KF kairosfocus
of related note as to tying up some 'philosophical loose ends' Christianity and Panentheism - (conflict or concordance?) - video https://www.youtube.com/watch?v=_xki03G_TO4&list=UU5qDet6sa6rODi7t6wfpg8g bornagain77
Related notes on ‘interaction free’ measurement:
The Mental Universe – Richard Conn Henry – Professor of Physics John Hopkins University Excerpt: The only reality is mind and observations, but observations are not of things. To see the Universe as it really is, we must abandon our tendency to conceptualize observations as things.,,, Physicists shy away from the truth because the truth is so alien to everyday physics. A common way to evade the mental universe is to invoke “decoherence” – the notion that “the physical environment” is sufficient to create reality, independent of the human mind. Yet the idea that any irreversible act of amplification is necessary to collapse the wave function is known to be wrong: in “Renninger-type” experiments, the wave function is collapsed simply by your human mind seeing nothing. The universe is entirely mental,,,, The Universe is immaterial — mental and spiritual. Live, and enjoy. http://henry.pha.jhu.edu/The.mental.universe.pdf The Renninger Negative Result Experiment – video http://www.youtube.com/watch?v=C3uzSlh_CV0 Elitzur–Vaidman bomb tester Excerpt: In 1994, Anton Zeilinger, Paul Kwiat, Harald Weinfurter, and Thomas Herzog actually performed an equivalent of the above experiment, proving interaction-free measurements are indeed possible.[2] In 1996, Kwiat et al. devised a method, using a sequence of polarising devices, that efficiently increases the yield rate to a level arbitrarily close to one. http://en.wikipedia.org/wiki/Elitzur%E2%80%93Vaidman_bomb-testing_problem#Experiments Experimental Realization of Interaction-Free Measurement – Paul G. Kwiat; H. Weinfurter, T. Herzog, A. Zeilinger, and M. Kasevich – 1994 http://www.univie.ac.at/qfp/publications3/pdffiles/1994-08.pdf Interaction-Free Measurement – 1995 http://archive.is/AjexE Realization of an interaction-free measurement – 1996 http://bg.bilkent.edu.tr/jc/topics/Interaction%20free%20measurements/papers/realization%20of%20an%20interaction%20free%20measurement.pdf
Verse and Music:
Colossians 1:17 And he is before all things, and by him all things consist. Brooke Fraser- “C S Lewis Song” http://www.godtube.com/watch/?v=DL6LPLNX
Supplemental Notes:
The Galileo Affair and Life/Consciousness as the true “Center of the Universe” https://docs.google.com/document/d/1BHAcvrc913SgnPcDohwkPnN4kMJ9EDX-JJSkjc4AXmA/edit
Two very different eternities revealed by physics:
General Relativity, Special Relativity, Heaven and Hell https://docs.google.com/document/d/1_4cQ7MXq8bLkoFLYW0kq3Xq-Hkc3c7r-gTk0DYJQFSg/edit
bornagain77
The reason why I am very impressed with the Quantum Zeno effect as to establishing consciousness’s primacy in quantum mechanics is, for one thing, that Entropy is, by a wide margin, the most finely tuned of initial conditions of the Big Bang:
The Physics of the Small and Large: What is the Bridge Between Them? Roger Penrose Excerpt: “The time-asymmetry is fundamentally connected to with the Second Law of Thermodynamics: indeed, the extraordinarily special nature (to a greater precision than about 1 in 10^10^123, in terms of phase-space volume) can be identified as the “source” of the Second Law (Entropy).” How special was the big bang? – Roger Penrose Excerpt: This now tells us how precise the Creator’s aim must have been: namely to an accuracy of one part in 10^10^123. (from the Emperor’s New Mind, Penrose, pp 339-345 – 1989)
For another thing, it is interesting to note just how foundational entropy is in its explanatory power for actions within the space-time of the universe:
Shining Light on Dark Energy – October 21, 2012 Excerpt: It (Entropy) explains time; it explains every possible action in the universe;,, Even gravity, Vedral argued, can be expressed as a consequence of the law of entropy. ,,, The principles of thermodynamics are at their roots all to do with information theory. Information theory is simply an embodiment of how we interact with the universe —,,, http://crev.info/2012/10/shining-light-on-dark-energy/
In fact, entropy is also the primary reason why our physical, temporal, bodies grow old and die,,,
Ageing Process – 85 years in 40 seconds – video http://www.youtube.com/watch?v=A91Fwf_sMhk *3 new mutations every time a cell divides in your body * Average cell of 15 year old has up to 6000 mutations *Average cell of 60 year old has 40,000 mutations Reproductive cells are ‘designed’ so that, early on in development, they are ‘set aside’ and thus they do not accumulate mutations as the rest of the cells of our bodies do. Regardless of this protective barrier against the accumulation of slightly detrimental mutations still we find that,,, *60-175 mutations are passed on to each new generation. Per John Sanford Entropy Explains Aging, Genetic Determinism Explains Longevity, and Undefined Terminology Explains Misunderstanding Both - 2007 Excerpt: There is a huge body of knowledge supporting the belief that age changes are characterized by increasing entropy, which results in the random loss of molecular fidelity, and accumulates to slowly overwhelm maintenance systems [1–4].,,, http://www.plosgenetics.org/article/info%3Adoi/10.1371/journal.pgen.0030220
And yet, to repeat wikipedia,,,
Quantum Zeno effect Excerpt: The quantum Zeno effect is,,, an unstable particle, if observed continuously, will never decay. per wiki
This is just fascinating! Why in blue blazes should conscious observation put a freeze on entropic decay, unless consciousness was/is more foundational to reality than the 1 in 10^10^120 entropy is? Putting all the lines of evidence together the argument for God from consciousness can now be framed like this:
1. Consciousness either preceded all of material reality or is a ‘epi-phenomena’ of material reality. 2. If consciousness is a ‘epi-phenomena’ of material reality then consciousness will be found to have no special position within material reality. Whereas conversely, if consciousness precedes material reality then consciousness will be found to have a special position within material reality. 3. Consciousness is found to have a special, even central, position within material reality. 4. Therefore, consciousness is found to precede material reality. Four intersecting lines of experimental evidence from quantum mechanics that shows that consciousness precedes material reality (Wigner’s Quantum Symmetries, Wheeler’s Delayed Choice, Leggett’s Inequalities, Quantum Zeno effect): https://docs.google.com/document/d/1G_Fi50ljF5w_XyJHfmSIZsOcPFhgoAZ3PRc_ktY8cFo/edit
bornagain77
Then, a little bit later, I learned that the delayed choice experiment had been extended:
The Experiment That Debunked Materialism – video – (delayed choice quantum eraser) http://www.youtube.com/watch?v=6xKUass7G8w (Double Slit) A Delayed Choice Quantum Eraser – updated 2007 Excerpt: Upon accessing the information gathered by the Coincidence Circuit, we the observer are shocked to learn that the pattern shown by the positions registered at D0 (Detector Zero) at Time 2 depends entirely on the information gathered later at Time 4 and available to us at the conclusion of the experiment. http://www.bottomlayer.com/bottom/kim-scully/kim-scully-web.htm
And then I learned the delayed choice experiment was refined yet again:
“If we attempt to attribute an objective meaning to the quantum state of a single system, curious paradoxes appear: quantum effects mimic not only instantaneous action-at-a-distance but also, as seen here, influence of future actions on past events, even after these events have been irrevocably recorded.” Asher Peres, Delayed choice for entanglement swapping. J. Mod. Opt. 47, 139-143 (2000). Quantum physics mimics spooky action into the past – April 23, 2012 Excerpt: The authors experimentally realized a “Gedankenexperiment” called “delayed-choice entanglement swapping”, formulated by Asher Peres in the year 2000. Two pairs of entangled photons are produced, and one photon from each pair is sent to a party called Victor. Of the two remaining photons, one photon is sent to the party Alice and one is sent to the party Bob. Victor can now choose between two kinds of measurements. If he decides to measure his two photons in a way such that they are forced to be in an entangled state, then also Alice’s and Bob’s photon pair becomes entangled. If Victor chooses to measure his particles individually, Alice’s and Bob’s photon pair ends up in a separable state. Modern quantum optics technology allowed the team to delay Victor’s choice and measurement with respect to the measurements which Alice and Bob perform on their photons. “We found that whether Alice’s and Bob’s photons are entangled and show quantum correlations or are separable and show classical correlations can be decided after they have been measured”, explains Xiao-song Ma, lead author of the study. According to the famous words of Albert Einstein, the effects of quantum entanglement appear as “spooky action at a distance”. The recent experiment has gone one remarkable step further. “Within a naïve classical world view, quantum mechanics can even mimic an influence of future actions on past events”, says Anton Zeilinger. http://phys.org/news/2012-04-quantum-physics-mimics-spooky-action.html
i.e. The preceding experiment clearly shows, and removes any doubt whatsoever, that the ‘material’ detector recording information in the double slit is secondary to the experiment and that a conscious observer being able to consciously know the ‘which path’ information of a photon with local certainty, is of primary importance in the experiment. You can see a more complete explanation of the startling results of the experiment at the 9:11 minute mark of the following video:
Delayed Choice Quantum Eraser Experiment Explained – 2014 video http://www.youtube.com/watch?v=H6HLjpj4Nt4
And then, after the delayed choice experiments, I learned about something called Leggett’s Inequality. Leggett’s Inequality was, as far as I can tell, a mathematical proof designed by Nobelist Anthony Leggett to try to prove ‘realism’. Realism is the belief that an objective reality exists independently of a conscious observer looking at it. And, as is usual with challenging the predictions of Quantum Mechanics, his proof was violated by a stunning 80 orders of magnitude, thus once again, in an over the top fashion, highlighting the central importance of the conscious observer to Quantum Experiments:
A team of physicists in Vienna has devised experiments that may answer one of the enduring riddles of science: Do we create the world just by looking at it? – 2008 Excerpt: In mid-2007 Fedrizzi found that the new realism model was violated by 80 orders of magnitude; the group was even more assured that quantum mechanics was correct. Leggett agrees with Zeilinger that realism is wrong in quantum mechanics, but when I asked him whether he now believes in the theory, he answered only “no” before demurring, “I’m in a small minority with that point of view and I wouldn’t stake my life on it.” For Leggett there are still enough loopholes to disbelieve. I asked him what could finally change his mind about quantum mechanics. Without hesitation, he said sending humans into space as detectors to test the theory.,,, (to which Anton Zeilinger responded) When I mentioned this to Prof. Zeilinger he said, “That will happen someday. There is no doubt in my mind. It is just a question of technology.” Alessandro Fedrizzi had already shown me a prototype of a realism experiment he is hoping to send up in a satellite. It’s a heavy, metallic slab the size of a dinner plate. http://seedmagazine.com/content/article/the_reality_tests/P3/ Alain Aspect and Anton Zeilinger by Richard Conn Henry – Physics Professor – John Hopkins University Excerpt: Why do people cling with such ferocity to belief in a mind-independent reality? It is surely because if there is no such reality, then ultimately (as far as we can know) mind alone exists. And if mind is not a product of real matter, but rather is the creator of the “illusion” of material reality (which has, in fact, despite the materialists, been known to be the case, since the discovery of quantum mechanics in 1925), then a theistic view of our existence becomes the only rational alternative to solipsism (solipsism is the philosophical idea that only one’s own mind is sure to exist). (Dr. Henry’s referenced experiment and paper – “An experimental test of non-local realism” by S. Gröblacher et. al., Nature 446, 871, April 2007 – “To be or not to be local” by Alain Aspect, Nature 446, 866, April 2007 (Leggett’s Inequality: Verified to 80 orders of magnitude) http://henry.pha.jhu.edu/aspect.html
The following video gets the general, and dramatic, point across of what ‘giving up realism’ actually means:
Quantum Physics – (material reality does not exist until we look at it) – Dr. Quantum video http://www.youtube.com/watch?v=D1ezNvpFcJU
Further work confirms the initial results for Leggett's Inequality
Macrorealism Emerging from Quantum Physics – Brukner, Caslav; Kofler, Johannes American Physical Society, APS March Meeting, – March 5-9, 2007 Excerpt: for unrestricted measurement accuracy a violation of macrorealism (i.e., a violation of the Leggett-Garg inequalities) is possible for arbitrary large systems.,, http://adsabs.harvard.edu/abs/2007APS..MARB33005B
But, as if all that was not enough to demonstrate consciousness’s centrality in quantum mechanics, I then learned about something called the ‘Quantum Zeno Effect’,,
Quantum Zeno Effect The quantum Zeno effect is,, an unstable particle, if observed continuously, will never decay. http://en.wikipedia.org/wiki/Quantum_Zeno_effect “It has been experimentally confirmed,, that unstable particles will not decay, or will decay less rapidly, if they are observed. Somehow, observation changes the quantum system. We’re talking pure observation, not interacting with the system in any way.” Douglas Ell – Counting to God – pg. 189 – 2014 – Douglas Ell graduated early from MIT, where he double majored in math and physics.
bornagain77
kf, as far as quantum mechanics is concerned, conscious observation is not so easily dismissed as to having an effect on atomic molecules. And can only be ignored on pain of a infinite number of many world kairosfocus's all agreeing and disagreeing with me. i.e. on pain of Hyper irrationality! That consciousness is integral to quantum mechanics is fairly obvious to the unbiased observer (no pun intended). I first, much like everybody else, was immediately shocked to learn that the observer could have any effect whatsoever in the double slit experiment:
Quantum Mechanics – Double Slit and Delayed Choice Experiments – video https://vimeo.com/87175892 Dr. Quantum – Double Slit Experiment – video https://www.youtube.com/watch?v=Q1YqgPAtzho Double Slit Experiment – Explained By Prof Anton Zeilinger – video http://www.metacafe.com/watch/6101627/ Quantum Mechanics – Double Slit Experiment. Is anything real? (Prof. Anton Zeilinger) – video http://www.youtube.com/watch?v=ayvbKafw2g0
Prof. Zeilinger makes this rather startling statement in the preceding video:
“The path taken by the photon is not an element of reality. We are not allowed to talk about the photon passing through this or this slit. Neither are we allowed to say the photon passes through both slits. All this kind of language is not applicable.” Anton Zeilinger
I think Feynman, who had high regard for the double slit experiment, sums the ‘paradox’ up extremely well,
…the “paradox” is only a conflict between reality and your feeling of what reality “ought to be.” Richard Feynman, in The Feynman Lectures on Physics, vol III, p. 18-9 (1965)
Dean Radin, who spent years at Princeton testing different aspects of consciousness (with positive results towards consciousness having causality), recently performed experiments testing the possible role of consciousness in the double slit. His results were, not so surprisingly, very supportive of consciousness’s central role in the experiment:
Consciousness and the double-slit interference pattern: six experiments – Radin – 2012 Abstract: A double-slit optical system was used to test the possible role of consciousness in the collapse of the quantum wavefunction. The ratio of the interference pattern’s double-slit spectral power to its single-slit spectral power was predicted to decrease when attention was focused toward the double slit as compared to away from it. Each test session consisted of 40 counterbalanced attention-toward and attention-away epochs, where each epoch lasted between 15 and 30 s(seconds). Data contributed by 137 people in six experiments, involving a total of 250 test sessions, indicate that on average the spectral ratio decreased as predicted (z = -4:36, p = 6·10^-6). Another 250 control sessions conducted without observers present tested hardware, software, and analytical procedures for potential artifacts; none were identified (z = 0:43, p = 0:67). Variables including temperature, vibration, and signal drift were also tested, and no spurious influences were identified. By contrast, factors associated with consciousness, such as meditation experience, electrocortical markers of focused attention, and psychological factors including openness and absorption, significantly correlated in predicted ways with perturbations in the double-slit interference pattern. The results appear to be consistent with a consciousness-related interpretation of the quantum measurement problem. http://www.deanradin.com/papers/Physics%20Essays%20Radin%20final.pdf
Of course, atheists/materialists were/are in complete denial as to the obvious implications of mind in the double slit (invoking infinite parallel universes and such absurdity as that to try to get around the obvious implications of ‘Mind’). But personally, not being imprisoned in the materialist’s self imposed box, my curiosity was aroused and I’ve been sort of poking around, finding out a little more here and there about quantum mechanics, and how the observer is central to it. One of the first interesting experiments in quantum mechanics I found after the double slit, that highlighted the centrality of the observer to the experiment, was Wigner’s Quantum Symmetries. Here is Wigner commenting on the key experiment that led Wigner to his Nobel Prize winning work on quantum symmetries,,,
Eugene Wigner Excerpt: When I returned to Berlin, the excellent crystallographer Weissenberg asked me to study: why is it that in a crystal the atoms like to sit in a symmetry plane or symmetry axis. After a short time of thinking I understood:,,,, To express this basic experience in a more direct way: the world does not have a privileged center, there is no absolute rest, preferred direction, unique origin of calendar time, even left and right seem to be rather symmetric. The interference of electrons, photons, neutrons has indicated that the state of a particle can be described by a vector possessing a certain number of components. As the observer is replaced by another observer (working elsewhere, looking at a different direction, using another clock, perhaps being left-handed), the state of the very same particle is described by another vector, obtained from the previous vector by multiplying it with a matrix. This matrix transfers from one observer to another. http://www.reak.bme.hu/Wigner_Course/WignerBio/wb1.htm
Wigner went on to make these rather dramatic comments in regards to his work:
“It was not possible to formulate the laws (of quantum theory) in a fully consistent way without reference to consciousness.” Eugene Wigner (1902 -1995) from his collection of essays “Symmetries and Reflections – Scientific Essays”; Eugene Wigner laid the foundation for the theory of symmetries in quantum mechanics, for which he received the Nobel Prize in Physics in 1963. “It will remain remarkable, in whatever way our future concepts may develop, that the very study of the external world led to the scientific conclusion that the content of the consciousness is the ultimate universal reality” - Eugene Wigner – (Remarks on the Mind-Body Question, Eugene Wigner, in Wheeler and Zurek, p.169) 1961
Also of note:
Von Neumann–Wigner – interpretation Excerpt: The von Neumann–Wigner interpretation, also described as “consciousness causes collapse [of the wave function]“, is an interpretation of quantum mechanics in which consciousness is postulated to be necessary for the completion of the process of quantum measurement. http://en.wikipedia.org/wiki/Von_Neumann%E2%80%93Wigner_interpretation#The_interpretation “I think von Neumann’s orthodox QM gives a good way to understand the nature of the universe: it is tightly tied to the practical test and uses of our basic physical theory, while also accounting for the details of the mind-brain connection in a way that is rationally concordant with both our conscious experiences, and experience of control, and the neuroscience data.” Henry Stapp
Then after I had learned about Wigner’s Quantum Symmetries, I stumbled across Wheeler’s Delayed choice experiments in which this finding shocked me as to the central importance of the observer’s free will choice in quantum experiments:
Alain Aspect speaks on John Wheeler’s Delayed Choice Experiment – video http://vimeo.com/38508798 “Thus one decides the photon shall have come by one route or by both routes after it has already done its travel” John A. Wheeler Wheeler’s Classic Delayed Choice Experiment: Excerpt: Now, for many billions of years the photon is in transit in region 3. Yet we can choose (many billions of years later) which experimental set up to employ – the single wide-focus, or the two narrowly focused instruments. We have chosen whether to know which side of the galaxy the photon passed by (by choosing whether to use the two-telescope set up or not, which are the instruments that would give us the information about which side of the galaxy the photon passed). We have delayed this choice until a time long after the particles “have passed by one side of the galaxy, or the other side of the galaxy, or both sides of the galaxy,” so to speak. Yet, it seems paradoxically that our later choice of whether to obtain this information determines which side of the galaxy the light passed, so to speak, billions of years ago. So it seems that time has nothing to do with effects of quantum mechanics. And, indeed, the original thought experiment was not based on any analysis of how particles evolve and behave over time – it was based on the mathematics. This is what the mathematics predicted for a result, and this is exactly the result obtained in the laboratory. http://www.bottomlayer.com/bottom/basic_delayed_choice.htm Genesis, Quantum Physics and Reality Excerpt: Simply put, an experiment on Earth can be made in such a way that it determines if one photon comes along either on the right or the left side or if it comes (as a wave) along both sides of the gravitational lens (of the galaxy) at the same time. However, how could the photons have known billions of years ago that someday there would be an earth with inhabitants on it, making just this experiment? ,,, This is big trouble for the multi-universe theory and for the “hidden-variables” approach. - per Greer “It begins to look as we ourselves, by our last minute decision, have an influence on what a photon will do when it has already accomplished most of its doing… we have to say that we ourselves have an undeniable part in what we have always called the past. The past is not really the past until is has been registered. Or to put it another way, the past has no meaning or existence unless it exists as a record in the present.” – John Wheeler – The Ghost In The Atom – Page 66-68
bornagain77
kf, I profoundly disagree with your opinion. bornagain77
BA77: It's not conscious observation that freezes but that if something is in a constrained macrostate it has less entropy than in a less constrained one, fewer ways for mass and energy to be arranged consistent with the state. E.g. consider free expansion: Case A: || ***** | || Remove the internal barrier for the gas: Case B: || * * * * * || In B there is a rise in entropy as the micro-particles have greater freedom consistent with the macro constraints (which are observable). The constraint comes first, the lack of info on specific microstate stems from that. Our observation of the macrostate is just the usual way we do science at lab scale. In this case we observe two volumes, one much more confining. There is no necessary profound entanglement of the observer and the observation with the system state that makes a material difference. (This is not a quantum concept, though it can be extended to such -- Gibbs was pre-quantum.) KF kairosfocus
kairosfocus, but why should conscious observation put a freeze on entropic decay from a mathematical point of view? i.e. since, as far as I can tell, math cannot possibly causally explain this effect from a entropic point of view then why does the quantum zeno effect even happen unless mind/consciousness precedes both the entropy of the universe and the math that describes the entropy? bornagain77
F/N: I want to note, this is the 45th anniversary of the first Moon landing, to day and date. KF kairosfocus
BA77: When, based on macro level state, we have a tightly constrained distribution of mass and energy at micro level, that means that there is low freedom to be in various states, hence low entropy. A living cell is in a highly constrained state, and is a macro-level observable in this sense. Prick it and allow its contents to diffuse via Brownian motion etc, and there is a much lower constraint. There is effectively zero likelihood of such a pricked cell spontaneously re assembling, on the gamut of time and resources of the observable cosmos across its lifespan. And BTW, this is also a strong argument against spontaneous OOL. KF kairosfocus
Notice, how this work tries to get info for free for life:
One theme seems to characterize life: It is able to convert the thermodynamic information contained in food or in sunlight into complex and statistically unlikely configurations of matter. A flood of information-containing free-energy reaches the earth's biosphere in the form of sunlight. Passing through the metabolic pathways of living organisms, this information keeps the organisms far away from thermodynamic equilibrium (which is death). As the thermodynamic information flows through the biosphere, much of it is degraded into heat, but part is converted into cybernetic information and preserved in the intricate structures which are characteristic of life. The principle of natural selection ensures that as this happens, the configurations of matter in living organisms constantly increase in complexity, refinement and statistical improbability. This is the process which we call evolution, or in the case of human society, progress. [Information Theory and Evolution, 2nd Edn. John Scales Avery (World Scientific, Singapore, 2012). p.97]
If we look carefully, we will see that the error here is to imagine that organisation to process energy and information can be acquired for free. The OOL threshold is vast, and the threshold to originate onward body plans is vaster yet. Absent organised coupling and configuring mechanisms that provide the required highly functionally specific, complex work to make the structures and arrangements of cell based life, input energy will just go into randomising, aka heating up. In Maxwell's analysis of his demon, the demon got info and the weightless door to gate passage of molecules free of energy and information cost. But int eh physical world, there will need to be specific, coupled mechanisms for the demon to do the work, and that will extract its own requisite to design and build it, and to operate it. The same obtains in the wider case of OOL and origin of body plans. But there is a widespread blindness to that, thus the sort of verbal hey presto, bang it's there we just saw. KF kairosfocus
F/N: This may make useful reading also. KF kairosfocus
kairosfocus per 79:
Maxwell’s demon can (hypothetically) reduce the thermodynamic entropy of a system by using information about the states of individual molecules; but, as Landauer (from 1961) and co-workers have shown, to function the demon himself must increase thermodynamic entropy in the process, by at least the amount of Shannon information he proposes to first acquire and store; and so the total entropy does not decrease (which resolves the paradox). Boiling down, and with a dash of this clip from Jaynes courtesy HSR: “. . . The entropy of a thermodynamic system is a measure of the degree of ignorance of a person whose sole knowledge about its microstate consists of the values of the macroscopic quantities . . . which define its thermodynamic state. This is a perfectly ‘objective’ quantity . . . it is a function of [those variables] and does not depend on anybody’s personality. There is no reason why it cannot be measured in the laboratory.” . . . that is, the entropy of a body is the suitably adjusted metric of the average missing info to specify its microstate, given only its macroscopic state definition on temp, pressure etc. https://uncommondescent.com/intelligent-design/the-second-law-in-force-everywhere-but-nowhere/#comment-507913
As to being measured in the laboratory, this has now been accomplished:
Maxwell's demon demonstration (knowledge of a particle's position) turns information into energy - November 2010 Excerpt: Until now, demonstrating the conversion of information to energy has been elusive, but University of Tokyo physicist Masaki Sano and colleagues have succeeded in demonstrating it in a nano-scale experiment. In a paper published in Nature Physics they describe how they coaxed a Brownian particle to travel upwards on a "spiral-staircase-like" potential energy created by an electric field solely on the basis of information on its location. As the particle traveled up the staircase it gained energy from moving to an area of higher potential, and the team was able to measure precisely how much energy had been converted from information. http://www.physorg.com/news/2010-11-maxwell-demon-energy.html Demonic device converts information to energy - 2010 Excerpt: "This is a beautiful experimental demonstration that information has a thermodynamic content," says Christopher Jarzynski, a statistical chemist at the University of Maryland in College Park. In 1997, Jarzynski formulated an equation to define the amount of energy that could theoretically be converted from a unit of information2; the work by Sano and his team has now confirmed this equation. "This tells us something new about how the laws of thermodynamics work on the microscopic scale," says Jarzynski. http://www.scientificamerican.com/article.cfm?id=demonic-device-converts-inform “It is CSI that enables Maxwell’s demon to outsmart a thermodynamic system tending toward thermal equilibrium” William Dembki Intelligent Design, pg. 159
I also like how the 'conscious observer' is brought in in your quote kairosfocus:
The entropy of a thermodynamic system is a measure of the degree of ignorance of a person whose sole knowledge about its microstate consists of the values of the macroscopic quantities
But my question for you kairosfocus is "what is the mathematical explanation for why the entropic decay of a unstable particle 'freezes' when a person has 'knowledge of its microstate'?"
Quantum Zeno effect Excerpt: The quantum Zeno effect is,,, an unstable particle, if observed continuously, will never decay. http://en.wikipedia.org/wiki/Quantum_Zeno_effect “It has been experimentally confirmed,, that unstable particles will not decay, or will decay less rapidly, if they are observed. Somehow, observation changes the quantum system. We’re talking pure observation, not interacting with the system in any way.” Douglas Ell – Counting to God – pg. 189 – 2014 – Douglas Ell graduated early from MIT, where he double majored in math and physics.
kairosfocus, for me, being a novice, it seems fairly obvious the implications to 'consciousness as fundamental (Planck)', but I would be much interested in your opinion on the matter if it is not too much trouble. bornagain77
Salvador:
The probabilities likewise involved in OOL are not the same probabilities associated with Shannon entropy nor thermodynamic entropy which only deals with energy microstates or position and momentum microstates, not things like the functional state of a system!
Sure. We'll just take your word for it. You being an authority in the probabilities involved in OOL, and an authority in the probabilities associated with Shannon entropy, and an authority in the probabilities associated with thermodynamic entropy. Are you admitting that these are all probabilistic? Salvador:
... not things like the functional state of a system!
You just don't get it, do you. Is it that functional states are probabilistically unlikely? Mung
Salvador:
Shannon information is not the same as the information most important to OOL. There is for example prescriptive information, algorithmic information, specified information, etc.
You're just pontificating. These are just assertions, not an argument. Even if we grant the existence of prescriptive information, algorithmic information, specified information, etc., it does not follow that these are not subsumed under Shannon Information or that they are "more important" to OOL than Shannon Information. Salvador:
Shannon information is not the same as the information most important to OOL... So the information argument isn’t the same as the thermodynamic argument if one is discussing forms of information beyond Shannon. But that subtlety seems to have escaped you.
Even if we grant that Shannon Information is not the same as the information most important to OOL, it does not follow that the information argument isn’t the same as the thermodynamic argument. That's just a non-sequitur. Also, I don't recall saying anything about the thermodynamic argument. Mung:
Salvador, there is no difference between the entropy argument and the information argument and the probability argument.
Salvador:
So the information argument isn’t the same as the thermodynamic argument if one is discussing forms of information beyond Shannon. But that subtlety seems to have escaped you.
Oh, but the information argument is the same as the thermodynamic argument if one isn't discussing forms of information beyond Shannon. But that subtlety seems to have escaped you. Mung
Salvador:
What argument are you talking about? The information argument about OOL, the probability argument about OOL, the thermodynamic entropy argument about OOL?
I thought I made it clear that there was no difference between the three. Mung:
Salvador, there is no difference between the entropy argument and the information argument and the probability argument.
You disagree. Please say why. Mung
Salvador:
Would you care to correct my analysis of the relevant differential equations? Explain for the readers where I went materially wrong in my math? Talk about false insinuations. I never claimed or even insinuated that your math was wrong, so why this challenge to show how your math was wrong? You're confused.
Mung
Salvador:
I derived these relations and posted them here and elsewhere. Don’t make false insinuations about me of not understanding the relationship of Shannon information and thermodynamic entropy.
You've got to be kidding. What false insinuation? I plainly stated that you understood the relationship, I even quoted you:
When I [Salvador] related Shannon, Dembski, Boltzmann and Clausius notions of entropy, I thought I demonstrated that entropy must necessarily increase for CSI to increase.
So my question to you is, what is your problem? Mung:
Salvador, there is no difference between the entropy argument and the information argument and the probability argument.
You objected. Why? You acknowledge the relationships. Mung
kairosfocus:
So, while I doubt Mung has done a serious stat mech course, his reading and intuition on solid state and vacuum physics tied to electronics and to info theory will be in the right ballpark.
Man, you've got that right! I've no idea why the US Navy thought I'd be a good fit for their nuclear power program, lol! As it turns out, due to a propensity to steal rather than purchase (at age 17/18) I was ejected from that program and ended up in electronics instead, and thus communications. And here I am decades later still in the communications field. The point is that entropy is a statistical/probabilistic concept, and thus uniquely suited to analysis according to information theory. Mung
Additional reading material for Salvador: Principles of Statistical Mechanics: The Information Theory Approach From the dustjacket:
In this book the author approaches statistical mechanics in the uniquely consistent and unified way made possible by information theory ... Information theory ... has proved to be very powerful in clarifying the concepts of statistical mechanics and in uniting its various branches to one another and to the rest of physics.
From the Preface:
The material presented here reflects the work of many authors over a long period of time. ... (I should, however, mention E.T. Jaynes, who has pioneered the information theory approach to statistical mechanics.)
H.T. to kairosfocus who also brought attention to Jaynes here at UD. Mung
Ps 2: Further following on, and drawing on Robertson: ______________ >>Summarising Harry Robertson's Statistical Thermophysics (Prentice-Hall International, 1993) -- excerpting desperately and adding emphases and explanatory comments, we can see, perhaps, that this should not be so surprising after all. (In effect, since we do not possess detailed knowledge of the states of the vary large number of microscopic particles of thermal systems [typically ~ 10^20 to 10^26; a mole of substance containing ~ 6.023*10^23 particles; i.e. the Avogadro Number], we can only view them in terms of those gross averages we term thermodynamic variables [pressure, temperature, etc], and so we cannot take advantage of knowledge of such individual particle states that would give us a richer harvest of work, etc.) For, as he astutely observes on pp. vii - viii:
. . . the standard assertion that molecular chaos exists is nothing more than a poorly disguised admission of ignorance, or lack of detailed information about the dynamic state of a system . . . . If I am able to perceive order, I may be able to use it to extract work from the system, but if I am unaware of internal correlations, I cannot use them for macroscopic dynamical purposes. On this basis, I shall distinguish heat from work, and thermal energy from other forms . . .
And, in more details, (pp. 3 - 6, 7, 36, cf Appendix 1 below for a more detailed development of thermodynamics issues and their tie-in with the inference to design; also see recent ArXiv papers by Duncan and Samura here and here):
. . . It has long been recognized that the assignment of probabilities to a set represents information, and that some probability sets represent more information than others . . . if one of the probabilities say p2 is unity and therefore the others are zero, then we know that the outcome of the experiment . . . will give [event] y2. Thus we have complete information . . . if we have no basis . . . for believing that event yi is more or less likely than any other [we] have the least possible information about the outcome of the experiment . . . . A remarkably simple and clear analysis by Shannon [1948] has provided us with a quantitative measure of the uncertainty, or missing pertinent information, inherent in a set of probabilities [NB: i.e. a probability different from 1 or 0 should be seen as, in part, an index of ignorance] . . . . [deriving informational entropy, cf. discussions here, here, here, here and here; also Sarfati's discussion of debates and the issue of open systems here . . . ] H({pi}) = - C [SUM over i] pi*ln pi, [. . . "my" Eqn 6] [where [SUM over i] pi = 1, and we can define also parameters alpha and beta such that: (1) pi = e^-[alpha + beta*yi]; (2) exp [alpha] = [SUM over i](exp - beta*yi) = Z [Z being in effect the partition function across microstates, the "Holy Grail" of statistical thermodynamics]. . . . [H], called the information entropy, . . . correspond[s] to the thermodynamic entropy [i.e. s, where also it was shown by Boltzmann that s = k ln w], with C = k, the Boltzmann constant, and yi an energy level, usually ei, while [BETA] becomes 1/kT, with T the thermodynamic temperature . . . A thermodynamic system is characterized by a microscopic structure that is not observed in detail . . . We attempt to develop a theoretical description of the macroscopic properties in terms of its underlying microscopic properties, which are not precisely known. We attempt to assign probabilities to the various microscopic states . . . based on a few . . . macroscopic observations that can be related to averages of microscopic parameters. Evidently the problem that we attempt to solve in statistical thermophysics is exactly the one just treated in terms of information theory. It should not be surprising, then, that the uncertainty of information theory becomes a thermodynamic variable when used in proper context . . . . Jayne's [summary rebuttal to a typical objection] is ". . . The entropy of a thermodynamic system is a measure of the degree of ignorance of a person whose sole knowledge about its microstate consists of the values of the macroscopic quantities . . . which define its thermodynamic state. This is a perfectly 'objective' quantity . . . it is a function of [those variables] and does not depend on anybody's personality. There is no reason why it cannot be measured in the laboratory." . . . . [pp. 3 - 6, 7, 36; replacing Robertson's use of S for Informational Entropy with the more standard H.]
As is discussed briefly in Appendix 1, Thaxton, Bradley and Olsen [TBO], following Brillouin et al, in the 1984 foundational work for the modern Design Theory, The Mystery of Life's Origins [TMLO], exploit this information-entropy link, through the idea of moving from a random to a known microscopic configuration in the creation of the bio-functional polymers of life, and then -- again following Brillouin -- identify a quantitative information metric for the information of polymer molecules. For, in moving from a random to a functional molecule, we have in effect an objective, observable increment in information about the molecule. This leads to energy constraints, thence to a calculable concentration of such molecules in suggested, generously "plausible" primordial "soups." In effect, so unfavourable is the resulting thermodynamic balance, that the concentrations of the individual functional molecules in such a prebiotic soup are arguably so small as to be negligibly different from zero on a planet-wide scale. By many orders of magnitude, we don't get to even one molecule each of the required polymers per planet, much less bringing them together in the required proximity for them to work together as the molecular machinery of life. The linked chapter gives the details. More modern analyses [e.g. Trevors and Abel, here and here], however, tend to speak directly in terms of information and probabilities rather than the more arcane world of classical and statistical thermodynamics, so let us now return to that focus; in particular addressing information in its functional sense, as the third step in this preliminary analysis. As the third major step, we now turn to information technology, communication systems and computers, which provides a vital clarifying side-light from another view on how complex, specified information functions in information processing systems: [In the context of computers] information is data -- i.e. digital representations of raw events, facts, numbers and letters, values of variables, etc. -- that have been put together in ways suitable for storing in special data structures [strings of characters, lists, tables, "trees" etc], and for processing and output in ways that are useful [i.e. functional]. . . . Information is distinguished from [a] data: raw events, signals, states etc represented digitally, and [b] knowledge: information that has been so verified that we can reasonably be warranted, in believing it to be true. [GEM, UWI FD12A Sci Med and Tech in Society Tutorial Note 7a, Nov 2005.] That is, we have now made a step beyond mere capacity to carry or convey information, to the function fulfilled by meaningful -- intelligible, difference making -- strings of symbols. In effect, we here introduce into the concept, "information," the meaningfulness, functionality (and indeed, perhaps even purposefulness) of messages -- the fact that they make a difference to the operation and/or structure of systems using such messages, thus to outcomes; thence, to relative or absolute success or failure of information-using systems in given environments. And, such outcome-affecting functionality is of course the underlying reason/explanation for the use of information in systems. [Cf. the recent peer-reviewed, scientific discussions here, and here by Abel and Trevors, in the context of the molecular nanotechnology of life.] Let us note as well that since in general analogue signals can be digitised [i.e. by some form of analogue-digital conversion], the discussion thus far is quite general in force. So, taking these three main points together, we can now see how information is conceptually and quantitatively defined, how it can be measured in bits, and how it is used in information processing systems; i.e., how it becomes functional. In short, we can now understand that: Functionally Specific, Complex Information [FSCI] is a characteristic of complicated messages that function in systems to help them practically solve problems faced by the systems in their environments. Also, in cases where we directly and independently know the source of such FSCI (and its accompanying functional organisation) it is, as a general rule, created by purposeful, organising intelligent agents. So, on empirical observation based induction, FSCI is a reliable sign of such design, e.g. the text of this web page, and billions of others all across the Internet. (Those who object to this, therefore face the burden of showing empirically that such FSCI does in fact -- on observation -- arise from blind chance and/or mechanical necessity without intelligent direction, selection, intervention or purpose.) Indeed, this FSCI perspective lies at the foundation of information theory: (i) recognising signals as intentionally constructed messages transmitted in the face of the possibility of noise, (ii) where also, intelligently constructed signals have characteristics of purposeful specificity, controlled complexity and system- relevant functionality based on meaningful rules that distinguish them from meaningless noise; (iii) further noticing that signals exist in functioning generation- transfer and/or storage- destination systems that (iv) embrace co-ordinated transmitters, channels, receivers, sources and sinks. That this is broadly recognised as true, can be seen from a surprising source, Dawkins, who is reported to have said in his The Blind Watchmaker (1987), p. 8: Hitting upon the lucky number that opens the bank's safe [NB: cf. here the case in Brown's The Da Vinci Code] is the equivalent, in our analogy, of hurling scrap metal around at random and happening to assemble a Boeing 747. [NB: originally, this imagery is due to Sir Fred Hoyle, who used it to argue that life on earth bears characteristics that strongly suggest design. His suggestion: panspermia -- i.e. life drifted here, or else was planted here.] Of all the millions of unique and, with hindsight equally improbable, positions of the combination lock, only one opens the lock. Similarly, of all the millions of unique and, with hindsight equally improbable, arrangements of a heap of junk, only one (or very few) will fly. The uniqueness of the arrangement that flies, or that opens the safe, has nothing to do with hindsight. It is specified in advance. [Emphases and parenthetical note added, in tribute to the late Sir Fred Hoyle. (NB: This case also shows that we need not see boxes labelled "encoders/decoders" or "transmitters/receivers" and "channels" etc. for the model in Fig. 1 above to be applicable; i.e. the model is abstract rather than concrete: the critical issue is functional, complex information, not electronics.)] Here, we see how the significance of FSCI naturally appears in the context of considering the physically and logically possible but vastly improbable creation of a jumbo jet by chance. Instantly, we see that mere random chance acting in a context of blind natural forces is a most unlikely explanation, even though the statistical behaviour of matter under random forces cannot rule it strictly out. But it is so plainly vastly improbable, that, having seen the message -- a flyable jumbo jet -- we then make a fairly easy and highly confident inference to its most likely origin: i.e. it is an intelligently designed artifact. For, the a posteriori probability of its having originated by chance is obviously minimal -- which we can intuitively recognise, and can in principle quantify. FSCI is also an observable, measurable quantity; contrary to what is imagined, implied or asserted by many objectors. This may be most easily seen by using a quantity we are familiar with: functionally specific bits [FS bits], such as those that define the information on the screen you are most likely using to read this note . . . >> _________________ In short, the interconnexions are there. kairosfocus
PS 1: I clip from my always linked note, Sec A: _______________ >> The second major step is to refine our thoughts, through discussing the communication theory definition of and its approach to measuring information. A good place to begin this is with British Communication theory expert F. R Connor, who gives us an excellent "definition by discussion" of what information is:
From a human point of view the word 'communication' conveys the idea of one person talking or writing to another in words or messages . . . through the use of words derived from an alphabet [NB: he here means, a "vocabulary" of possible signals]. Not all words are used all the time and this implies that there is a minimum number which could enable communication to be possible. In order to communicate, it is necessary to transfer information to another person, or more objectively, between men or machines. This naturally leads to the definition of the word 'information', and from a communication point of view it does not have its usual everyday meaning. Information is not what is actually in a message but what could constitute a message. The word could implies a statistical definition in that it involves some selection of the various possible messages. The important quantity is not the actual information content of the message but rather its possible information content. This is the quantitative definition of information and so it is measured in terms of the number of selections that could be made. Hartley was the first to suggest a logarithmic unit . . . and this is given in terms of a message probability. [p. 79, Signals, Edward Arnold. 1972. Bold emphasis added. Apart from the justly classical status of Connor's series, his classic work dating from before the ID controversy arose is deliberately cited, to give us an indisputably objective benchmark.]
To quantify the above definition of what is perhaps best descriptively termed information-carrying capacity, but has long been simply termed information (in the "Shannon sense" - never mind his disclaimers . . .), let us consider a source that emits symbols from a vocabulary: s1,s2, s3, . . . sn, with probabilities p1, p2, p3, . . . pn. That is, in a "typical" long string of symbols, of size M [say this web page], the average number that are some sj, J, will be such that the ratio J/M --> pj, and in the limit attains equality. We term pj the a priori -- before the fact -- probability of symbol sj. Then, when a receiver detects sj, the question arises as to whether this was sent. [That is, the mixing in of noise means that received messages are prone to misidentification.] If on average, sj will be detected correctly a fraction, dj of the time, the a posteriori -- after the fact -- probability of sj is by a similar calculation, dj. So, we now define the information content of symbol sj as, in effect how much it surprises us on average when it shows up in our receiver: I = log [dj/pj], in bits [if the log is base 2, log2] . . . Eqn 1 This immediately means that the question of receiving information arises AFTER an apparent symbol sj has been detected and decoded. That is, the issue of information inherently implies an inference to having received an intentional signal in the face of the possibility that noise could be present. Second, logs are used in the definition of I, as they give an additive property: for, the amount of information in independent signals, si + sj, using the above definition, is such that: I total = Ii + Ij . . . Eqn 2 For example, assume that dj for the moment is 1, i.e. we have a noiseless channel so what is transmitted is just what is received. Then, the information in sj is: I = log [1/pj] = - log pj . . . Eqn 3 This case illustrates the additive property as well, assuming that symbols si and sj are independent. That means that the probability of receiving both messages is the product of the probability of the individual messages (pi *pj); so: Itot = log1/(pi *pj) = [-log pi] + [-log pj] = Ii + Ij . . . Eqn 4 So if there are two symbols, say 1 and 0, and each has probability 0.5, then for each, I is - log [1/2], on a base of 2, which is 1 bit. (If the symbols were not equiprobable, the less probable binary digit-state would convey more than, and the more probable, less than, one bit of information. Moving over to English text, we can easily see that E is as a rule far more probable than X, and that Q is most often followed by U. So, X conveys more information than E, and U conveys very little, though it is useful as redundancy, which gives us a chance to catch errors and fix them: if we see "wueen" it is most likely to have been "queen.") Further to this, we may average the information per symbol in the communication system thusly (giving in termns of -H to make the additive relationships clearer): - H = p1 log p1 + p2 log p2 + . . . + pn log pn or, H = - SUM [pi log pi] . . . Eqn 5 H, the average information per symbol transmitted [usually, measured as: bits/symbol], is often termed the Entropy; first, historically, because it resembles one of the expressions for entropy in statistical thermodynamics. As Connor notes: "it is often referred to as the entropy of the source." [p.81, emphasis added.] Also, while this is a somewhat controversial view in Physics, as is briefly discussed in Appendix 1below, there is in fact an informational interpretation of thermodynamics that shows that informational and thermodynamic entropy can be linked conceptually as well as in mere mathematical form. Though somewhat controversial even in quite recent years, this is becoming more broadly accepted in physics and information theory, as Wikipedia now discusses [as at April 2011] in its article on Informational Entropy (aka Shannon Information, cf also here):
At an everyday practical level the links between information entropy and thermodynamic entropy are not close. Physicists and chemists are apt to be more interested in changes in entropy as a system spontaneously evolves away from its initial conditions, in accordance with the second law of thermodynamics, rather than an unchanging probability distribution. And, as the numerical smallness of Boltzmann's constant kB indicates, the changes in S / kB for even minute amounts of substances in chemical and physical processes represent amounts of entropy which are so large as to be right off the scale compared to anything seen in data compression or signal processing. But, at a multidisciplinary level, connections can be made between thermodynamic and informational entropy, although it took many years in the development of the theories of statistical mechanics and information theory to make the relationship fully apparent. In fact, in the view of Jaynes (1957), thermodynamics should be seen as an application of Shannon's information theory: the thermodynamic entropy is interpreted as being an estimate of the amount of further Shannon information needed to define the detailed microscopic state of the system, that remains uncommunicated by a description solely in terms of the macroscopic variables of classical thermodynamics. For example, adding heat to a system increases its thermodynamic entropy because it increases the number of possible microscopic states that it could be in, thus making any complete state description longer. (See article: maximum entropy thermodynamics.[Also,another article remarks: >>in the words of G. N. Lewis writing about chemical entropy in 1930, "Gain in entropy always means loss of information, and nothing more" . . . in the discrete case using base two logarithms, the reduced Gibbs entropy is equal to the minimum number of yes/no questions that need to be answered in order to fully specify the microstate, given that we know the macrostate.>>]) Maxwell's demon can (hypothetically) reduce the thermodynamic entropy of a system by using information about the states of individual molecules; but, as Landauer (from 1961) and co-workers have shown, to function the demon himself must increase thermodynamic entropy in the process, by at least the amount of Shannon information he proposes to first acquire and store; and so the total entropy does not decrease (which resolves the paradox).
Summarising Harry Robertson's Statistical Thermophysics (Prentice-Hall International, 1993) . . . >> _______________ This gets us to an information expression that is closely tied to the thermodynamics situation. kairosfocus
SalC (attn Mung): I suggest you should take a look at Harry S Robertson's Statistical Thermophysics, esp Ch 1. (L K Nash's classic Elements of Statistical Thermodynamics is a very good intro, too. I strongly suggest this Chemist's look at the area for newbies. I have never seen a better introductory presentation done by a Physicist. I wish I had gone over to the Chemistry section of the uni bookshop or library in my undergrad days . . . Mandl was and is a pig, and Sears et al improves on the reading with the years, similar to Milman-Halkias' Integrated Electronics. There are 1st time thru books and there are nth time through books. My u/grad rule of thumb was, expect to do three readings to get the point in most physics books of that era. I think that's because late occurring points are really needed to understand earlier ones. H'mm, maybe that's why I came to favour spiral curricula that start from a compressed overview and guided tour of key themes . . . esp. based on a key case study, then elaborate major aspect by aspect until a capstone integrative at a higher level can be done. Thus, too, a view on my foundationalist - coherentist -- elegant balance approach generally: what is the hard core things are built up from? In the case of the design inference that is abductive inference on induction and analysis to explain the causal origin of functionally specific, complex information. The only empirically grounded, analytically reasonable such, is design. But when you are up against entrenched ideology and indoctrination, that is going to be an uphill charge into the teeth of MG fire. You need tanks closely backed by infantry to take such on.) I think we would all find the following from Wiki [clipped Apr 2011 fr the Informational Entropy (i.e. Shannon Info . . . avg info per symbol in a message and message context) art] helpful (understanding that the informational approach to statistical thermodynamics has been controversial since Brillouin and Jaynes but has latterly been increasingly seen as having a serious point):
At an everyday practical level the links between information entropy and thermodynamic entropy are not close. Physicists and chemists are apt to be more interested in changes in entropy as a system spontaneously evolves away from its initial conditions, in accordance with the second law of thermodynamics, rather than an unchanging probability distribution. And, as the numerical smallness of Boltzmann's constant kB indicates, the changes in S / kB for even minute amounts of substances in chemical and physical processes represent amounts of entropy which are so large as to be right off the scale compared to anything seen in data compression or signal processing. But, at a multidisciplinary level, connections can be made between thermodynamic and informational entropy, although it took many years in the development of the theories of statistical mechanics and information theory to make the relationship fully apparent. In fact, in the view of Jaynes (1957), thermodynamics should be seen as an application of Shannon's information theory: the thermodynamic entropy is interpreted as being an estimate of the amount of further Shannon information needed to define the detailed microscopic state of the system, that remains uncommunicated by a description solely in terms of the macroscopic variables of classical thermodynamics. For example, adding heat to a system increases its thermodynamic entropy because it increases the number of possible microscopic states that it could be in, thus making any complete state description longer. (See article: maximum entropy thermodynamics.[Also,another article remarks: >>in the words of G. N. Lewis writing about chemical entropy in 1930, "Gain in entropy always means loss of information, and nothing more" . . . in the discrete case using base two logarithms, the reduced Gibbs entropy is equal to the minimum number of yes/no questions that need to be answered in order to fully specify the microstate, given that we know the macrostate.>>]) Maxwell's demon can (hypothetically) reduce the thermodynamic entropy of a system by using information about the states of individual molecules; but, as Landauer (from 1961) and co-workers have shown, to function the demon himself must increase thermodynamic entropy in the process, by at least the amount of Shannon information he proposes to first acquire and store; and so the total entropy does not decrease (which resolves the paradox).
Boiling down, and with a dash of this clip from Jaynes courtesy HSR:
". . . The entropy of a thermodynamic system is a measure of the degree of ignorance of a person whose sole knowledge about its microstate consists of the values of the macroscopic quantities . . . which define its thermodynamic state. This is a perfectly 'objective' quantity . . . it is a function of [those variables] and does not depend on anybody's personality. There is no reason why it cannot be measured in the laboratory."
. . . that is, the entropy of a body is the suitably adjusted metric of the average missing info to specify its microstate, given only its macroscopic state definition on temp, pressure etc. Where of course a 1 mm cube easily has over 10^18 1 nm scale particles [except if it is a gas which might be like 10^15 or so]. As a consequence of this connexion, we must understand that info is connected to log of probability measures on the one hand and to entropy in the thermodynamic sense on the other, once relevant multiplicative factors and units such as the Boltzmann's constant in J/K are factored in. They are inextricably intertwined, in short. So, while I doubt Mung has done a serious stat mech course, his reading and intuition on solid state and vacuum physics tied to electronics and to info theory will be in the right ballpark. All that stuff about Fermi levels and thermionic emission and photo effect does tend to tie in together over time. That said, in an info era probably the easiest break-in point is going to be info, starting with the idea of a 2-state switch storing one bit. Thence, a 4-state string position in a D/RNA molecule stores 2 bits raw, adjusted for relative frequency as relevant across GCAT/U. The info is in effect stored in prong height, much like with a classic Yale type lock's key. (And BTW, that prong height approach is what von neumann put up for his self replicating machine thought exercise, in 1948.) We then point to the way proteins are synthesised, using a simplified model, drawings and vids, such as Vuk Nikolic's wonderful simulation. That can be correlated to the adapted form of the Shannon Comms model, source > encoder > transmission unit --> CHANNEL --> receiver unit > decoder > sink. This helps us see that a lot of co-ordinated components are needed for a comms framework to work. Onward, proteins including enzymes are the workhorse molecules of the cell. These rest on chained AA strings controlled by D/RNA codes and algorithms. To work right they have to fold to 3-d shapes partly controlled by the AA string sequences, and often partly assisted by chaperoning molecular machines. There are prions after all, misfolded more stable but destructive structures. Thousands of proteins are routinely involved in the operation of the cell, and they form a coherent whole based on the right parts in the right place at the right time. There is even a cellular post office and transport network using walking tractors. The integrated functionally specific complexity of this easily overwhelms the blind chance and mechanical necessity resources of our observed cosmos, being far, far beyond 1,000 bits. The only empirically warranted, analytically plausible explanation of the required FSCO/I is design. This puts design at the table of discussion for origins science, right from the root of the tree of life. There is therefore no reason to exclude it from discussion there or thereafter across the branches pointing to major body plans. Ideological a prioris rooted in evolutionary materialism and imposed on science and science education do not count as reasons. A similar point can be made regarding the fine tuning of physics to build a cosmos that supports the kind of C-Chemistry, aqueous medium, gated metabolism, molecular nanomachine, code and algorithm using cell based life we see and experience. Ideological game over. Thing is, this is riddled with information and thermodynamics issues tied onwards to likelihoods rooted in blind chance and necessity challenged to search vast configuration spaces. While thermodynamics and info issues are not there at the surface, they are just beneath it, and come up as objector talking points are trotted out. (Other than well poisoning attacks.) So, I suggest the best answer is to acknowledge the formal connexions and the reason behind them, but to emphasise what is most easy to see. KF kairosfocus
Mung,
We shall see that the concept of entropy, central to information theory as treated here, has at least a formal equivalence with the entropy of thermodynamics. (Brillouin, 1956. Jaynes, 1959).
I derived these relations and posted them here and elsewhere. Don't make false insinuations about me of not understanding the relationship of Shannon information and thermodynamic entropy. For example, I provided a derivation on April 2, 2014 relating information measures to thermodynamic entropy: Clausius, Boltzmann, Dembski. All you could do is quote, but I actually could do the derivation. Do you think you could do the same? Would you care to correct my analysis of the relevant differential equations? Explain for the readers where I went materially wrong in my math? :-)
So what’s so controversial about that statement?
What argument are you talking about? The information argument about OOL, the probability argument about OOL, the thermodynamic entropy argument about OOL? Shannon information is not the same as the information most important to OOL. There is for example prescriptive information, algorithmic information, specified information, etc. So the information argument isn't the same as the thermodynamic argument if one is discussing forms of information beyond Shannon. But that subtlety seems to have escaped you. The probabilities likewise involved in OOL are not the same probabilities associated with Shannon entropy nor thermodynamic entropy which only deals with energy microstates or position and momentum microstates, not things like the functional state of a system! Feel free to do a derivation that shows thermodynamic entropy (as would be derived by a college student in physics, engineering, or chemistry) will necessarily indicate if an object is functional. An object has on the order 10^7 J/K. Another object has only 10^1 J/K entropy. Explain for the reader which object is functional and why. So what's the probability the object with 10^7 J/K entropy is functional versus the object with 10^1 J/K entropy. You after all claimed:
there is no difference between the entropy argument and the information argument and the probability argument.
So what's the probability the object with 10^7 J/K entropy is functional not functional based on the entropy score alone. I'm now giving you a chance to defend your confused claim. You won't be able to, because you're wrong.
there is no difference between the entropy argument and the information argument and the probability argument.
So what's the probability something with approximately 10^7 J/K entropy is functional. Given your claim, you should be able to take 10^7 J/K and convert it to a probability of whether it is functional or not, whether it is living or not. Go ahead, Mung. :roll: I could convert that number into Shannon bits if I wanted. I could convert it into a probability. Think you can do the same? :-) HINT: I did a comparable derivation in the link! Do you think once you've converted that number in to a probability that it is the same probability as functionality? HINT: NO! Why? The thermodynamic probabilities are associated with the energy microstates, which say little or nothing about functionality. But you since you insist:
there is no difference between the entropy argument and the information argument and the probability argument.
By all means, show the reader you can actually put into practice what you claim. :roll: scordova
More reading material for Salvador: Information Theory and Coding
We shall see that the concept of entropy, central to information theory as treated here, has at least a formal equivalence with the entropy of thermodynamics. (Brillouin, 1956. Jaynes, 1959).
Mung
Macroscopic thermodynamics is here reexamined from the perspective of atomic-molecular theory. The thermodynamic concepts of entropy and equilibrium are thus invested with new meaning and implication... - Leonard K. Nash
Chapter 1 introduces and develops a statistical analysis of the concept of distribution - culminating in a very simple derivation of the Boltzmann distribution law and a demonstration of the relation of statistical and thermodynamic entropies. - Leonard K. Nash
Compared with the first edition, the present second edition offers in its opening chapter an analysis that is both much simpler and more decisive. This chapter also provide a brief but convincing demonstration of one crucial point merely announced in the first edition, namely: the enormous odds by which some distributions are favored over others. - Leonard K. Nash
Mung:
Salvador, there is no difference between the entropy argument and the information argument and the probability argument.
So what’s so controversial about that statement? Mung
Reading material for Salvador: Mathematical Foundations of Information Theory The first section of this book consists of The Entropy Concept in Probability Theory Mathematical Foundations of Statistical Mechanics Chapter 4 in this book is Reduction to the Problem of the Theory of Probability Elements of Statistical Thermodynamics Don't miss the section starting on page 33. The Concept of Entropy Mung:
Salvador, there is no difference between the entropy argument and the information argument and the probability argument.
So what's so controversial about that statement? Mung
Mung:
Salvador, there is no difference between the entropy argument and the information argument and the probability argument.
Salvador:
Your assertion is confused. What entropy are you talking about? What probability are you talking about? Until you specify these parameters, your statement is only valid for special cases and not true in general.
My assertion is not confused. You admitted in your post @8 that it is correct and that you agree with it. Here's what you wrote:
When I related Shannon, Dembski, Boltzmann and Clausius notions of entropy, I thought I demonstrated that entropy must necessarily increase for CSI to increase.
How did you manage to relate Shannon, Dembski, Boltzmann and Clausius notions of entropy?
What entropy are you talking about? What probability are you talking about? Until you specify these parameters, your statement is only valid for special cases and not true in general.
Now you're contradicting yourself and showing that you are the one that is confused. My argument is that all three of the above are generalizable, so asking me to explain which entropy or which probability is just missing the point.
Until you specify these parameters, your statement is only valid for special cases and not true in general.
And that's just absurd. Weaver:
The quantity which uniquely meets the natural requirements that one sets up for "information" turns out to be exactly that which is known in thermodynamics as entropy.
Mung
Henry, you don't realize how an incubator works. It starts pre-loaded. Trust me, unless you seed it with life, life won't come. revelator
F/N: And, Mapou should recognise that we are dealing with conceptual systems in math and so will have to take the infinite seriously and positively. KF kairosfocus
SalC: I suggest, as can be seen, that there are fundamental connexions between information and thermodynamics concerns in the context of FSCO/I. While -- as with mathematics -- it is not a simple matter to explore such, we do need to understand them in the face of a three or four way debate. Further to such, I suggest that we should accept that there is a difference between what are ABC basics for those getting a first exposure and what are foundational but sometimes abstruse matters. However, the simplified picture -- and visuals help -- of the connexions is also important as on a matter that has been the subject of misrepresentations, exploitation of equivocation, polarisation, wedge tactics and the like there is an unhealthy emphasis among objectors on trying to pose difficulties as attack points meant to discredit. The recent tactics on twisting the meaning and context of squares and circles in order to suggest that it is not a reasonably plain point that a square circle is an impossible being is a case in point. To address that required me to reluctantly go back to the empty set cascade to get the Reals [which, recall involves that place value numbers are infinite series . . . in order to get the continuum], inject complex numbers to specify an orthogonal axis and thus a plane, then on that basis of a definite plane, define squares and circles functionally, thence show from this the fundamental incompatibility. All along, simple common sense and respect for context would have been enough. And of course a year ago there was a related issue on whether it was self evidently axiomatic that parallel lines . . . which implies a plane . . . never meet. That one required using the plane to specify equations of form y = mx + c, with m the same and c values different. The point is, ABC basics can be approached on common good sense where there is good will, but when toxic hyperskeptical agendas break in, fundamental analysis may need to be deployed, even at 101 level. KF kairosfocus
PPS: Looks like some redefinitions of the SI system are in prospect, here is a clip from the draft brochure for SI:
Since the establishment of the SI in 1960, extraordinary advances have been made in relating SI units to truly invariant quantities such as the fundamental constants of physics and the properties of atoms. Recognising the importance of linking SI units to such invariant quantities, the XXth CGPM, in 20XX, adopted new definitions of the kilogram, ampere, kelvin, and mole in terms of fixed numerical values of the Planck constant h, elementary charge e, Boltzmann constant k, and Avogadro constant N_A, respectively.
Somehow, I preferred it the other way around. KF kairosfocus
PS: Wiki on Boltzmann's expression, here. (I recall an incident whent his first came up in a discussion, where an objector to design theory at another site assumed I did not know anything about what the expression meant, which speaks volumes about the assumptions of too many objectors, along lines of Dawkins' prejudice-laced taunt on you must be ignorant, stupid, insane or wicked.) kairosfocus
Mung, yes. Relative statistical weight of clusters of accessible microstates is viewed -- and often termed -- as thermodynamic probability, and high entropy corresponds to high probability when something is not otherwise constraining the system from going there. That is the message of Boltzmann's famous S = k ln W expression. W being a count of numbers of ways energy and mass can be arranged at micro levels consistent with the overall macroscopic state, and k being in effect a constant that converts from logs of a count to an energy related measure, using Joules per Kelvin in SI units . . . 1.38 * 10^-23 J/K . A log of ways metric is obviously rather close to a log or probability metric, i.e. an info metric. That is the line of thoughts developed elsewhere, though the formulation used is based on Gibbs' more general work, S = - k* sum of pi ln pi, which very directly corresponds to the Shannon entropy metric which is average info per symbol. In this case, the entropy is in effect measuring average missing info to specify microstate (specific energy and mass distribution) under a given macrostate, which is often presented in terms of being "degree of freedom." Freedom to vary under a condition rather patently corresponds to degree of uncertainty. High entropy corresponds to high uncertainty about specific microstate, and a tightly constrained state will be of low entropy. Cf. my longstanding discussion in my always linked through my handle sect a here, and app 1 here. KF kairosfocus
Speaking of Thaxton et al.
There is another way to view entropy. The entropy of a system is a measure of the probability of a given arrangement of mass and energy within it. A statistical thermodynamic approach can be used to further quantify the system entropy. High entropy corresponds to high probability (p. 116)
Mung
Will thermodynamic entropy ever be described in log base 2 Shannon units instead of natural log based Joules/Kelvin?
It's already been done. Thus my comment @35. Mung
KF asked: Have you ever read old books that talk in terms of Normals not Moles, for example? Electron volts, Angstroms, older radiation units such as Curies and Rads, etc? KF
HA! Anything beyond SI units causes panic for me. Even Thaxton was using (gasp) ergs! Electron volts is still used, it will probably always be used when dealing with electrons or subatomic particles... When taking aerospace engineering classes, so much of the Aerospace culture was steeped in English units -- feet, lbs, foot pounds, farenheit, etc. I gagged having to deal with all the conversions. Some non-SI persistence is likely. In the USA, temperature will be continue to be measured in Farenheit. As far as aviation, feet will probably be continued for describing altitude, not meters! Speed will continue to be measured in Knots! HA! Will thermodynamic entropy ever be described in log base 2 Shannon units instead of natural log based Joules/Kelvin? :-) scordova
I've posted a follow up discussion on a related topic. I've been ambivalent to the LCI (Law Conservation of Information) based on some of the ideas I've restated this new thread. It is an issue I raised with Bill Dembski on the old ISCID forum almost 10 years ago. The examples I put forward have still been a matter of irresolution in the ID community. Even supposing the LCI is true, the examples of increasing Rube Gold complexity in a thermodynamically closed system illustrate the difficulty (if not intractability) of applying LCI in practice for certain situations. https://uncommondescent.com/computer-science/rube-goldberg-complexity-increase-in-thermodynamically-closed-systems/ I occasionally offer caution in using certain ID arguments which go beyond simple. When I teach ID, I emphasize KISS (Keep it Simple Soldier), thermodynamics and LCI and information theory are very advanced topics -- tread with caution. As I said, I prefer basic probability and Humpty Dumpty illustrations. They get the point across forcefully. scordova
SalC: you have stated one convention -- as it is the one I prefer. There is another that tends to be used by engineers that uses closed much as we mean by isolated. Remember there is even another convention for work such that the equation for the 1st law changes its sign. 30 years ago, the older terminology was still common in text books etc. TMLO reflects that older convention -- those folks would have likely studied in the 1960's when the SI system even was not firmly established. Have you ever read old books that talk in terms of Normals not Moles, for example? Electron volts, Angstroms, older radiation units such as Curies and Rads, etc? KF kairosfocus
Piotr:
Incidentally, there’s no way to circumvent thermodynamics even if an intelligence takes part in the process. Intelligent beings are not exempted from the laws of physics. Intelligence can’t simply “decrease entropy” just like that without paying the thermodynamic bill.
Read #44. Eric Anderson
From http://www.bluffton.edu/~bergerd/nsc_111/thermo2.html
Open systems Open systems can exchange both matter and energy with an outside system. They are portions of larger systems and in intimate contact with the larger system. Your body is an open system. Closed systems Closed systems exchange energy but not matter with an outside system. Though they are typically portions of larger systems, they are not in complete contact. The Earth is essentially a closed system; it obtains lots of energy from the Sun but the exchange of matter with the outside is almost zero. Isolated systems Isolated systems can exchange neither energy nor matter with an outside system. While they may be portions of larger systems, they do not communicate with the outside in any way. The physical universe is an isolated system; a closed thermos bottle is essentially an isolated system (though its insulation is not perfect).
So the Earth is a closed system (approximately). The real question is whether it is at equilibrium, and the answer is "NO", not the least of the reasons is we have intelligent agents living on the Earth that can take the closed system energy flow and convert it into construction of new intelligently designed structures like cities and airplanes. Again, it is questionable that materialist argue for open systems in the first place, since a system being closed doesn't not in and of itself preclude change! Is it possible for such a closed system not at equilibrium to increase the amount of physical design even without a conscious intelligence? I'd say "yes" if is front loaded (a form of ID). I gave an example of intelligently designed front-loaded evolution in a thermodynamically closed system here: Paradox in Calculating CSI numbers for 2000 coins. The analogy is extensible to robots making more integratively complex robots than themselves (this can be done, just try it with software to see why!). Is it possible for a non-front loaded (not intelligently designed) closed system to spontaneously increase the amount of complex physical design? I'd say "NO" for reasons similar to the OOL problem. scordova
Piotr @50: Seriously? You know exactly what Sewell is saying. "In the absence of X" is a common type of phrasing in the English language. Let's address the real issues instead of nitpicking someone's phrase -- a phrase which anyone who isn't trying to be a hyperskeptic would easily understand. Eric Anderson
Mung wrote: Salvador:
When the term closed system is used in discussion of OOL, it means closed and near equilibrium.
That’s just silly. Eric, you’re too kind.
What I wrote was in reference to the section in Thaxton, Bradley, Olsen's book Mystery of Life's Origin starting at page 118. That book is the founding book along with Denton's book, for the modern ID movement. You're wrong, Mung. scordova
'coincidently' TED just loaded this video yesterday: David Chalmers: How do you explain consciousness? - video https://www.youtube.com/watch?v=uhRhtFFhNzQ "There are no coincidences in God's world." Anonymous bornagain77
Piotr, I when you say that,,,
Intelligent beings are not exempted from the laws of physics.
I suppose what you REALLY want to say is that 'intelligence' is an emergent property of a material basis. But you are sorely vexed to prove that intelligence, and consciousness in particular, (even with a brain that is shown to have more switches than all the computers and routers and Internet connections on Earth), can arise from a material basis. You simply have no evidence that it is possible for material to give rise to self awareness (i.e. consciousness). It is called the 'hard problem' of consciousness.
The Hard Problem (Of Consciousness) - video http://www.youtube.com/watch?v=VRG1fA_DQ9s 'But the hard problem of consciousness is so hard that I can't even imagine what kind of empirical findings would satisfactorily solve it. In fact, I don't even know what kind of discovery would get us to first base, not to mention a home run.' David Barash - Materialist/Atheist Darwinian Psychologist David Chalmers on Consciousness - (Philosophical zombies and the hard problem of consciousness) - video https://www.youtube.com/watch?v=NK1Yo6VbRoo
But the Theist is not vexed in such a way as the materialist is in giving an explanation for the 'hard problem', because the Theist presupposes consciousness to be primary and matter to be derivative:
“No, I regard consciousness as fundamental. I regard matter as derivative from consciousness. We cannot get behind consciousness. Everything that we talk about, everything that we regard as existing, postulates consciousness.” Max Planck (1858–1947), the originator of quantum theory, The Observer, London, January 25, 1931
And Planck has been empirically vindicated in spades by advances in quantum mechanics. Due to advances in quantum mechanics, the argument for God from consciousness can now be framed like this:
1. Consciousness either preceded all of material reality or is a 'epi-phenomena' of material reality. 2. If consciousness is a 'epi-phenomena' of material reality then consciousness will be found to have no special position within material reality. Whereas conversely, if consciousness precedes material reality then consciousness will be found to have a special position within material reality. 3. Consciousness is found to have a special, even central, position within material reality. 4. Therefore, consciousness is found to precede material reality. Four intersecting lines of experimental evidence from quantum mechanics that shows that consciousness precedes material reality (Wigner’s Quantum Symmetries, Wheeler’s Delayed Choice, Leggett’s Inequalities, Quantum Zeno effect): https://docs.google.com/document/d/1G_Fi50ljF5w_XyJHfmSIZsOcPFhgoAZ3PRc_ktY8cFo/edit Colossians 1:17 And he is before all things, and by him all things consist.
Of related interest in this discussion of entropy, and 'resurrection' from death, it is very interesting to note that Special Relativity and General Relativity have two very different ‘qualities of entropy' associated with them. In particular, Black Holes are found to be ‘timeless’ singularities of destruction and disorder rather than singularities of creation and order, such as the extreme (1 in 10^10^123) order we see at the creation event of the Big Bang.
Entropy of the Universe – Hugh Ross – May 2010 Excerpt: Egan and Lineweaver found that supermassive black holes are the largest contributor to the observable universe’s entropy. They showed that these supermassive black holes contribute about 30 times more entropy than what the previous research teams estimated. http://www.reasons.org/entropy-universe Roger Penrose – How Special Was The Big Bang? “But why was the big bang so precisely organized, whereas the big crunch (or the singularities in black holes) would be expected to be totally chaotic? It would appear that this question can be phrased in terms of the behaviour of the WEYL part of the space-time curvature at space-time singularities. What we appear to find is that there is a constraint WEYL = 0 (or something very like this) at initial space-time singularities-but not at final singularities-and this seems to be what confines the Creator’s choice to this very tiny region of phase space.” “Einstein’s equation predicts that, as the astronaut reaches the singularity (of the black-hole), the tidal forces grow infinitely strong, and their chaotic oscillations become infinitely rapid. The astronaut dies and the atoms which his body is made become infinitely and chaotically distorted and mixed-and then, at the moment when everything becomes infinite (the tidal strengths, the oscillation frequencies, the distortions, and the mixing), spacetime ceases to exist.” Kip S. Thorne – “Black Holes and Time Warps: Einstein’s Outrageous Legacy” pg. 476
Needless to say, the implications of this ‘eternity of destruction’ should be fairly disturbing for those of us who are of the ‘spiritually minded’ persuasion! In light of this dilemma that the two very different 'entropies' present to us spiritually minded people, and the fact that Gravity is, in so far as we can tell, completely incompatible with Quantum Mechanics,,,
A Capella Science – Bohemian Gravity! – video https://www.youtube.com/watch?v=2rjbtsX7twc
,,,in light of this dilemma, it is interesting to point out a subtle nuance on the Shroud of Turin. Namely that Gravity was overcome in the resurrection event of Christ:
Particle Radiation from the Body – July 2012 – M. Antonacci, A. C. Lind Excerpt: The Shroud’s frontal and dorsal body images are encoded with the same amount of intensity, independent of any pressure or weight from the body. The bottom part of the cloth (containing the dorsal image) would have born all the weight of the man’s supine body, yet the dorsal image is not encoded with a greater amount of intensity than the frontal image. Radiation coming from the body would not only explain this feature, but also the left/right and light/dark reversals found on the cloth’s frontal and dorsal body images. https://docs.google.com/document/d/19tGkwrdg6cu5mH-RmlKxHv5KPMOL49qEU8MLGL6ojHU/edit A Quantum Hologram of Christ’s Resurrection? by Chuck Missler Excerpt: “You can read the science of the Shroud, such as total lack of gravity, lack of entropy (without gravitational collapse), no time, no space—it conforms to no known law of physics.” The phenomenon of the image brings us to a true event horizon, a moment when all of the laws of physics change drastically. Dame Piczek created a one-fourth size sculpture of the man in the Shroud. When viewed from the side, it appears as if the man is suspended in mid air (see graphic, below), indicating that the image defies previously accepted science. The phenomenon of the image brings us to a true event horizon, a moment when all of the laws of physics change drastically. - pre khouse THE EVENT HORIZON (Space-Time Singularity) OF THE SHROUD OF TURIN. – Isabel Piczek – Particle Physicist Excerpt: We have stated before that the images on the Shroud firmly indicate the total absence of Gravity. Yet they also firmly indicate the presence of the Event Horizon. These two seemingly contradict each other and they necessitate the past presence of something more powerful than Gravity that had the capacity to solve the above paradox. http://shroud3d.com/findings/isabel-piczek-image-formation
Moreover, as would be expected if General Relativity and Quantum Mechanics-Special Relativity (QED) were truly unified in the resurrection of Christ from death, the image on the shroud is found to be formed by a quantum process. The image was not formed by a ‘classical’ process:
The absorbed energy in the Shroud body image formation appears as contributed by discrete values – Giovanni Fazio, Giuseppe Mandaglio – 2008 Excerpt: This result means that the optical density distribution,, can not be attributed at the absorbed energy described in the framework of the classical physics model. It is, in fact, necessary to hypothesize a absorption by discrete values of the energy where the ‘quantum’ is equal to the one necessary to yellow one fibril. http://cab.unime.it/journals/index.php/AAPP/article/view/C1A0802004/271 “It is not a continuum or spherical-front radiation that made the image, as visible or UV light. It is not the X-ray radiation that obeys the one over R squared law that we are so accustomed to in medicine. It is more unique. It is suggested that the image was formed when a high-energy particle struck the fiber and released radiation within the fiber at a speed greater that the local speed of light. Since the fiber acts as a light pipe, this energy moved out through the fiber until it encountered an optical discontinuity, then it slowed to the local speed of light and dispersed. The fact that the pixels don’t fluoresce suggests that the conversion to their now brittle dehydrated state occurred instantly and completely so no partial products remain to be activated by the ultraviolet light. This suggests a quantum event where a finite amount of energy transferred abruptly. The fact that there are images front and back suggests the radiating particles were released along the gravity vector. The radiation pressure may also help explain why the blood was “lifted cleanly” from the body as it transformed to a resurrected state.” Kevin Moran – optical engineer Scientists say Turin Shroud is supernatural – December 2011 Excerpt: After years of work trying to replicate the colouring on the shroud, a similar image has been created by the scientists. However, they only managed the effect by scorching equivalent linen material with high-intensity ultra violet lasers, undermining the arguments of other research, they say, which claims the Turin Shroud is a medieval hoax. Such technology, say researchers from the National Agency for New Technologies, Energy and Sustainable Economic Development (Enea), was far beyond the capability of medieval forgers, whom most experts have credited with making the famous relic. “The results show that a short and intense burst of UV directional radiation can colour a linen cloth so as to reproduce many of the peculiar characteristics of the body image on the Shroud of Turin,” they said. And in case there was any doubt about the preternatural degree of energy needed to make such distinct marks, the Enea report spells it out: “This degree of power cannot be reproduced by any normal UV source built to date.” http://www.independent.co.uk/news/science/scientists-say-turin-shroud-is-supernatural-6279512.html
I consider the preceding ‘quantum’ nuance on the Shroud of Turin to be a subtle, but powerful, evidence substantiating Christ’s primary claim as to being our Savior from sin, death, and hell:
John 8:23-24 But he continued, “You are from below; I am from above. You are of this world; I am not of this world. I told you that you would die in your sins; if you do not believe that I am he, you will indeed die in your sins. Matthew 10:28 “Do not fear those who kill the body but are unable to kill the soul; but rather fear Him who is able to destroy both soul and body in hell.
bornagain77
BA77 Nope, you can't outsmart thermodynamics. The formulation "to convert information into energy" is misleading if it makes you think that energy is somehow "created" out of nothing by non-material information. The experimenters used the measurement process itself to transport energy, and to gain an excess of free energy in comparison with the conventional limitations of the second law, but since information is not "bodiless" or cost-free, the total system violated no known laws. Here is a quotation from the original article by Masaki Sano and his team (Nature Physics 6, 2010, p. 990):
As the energy converted from information is compensated for by the demon's energy cost to manipulate information [2-4], the second law of thermodynamics is not violated when the total system including both the particle and demon is considered. In our system, the demon consists of macroscopic devices such as computers; the microscopic device gains energy at the expense of the energy consumption of a macroscopic device.
Piotr
Piotr: Have you ever seen something refined from the dust of the earth, then made into components then assembled into an FSCO/I rich entity? Observe, again, the vat thought exercise above. And no-one anywhere said or implied that in our world work is got thermodynamically "free." Heat engines and energy conversion devices invariably carefully couple an energy source to a conversion system and by one means or other exhaust wastes. In the course of the wider process a heat reservoir will be involved -- start with good old Sol at 5700K -- and at the other end heat will be dissipated to a sink, typically ambient environment at about 300 K. Even, a windmill is like that. The inflow wind comes in the end from atmospheric convection cells, and those from solar radiation from sol. The Betz limit is set by a need to exhaust fluid to keep turning, 59.3% of air flow kinetic energy. Steam turbines get around that by using condensation, but that is an obvious thermal process. If you have ever bathed in the hot spot created by a power plant's heat exhaust to a water body, you know where that is going. Heat engines, patently, lie behind Boeing's assembly lines and the heat flows net add to cosmic entropy where also Boeing's factories are open systems but heat flows as such don't explain the 747, there is organised design based work involved. The basic problem is again showed by the vat challenge, to get FSCO/I for free from molecular motion is faced by the scale of config spaces and the isolation of zones of function. Notice the obvious difference between what molecular motion and planned work can do. As for the idea of scattered molecular etc parts formerly part of a living organism coming back to form one again, the hypothetical raised by Sewell is contrasting Time's arrow with the story commonly presented on OOL and OO body plans by alleged spontaneous processes. Just as with the computer, the life form from a single celled one on up is FSCO/I rich. We need to get to a metabolic automaton with gating integrating a self replication facility. Both aspects of a living organism are overflowing with the same sort of FSCO/I as in a computer. Including, coded strings, execution machines and algorithms. So, with a rhetorical flourish, Sewell presents the challenge of reversing time's arrow in two paralleled cases, in intentionally repetitive phrasing. Phrasing backed by an outline of the underlying statistical reasoning that grounds the second law. Now too, there is another, philosophical side to your remark. Unless you know ahead of time that something like a creation of a cosmos and organisation of planets populated by life by any means is impossible by intelligence, you have no proper right to assert:
there’s no way to circumvent thermodynamics even if an intelligence takes part in the process. Intelligent beings are not exempted from the laws of physics. Intelligence can’t simply “decrease entropy” just like that without paying the thermodynamic bill.
FYI, the Maxwell Demon model shows how intelligence aware of actual state at micro level could in principle extract work thermodynamically free. The usual way around it, is to look for separate processes that go with info acquisition and expression, leading to coupled compensating processes that lead to higher net entropy. But that is based on an entity that is so constrained. If an entity -- say, a creative mind of a different order -- is not so constrained then a different result would obtain. If such a mind is ontologically possible then that is a candidate to explain the origin of a contingent, entropy-bound cosmos. Where, a serious philosophical possibility is that our cosmos is the product of a mind, a necessary being beyond it -- much as that fundy Bible thumping nutter . . . NOT . . . Plato inferred to, 2350 Years ago in The Laws Bk X. Based on the organisation of the cosmos, as we may see from the exchange between the Athenian Stranger and Clenias:
Ath. Then, by Heaven, we have discovered the source of this vain opinion of all those physical investigators; and I would have you examine their arguments with the utmost care, for their impiety is a very serious matter; they not only make a bad and mistaken use of argument, but they lead away the minds of others: that is my opinion of them. Cle. You are right; but I should like to know how this happens. Ath. I fear that the argument may seem singular. Cle. Do not hesitate, Stranger; I see that you are afraid of such a discussion carrying you beyond the limits of legislation. But if there be no other way of showing our agreement in the belief that there are Gods, of whom the law is said now to approve, let us take this way, my good sir. Ath. Then I suppose that I must repeat the singular argument of those who manufacture the soul according to their own impious notions; they affirm that which is the first cause of the generation and destruction of all things, to be not first, but last, and that which is last to be first, and hence they have fallen into error about the true nature of the Gods. Cle. Still I do not understand you. Ath. Nearly all of them, my friends, seem to be ignorant of the nature and power of the soul [[ = psuche], especially in what relates to her origin: they do not know that she is among the first of things, and before all bodies, and is the chief author of their changes and transpositions. And if this is true, and if the soul is older than the body, must not the things which are of the soul's kindred be of necessity prior to those which appertain to the body? Cle. Certainly. Ath. Then thought and attention and mind and art and law will be prior to that which is hard and soft and heavy and light; and the great and primitive works and actions will be works of art; they will be the first, and after them will come nature and works of nature, which however is a wrong term for men to apply to them; these will follow, and will be under the government of art and mind. Cle. But why is the word "nature" wrong? Ath. Because those who use the term mean to say that nature is the first creative power; but if the soul turn out to be the primeval element, and not fire or air, then in the truest sense and beyond other things the soul may be said to exist by nature; and this would be true if you proved that the soul is older than the body, but not otherwise. [[ . . . .] Ath. . . . when one thing changes another, and that another, of such will there be any primary changing element? How can a thing which is moved by another ever be the beginning of change? Impossible. But when the self-moved changes other, and that again other, and thus thousands upon tens of thousands of bodies are set in motion, must not the beginning of all this motion be the change of the self-moving principle? . . . . self-motion being the origin of all motions, and the first which arises among things at rest as well as among things in motion, is the eldest and mightiest principle of change, and that which is changed by another and yet moves other is second. [[ . . . .] Ath. If we were to see this power existing in any earthy, watery, or fiery substance, simple or compound-how should we describe it? Cle. You mean to ask whether we should call such a self-moving power life? Ath. I do. Cle. Certainly we should. Ath. And when we see soul in anything, must we not do the same-must we not admit that this is life? [[ . . . . ] Cle. You mean to say that the essence which is defined as the self-moved is the same with that which has the name soul? Ath. Yes; and if this is true, do we still maintain that there is anything wanting in the proof that the soul is the first origin and moving power of all that is, or has become, or will be, and their contraries, when she has been clearly shown to be the source of change and motion in all things? Cle. Certainly not; the soul as being the source of motion, has been most satisfactorily shown to be the oldest of all things. Ath. And is not that motion which is produced in another, by reason of another, but never has any self-moving power at all, being in truth the change of an inanimate body, to be reckoned second, or by any lower number which you may prefer? Cle. Exactly. Ath. Then we are right, and speak the most perfect and absolute truth, when we say that the soul is prior to the body, and that the body is second and comes afterwards, and is born to obey the soul, which is the ruler? [[ . . . . ] Ath. If, my friend, we say that the whole path and movement of heaven, and of all that is therein, is by nature akin to the movement and revolution and calculation of mind, and proceeds by kindred laws, then, as is plain, we must say that the best soul takes care of the world and guides it along the good path. [[Plato here explicitly sets up an inference to design (by a good soul) from the intelligible order of the cosmos.]
We ought not to beg big metaphysical questions in our physics, That is one reason why I learned to deeply respect my old Sears thermodynamics text, as it carefully pointed out that it is disputable that the observed cosmos is an isolated system. KF kairosfocus
Piotr you state
No, intelligence doesn’t reverse death and subsequent decay, as far as I’m aware.,,, Intelligent beings are not exempted from the laws of physics.
Funny that every time you write a single sentence of functional information on this blog you are 'intelligently' doing something that the laws of physics are grossly incapable of doing, i.e. creating information. But more to the point that is being discussed, it is shown that,,,
Maxwell's demon demonstration (knowledge of a particle's position) turns information into energy - November 2010 Excerpt: Until now, demonstrating the conversion of information to energy has been elusive, but University of Tokyo physicist Masaki Sano and colleagues have succeeded in demonstrating it in a nano-scale experiment. In a paper published in Nature Physics they describe how they coaxed a Brownian particle to travel upwards on a "spiral-staircase-like" potential energy created by an electric field solely on the basis of information on its location. As the particle traveled up the staircase it gained energy from moving to an area of higher potential, and the team was able to measure precisely how much energy had been converted from information. http://www.physorg.com/news/2010-11-maxwell-demon-energy.html Demonic device converts information to energy - 2010 Excerpt: "This is a beautiful experimental demonstration that information has a thermodynamic content," says Christopher Jarzynski, a statistical chemist at the University of Maryland in College Park. In 1997, Jarzynski formulated an equation to define the amount of energy that could theoretically be converted from a unit of information2; the work by Sano and his team has now confirmed this equation. "This tells us something new about how the laws of thermodynamics work on the microscopic scale," says Jarzynski. http://www.scientificamerican.com/article.cfm?id=demonic-device-converts-inform “It is CSI (Complex Specified Information) that enables Maxwell’s demon to outsmart a thermodynamic system tending toward thermal equilibrium” William Dembki Intelligent Design, pg. 159
bornagain77
KF:
he points out that death and subsequent decay are thermodynamically expected, and it would take intelligence to reverse this in light of the balance of possibilities.
No, intelligence doesn't reverse death and subsequent decay, as far as I'm aware. So why does Sewell say "in the absence of intelligence" when it would be more accurate to say "even with the help of intelligence"? I suppose he wants to insinuate that death and subsequent decay could be reversed if an intelligence were involved, i.e. that the resurrection of a living organism from "simple organic and inorganic compounds" is technically possible for an intelligent agent. Does Sewell suggest how this could be done? Incidentally, there's no way to circumvent thermodynamics even if an intelligence takes part in the process. Intelligent beings are not exempted from the laws of physics. Intelligence can't simply "decrease entropy" just like that without paying the thermodynamic bill. Piotr
Piotr: he points out that death and subsequent decay are thermodynamically expected, and it would take intelligence to reverse this in light of the balance of possibilities. Observe his context, again -- which in your case is a serious challenge, on recent track record of equivocation-based side tracking:
. . . The second law is all about probability, it uses probability at the microscopic level to predict macroscopic change: the reason carbon distributes itself more and more uniformly in an insulated solid [--> i.e. diffuses] is, that is what the laws of probability predict when diffusion alone is operative. The reason natural forces may turn a spaceship, or a TV set, or a computer into a pile of rubble but not vice-versa is also probability: of all the possible arrangements atoms could take, only a very small percentage could fly to the moon and back, or receive pictures and sound from the other side of the Earth, or add, subtract, multiply and divide real numbers with high accuracy. The second law of thermodynamics is the reason that computers will degenerate into scrap metal over time, and, in the absence of intelligence, the reverse process will not occur; and it is also the reason that animals, when they die, decay into simple organic and inorganic compounds, and, in the absence of intelligence, the reverse process will not occur.
KF kairosfocus
PPS: The B/W balls example is of course directly related to the spontaneous formation of test strings as this is a 1 bit per ball case. If you will, let us relieve a constraint and pass over to a string of 20 coins, in B/W for H/T. Then we can look at odds to get to text. The FSCO/I threshold comes in at 500 - 1,000 such coins, to be conservative. We can set out a config space for 500 coins, simply line up from BBBBB . . . to WWWWW . . . If we do so, we will see there are now looking at 3.27*10^150 possibilities. If we were to give each of the 10^57 atoms of our solar system a tray of 500 such coins, and to toss and examine somehow every 10^-14 s [about as fast as chem rxns get and certainly generous for organic rxns], then toss for 10^17 s, we would explore a small fraction only of the space of possibilities. One, comparable to picking at random a 1 straw size sample of a cubical haystack 1,000 LY across. Such a sparse, blind sample would be maximally unlikely to pick up rare clusters of special sites -- needles if you will. Or to make this more vivid, superimpose on our galactic neighbourhood, and then pick. By overwhelming odds, the straw sized sample will reliably pick up straw and nothing else. Too much stack and too few needles. That is an excellent reason why FSCO/I is so hard to find spontaneously by blind search. And if you think in terms of oh, we have a lucky search somehow, the search for search space is based on that searches are samples of subsets of a space. So, we see the S4S search being from a space of scale the power set of the 3.27*10^150 possibilities above. That is, 2^ [3.27*10^150]. A blind search in such a space for a good enough search is far far worse than a blind search in the original space. So if you imagine that spontaneously we happen to be in a cosmos so configured that the physics has life written in by happenstance, you are looking at a lone fly onteh wall section swatted by a bullet problem. No matter if elsewhere there are carpeted sections of wall with flied solidly piled up, the local isolation is enough, as Leslie pointed out. That is, as the parameters of our physics give us reason to believe we are at a deeply isolated operating point that enables a life friendly cosmos, we need to find a reasonable explanation for that. So far the only good one -- unpalatable though it obviously is for an establishment orthodoxy wedded to a priori materialism -- is design. kairosfocus
KF (@48), quoting Sewell:
and it is also the reason that animals, when they die, decay into simple organic and inorganic compounds, and, in the absence of intelligence, the reverse process will not occur.
Does he mean that in the presence of intelligence decaying corpses turn back into living animals? Piotr
PS: Note what Thaxton et al noted in the 1984 TMLO ch 7 before going on to Ch's 8 - 9 which brought the spontaneous formation of proteins and D/RNA into serious question on thermodynamics concerns:
While the maintenance of living systems is easily rationalized in terms of thermodynamics, the origin of such living systems is quite another matter. Though the earth is open to energy flow from the sun, the means of converting this energy into the necessary work to build up living systems from simple precursors remains at present unspecified (see equation 7-17). The “evolution” from biomonomers of to fully functioning cells is the issue. Can one make the incredible jump in energy and organization from raw material and raw energy, apart from some means of directing the energy flow through the system? [--> notice, therefore a system opened up to energy flows] In Chapters 8 and 9 we will consider this question, limiting our discussion to two small but crucial steps in the proposed evolutionary scheme namely, the formation of protein and DNA from their precursors. It is widely agreed that both protein and DNA are essential for living systems and indispensable components of every living cell today.11 Yet they are only produced by living cells. Both types of molecules are much more energy and information rich than the biomonomers from which they form. Can one reasonably predict their occurrence given the necessary biomonomers and an energy source? Has this been verified experimentally? These questions will be considered . . .
Their answers were that the reasonable concentration of such to form would be far below one per planet. kairosfocus
GD (attn SalC): In regards to your 43 above [and SalC I note 8 & 10 above], kindly cf. 20 above. Also cf. this note of almost exactly a year ago based on my longstanding note linked through my handle. Let me clip a key comment by Sewell, which is quite apt but liable to be distorted or misunderstood given the grip the open system argument seems to have on thinking about the matter:
. . . The second law is all about probability, it uses probability at the microscopic level to predict macroscopic change: the reason carbon distributes itself more and more uniformly in an insulated solid [--> i.e. diffuses] is, that is what the laws of probability predict when diffusion alone is operative. The reason natural forces may turn a spaceship, or a TV set, or a computer into a pile of rubble but not vice-versa is also probability: of all the possible arrangements atoms could take, only a very small percentage could fly to the moon and back, or receive pictures and sound from the other side of the Earth, or add, subtract, multiply and divide real numbers with high accuracy. The second law of thermodynamics is the reason that computers will degenerate into scrap metal over time, and, in the absence of intelligence, the reverse process will not occur; and it is also the reason that animals, when they die, decay into simple organic and inorganic compounds, and, in the absence of intelligence, the reverse process will not occur.
We know from direct observation that design is possible, that organised work creating complex, functionally specific and otherwise utterly implausible objects is possible, and that the only empirically observed source of FSCO/I is design. Where, the search space challenge I recently summarised here [note the info-graphic] shows why that is credibly so. Given what Venter et al have been doing, and given the demonstration that molecular nanotech engineering is possible, it is quite plain that at minimum, if an advanced molecular nanotech lab were available, it could engineer OOL and origin of body plans. Thermodynamics does not block that from having happened, no more than it blocks Venter et al from doing what they have been doing. What is really at stake then is not the capability of designers, but whether we are reasonably entitled to infer from the manifestation of FSCO/I to design as most reasonable explanation of the origin of life and body plans. Arrayed against this has been a 150 year old theory of evolution, and suggestions, speculations and models since the 1920's that build on Darwin's little note on possibilities for a warm little pond. Cell based life is chock full of FSCO/I and body plans are likewise chock full of even more. Just on genome length to make the relevant molecular bricks, we are looking at 100 - 1,000 kbits of genetic info for first life based on cells, and at 10 - 100+ mn bits each for major body plans. And the various mechanisms boil down to blind chance and mechanical necessity processes being held to incrementally account for the emergence of life and body plans through spontaneous processes. That is or directly implies a claim that FSCO/I can and does emerge by blind chance and mechanical necessity. But as my second linked above -- note the info-graphic -- points out, there is a major search space challenge to all such claims. One, backed up by the bald fact of observation, that the ONLY source of FSCO/I we have actually -- and routinely [think, posts in this thread] -- seen is design. When we look at the search space picture, we see that this is in fact close to the configuration/phase space type picture used in formulating the statistical undergirding of thermodynamics, and in particular the second law. The reason that there is a strong tendency for entropy to increase is that on examining clusters of accessible microstates consistent with given large scale conditions, there is an overwhelming statistical weight of states that move us towards higher disorganisation, etc. So, if for instance in a toy example of a string of black and white marbles, 10 each on two sides of a partition: || BBBBBBBBBB | WWWWWWWWWW || . . . we were to now allow interactions such that the balls spontaneously can jump sides -- that's an energy barrier to be surmounted -- and rearrange the patterns, we would easily see that there is one way to be 10B + 10 W, but there is a much larger cluster of ways . . . peak . . . to be 5B5W on both sides, or near to that 4B6W,or 6B4W . . . near-peak . . . on the sides; based on numbers of ways to arrange 10 things, 5 each identical etc. The natural state of such a system is to tend towards the clusters with higher numbers of possibilities, as could be seen from considering spending about the same time in each of the possible states and circulating at random. This lends some insight into the notion of increasing disorder etc. So, now when we consider that for FSCO/I to form, particular components in particular orientations must be arranged in a particular nondes-arcs pattern and must be so arranged as to work to fulfill a specific function, we can see why such clusters of states will be rare in the space of possibilities in even a warm little pond. A cubic metre of water would have 10^18 1-micron cells, and the number of ways say 1 mn parts of that size could be distributed throughout is astonishingly large. For such parts to clump and arrange themselves into say a flyable micro-jet by blind forces of molecular agitation -- I deliberately chose a scale that Brownian motion becomes relevant at -- would be far beyond astronomically remote. In short, s_dispersed >> s_clumped >> s_functional Shining sunlight on such a pond, or lighting a fire under it, or blowing breezes on it, or shaking it or the like would make but little difference to the basic search problem. Nor would decanting in say the parts of a micro-submarine etc. What would make a drastic difference would be to send in an army of nanobots with search and organisation programming. Such would find the parts, and arrange them in order then assemble to build the micro jet, much as is done at macro-scale by Boeing etc. That is, intelligent design. But, but, but, cell based life reproduces and so can undergo Darwinian Evolution! First and foremost, not at origin of life the required von Neumann self replicating facility is a big part of what is to be explained. Which, is code based, FSCO/I rich and irreducibly complex. Yes, when such is coupled to a gated, encapsulated, metabolising automaton, it demonstrably can self-replicate, but that rather begs the first question, how we get there in the face of the spontaneous generation challenge in the face of the configurational possibilities for scattered and clumped states. Recall, a flyable microjet is an OBSERVABLE different state at macro-level, the issue is to get to it. A suggestion often made is that there is a simple first replicator that then somehow incrementally tacks on the relevant bits and pieces and functions until we get to cell based life. Too often, presented as practically certain fact. Problem is, first, never observed, mere speculation backed up by lab coat clad evolutionary materialism. Second, as anyone who has build things of any complexity will tell you, complex function depends on specific arrangement of well matched parts. Given the space of possibilities, that is not to be presumed upon, and it is therefore maximally implausible that such an smooth incremental pathway exists. Those who propose such need to demonstrate it, if they wish to be taken seriously. We did not and cannot observe first life forming, so we need to be prudent and explain on known, observed, seen to be adequate causal factors. Secondly, between the rarity of functional incremental improvements, the issue of isolation of island of function in the space of configs, the credible need to cross intervening zones of non-function, the pop genetics of fixing mutations etc etc, the burden of accumulating deleterious changes etc, there is no good reason to hold as practically certain fact that blind chance and mechanical necessity can and did account for body plan level biodiversity beyond. All, connected to the underlying statistical reasoning as outlined. What then accounts for the confidence and firm position that science has given assurance that spontaneous OOL and OO body plans has happened? Locking out the only actually observed source of FSCO/I, derided as not scientific, an unwanted religious intrusion, god of gaps reasoning, etc etc. That is ideological imposition, not cogent inductive reasoning. And, thermodynamic considerations closely tied to what the principles of the second law have to say about the spontaneous formation of FSCO/I through blind chance and mechanical necessity are closely connected. KF kairosfocus
Sal @42:
The only thing their “open” system is closed to is the possibility of design.
Well said, and priceless. I may steal that one. :) Eric Anderson
Gordon Davisson @43:
And that means that if your argument against evolution depends on claiming that the second law forbids such a decrease… you need to get a new argument.
Quite true. Fortunately none of us are making such an argument. :) In addition, although there might be some kook out there who is making such an argument, but I am not aware of any prominent ID proponent making such an argument. What I do think sometimes happens is that some evolution critics make arguments with less-than-clear precision of terminology, which then allows their opponent to knock down a straw man. There are some sound and rational arguments, or at least doubt-inducing questions, about how alleged evolutionary mechanisms could produce the systems we see, given thermodynamic considerations. Unfortunately, these questions are often deflected with a broad-brush dismissal by people who think the issues are simply related to energy transfer into or out of some arbitrarily-defined "open system." Eric Anderson
Eric,
I understand what you’re saying about the translation aspect. But I’m not sure I would say that thermodynamics are irrelevant.
I accept your correction. The 2nd Law is irrelevant (has no determining role) in the establishment of specific effects from the translation of genetic information. Upright BiPed
UB @41:
Any system that translates the arrangement of an informational medium into a physical effect must as a matter of physical necessity – in order to function – preserve the natural discontinuity between the arrangement of that medium and its post translation effect. This means that the output of translation is not derived from the input via inexorable law; it is only derivable from the organization of the organic systems that translate information.
Exactly. Indeed, law-like processes are anathema to the creation of information-rich systems. Such natural processes on their own cannot, by definition, produce such systems.
The systems that create life and evolution do not violate the 2nd Law; the 2nd Law is irrelevant to those systems.
I understand what you're saying about the translation aspect. But I'm not sure I would say that thermodynamics are irrelevant. Thermodynamics always hold everywhere and do what they do, including driving toward equilibrium. As a result, systems have to be engineered in a way to temporarily maintain a far-from-equilibrium state. It is kind of like gravity -- when we build an airplane that can lift thousands of feet off of the surface of the Earth, it is not that gravity has become irrelevant, rather that we have used engineering principles and aerodynamics to temporarily counterbalance the effect of gravity. Gravity is still very much operable. Indeed, it is taken into account in the engineering of the airplane. In a similar way, any designed system needs to take into account thermodynamics. A far-from-equilibrium system is still beholden to thermodynamics, but if properly engineered the system can maintain that far-from-equilibrium state for a long time. There are many far-from-equilibrium systems in living organisms, including the information processing systems you mentioned. Thermodynamics, for the most part, are working against such systems. So it is only through carefully coordinated and specified systems and engineering principles that the ordinary effects of thermodynamics are temporarily countered or held at bay in order to create and maintain life. I haven't fully parsed arguments like those put forward by Sewell, but from what I gather, the above is really the point he is trying to make, even if expressed in a way that might confuse some people. Unfortunately, many critics misunderstand the point entirely and then make silly statements about "Earth is an open system" and so forth that don't even address the issues in question. You are right, of course, that living systems don't violate the 2nd law. Nothing does. But they do have to contend with it and, often, have to implement ingenious ways to temporarily counteract its otherwise inexorable march toward lifeless equilibrium. Eric Anderson
Let me try to go back to Barry Arrington's original question, and answer it as directly as possible: * The second law of thermodynamics applies (as far as we know) to every real physical system. * The isolated-system formulation of the second law does not apply directly to any real physical system. The parts in bold make all the difference. Let me take them in reverse order: Applying the isolated-system formulation indirectly: Even though there are no truly isolated systems, we can consider what would happen if a real process/system/etc were to be embedded in an isolated system, and work out what the consequences would be. For example, suppose someone claims to have invented a gadget that transfers heat from a warmer object to a cooler object, and has no other effects (no waste heat, no batteries running down, etc). If we were to take this system, attach it to two large rocks (or other heat resevoirs) at appropriate temperatures, and isolate those (the gadget and the two rocks), we can calculate that the entropy of that isolated system would decrease. Thus, the existance of such a gadget would imply that violations of this formulation of the second law are possible -- even if it isn't directly violating it right at the moment -- and hence that the second law is wrong. Now, we know (or at least have really good evidence that) the second law isn't wrong, so such a gadget is impossible and may be said to indirectly violate this formulation of the second law. Other formulations of the second law: There are many different formulations of the second law, with widely varying applicability. Here are a few examples: * The entropy of an isolated system cannot decrease. * The Helmholtz free energy (defined as a system's internal energy minus its temperature times its entropy) of a system at constant temperature that only interacts with its surroundings by exchanging heat (at that constant temp) cannot increase. (Note that it's entirely normal for such a system's entropy to decrease.) * The Gibbs free energy (defined as its Helmholtz free energy plus its pressure times its volume) of a system at constant temperature and pressure that only interacts with its surroundings by exchanging heat (at that constant temp) and/or expanding/contracting cannot increase. (Note that it's entirely normal for such a system's entropy to decrease and/or its Helmholtz free energy to decrease.) ...hmm, not particularly relevant to Earth or evolution, are they? There's another, far more general, formulation that'll allow us to apply it much more directly: * In (almost) any system, the entropy change will be greater than or equal to its entropy flux. Essentially, if you think of entropy as a "thing" like water (it isn't, but it mostly acts like one), you can state the second law as: "entropy can be produced, and can move around, but cannot be destroyed". The only way a system's entropy can decrease is if there's more entropy leaving it than entering (and in that case, the decrease is limited to the difference between efflux and influx). Note that this doesn't say that a system's entropy will increase if there's more entropy leaving than entering, just that the second law doesn't forbid it. In order to know if it'll actually decrease, you'd have to know how fast it's being produced inside the system; and the second law doesn't say anything about that. Ok, so let's try applying this to the Earth: As I calculated in this previous comment (see near the end), the entropy flux from Sunlight reaching Earth is about 3.83e13 J/K per second, and the entropy flux of the thermal radiation leaving Earth is at least 3.7e14 J/K per second. A number of people have pointed out that adding energy to a system (e.g. the sunlight reaching Earth) actually increases the system's entropy. This is correct. But you also have to take into account the energy leaving the system decreases its entropy, and in the case of Earth there's at least 10 times as much leaving as entering. Again, this doesn't mean that the Earth's entropy will actually decrease, or that the decrease (if any) will involve evolution, or that evolution is possible. It just means that the second law doesn't forbid entropy decreases on Earth. And that means that if your argument against evolution depends on claiming that the second law forbids such a decrease... you need to get a new argument. Gordon Davisson
My point is that “open system” seems to be a very amorphous concept.
It is correct to say the universe on the whole is closed system, but the last resort is to say it is not yet at final equilibrium (aka it is still changing and evolving in some places), so in that sense certain pockets can be modeled as "open". Once a system is truly closed and changeless, then it has stopped evolving and if there is no life in the system, there never will be. This has been universally acknowledged even by materialists. Their last hope is some sort of "open" system (aka, one that can still change), or open number of multiverses. The only thing their "open" system is closed to is the possibility of design. scordova
Hi Eric, There is a fundamental issue that our critics habitually overlook; the complexity of life is the direct result of translated information. Any system that translates the arrangement of an informational medium into a physical effect must as a matter of physical necessity - in order to function - preserve the natural discontinuity between the arrangement of that medium and its post translation effect. This means that the output of translation is not derived from the input via inexorable law; it is only derivable from the organization of the organic systems that translate information. In other words, the very thing (i.e. the translation of information) that organizes the living cell, and allows heritable evolution, requires a local independence from inexorable law. This includes the 2nd Law of Thermodynamics. The systems that create life and evolution do not violate the 2nd Law; the 2nd Law is irrelevant to those systems. Upright BiPed
Salvador, there is no difference between the entropy argument and the information argument and the probability argument.
Your assertion is confused. What entropy are you talking about? What probability are you talking about? Until you specify these parameters, your statement is only valid for special cases and not true in general. hint: Thermodynamic entropy can be expressed as a Shannon entropy, but that doesn't mean all possible Shannon entropies can be expressed as thermodynamic entropies, a good example is the Shannon entropy associated with the information content of DNA -- it has NOTHING to do with thermodynamic entropy. You're totally misunderstanding the relation of Shannon entropy and thermodynamic entropy, and thus this statement, except for special cases is meaningless.
Salvador, there is no difference between the entropy argument and the information argument and the probability argument.
What probability argument are you talking about, probability of OOL, something being alive or finding a given thermodynamic energy microstate. They are not the same thing, and without specifying what you mean, your statement is meaningless at best and wrong at worst. scordova
Eric, Thanks for your comments. I suspected that the whole thing was a rhetorical game. As the OP, I hope, demonstrates, there is no limiting principle to the assertion. And when there is no limiting principle, watch out for mischief. Sal, Thanks for your comments as well, but you’ve misunderstood the point of the OP. I am not qualified to address the “entropy is/is not an obstacle to blind watchmaker evolution” question, and the OP does not take a position on the issue. My point is that “open system” seems to be a very amorphous concept. Barry Arrington
Salvador:
It is pretty much agreed, a closed system near equilibrium won’t yield new life.
So? Mung
Salvador:
When the term closed system is used in discussion of OOL, it means closed and near equilibrium.
That's just silly. Eric, you're too kind. Mung
Sal @32: You raise an interesting point. Let me see if I can parse it a bit.
When the term closed system is used in discussion of OOL, it means closed and near equilibrium.
I'm not sure that is what most people mean. The evolutionary retort from most people, certainly those folks Sewell has been arguing with, does not really have anything to do with equilibrium. Indeed, the Earth is most definitely not an at-equilibrium system, even without the Sun. There are lots of options for energy fluctuations within the system (volcanic vents, radioactive decay, hydrothermal vents, etc.). Eventually, in perhaps a few billion years, the Earth might reach equilibrium (ignoring the Sun). But that has never been the case on Earth and is not assumed for purposes of OOL. Of course we could just redefine our "system" to be a particular pool of water or a particular mud globule, sans the nearby energy, and then proclaim that our tiny system is an "open system" because it is receiving energy from the nearby volcanic vent or the geyser or the lightning strike. So, yes, there has to be energy to perform the work, but whether that energy is already within the system or is coming into the system from outside is purely a rhetorical definitional game. The only reason the people arguing with Sewell, for example, bring up the "Earth is an open system" refrain is not because they understand some nuance about systems at equilibrium, but because they completely misunderstand Sewell's argument about the non-thermal requirements to get life off the ground.
Thus the openness of the system is important, but not specifically to reduce entropy, but to allow an external organizing influence. But being open is only a necessary condition, it is not a sufficient condition.
I understand what you're saying, and you make a good point about the organizing influence (in the sense of specified complexity type of organization). But again the open-v-closed terminology is a definitional game, nothing more. That is why we must keep our eye on the ball. The open/closed terminology is an utter red herring. Any time we hear someone talking about an open-v-closed system as though it teaches us something about how life could arise, we can immediately know -- as a matter of principle -- that they are on the wrong track. It is best to avoid the red herring altogether. When I press people on OOL I sometimes find it is helpful to grant them all the energy they want, in any form they want, for as long as they want, just so they don't get hung up on the silly open/closed game, and then pose the question: "You've got all the energy you want. Now what? How does life arise?" When we do that we quickly see, as you say, that energy is necessary but not sufficient. Not even close to sufficient. Not even in the ballpark. Not even addressing the primary issues that have to be resolved. Eric Anderson
Salvador, there is no difference between the entropy argument and the information argument and the probability argument. Mung
Joe:
Adding heat to a frying pan helps cook my pancakes, eggs and home-fries.
And the pancakes, eggs and home-fries helps keep you alive. Ergo, adding heat is all that's required for OOL. Mung
Barry:
...every particular place in the universe receives energy of some kind from some other place. And if that is so, it seems the materialists have painted themselves into a corner in which they must, to remain logically consistent, assert that entropy applies everywhere but no place in particular, which is absurd.
Sort of like gravity then? Gravity applies everywhere but nowhere in particular, which is absurd. Mung
When the term closed system is used in discussion of OOL, it means closed and near equilibrium. If we put a frog in blender, and then pour the contents in a sealed jar, after its remains achieve equilibrium when supposedly even all microbes feasting on its remains are dead, presumably, this closed system will not spontaneously yield new sustainable life since it is in a state of changeless equilibrium. Hence the last hope is to make the system open and not at equilibrium. The universe may be said to be closed, but there are pockets that are not at equilibrium thus parts can be somewhat modeled as open. Thus the openness of the system is important, but not specifically to reduce entropy, but to allow an external organizing influence. But being open is only a necessary condition, it is not a sufficient condition. It is pretty much agreed, a closed system near equilibrium won't yield new life. scordova
Joe @30: I like it! Eric Anderson
Eric- The Earth's design allows it to harness energy and release it as lightning. In turn the lightning makes nitrates out of the abundant nitrogen in the atmosphere which then rains down as fertilizer for the plants to utilize. Talk about design details... Joe
revelator, Henry Crun, Joe: Yep, the whole issue has nothing to do with whether a system is open or closed. It has to do with whether there is an ability to harness the available energy to do useful, ongoing work. Doesn't make one whit of difference whether the energy is already in the system or whether it is coming into the system. Eric Anderson
If someone asked, "if not the 2nd law, then what law would you use to argue ID?" I've said, "the law of large numbers". See: Fundamental Law of ID and The Paradox of Almost Definite Knowledge in the face of Maximum Uncertainty And regarding design identification: To recognize design is to recognize products of a like-minded process These ideas are simple, straightforward. Even kids can develop an intuition about it. Next is the Humpty Dumpty/Dead Dog principle: Do Dead Dogs Stay Dead Dogs? see follow-on: Vernal Equinox sees Outbreak of DDS The 2nd Law and Boltzmann/Clausius entropy are non-trivial, full of math, and of marginal (occasionally damaging) relevance to ID. I've advocated arguing basic probability (starting with coin/homochirality illustrations). Then argue Humpty Dumpty illustrations, etc. Simple and succinct is better than incomprehensible and verbose. High powered math doesn't necessarily make a more convincing case than simple math and humpty dumpty illustrations. KISS (Keep It Simple, Soldier!) scordova
Adding heat to a frying pan helps cook my pancakes, eggs and home-fries. Adding energy to a fridge helps it cool. IOW if you take a closed system, ie an isolated freezer/ fridge, and add energy to it the thing starts working. And if you take a person, put him/ her in a closed system, ie a fully isolated room, that contains a lighter and flammable materials, I bet it would be quite easy for that person to add heat to that closed system. Just sayin'... Joe
Revelator @ 24, Damn - and there are all these people using incubators who don't realise they don't work. And by the way, if you can add heat to the system, it's not closed. Henry Crun
BTW, the Coursera "Emergence of Life" course from the University of Illinois starts today. If I can stay on top of it in between work/vacation the next 2 months, I'll let you know if this "open system" business comes up in the class. Eric Anderson
Adding "energy" (heat) to a "closed-system" usually destroys. It never creates. Set a library on fire and see how much more information you get with that added energy. (Zilch.) The heat of the sun DESTROYS and PREVENTS any opportunity for OOL, unless directed by an intelligent entity. See here for more information regarding thermodynamics. revelator
Piotr @17:
Did anyone say that the origin of life or its evolution was explained by the second law of thermodynamics? Who said so? Where? But it’s those on the ID/creationist side who argue again and again that life would somehow violate the 2LOT if it were not for an intelligent designer’s interventions, because the processes that support life require entropy to decrease, which is allegedly impossible, or because the 2LOT prohibits the rise of “order out of chaos”, or… (insert more examples of how thermodynamics and the concept of entropy are misunderstood).
Piotr, you are correct that evolutionists bringing up the "open system" business is typically in response to someone arguing that evolution (speaking broadly) would violate the 2nd law. *However,* the actual debate situation is more nuanced than that. Specifically: 1. When this "open system" stuff is brought up, it betrays a profound misunderstanding of what is required to get life going or new systems to come into play. The whole business about "entropy can decrease here because it is increasing somewhere else in the universe" is nonsense, probably generally, but certainly in the context of thinking about, say, OOL. 2. There are many systems in biology that are in fact thermodynamically unstable and that cannot or would not arise on their own if you were to stick the constituents in a tube and let them mix. Saying that the thermodynamic situation in that location can somehow be changed because of a change in the thermodynamics of a system in some other location betrays a deep misunderstanding of the issues. This is most certainly not a creationist talking point. There are a number of papers from committed evolutionists that seek to understand how it is possible for such far-from-equilibrium systems to come about naturally. (Nick even sent us down a rabbit hole a while back with one set of researchers who came to the wholly laughable conclusion that such systems, contrary to appearances and all calculations, must be thermodynamically preferred after all because, hey, such systems exist.) The issue of far-from-equilibrium systems is a perfectly legitimate issue that far too many evolutionists just do not appreciate. 3. Perhaps most relevant for this specific thread, the terms "thermodynamics" and "entropy" are not reserved just for thermal calculations, whatever their initial history. Dr. Sewell is probably the most well known on these pages for making an argument in this regard. Personally, I am not sure whether Dr. Sewell's argument is convincing (I still need to spend more time with it), but I do know that Dr. Sewell did not invent the idea of thermodynamics/entropy being applied to information and/or the organizational aspect of systems. There is an interesting and valuable discussion to be had on this last point. Sal loves to proclaim himself as a lone wolf and a voice crying in the wilderness, but the situation is much more nuanced than that. Whatever Dr. Sewell's use of terminology (which, again, he did not make up), the responses he has gotten from evolutionists have been sadly instructive. Typically: (i) they completely misunderstand the point that there are certain engineering and informational problems that can be addressed in much the same way that we think about thermodynamic systems, (ii) they assume he is talking about simple heat/energy flow, then (iii) they throw out the old "Earth is an open system" nonsense (based, I might add, on evolutionist talking points they read somewhere) as though it were some kind of an explanation, when not only is it (A) irrelevant to Dr. Sewell's point about information, but also (B) unhelpful in explaining the classic heat/energy situation found in many living systems. Then they go off patting themselves on the back and laughing, "Gee, those creationists are so dumb. They don't realize the Earth is an open system and gets energy from the Sun!" When, sadly, they have only demonstrated themselves to be utterly clueless about the issues that are even being discussed. ----- I personally would be interested in a more detailed discussion of thermodynamic principles as they relate to living systems in #2 above and as they relate to information as in #3 above. Unfortunately, there is a great deal of baggage surrounding the specific terminology, which makes it hard to proceed in a rational fashion with any large group. Eric Anderson
PPS: See for yourself, here. kairosfocus
PS: Of course heat engines etc COUPLE energy from a source to produce work -- organised or orderly forced motion -- and exhaust waste heat etc. Something like a hurricane is a spontaneously formed hear engine, but something like a jet engine is based on FSCO/I and is maximally implausible on a spontaneous heat engine like a tornado passing through a junkyard. The machinery and coded stored info and info processing in living cells dwarfs the jet engine in terms of FSCO/I. kairosfocus
F/N: A few points: 1 --> While I prefer the isolated > closed > open system terminology Physicists prefer, it seems Engineers often use a different terminology where closed systems mean isolated ones in the physics sense (no boundary-crossing energy or matter). 2 --> That the observed universe is isolated is a debatable point; as Sears et al pointed out in their classic textbook many years ago. 3 --> If you look at the inner structure of Clausius' formulation of the 2nd law, you will see he considers energy interchanging sub-systems in his analysis. A at Th --> d'Q --> B at Tc. From this the rise in S for B overbalances the loss for A. Where dS >/= d'Q/T, as Th > Tc. 4 --> Thence, we see that if a subsystem absorbs energy, it tends to increase entropy. That is, the number of ways at micro level that mass and energy may be arranged consistent with macro-level thermodynamic parameters such as P, V, T etc defining macro-level state, rises. (A very small amount of matter in human scale terms easily has in it 10^18 or more molecule scale particles ~ 10^-10 - 10^-9 or so m across. A cube 1 mm across -- 10^-3 cc -- has in it [10^6]^3 = 10^18 1 nm cubes. That's a tiny speck like a grain of sand.) 5 --> Switching to micro level, and a statistical/ informational view . . . cf. Jaynes et al down to Harry S Robertson etc . . . the entropy of a body can be seen as a measure of the average missing information to specify the microstate provided by the macrostate information. This is already in log terms and implies a probability metric. It also gives additivity and bridges to information theory, explaining the average info per symbol metric. 6 --> We see emerging a phase space on position and momentum of particles, with degrees of freedom linked to number of particles and the ways energy can be distributed across different forms of motion or potential energy. Thus, temperature as a measure of the avg energy per degree of relevant freedom per particle. 7 --> Knocking off the motion part, as this may not be particularly relevant for purposes in hand, we arrive at, configuration spaces, with huge numbers of possibilities. 8 --> We are very close now to the FSCO/I issue, of isolated islands of function in very large config spaces where for a space with W possibilities, a search is a selection from its power set, of cardinality 2^W. 9 --> That's where the search for search exponentiation of challenges comes in. For 500 bits, there are 2^500 possible configs, about 3.27 * 10^150. The space for possible searches is 2^[3.27 * 10^150] . . . a number that would exhaust the observable universe's atomic resources and still could not be fully written out as a decimal. 10 --> Now, the search challenge on strings is WLOG, as arbitrary configs can be represented by coded strings describing a nodes-arcs pattern. That's what AutoCAD etc do. 11 --> This then leads to the real problem brought out by the logic of the second law. Sewell aptly summarises it as, the mere opening up of a system will not make a vastly improbable configuration of matter and energy become more likely, unless something specific is happening that makes it not unlikely. (That is, organisation can be injected and can even be arranged in a von Neumann self replicator so that it will propagate from generation to generation.) 12 --> Where given the search challenge, there is a maximally implausible constraint on the idea that blind chance plus mechanical necessity will or could reasonably produce the FSCO/I found in life forms within the resources of our observable cosmos. But, FSCO/I is a routine product of design, to the point where FSCO/I -- functionally specific, complex organisation and associated information -- is an empirically highly reliable sign of design. __________ The problem with this is not its logic, it is that it cuts across the a priori materialist ideology that dominates most schools of thought on origins of life and of body plans. KF kairosfocus
it’s those on the ID/creationist side who argue again and again that life would somehow violate the 2LOT if it were not for an intelligent designer’s interventions
Agreed, and I'm in the dissenting minority of ID/creationists who says to avoid 2LOT arguments regarding OOL. For the reader's benefit, I've suggested to first pick which definition of the 2nd law and entropy one is working from, and then proceed with an analysis. Let's start with the conceptually easiest form for entropy -- Shannon-Dembski entropy for a system of fair coins.
Q. What is the Shannon-Dembski entropy for a symbolic system consisting of 1 fair coin? A. 1 bit
Q. What is the Shannon-Dembski entropy for a symbolic system consisting of 1 billion fair coin? A. 1 billion bits or 1 giga bit
Q. in light of the above, does a system of high CSI require more Shannon-Dembski entropy or less? A. MORE!
Now how about traditional thermodynamic entropy (Boltzmann, Clausius, Kelvin, Plank etc.)
Q. Under standard textbook methods for calculating entropy as would be expected of students of physics, chemistry, and mechanical engineering: does a living adult human have more or less entropy than a dead cell? A. MORE! by a factor of 100 TRILLION. In general, more of the same kind of particles imply more entropy.
In light of this, I've wondered: 1. Why ID/Creationists have such an anti-entropy bias, since more entropy is not necessarily a bad thing in the creation of life. God had to infuse our bodies with entropy. Otherwise we'd be dead! 2. Why materialists like Bill Nye even bother appealing to open systems since more entropy is not necessarily bad, and the amount of entropy is not just about the structure of the solar system but involves basic considerations like how the system boundaries are drawn. scordova
scordova:
Most definitely, universal entropy cannot increase without decrease in entropy in some locations (just like the hot bricks example).
It's the other way round. Entropy can't decrease locally unless thermal energy is exported to the environment (which serves as a heat reservoir), increasing overall entropy. As long as there are energy tranfers in the Universe, some of the flowing energy can be (but doesn't have to be) diverted from the thermal pathway to do some work. Piotr
Eric Anderson: Did anyone say that the origin of life or its evolution was explained by the second law of thermodynamics? Who said so? Where? But it's those on the ID/creationist side who argue again and again that life would somehow violate the 2LOT if it were not for an intelligent designer's interventions, because the processes that support life require entropy to decrease, which is allegedly impossible, or because the 2LOT prohibits the rise of "order out of chaos", or... (insert more examples of how thermodynamics and the concept of entropy are misunderstood). Piotr
Barry, you are not missing anything. The old "Earth is an open system" idea to support evolution is utter bunk. First, as you hinted at, in thermodynamics the "system" under consideration is whatever we want it to be. It is an arbitrary construct that simply helps us think through various thermodynamic exchanges. So if some genius (cough) asserts that the Earth is an open system because it receives energy from the Sun and, therefore, OOL is possible/likely, we can simply say, "Great. Let's consider the Earth and Sun together as a closed system. Now explain to me again, how this naturalistic OOL is possible/likely?" The entire open/closed system nonsense is an utter, complete red herring. It is a rhetorical shell game, nothing more. Second, it makes not one whit of difference whether we are dealing with an open or closed system. The question is how those energy inputs could possibly lead to the formation of living systems. Simply having energy inputs does not help in that regard, and might as well hurt in many cases. Eric Anderson
supplemental note on 'pouring raw energy' onto the Earth: Visible light is incredibly fine-tuned for life to exist. Though visible light is only a tiny fraction of the total electromagnetic spectrum coming from the sun, it happens to be the "most permitted" portion of the sun's spectrum allowed to filter through the our atmosphere. All the other bands of electromagnetic radiation, directly surrounding visible light, happen to be harmful to organic molecules, and are almost completely absorbed by the atmosphere. The tiny amount of harmful UV radiation, which is not visible light, allowed to filter through the atmosphere is needed to keep various populations of single cell bacteria from over-populating the world (Ross; reasons.org). The size of light's wavelengths and the constraints on the size allowable for the protein molecules of organic life, also seem to be tailor-made for each other. This "tailor-made fit" allows photosynthesis, the miracle of sight, and many other things that are necessary for human life. These specific frequencies of light (that enable plants to manufacture food and astronomers to observe the cosmos) represent less than 1 trillionth of a trillionth (10^-24) of the universe's entire range of electromagnetic emissions. (Gonzalez; Privileged Planet,, Denton; Nature's Destiny).
Extreme Fine Tuning of Light for Life and Scientific Discovery - video http://www.metacafe.com/w/7715887 Fine Tuning Of Universal Constants, Particularly Light - Walter Bradley - video http://www.metacafe.com/watch/4491552 Fine Tuning Of Light to the Atmosphere, to Biological Life, and to Water - graphs http://docs.google.com/Doc?docid=0AYmaSrBPNEmGZGM4ejY3d3pfMTljaGh4MmdnOQ Michael Denton: Remarkable Coincidences in Photosynthesis - podcast http://www.idthefuture.com/2012/09/michael_denton_remarkable_coin.html
bornagain77
Open systems obey the second law when energy flows across the system boundary are accounted for. A good example is a car or truck, considered as a thermal system. Fuel flows across the boundary into a fuel tank or battery. Exhaust byproducts and heat flow out of the system. Kinetic energy is produced, with efficiency losses governed by the second law. The entropy (unavailable energy) of the universe increases each time the vehicle is operated. jabo
Further comments on ATP synthase:
ATP: The Perfect Energy Currency for the Cell - Jerry Bergman, Ph.D. Excerpt: In manufacturing terms, the ATP (Synthase) molecule is a machine with a level of organization on the order of a research microscope or a standard television (Darnell, Lodish, and Baltimore, 1996). http://www.trueorigin.org/atp.asp ATP Synthase, an Energy-Generating Rotary Motor Engine - Jonathan M. May 15, 2013 Excerpt: ATP synthase has been described as "a splendid molecular machine," and "one of the most beautiful" of "all enzymes" .,, "bona fide rotary dynamo machine",,, If such a unique and brilliantly engineered nanomachine bears such a strong resemblance to the engineering of manmade hydroelectric generators, and yet so impressively outperforms the best human technology in terms of speed and efficiency, one is led unsurprisingly to the conclusion that such a machine itself is best explained by intelligent design. http://www.evolutionnews.org/2013/05/atp_synthase_an_1072101.html Miniature Molecular Power Plant: ATP Synthase - January 2013 - video http://www.youtube.com/watch?v=XI8m6o0gXDY Thermodynamic efficiency and mechanochemical coupling of F1-ATPase - 2011 Excerpt: F1-ATPase is a nanosized biological energy transducer working as part of FoF1-ATP synthase. Its rotary machinery transduces energy between chemical free energy and mechanical work and plays a central role in the cellular energy transduction by synthesizing most ATP in virtually all organisms.,, Our results suggested a 100% free-energy transduction efficiency and a tight mechanochemical coupling of F1-ATPase. http://www.pnas.org/content/early/2011/10/12/1106787108.short?rss=1 See also: Davies et al., “Macromolecular organization of ATP synthase and complex I in whole mitochondria,” Proceedings of the National Academy of Sciences and: Tamás Beke-Somfai, Per Lincoln, and Bengt Nordén, “Double-lock ratchet mechanism revealing the role of [alpha]SER-344 in F0F1 ATP synthase,” Proceedings of the National Academy of Sciences
There is a profound 'chicken and egg' dilemma with ATP surrounding the Origin Of Life problem for evolutionists:
Evolutionist Has Another Honest Moment as “Thorny Questions Remain” - Cornelius Hunter - July 2012 Excerpt: It's a chicken and egg question. Scientists are in disagreement over what came first -- replication, or metabolism. But there is a third part to the equation -- and that is energy. … You need enzymes to make ATP and you need ATP to make enzymes. The question is: where did energy come from before either of these two things existed? http://darwins-god.blogspot.com/2012/07/evolutionist-has-another-honest-moment.html
It is also interesting to note that entropy is the primary reason why our physical bodies grow old and die,,,
Entropy Explains Aging, Genetic Determinism Explains Longevity, and Undefined Terminology Explains Misunderstanding Both - 2007 Excerpt: There is a huge body of knowledge supporting the belief that age changes are characterized by increasing entropy, which results in the random loss of molecular fidelity, and accumulates to slowly overwhelm maintenance systems [1–4].,,, http://www.plosgenetics.org/article/info%3Adoi/10.1371/journal.pgen.0030220 *3 new mutations every time a cell divides in your body * Average cell of 15 year old has up to 6000 mutations *Average cell of 60 year old has 40,000 mutations Reproductive cells are 'designed' so that, early on in development, they are 'set aside' and thus they do not accumulate mutations as the rest of the cells of our bodies do. Regardless of this protective barrier against the accumulation of slightly detrimental mutations still we find that,,, *60-175 mutations are passed on to each new generation. per John Sanford Phd, - Geneticist- author of 'Genetic Entropy and The Mystery of The Genome' (inventor of the 'Gene Gun')
This following video brings the point personally home to us about the effects of genetic entropy:
Ageing Process - 85 years in 40 seconds - video http://www.youtube.com/watch?v=A91Fwf_sMhk
Verse and Music:
Romans 8:18-21 I consider that our present sufferings are not worth comparing with the glory that will be revealed in us. The creation waits in eager expectation for the sons of God to be revealed. For the creation was subjected to frustration, not by its own choice, but by the will of the one who subjected it, in hope that the creation itself will be liberated from its bondage to decay and brought into the glorious freedom of the children of God. Evanescence – The Other Side (Lyric Video) http://www.vevo.com/watch/evanescence/the-other-side-lyric-video/USWV41200024?source=instantsearch
Supplemental note:
Quantum Zeno effect Excerpt: The quantum Zeno effect is,,, an unstable particle, if observed continuously, will never decay. http://en.wikipedia.org/wiki/Quantum_Zeno_effect “It has been experimentally confirmed,, that unstable particles will not decay, or will decay less rapidly, if they are observed. Somehow, observation changes the quantum system. We’re talking pure observation, not interacting with the system in any way.” Douglas Ell – Counting to God – pg. 189 – 2014 – Douglas Ell graduated early from MIT, where he double majored in math and physics.
This is just fascinating! Why in blue blazes should conscious observation put a freeze on entropic decay, unless consciousness was/is more foundational to reality than entropy is? And seeing as to how entropy is VERY foundational to reality, I think the implications of all this are fairly obvious, i.e.
"For the creation was subjected to frustration, not by its own choice, but by the will of the one who subjected it,"
bornagain77
Here is a video based on Granville Sewell's 2013 Biocomplexity paper
Evolution and Entropy - Granville Sewell - video http://www.youtube.com/watch?v=pMHzFoOcdFA
It should also be noted that just pouring raw energy into a system (as with the sun pouring energy onto the earth) actually increases the disorder of the system,,,
Thermodynamic Arguments for Creation - Thomas Kindell (46:39 minute mark) - video https://www.youtube.com/watch?v=I1yto0-z2bQ&feature=player_detailpage#t=2799
,,, and that the raw energy coming from the sun must be harnessed in a very precise way in order for it to be useful for life and not detrimental to it.
Scientists unlock some key secrets of photosynthesis - July 2, 2012 Excerpt: "The photosynthetic system of plants is nature's most elaborate nanoscale biological machine," said Lakshmi. "It converts light energy at unrivaled efficiency of more than 95 percent compared to 10 to 15 percent in the current man-made solar technologies.,, "Photosystem II is the engine of life," Lakshmi said. "It performs one of the most energetically demanding reactions known to mankind, splitting water, with remarkable ease and efficiency.",,, "Water is a very stable molecule and it takes four photons of light to split water," she said. "This is a challenge for chemists and physicists around the world (to imitate) as the four-photon reaction has very stringent requirements." http://phys.org/news/2012-07-scientists-key-secrets-photosynthesis.html
Moreover, 'non-local', beyond space and time, quantum mechanical principles are utilized to accomplish the initial steps of photosynthesis. At the 21:00 minute mark of the following video, Dr Suarez explains why photosynthesis needs a 'non-local', beyond space and time, cause to explain its effect:
Nonlocality of Photosynthesis - Antoine Suarez - video - 2012 http://www.youtube.com/watch?v=dhMrrmlTXl4&feature=player_detailpage#t=1268s
Also of note:
Quantum Mechanics Explains Efficiency of Photosynthesis - Jan. 9, 2014 Excerpt: Previous experiments suggest that energy is transferred in a wave-like manner, exploiting quantum phenomena, but crucially, a non-classical explanation could not be conclusively proved as the phenomena identified could equally be described using classical physics.,,, Now, a team at UCL have attempted to identify features in these biological systems which can only be predicted by quantum physics, and for which no classical analogues exist. ,,,said Alexandra Olaya-Castro (UCL Physics & Astronomy), supervisor and co-author of the research. "We found that the properties of some of the chromophore vibrations that assist energy transfer during photosynthesis can never be described with classical laws, and moreover, this non-classical behaviour enhances the efficiency of the energy transfer.",,, Other biomolecular processes such as the transfer of electrons within macromolecules (like in reaction centres in photosynthetic systems), the structural change of a chromophore upon absorption of photons (like in vision processes) or the recognition of a molecule by another (as in olfaction processes), are influenced by specific vibrational motions. The results of this research therefore suggest that a closer examination of the vibrational dynamics involved in these processes could provide other biological prototypes exploiting truly non-classical phenomena,, http://www.sciencedaily.com/releases/2014/01/140109092008.htm
Photosynthesis and ATP synthase are very intensive, integrated, processes:
The 10 Step Glycolysis Pathway In ATP Production: An Overview - video http://www.youtube.com/watch?v=8Kn6BVGqKd8
At the 14:00 minute mark of the following video, Chris Ashcraft, PhD – molecular biology, gives us an overview of the Citric Acid Cycle, which is, after the 10 step Glycolysis Pathway, also involved in ATP production:
Evolution vs ATP Synthase – Chris Ashcraft - video - citric acid cycle at 14:00 minute mark https://www.youtube.com/watch?feature=player_detailpage&v=rUV4CSs0HzI#t=746 The Citric Acid Cycle: An Overview - video http://www.youtube.com/watch?v=F6vQKrRjQcQ
bornagain77
As to 'entropy applying everywhere', first it is good to note how broad entropy is in its explanatory power for physical events in the universe:
Shining Light on Dark Energy – October 21, 2012 Excerpt: It (Entropy) explains time; it explains every possible action in the universe;,, Even gravity, Vedral argued, can be expressed as a consequence of the law of entropy. ,,, The principles of thermodynamics are at their roots all to do with information theory. Information theory is simply an embodiment of how we interact with the universe —,,, http://crev.info/2012/10/shining-light-on-dark-energy/ Evolution is a Fact, Just Like Gravity is a Fact! UhOh! - January 2010 Excerpt: The results of this paper suggest gravity arises as an entropic force, once space and time themselves have emerged. https://uncommondescent.com/intelligent-design/evolution-is-a-fact-just-like-gravity-is-a-fact-uhoh/
In fact Black Holes are found to be the greatest contributors to the entropy of the universe:
Entropy of the Universe – Hugh Ross – May 2010 Excerpt: Egan and Lineweaver found that supermassive black holes are the largest contributor to the observable universe’s entropy. They showed that these supermassive black holes contribute about 30 times more entropy than what the previous research teams estimated. http://www.reasons.org/entropy-universe Roger Penrose – How Special Was The Big Bang? “But why was the big bang so precisely organized, whereas the big crunch (or the singularities in black holes) would be expected to be totally chaotic? It would appear that this question can be phrased in terms of the behaviour of the WEYL part of the space-time curvature at space-time singularities. What we appear to find is that there is a constraint WEYL = 0 (or something very like this) at initial space-time singularities-but not at final singularities-and this seems to be what confines the Creator’s choice to this very tiny region of phase space." "Einstein's equation predicts that, as the astronaut reaches the singularity (of the black-hole), the tidal forces grow infinitely strong, and their chaotic oscillations become infinitely rapid. The astronaut dies and the atoms which his body is made become infinitely and chaotically distorted and mixed-and then, at the moment when everything becomes infinite (the tidal strengths, the oscillation frequencies, the distortions, and the mixing), spacetime ceases to exist." Kip S. Thorne - "Black Holes and Time Warps: Einstein's Outrageous Legacy" pg. 476
In fact it was in large measure by working from entropic concerns of black-holes that Penrose was able to deduce the initial entropy of the universe:
The Physics of the Small and Large: What is the Bridge Between Them? Roger Penrose Excerpt: "The time-asymmetry is fundamentally connected to with the Second Law of Thermodynamics: indeed, the extraordinarily special nature (to a greater precision than about 1 in 10^10^123, in terms of phase-space volume) can be identified as the "source" of the Second Law (Entropy)." http://www.pul.it/irafs/CD%20IRAFS%2702/texts/Penrose.pdf Roger Penrose discusses initial entropy of the universe. - video http://www.youtube.com/watch?v=WhGdVMBk6Zo "This now tells us how precise the Creator's aim must have been: namely to an accuracy of one part in 10^10^123." (from the Emperor’s New Mind, Penrose, pp 339-345 - 1989) "The 'accuracy of the Creator's aim' would have had to be in 10^10^123" Hawking, S. and Penrose, R., The Nature of Space and Time, Princeton, Princeton University Press (1996), 34, 35.
This number is gargantuan and blows all the other constants, in terms of fine-tuning, out of the water. If this number were written out in its entirety, 1 with 10^123 zeros to the right, it could not be written on a piece of paper the size of the entire visible universe, even if a number were written down on each sub-atomic particle in the entire universe, since the universe only has 10^80 sub-atomic particles in it! Besides disorder, entropy is also related to thermodynamic equilibrium. In fact, a 'flat universe', which is actually another very surprising finely-tuned 'coincidence' of the universe, means this universe, left to its own present course of accelerating expansion due to Dark Energy, will continue to expand forever, thus fulfilling the thermodynamic equilibrium of the second law to its fullest extent (entropic 'Heat Death' of the universe).
The Future of the Universe Excerpt: After all the black holes have evaporated, (and after all the ordinary matter made of protons has disintegrated, if protons are unstable), the universe will be nearly empty. Photons, neutrinos, electrons and positrons will fly from place to place, hardly ever encountering each other. It will be cold, and dark, and there is no known process which will ever change things. --- Not a happy ending. http://spiff.rit.edu/classes/phys240/lectures/future/future.html Big Rip Excerpt: The Big Rip is a cosmological hypothesis first published in 2003, about the ultimate fate of the universe, in which the matter of universe, from stars and galaxies to atoms and subatomic particles, are progressively torn apart by the expansion of the universe at a certain time in the future. Theoretically, the scale factor of the universe becomes infinite at a finite time in the future. http://en.wikipedia.org/wiki/Big_Rip "We have the sober scientific certainty that the heavens and earth shall ‘wax old as doth a garment’.... Dark indeed would be the prospects of the human race if unilluminated by that light which reveals ‘new heavens and a new earth.’" Lord Kelvin (originator of the second law) Psalm 102:25-27 Of old You laid the foundation of the earth, And the heavens are the work of Your hands. They will perish, but You will endure; Yes, they will all grow old like a garment; Like a cloak You will change them, And they will be changed. But You are the same, And Your years will have no end.
Personally, although I'm not qualified in the mathematics to know for sure, it seems very ironic to me that entropy would be a measure of both the extraordinary disorder of a black-hole and of the apparent order that would be represented by the thermodynamic equilibrium inherent in the heat death of the universe. (note that the initial measure of the entropy of the universe is extreme order, the second measure of the entropy is extreme disorder, i.e. black-holes, and the final measure of entropy is order once again, i.e. thermodynamic equilibrium, heat death. Strange! And, again although I'm not qualified to know for sure, it seems that the tension between Sal and Dr. Sheldon, Dr. Sewell, (in which Dr. Sewell felt personally attacked by Sal because of the way Sal presented his opinion), may have to do with this apparent schizophrenia apparent in the measure of order and disorder in the entropy of the universe.,,,
Physicist Rob Sheldon offers some thoughts on Sal Cordova vs. Granville Sewell on 2nd Law Thermo - July 2012 Excerpt: The Equivalence: Boltzmann’s famous equation (and engraved on his tombstone) S = k ln W, merely is an exchange rate conversion. If W is lira, and S is dollars, then k ln() is the conversion of the one to the other, which is empirically determined. Boltzmann’s constant “k” is a semi-empirical conversion number that made Gibbs “stat mech” definition work with the earlier “thermo” definition of Lord Kelvin and co. Despite this being something as simple as a conversion factor, you must realize how important it was to connect these two. When Einstein connected mass to energy with E = (c2) m, we can now talk about mass-energy conservation, atom bombs and baby universes, whereas before Einstein they were totally different quantities. Likewise, by connecting the two things, thermodynamics and statistical mechanics, then the hard rules derived from thermo can now be applied to statistics of counting permutations. This is where Granville derives the potency of his argument, since a living organism certainly shows unusual permutations of the atoms, and thus has stat mech entropy that via Boltzmann, must obey the 2nd law. If life violates this, then it must not be lawfully possible for evolution to happen (without an input of work or information.) The one remaining problem, is how to calculate it precisely (how to calculate the entropy precisely). note: (And because it is extremely difficult to calculate entropy precisely for living cells, this is exactly where Darwinists try to claim evolution does not violate the second law. Yet regardless of the games Darwinists play because of this lack of mathematical precision, for all intents and purposes as far as we can ascertain, for evolution to occur would indeed violate the 'iron clad' second law of thermodynamics!) https://uncommondescent.com/intelligent-design/physicist-rob-sheldon-offers-some-thoughts-on-sal-cordova-vs-granville-sewell-on-2nd-law-thermo/ Why Tornados Running Backward do not Violate the Second Law - Granville Sewell - May 2012 - article with video Excerpt: So, how does the spontaneous rearrangement of matter on a rocky, barren, planet into human brains and spaceships and jet airplanes and nuclear power plants and libraries full of science texts and novels, and supercomputers running partial differential equation solving software , represent a less obvious or less spectacular violation of the second law—or at least of the fundamental natural principle behind this law—than tornados turning rubble into houses and cars? Can anyone even imagine a more spectacular violation? https://uncommondescent.com/intelligent-design/why-tornados-running-backward-do-not-violate-the-second-law/
bornagain77
As an addendum: Consider the total entropy of a closed system consisting of a cold brick and a hot brick in contact with each other where each brick is an "open" system allowing heat flow out of each brick. The total entropy of the both bricks combined will increase as the temperature of the cold brick gets higher and the temperature of the hot brick gets lower. The entropy of the hot brick actually gets LOWER as the total entropy goes up because it is cooling off! The entropy of such bricks is calculated in 4.Entropy changes in the ``hot brick problem'': http://web.mit.edu/16.unified/www/FALL/thermodynamics/notes/node41.html Thus, even though the 2nd law is applicable to the universe, and even though universal entropy is always increasing, it is inevitable entropy of some objects will decrease for the simple reason they are getting colder, and this will most definitely be true because when stars start to burn out. A lot of this is counter intuitive, but the numbers will bear this out. Most definitely, universal entropy cannot increase without decrease in entropy in some locations (just like the hot bricks example). And again, to repeat, if the Earth had been a frozen low entropy ice ball we'd all be dead. Even materialists should be arguing the increase of the Earth's entropy from a low entropy state would actually be something desirable. The Earth is an open system, but we wouldn't want the entropy of the Earth to escape and go to zero, we'd want the sun giving us entropy as we dump entropy out into cold space. Amazingly, materialists and Darwinists don't give this more direct analysis, and even fumble with basic physics as well. It's not a matter of lowering entropy, it's a matter of having just the right amounts! To paraphrase Godilocks, "the porridge doesn't have too much entropy, not too little, just right". The hot bricks problem should be studied carefully to really grasp the issues. Entropy is not an easy concept. That's why I far prefer basic probability arguments for OOL. scordova
The 2nd law of thermodynamics allows the entropy of a system to decrease if the system radiates heat (waste energy) to the environment. Actually, the Solar System is not isolated: it is surrounded by cold outer space into which heat is radiated. If instead of receiving energy from the Sun -- a small spot in the sky -- we were surrounded by energy arriving uniformly from every direction, all of it would only heat up the Earth, and no fraction of it could be diverted for any useful purpose. Luckily, thermal energy can escape into the night sky, so the large temperature difference between the Sun and the Earth is maintained in the long run. Living things get rid of heat, which allows them to decrease their entropy without violating the second law. Piotr
I understand their argument, entropy is not an obstacle to blind watchmaker evolution, because entropy applies absolutely only in a “closed system,” and the earth is not a closed system because it receives electromagnetic radiation from space.
The 2nd law of Thermodynamics is not a obstacle to mindless OOL any more than 2nd law of motion by Newton. This was stated by a recognized pioneer of Intelligent Design, Walter Bradley:
“Strictly speaking, the earth is an open system, and thus the Second Law of Thermodynamics cannot be used to preclude a naturalistic origin of life.” Walter Bradley, Thermodynamics and the Origin of Life
But supposing for the sake of argument the 2nd law was a barrier to the Origin of Life in a closed system, the universe being a closed system does not preclude pockets of low entropy, especially in very cold locations. For example, when the universe burns out and the Earth is near absolute zero, its entropy will be correspondingly low. If I put a living rat in bath of liquid helium (near absolute zero), I'll be removing most of its thermodynamic entropy. It will have far less entropy than a living warm rat. Textbook thermodynamics says removing entropy can be lethal! Furthermore, for proteins to have stability, they must maximize their entropy by distributing their energy. The barrier to OOL is NOT reducing entropy, but having just the right fine-tuned amounts, just like fine-tuned temperatures. This subtlety seems lost on both sides of the debate. I'm appalled materialists (like Bill Nye the science guy) didn't understand this either! I've encouraged people who really want to delve into the 2nd law to review my analysis in the link below. I only made one substantial error in that I didn't specify a monoatomic inert gas like xenon, but mistakenly used water, but the basic ideas are correct: Entropy Examples connecting Cluasius, Boltzmann, Dembski From the math, one will note, in order to have an increase of CSI one needs to increase Entropy, not reduce it! And in the case of thermodynamics, a dying person in the cold needs to have his entropy raised (by having his temperature raised)so that he can live. His entropy must be raised, not lowered! There are many barriers to OOL, the 2nd law, imho isn't one of them. Others disagree, but the way I stated it is the way I'd expect chemistry, physics and engineering students who actually do entropy calculations in their homework and exams to analyze these questions. When I related Shannon, Dembski, Boltzmann and Clausius notions of entropy, I thought I demonstrated that entropy must necessarily increase for CSI to increase. Others are welcome to disagree, but those are my computations and I put them on the table for others to refute if my calculations are materially wrong (save my error of not using mono atomic elements, but even then, that is correctable, just use the appropriate constants for the material under consideration). The question for OOL is not lowering entropy, but having the right amounts. In that sense, if universal entropy is increasing, then there will be a finite window of time where life can emerge. That is a real challenge, but not as insurmountable as say evolving the DNA triplet code from random soup of chemicals or evolving homochirality. I criticize OOL, but I not because of the 2nd law. I prefer to argue rote probability just like 500 fair coins heads. It's clearer and unassailable. scordova
Entropy is defined by mathematical equations. The local reduction in entropy is mathematically modeled by the theory of heat engines. The physicists are not at all confused by this. But sure, when you oversimplify, that can lead to confusion. Neil Rickert
And what explains the Sun? The Sun must, sensibly, be more organised (less entropic) than the energy it blasts at the earth ... How did it get that way when entropy works only in the other direction? Where did the "free energy" come from that organised the star? Not only into the massive sources of energy they are, but into the intricate and enormous (100s of millions of lightyears) structures in our universe? And how did the universe achieve this large-scale organisation and energy concentration (i.e. enormous negative entropy) when natural processes all work in the other direction? By definition, the universe is a closed system. The energy to organiseit cannot naturally come from within, nor can it come from without. As for this energy that is constantly being blasted at the earth by the Sun, the word we use for a massive injection of uncontrolled energy is "explosion". Such events, in all our actual experience (you know, that primitive superstition known as observation), are without exception destructive of complex delicate organisation. They do not produce more complex organisations from less, but they reduce complex organisations to less complex arrangements. Life is the most delicate, fragile, complex, arrangement of matter of which we are aware. But explosions, big bangs of uncontrolled energy, seem to be the answer to everything. Instead of the answer the atheist disfavors - "God did it" - is an answer he favors - "Explosions did it". Excuse me is I find this not particularly consistent, let alone unconvincing ... ScuzzaMan
Natural evolution does not defy the 2nd law because the sun pumps a huge amount of energy into the system.
Or so you've been told. The same people who told you that would also insist that setting a cup of tea out into the sun and coming back in 5 minutes to find that all the tea had gone back into the teabag would not violate the 2nd law, for the same reason. cantor
Cantor, the operative phrase is "comes close". Natural evolution does not defy the 2nd law because the sun pumps a huge amount of energy into the system. Acartia_bogart
the entire solar system comes close to being an isolated system
It's not isolated from cosmic rays cantor
the earth is not a closed system because it receives electromagnetic radiation from space
... and cosmic rays cantor
"As I understand their argument, entropy is not an obstacle to blind watchmaker evolution, because entropy applies absolutely only in a “closed system,” and the earth is not a closed system because it receives electromagnetic radiation from space." It is more accurate to say an "isolated system" rather than a closed system. The earth cannot be seen as an isolated system because of that fusion reactor that is only 8 light minutes away. But the entire solar system comes close to being an isolated system. Acartia_bogart

Leave a Reply