Fluctuation theorem has been listed as a level5 vital article in Science, Physics. If you can improve it, please do. This article has been rated as CClass. 
WikiProject Physics  (Rated Cclass, Highimportance)  


I tagged this page "too technical" because, even though I don't have a physics degree, it should still be able to inform me as to why Loschmidt's paradox (which I could understand) isn't a problem. ~~ N (t/c) 04:21, 6 November 2005 (UTC)
I hardly understand anything either how is entropy defined in this article? What does an average over entropy production mean, since entropy is already defined by summing over the whole distribution of states. It seems that this article somehow associates an entropy to every microstate, or something like that.ThorinMuglindir 19:54, 6 November 2005 (UTC)
The list of assumptions required to prove the Fluctuation Theorem is very interesting (Part of the "Summary" section toward the end of the article). There was one thing that was unclear to me. Quoting the article, "In regard to the [assumption of time reversal symmetry], all the equations of motion for either classical or quantum dynamics are in fact time reversible." If I remember my physics correctly, to reverse the trajectory of a charged particle in a magnetic field, not just its velocity but also its charge must be reversed. Does that mean that a collection of charged particles in an external magnetic field will not obey the Fluctuation Theorem (and by extension the Second Law of Thermodynamics) unless they are capable of charge reversal? Or, is the assumption of time reversal symmetry somehow independent of (lack of) charge reversal? Compbiowes 00:38, 4 October 2006 (UTC)
About the magnetic field
Well, I think if you treat magnetic field as one generated by current flow or moving charges, then time reversal will also imply changing the direction of current flow, and thus reversing magnetic field too.
I would think it's just equal to deltaS (change in entropy) divided by deltat (the time interval), but I'd like some confirmation from an expert. If this is true, than it might help make the article slightly more accessible if this was mentioned somewhere. This would also mean that the fluctuation theorem could be restated in terms of changes in entropy, so that for any given timeinterval, the ratio between (probability that entropy change is +deltaS) and (probability that entropy change is deltaS) would be e^(deltaS). Hypnosifl 18:04, 20 October 2006 (UTC)
Hi. I've drawn this graphic. I'd like to know your comments about the idea it describes, thank you very much.
Faustnh (talk) 00:07, 29 March 2009 (UTC)
(Also posted at Entropy and information talk page). Faustnh (talk) 18:14, 29 March 2009 (UTC)
I was trying to get background on another aarticle, http://www.popflock.com/learn?s=Law_of_Maximum_Entropy_Production which I continue to solicit input but is this FT the same as Crook's or GallavottiCohen? Nerdseeksblonde (talk) 12:39, 26 September 2009 (UTC)
The article states that biological machines such as molecular motors occasionally can run in reverse mode. A reader (like me) might then think that backstepping and a reverse forward step are the same. However, I fear this simple view is wrong: it is not that clear that a motor produces an ATP during a backward step. This is probably true for F1ATPase, but not for kinesin. It would be good to have a clarifying sentence on this in the article.
Some Literature:
Nishiyama et al. Nat Cell Bio. 2002 "Chemomechanical coupling of the forward and backward steps of single kinesin molecules"
Taniguchi et al. Nat Chem Bio 2005 "Entropy rectifies the Brownian steps of kinesin"
Carter et al. Nature 2005 "Mechanics of the kinesin step"
I have seen many statistical mechanical definitions of entropy which are not all equivalent, or have subtle distinctions in meaning. All definitions are obviously designed to respect thermodynamics in the thermodynamic limit, however they significantly differ for microscopic systems. I would imagine that the precise definition of the quantity "entropy" is extremely important for the fluctuation theorem, however it's not included anywhere in this article. Nanite (talk) 15:10, 14 September 2013 (UTC)
What is the "averaged entropy" in this article? Since entropy is already an average, I agree with other editors that a very precise definition of entropy is needed to make this meaningful.
Here is an attempt.
The microstates (10^10^10) are grouped into collections called states (10^10), characterized by thermodynamic variables such as p, V, T, S, U. A microstate follows an exact physical trajectory a(t), and a state obeys macroscopic (thermodynamic) laws that prescribes a trajectory M(t). (Or just restrict the allowable trajectories M(t).)
Then: for any fixed s, t, all but a tiny fraction of microstates "obey thermodynamics", that is, if a(s) ? M(s) then a(t) ? M(t).
A tiny number of microstate trajectories "jump state", that is, they shift from one macroscopic trajectory M(.) to another one M'(.) during the time interval [s,t]. The macroscopic variables of a(.) change in a way that is impossible according to the macroscopic laws. So the macroscopic laws get broken. An example is all the gas going into one corner of the room. 89.217.26.52 (talk) 22:34, 2 February 2015 (UTC)
It is important to understand what the Second Law Inequality does not imply. It does not imply that the ensemble averaged entropy production is nonnegative at all times. This is untrue, as consideration of the entropy production in a viscoelastic fluid subject to a sinusoidal time dependent shear rate shows.[clarification needed] In this example the ensemble average of the time integral of the entropy production is however non negative  as expected from the Second Law Inequality.
This seems to contradict the immediately preceding paragraph, which states that the averaged entropy is nondecreasing.
Indeed, the viscoelastic fluid example seems like a red herring. It misinterprets the claim of the article: indeed, the claim is that the ensembleaveraged, instantaneous rate of entropy production is nonnegative for all times, not just integrated over userselected time windows.
If the thesis in this article is wrong, it needs an attack in principle (from published sources, of course). It won't be disproved or "qualified" by a flyby example with no explanation OR citation. In any case I suspect the example is not a closed system! It sounds like a driven system. But like most any reader, I can't tell from the amount of detail given. The example should simply be removed for now. 89.217.26.52 (talk) 22:34, 2 February 2015 (UTC)