MS and ALS are strongly connected to inprganic mercury poisoning. Even moreso is motor neuron disease MND. Mercury preferentially attaches to neuromelanin in the SN and locus ceruleus. Mercury also prefers the upper motor neurons the corticomotor nuerons. The LC is most strongly implicated in PD and in particluar sleep, anxiety, balance and other problems related to PD and my symptoms but mercury in the LD was not found in 3 PD patients like it was in MND. Excess of the norepinephrine (noradrenaline) it produces causes constipation and excess saliva and maybe anxiety. So I do not know why damage to the LC appears to reduce the neurotransmitter that it produces.
A singapore study showed 1 cup of black tea per day slashed PD cases by 71%. But it's also Singapore where PD was 20x more likely to be associated with mercury. Maybe in Singapore they eat a lot of fish with methyl mercury in it and it is known that black tea prevent mercury in fixed from being absorbed. So although I use 4 tea bags in a cup of black tea per day, it may not be doing me any good. Likewise for the 1/4 cup of coffee grinds ni my cup of coffee: it may only be helping if I've got certain genes.
A large Danish study in dental workers was the only one that seems to have looked carefully at mercury and PD. It's findings were negative.
"The locus coeruleus may figure in clinical depression, panic disorder, Parkinson's disease, Alzheimer's disease[9] and anxiety. Some medications including norepinephrine reuptake inhibitors (reboxetine, atomoxetine), serotonin-norepinephrine reuptake inhibitors(venlafaxine, duloxetine), and norepinephrine-dopamine reuptake inhibitors (bupropion) are believed to show efficacy by acting upon neurons in this area."
Research continues to reveal that norepinephrine (NE) is a critical regulator of numerous activities from stress response, the formation of memory to attention and arousal. Many neuropsychiatric disorders precipitate from alterations to NE modulated neurocircuitry: disorders of affect, anxiety disorders, PTSD, ADHD and Alzheimer’s disease. Alterations in the locus coeruleus (LC) accompany dysregulation of NE function and likely play a key role in the pathophysiology of these neuropsychiatric disorders.[10]
Monday, November 30, 2015
Parkinson's: 4 hormones affected
1) dopamine => motivation, action
2) norepinephrine => excitatory
3) melatonin => sleep
4) serotonin => confidence
Tremors, lack of balance, and lack of response caused by lack of 1).
Lack of sleep caused by lack of receptors for 3).
Depression maybe caused by lack of 4) or its receptors.
Constipation and excess saliva can be caused by an excess of 2), not a decrease. Also anxiety should be cause by an excess of it. So somehow damage to the LC seems to appear as an increase 2) even as it is supposed to produces 2). I mean if the LC is damaged, you would think 2) is less but PD and other conditions that "affect" the LC look as if they increase 2).
A 5th important neurotransmitter is oxytocin, the "trust" or "love" hormone that works with dopamine that is also sometimes thought to be a pleasure hormone.
Notice each of the 5 hormones can help increase "pleasure".
OXYTOCIN research:
"The OXT-immunoreactive cell number in the PVN of the [6] PD patients was 22% lower"
" Lower oxytocin neuron numbers are found in Prader-Willi syndrome, AIDS and Parkinson's disease."
"administration of rats with oxytocin significantly lessened the neuronal death. These findings suggest that injury of dopaminergic neurons triggers exaggerated neuronal oscillations in the striatum and oxytocin may have some inhibitory effects on neuronal activity in PD."
" By incubation of stable isotope-labeled oxytocin with tissue preparations, it was also confirmed that oxytocin at least partially contributed to the production of MIF-1 in the hypothalamus by action of peptidases. ...MIF-1 has potent therapeutic effects in depression and Parkinson's disease,"
"oxytocin (OT) plays a role in neuropsychiatric disorders characterized by social dysfunction" (I could fall into this category)
"sexual dysfunction (decrease in libido and erection) in PD, via altered dopamine-oxytocin pathways, which normally promote libido and erection"
"OT improves memory consolidation and extinction, but only if given at a low dose immediately after the acquisition phase.... OT plays a role in elementary forms of behavioral flexibility and adaptive responses and support its therapeutic potential in neuropsychiatric disorders characterized by cognitive inflexibility and/or impairment (autism, schizophrenia, Alzheimer's disease, Parkinson disease, stroke, posttraumatic stress disorder)."
"mice treated with [lipid-form OXT] displayed more recovery than those given OXT. The results suggest that LOXT has a functional advantage in recovery of social behavioral impairment, such as those caused by neurodegenerative diseases, autism spectrum disorders, and schizophrenia."
OXYTOCIN research:
"The OXT-immunoreactive cell number in the PVN of the [6] PD patients was 22% lower"
" Lower oxytocin neuron numbers are found in Prader-Willi syndrome, AIDS and Parkinson's disease."
"administration of rats with oxytocin significantly lessened the neuronal death. These findings suggest that injury of dopaminergic neurons triggers exaggerated neuronal oscillations in the striatum and oxytocin may have some inhibitory effects on neuronal activity in PD."
" By incubation of stable isotope-labeled oxytocin with tissue preparations, it was also confirmed that oxytocin at least partially contributed to the production of MIF-1 in the hypothalamus by action of peptidases. ...MIF-1 has potent therapeutic effects in depression and Parkinson's disease,"
"oxytocin (OT) plays a role in neuropsychiatric disorders characterized by social dysfunction" (I could fall into this category)
"sexual dysfunction (decrease in libido and erection) in PD, via altered dopamine-oxytocin pathways, which normally promote libido and erection"
"OT improves memory consolidation and extinction, but only if given at a low dose immediately after the acquisition phase.... OT plays a role in elementary forms of behavioral flexibility and adaptive responses and support its therapeutic potential in neuropsychiatric disorders characterized by cognitive inflexibility and/or impairment (autism, schizophrenia, Alzheimer's disease, Parkinson disease, stroke, posttraumatic stress disorder)."
"mice treated with [lipid-form OXT] displayed more recovery than those given OXT. The results suggest that LOXT has a functional advantage in recovery of social behavioral impairment, such as those caused by neurodegenerative diseases, autism spectrum disorders, and schizophrenia."
Tuesday, November 24, 2015
entropy: relation between physical and information
== The relation between S and H ==
H is intensive S
Shannon entropy H appears to be more like bits per symbol of a message (see examples below for proof) which has abetter parallel with '''intensive''' entropy (S per volume or S per mole), NOT the usual '''extensive''' entropy S. This allows "n" distinct symbols like a,b,c in a Shannon message of length N to correlate with "n" energy levels in N particles or energy modes.
H as possible extensive S? Not really
However, a larger block of matter will have more longer wavelength phonons so that more Shannon symbols will be needed to represent the larger number of energy levels, so maybe it can't be taken as intensive S in all occasions. But the same error will apply if you try to use intensive S. So intensive S seems to be exactly equal to Shannon, as long as the same reference size block is used. This problem does not seem to apply to gases. But does a double-sized volume of gas with double number of molecules have double entropy as I would like to use Shannon entropy? Yes, according to the equation I found (at the bottom) for S for a gas: S = N*( a*ln(U/N) + b*ln(V/N)+c* ). If N, U, and V all double, then entropy merely doubles.
Why Shannon entropy H is not like the usual extensive entropy S
Examples of Shannon entropy (using the online Shannon entropy calculator): http://www.shannonentropy.netmark.pl/
01, 0011, and 00101101 all have Shannon entropy H = log2(2) = 1
abc, aabbcc, abcabcabc, and abccba all have Shannon entropy H = log2(3) = 1.58
The important point is to notice it does not depend on the message length, whose analogy is number of particles. Clearly physical extensive entropy increases with number of particles.
Suggested conversion between H and regular extensive S
To convert Shannon entropy H to extensive entropy S or vice versa use
'''S=N*kb*ln(2)*H'''
where N is number of oscillators or particles, kb is Boltzmann's constant, ln(2) is the conversion from bits to nats, and N is the number of particles to which you've assigned a "Shannon" symbol representing the distinct energy level (if volume is constant) or microstates (a combination of volume and energy) of the particles or other oscillators of energy. A big deal should not be made out of kb (and therefore S) having units of J/K because temperature is a precise measure of a kinetic energy distribution, so it is unitless in a deep sense. It is merely a slope that allows zero heat energy and zero temperature to meet at the same point, both of which are Joules per particle.
Physical entropy of an Einstein solid derivable from information theory?
Consider an Einstein Solidhttp://hyperphysics.phy-astr.gsu.edu/hbase/therm/einsol.html with 3 possible energy states (0, 1, 2) of in each of N oscillators. Let the average energy per oscillator be 1 which means the total energy q=N. The physical extensive entropy S is (by the reference above) very close to S = kb*ln(2)*2N = 1.38*N*kb for large N. You can verify this by plugging in example N's, or by Sterling's approximation and some math.
Shannon entropy for this system is as follows: each oscillator's energy is represented by 1 of 3 symbols of equal probability (i.e. think of a, b, c substituted for energies 0,1,2). Then H = log2(3) = 1.585 which does not depend on N (see my Shannon entropy examples above). Using the H to S conversion equation above gives S=N*kb*ln(2)*log2(3) = ln(3)*N*kb = 1.10*N*kb. This is 56% lower than the Debye model. I think this means there are 14 possible states of even probability in the message of oscillators for each oscillator N giving S = k*N*ln(14) and K.E.= N*energy/N.
log2(3) = ln(3)/ln(2)
Note that the physical S is 25% more than the S derived from information theory, 2*ln(2) verses ln(3). The reason is that the physical entropy is not restricted to independently random oscillator energy levels which average 1. It only requires the total energy be q=N.
You might say this proves Shannon entropy H is not exactly like intensive entropy S. But I wonder if this idealized multiplicity is a physical reality. Or maybe kb contains a "hidden" 25% reduction factor in which case I would need to correct my H to S conversion above by a factor of 0.75 if used in multiplicity examples instead of classical thermodynamics. Another possibility is that my approach in assigning a symbol to an energy level and a symbol location to a particle is wrong. But it's hard to imagine the usefulness of looking at it any other way.
Going further towards Debye model of solids
However, I might be doing it wrong. The oscillators might also be all 1's or half 2's and half 0's ( all 1's has log2(1)=0) . This would constitute 3 different types of signal sources as the possible source of the total energy with entropy of maybe H=log2(3)+log2(2)+log2(1)=2.58. Times ln(2) gives 1.77 which is higher than the 1.38, but close to Debye's model of solids which predicts temps 1/0.806 more than einstein for a given total energy which is S/kb=1.72 (because U/N in both models ~T*S). The model itself is supposed to be the problem with einstein solid, so H should not come out closer based on it, unless I'm cheating or re-interpreting in a way that makes it closer to the Debye model. To complete this, I need to convert the debye model to a shannon signal source, assigning a symbol to each energy level. Each symbol would occur in the signal source (the solid) based on how many modes are allowing it. So N would not be particles but number of phonons, which determine plank radiation even at low temps, and heat capacity. It is strange that energy level quantity does not change the entropy, just the amount of variation in the possibilities. This probably is the way the physical entropy is calculated so it's garanteed to be correct.
Debye Temp ~ s* sqrt(N/V) where s=sonic velocity of the solid . "Einstein Temp" = 0.806* debye temp. This means my treatment
C = a*T^3 (heat capacity)
S=a/3*C
Entropy of a single harmonic oscillator (giving rise to phonons?) at "high" temp is
for solids S=k*ln(kT/hf + 1) where f is frequency of the oscillations which is higher for stronger bonds. So if Earth acquires stronger bonds per oscillator and maintains a constant temperature, S decreases. For indpendent oscillators in 1 D I think it might be S=kN*(ln(kT/hf/N) + 1). In terms of S=n*H entropy: S=X*[-N/X*[ln(N/X)-1]] where X=kT/hf. For 3D X^3. See Vibrational Thermodynamics of Materials, Fultz paper.
Example that includes volume
The entropy of an ideal gas according to internal energy page.
S = k * N *[ cln(U/N) + R*ln(V/N) ]
by using log rules applied to Wikipedia's internal energy page, where c is heat capacity. You could use two sets of "Shannon" symbols, one for internal energy states U and a set for V, or use one set to represent the microstates determined by U and V to give S=constant*N*ln(2)*H. H using distinct symbols for microstates is thereby the Shannon entropy of an ideal gas.
ideal gas not near absolute zero, more accurately:
S=kN*ln(V/N*(mU4pi/3Nh^2)^3/2 + 5/2))
Sackur–Tetrode equation
S~N*( ln(V/N*(mU/N)^3/2 + a) +b)
S~N*(ln(V/N) + a*ln(mU/N) +b)
The separate ln()'s are because there are 2 different sets of symbols that need to be on the same absolute base, but have "something", maybe different "symbol lengths" as a result of momentum being the underlying quantity and it affects pressure per particle as v whereas it affects U as v^2. In PV=nRT, T is v^2 for a given mass, and P is v. P1/P2 = sqrt(m2/m1) when T1 and T2 are the same for a given N/V.
==================
edit to Wikipedia: (entropy article)
The closest connection between entropy in information theory and physical entropy can be seen by assigning a symbol to each distinct way a quantum energy level (microstate) can occur per mole, kilogram, volume, or particle of homogeneous substance. The symbols will thereby "occur" in the unit substance with different probability corresponding to the probability of each microstate. Physical entropy on a "per quantity" basis is called "[[Intensive_and_extensive_properties|intensive]]" entropy as opposed to total entropy which is called "extensive" entropy. When there are N moles, kilograms, volumes, or particles of the substance, the relationship between this assigned Shannon entropy H in bits and physical extensive entropy in nats is:
:S = k_\mathrm{B} \ln(2) N H
where ln(2) is the conversion factor from base 2 of Shannon entropy to the natural base e of physical entropy. [[Landauer's principle]] demonstrates the reality of this connection: the minimum energy E required and therefore heat Q generated by an ideally efficient memory change or logic operation from irreversibly erasing or merging N*H bits of information will be S times the temperature,
:E = Q = T k_\mathrm{B} \ln(2) N H,
where H is in information bits and E and Q are in physical Joules. This has been experimentally confirmed.{{Citation |author1=Antoine Bérut |author2=Artak Arakelyan |author3=Artyom Petrosyan |author4=Sergio Ciliberto |author5=Raoul Dillenschneider |author6=Eric Lutz |doi=10.1038/nature10872 |title=Experimental verification of Landauer’s principle linking information and thermodynamics |journal=Nature |volume=483 |issue=7388 |pages=187–190 |date=8 March 2012 |url=http://www.physik.uni-kl.de/eggert/papers/raoul.pdf|bibcode = 2012Natur.483..187B }}
(scott note: H = sum(count/N*log2 (N/count) )
BEST wiki article
When a file or message is viewed as all the data a source will ever generate, the Shannon entropy H in '''bits per symbol''' is
H = - \sum_{i=1}^{n} p_i \log_2 (p_i) = \sum_{i=1}^{n}count_i/N \log_2 (N/count_i)
where i is for each distinct symbol, p_i is "probability of symbol being received, and count_i is the number of times symbol i occurs in the message that is N symbols long. This equation is "bits" per symbol because it is in a logarithm of base 2. It is "entropy per symbol" or "normalized entropy" that ranges from 0 to1 when the logarithm base is equal to the n distinct symbols. This can be calculated by multiplying H in bits/symbol by ln(2)/ln(n). The "bits per symbol" of data that has only 2 symbols is therefore also "entropy per symbol".
The "entropy" of a message, file, source, or other data is S=N*H, in keeping with Boltzmann's [[H-theorem]] for physical entropy that Claude Shannon's 1948 paper cited as analogous to his information entropy. To clarify why Shannon's H is often called "entropy" instead of "entropy per symbol" or "bits per symbol", section 1.6 of Shannon's paper refers to H as the "entropy of the set of probabilities" which is not the same as the entropy of N symbols. H is analogous to physic's [[Intensive and extensive properties|intensive]] entropy So that is on a per mole or per kg basis, which is different from the more common extensive entropy S. In section 1.7 Shannon more explicitly says H is "entropy per symbol" or "bits per symbol".
It is instructive to see H values in bits per symbol (log2) for several examples of short messages by using an online entropy (bits per symbol) calculator:http://www.shannonentropy.netmark.pl/http://planetcalc.com/2476/
* H=0 for "A", "AAAAA", and "11111"
* H=1 for "AB", "ABAB", "0011", and "0110101001011001" (eight 1's and eight 0's)
* H=1.58 for "ABC", "ABCABCABCABC", and "aaaaBBBB2222"
* H=2 for "ABCD" and "AABBCCDD"
The entropy in each of these short messages is H*N.
Claude Shannon's 1948 paper was the first to define an "entropy" for use in information theory. His H function (formally defined below) is named after Boltzmann's [[H-theorem]] which was used to define physical entropy by S=kb*N*H of an ideal gas. Boltzmann's H is entropy in [[nats_(unit)|nats]] per particle, so each symbol in a message has an analogy to each particle in an ideal gas, and the probability of a specific symbol is analogous to the probability of a particle's microstate. The "per particle" basis does not work in solids because they are interacting. But the information entropy maintains mathematical equivalency with bulk [[Intensive and extensive properties|intensive]] physical entropy So which is on a per mole or per kilogram basis (molar entropy and specific entropy). H*N is mathematically analogous to total [[Intensive and extensive properties|extensive]] physical entropy S. If the probability of a symbol in a message represents the probability of a microstate per mole or kg, and each symbol represents a specific mole or kg, then S=kb*ln(2)*N*Hbits. Temperature is a measure of the average kinetic energy per particle in an ideal gas (Kelvins = Joules*2/3/kb) so the Joules/Kelvins units of kbare fundamentally unitless (Joules/Joules), so S is fundamentally an entropy in the same sense as H.
[[Landauer's principle]] shows a change in information entropy is equal to a change in physical entropy S when the information system is perfectly efficient. In other words, a device can't irreversibly change its information entropy content without causing an equal or greater increase in physical entropy. Sphysical=k*ln(2)*(H*N)bits where k, being greater than Boltzmann's constant kb, represents the system's inefficiency at erasing data and thereby losing energy previously stored in bits as heat dQ=T*dS.
==============
on the information and entropy article: (my best version)
The Shannon entropy H in information theory has units of bits per symbol (Chapter 1, section 7). For example, the messages "AB" and "AAABBBABAB" both have Shannon entropy H=log2(2) = 1 bit per symbol because the 2 symbols A and B occur with equal probability. So when comparing it to physical entropy, the physical entropy should be on a "per quantity" basis which is called "[[Intensive_and_extensive_properties|intensive]]" entropy instead of the usual total entropy which is called "extensive" entropy. The "shannons" of a message are its total "extensive" information entropy and is H times the number of bits in the message.
A direct and physically real relationship between H and S can be found by assigning a symbol to each microstate that occurs per mole, kilogram, volume, or particle of a homogeneous substance, then calculating the H of these symbols. By theory or by observation, the symbols (microstates) will occur with different probabilities and this will determine H. If there are N moles, kilograms, volumes, or particles of the unit substance, the relationship between H (in bits per unit substance) and physical extensive entropy in nats is:
:S = k_\mathrm{B} \ln(2) N H
where ln(2) is the conversion factor from base 2 of Shannon entropy to the natural base e of physical entropy. N*H is the amount of information in bits needed to describe the state of a physical system with entropy S. [[Landauer's principle]] demonstrates the reality of this by stating the minimum energy E required (and therefore heat Q generated) by an ideally efficient memory change or logic operation by irreversibly erasing or merging N*H bits of information will be S times the temperature which is
:E = Q = T k_\mathrm{B} \ln(2) N H
where H is in informational bits and E and Q are in physical Joules. This has been experimentally confirmed.{{Citation |author1=Antoine Bérut |author2=Artak Arakelyan |author3=Artyom Petrosyan |author4=Sergio Ciliberto |author5=Raoul Dillenschneider |author6=Eric Lutz |doi=10.1038/nature10872 |title=Experimental verification of Landauer’s principle linking information and thermodynamics |journal=Nature |volume=483 |issue=7388 |pages=187–190 |date=8 March 2012 |url=http://www.physik.uni-kl.de/eggert/papers/raoul.pdf|bibcode = 2012Natur.483..187B }}
Temperature is a measure of the average kinetic energy per particle in an ideal gase (Kelvins = 2/3*Joules/kb) so the J/K units of kb is fundamentally unitless (Joules/Joules). kb is the conversion factor from energy in 3/2*Kelvins to Joules for an ideal gas. If kinetic energy measurements per particle of an ideal gas were expressed as Joules instead of Kelvins, kb in the above equations would be replaced by 3/2. This shows S is a true statistical measure of microstates that does not have a fundamental physical unit other than "nats" which is just a statement of which logarithm base was chosen by convention.
===========
Example:
8 molecules of gas, with 2 equally probable internal energy states and 2 equally probable positions in a box. If at first they are on the same side of the box with half in 1 energy state and the other half in the other energy state, then the state of the system could be written ABABABAB. Now if they are allowed to distribute evenly in the box keeping their previous energy levels it is written ABCDABCD. For an ideal gas: S ~ N*ln(U*V) where U and V are averages per particle.
An ideal gas uses internal energy and volume. Solids seem to not include volume, and depend on phonons with have bond-stretching energies in place of rotational energies. But it seems like solids should follow S~N*ln(U) = k*ln(2)*H where H would get into quantum probabilities of each energy state. The problem is that phonon waves across the solid are occurring. So that instead of U, an H like this might need to be calculated from phase-space (momentum and position of the atoms and their electrons? )
=============
comment Wikipedia
Leegrc, "bits" are the invented name for when the log base is 2 is used. There is, like you say, no "thing" in the DATA itself you can point to. Pointing to the equation itself to declare a unit is, like you are thinking, suspicious. But physical entropy itself is in "nats" for natural units for the same reason (they use base "e"). The only way to take out this "arbitrary unit" is to make the base of the logarithm equal to the number of symbols. The base would be just another variable to plug a number in. Then the range of the H function would stay between 0 and 1. Then it is a true measure of randomness of the message per symbol. But by sticking with base two, I can look at any set of symbols and know how many bits (in my computing system that can only talk in bits) would be required to convey the same amount of information. If I see a long file of 26 letters having equal probability, then I need H = log2(26) = 4.7 bits to re-code each letter in 1's and 0's. There are H=4.7 bits per letter.
PAR, as far as I know, H should be used blind without knowledge of prior symbol probabilities, especially if looking for a definition of entropy. You are talking about watching a transmitter for a long time to determine probabilities, then looking at a short message and using the H function with the prior probabilities.
Let me give an example of why a blind and simple H can be extremely useful. Let's say there is a file that has 8 bytes in it. One moment it say AAAABBBB and the next moment it says ABCDABCD. I apply H blindly not knowing what the symbols represent. H=1 in the first case and H=2 in the second. H*N went from 8 to 16. Now someone reveals the bytes were representing microstates of 8 gas particles. I know nothing else. Not the size of the box they were in, not if the temperature had been raised, not if a partition had been lifted, and not even if these were the only possible microstates (symbols). But there was a physical entropy change everyone agrees upon from S1=kb*ln(2)*8 to S2=kb*ln(2)*16. So I believe entropy H*N as I've described it is as fundamental in information theory as it is in physics. Prior probabilities and such are useful but need to be defined how they are used. H on a per message basis will be the fundamental input to those other ideas, not to be brushed aside or detracted from.
=============
I agree you can shorten up the H equation by entering the p's directly by theory or by experience. But you're doing the same thing as me when I calculate H for large N, but I do not make any assumption about the symbol probabilities. You and I will get the same entropy H and "extensive" entropy N*H for a SOURCE. Your N*H extensive entropy is N*sum(p*log(p)). The online entropy calculators and I use N*H = N*sum[ count/N*log(count/N) ] ( they usually give H without the N). These are equal for large N if the source and channel do not change. "My" H can immediately detect if a source has deviated from its historical average. "My" H will fluctuate around the historical or theoretical average H for small N. You should see this method is more objective and more general than your declaration it can't be applied to a file or message without knowing prior p's. For example, let a partition be removed to allow particles in a box to get to the other side. You would immediately calculate the N*H entropy for this box from theory. "My" N*H will increase until it reaches your N*H as the particles reach maximum entropy. This is how thermodynamic entropy is calculated and measured. A message or file can have a H entropy that deviates from the expected H value of the source.
The distinct symbols A, B, C, D are distinct microstates at the lowest level. The "byte" POSITION determines WHICH particle (or microvolume if you want) has that microstate: that is the level to which this applies. The entropy of any one of them, is "0" by the H function, or "meaningless" as you stated. A sequence of these "bytes" tells the EXACT state of each particle and system, not a particular microstate (because microstate does not care about the order unless it is relevant to it's probability). A single MACROstate would be combinations of these distinct states. One example macrostate of this is when the gas might be in any one of these 6 distinct states: AABB, ABAB, BBAA, BABA, ABBA, or BAAB. You can "go to a higher level" than using A and B as microstates, and claim AA, BB, AB, and BA are individual microstates with a certain probabilities. But the H*N entropy will come out the same. There was not an error in my AAAABBBBB example and I did not make an assumption. It was observed data that "just happened" to be equally likely probabilities (so that my math was simple). I just blindly calculated the standard N*H entropy, and showed how it give the same result physics gets when a partition is removed and the macrostate H*N entropy went from 8 to 16 as the volume of the box doubled. The normal S increases S2-S1=kb*N*ln(2) as it always should when mid-box partition is removed.
I can derive your entropy from the way the online calculators and I use Shannon's entropy, but you can't go the opposite way.
Now is the time to think carefully, check the math, and realize I am correct. There are a lot of problems in the article because it does not distinguish between intensive Shannon entropy H in bits/symbol and extensive entropy N*H in bits (or "shannons to be more precise to distinguish it from the "bit" count of a file which may not have 1's and 0's of equal probability).
BTW, the entropy of an ideal gas is S~N*log2(u*v) where u and v are internal energy and volume per particle. u*v gives the number of microstates per particle. Quantum mechanics determines that u can take on a very large number of values and v is the requirement that the particles are not occupying the same spot, roughly 1000 different places per particle at standard conditions. The energy levels will have different probabilities. By merely observing large N and counting, H will automatically include the probabilities.
In summary, there are only 3 simple equations I am saying. They precisely lay the foundation of all further information entropy considerations. These equations should replace 70% of the existing article. These are not new equations, but defining them and how to use them is hard to come across since there is so much clutter and confusion regarding entropy as a result of people not understanding these statements.
1) Shannon's entropy is "intensive" bits/symbol = H = sum[ count/N*log2(count/N) ] where N is the length of a message and count is for each distinct symbol.
2) Absolute ("extensive") information entropy is in units of bits or shannons = N*H.
3) S = kb*ln(2)*N*H where each N has a distinct microstate which is represented by a symbol. H is calculated directly from these symbols for all N. This works from the macro down to the quantum level.
==========
Example: 3 interacting particles with sum total energy 2 and possible individual energies 0,1,2 may have possible energy distributions 011, 110, 101, 200, 020, or 002. I believe the order is not relevant to what is called a microstate, so you have only 2 symbols for 2 microstates, and get the probability for each is 50-50. Maybe there is usually something that skews this towards low energies. I would simply call each one of the 6 "sub-micro states" a microstate and let the count be included in H. Assuming equal p's again, the first case gives log(2)=1 and the 2nd log(6)=2.58. I believe the first one is the physically correct entropy (the approach, that is, not the exact number I gave). If I had let 0,1,2 be the symbols, then it would have 3*1.46 = 4.38 which is wrong.
Physically, because of the above, when saying S=k*ln(2)*NH, it requires that you look at specific entropy So and make it = k*ln(2)*H, so you'll have the correct H. This back-calculates the correct H. This assumes you are like me and can't derive Boltzmann's thermodynamic H from first (quantum or not) principles. I may be able to do it for an ideal gas. I tried to apply H to Einstein's oscillators (he was not aware of Shannon's entropy at the time) for solids, and I was 25% lower than his multiplicity, which is 25% lower than the more accurate Debye model. So a VERY simplistic approach to entropy with information theory was only 40% lower than experiment and good theory, for the one set of conditions I tried. I assumed the oscillators had only 4 energy states and got S=1.1*kT where Debye via Einstein said S=1.7*kT
My point is this: looking at a source of data and choosing how we group the data into symbols can result in different values for H and NH, [edit: if not independent]. Using no grouping on the original data is no compression and is the only one that does not use an algorithm plus lookup table. Higher grouping on independent data means more memory is required with no benefit to understanding (better understanding=lower NH). People with bad memories are forced to develop better compression methods (lower NH), which is why smart people can sometimes be so clueless about the big picture, reading too much with high NH in their brains and thinking too little, never needing to reduce the NH because they are so smart. Looking for a lower NH by grouping the symbols is the simplest compression algorithm. The next step up is run-length encoding, a variable symbol length. All compression and pattern recognition create some sort of "lookup table" (symbols = weighting factors) to run through an algorithm that may combine symbols to create on-the-fly higher-order symbols in order to find the lowest NH to explain higher original NH. The natural, default non-compressed starting point should be to take the data as it is and apply the H and NH statistics, letting each symbol be a microstate. Perfect compression for generalized data is not a solvable problem, so we can't start from the other direction with an obvious standard.
This lowering of NH is important because compression is 1 of 3 requirements for intelligence. Intelligence is the ability to acquire highest profit divided by noise*log(memory*computation) in the largest number of environments. Memory on a computing device has a potential energy cost and computation has a kinetic energy cost. The combination is internal energy U. Specifically, for devices with a fixed volume, in both production machines and computational machines, profit = Work output/[k*Temp*N*ln(U/N)] = Work/(kTNH). This is Carnot efficiency W/Q, except the work output includes acquisition of energy from the environment so that the ratio can be larger than 1. The thinking machine must power itself from its own work production, so I should write (W-Q)/Q instead. W-Q feeds back to change Q to improve the ratio. The denominator represents a thinking machine plus its body (environment manipulator) that moves particles, ions (in brains), or electrons (in computers) to model much larger objects in the external world to try different scenarios before deciding where to invest W-Q. "Efficient body" means trying to lower k for a given NH. NH is the thinking machine's algorithmic efficiency for a giving k. NH has physical structure with U losses, but that should be a conversion factor moved out to be part of the kT so that NH could be a theoretical information construct. The ultimate body is bringing kT down to kb at 0 C. The goal of life and a more complete definition of intelligence is to feed Work back to supply the internal energy U and to build physical structures that hold more and more N operating at lower and lower k*T. A Buddhist might say we only need to stop being greedy and stop trying to raise N (copies of physical self, kT, aka the number of computations) and U and we could leave k alone. This assumes constant volume, otherwise replace U/N with V/N*(U/N)^3/2 (for an ideal gas, anyway, maybe UV/NN is ok for solids. Including volume means we want to make it lower to lower kTNH. So denser thinking machine. The universe itself increases V/N (Hubble expansion) buth it cancels in determining Q because it causes U/N to decrease at possibly the same rate. This keeps entropy and energy CONSTANT on a universal COMOVING basis (ref: Weinberg's 1977 famous book "First 3 Minutes"), which causes entropy to be emitted (not universally "increased" as the laymen's books still say) from gravitational systems like Earth and Galaxies. The least action principle (the most general form of Newton's law, better than Hamiltonian & Lagrangian for developing new theories, see Feynman's red books) appears to me to have an inherent bias against entropy, preferring PE over KE over all time scales, and thereby tries to lower temp and raise the P.E. part of U for each N on Earth. This appears to be the source of evolution and why machines are replacing biology, killing off species 50,000 times faster than the historical rate. The legal requirement of all public companies is to dis-employ workers because they are expensive and to extract as much wealth from society as possible so that the machine can grow. Technology is even replacing the need for shareholders and skill (2 guys started MS, Apple, google, youtube, facebook, and snapchat and you can see trend in decreasing intelligence and age and increasing random luck needed to get your first $billion). Silicon, carbon-carbon, and matals are higher energy bonds (which least action prefers over kinetic energy) enabling lower N/U and k, and even capturing 20 times more Work energy per m^2 than photosynthesis. Ions that brains have to model objects with still weigh 100,000 times more than the electrons computers use.
In the case of the balance and 13 balls, we applied the balance like asking a question and organize thigs to get the most data out of the test. We may seek more NH answers from people or nature than we give in order to profit, but in producing W, we want to spend as little NH as possible.
[edit: I originally backtracked on dependency but corrected it, and I made a lot errors with my ratios from not letting k be positive for the ln().]Ywaz (talk) 23:09, 8 December 2015 (UTC)
Rubik's cube
Let the number of quarter turns from a specific disorder to ordered state be a microstate with a probability based on number of turns. Longer routes to solution are more likely. The shortest route uses the least energy which is indicative of intelligence. That's why we prize short solutions. The N in N*H = N*sum(count/N*log2(N/count)) is the number of turns. Count is the number of different routes with that N. Speakin in terms Carnot understood and reversing time, the fall from low entropy to high entropy should be fast in order minimize work production (work absorption when forward in time). The problem is always determining if a single particular step is the most efficient towards the goal or not. There's no incremental measure for profit increase in the problems we can't solve. Solving problems is a decrease in entropy. Is working backwards to generate the most mixed up state in as few turns as possible a beginning point?
Normalized entropy
There may not be a fundamental cost to a large memory, i.e., a large number of symbols aka classifications. That is potential energy which is an investment in infrastructure. Maybe there is a "comparison","lookup", or even transmission cost (more bits are still required), but maybe those are reversible. Maybe you can take large entropy, classify it the same as with microbits, and the before and after entropy cost is the same, but maybe computation is not reversible (indeed, dispersion calls it into question) but memory access is, so a larger memory set for classification is less energy to computation. If there are no obvious dependencies to the H function, the larger number of symbols absorbing more bits/symbol in the message will have the same NH. If H is not equal to 1 for binary, then this may be an unusual situation. Let i=0 to n=number of unique symbols of message N symbols long then H=sum(log(N/count_i)) . So
H' = H*ln(2)/ln(n) where "2" should be replaced if the H was not calculated with log base 2.
= normalized entropy where each "bit" position can take on i states, i.e., an access to a memory of symbols has i choices, the number of symbols, aka the number of distinct microstates. H' varies from 0 to 1, which Shannon did not point out, but let "entropy" vary in definition based on the log base. So if there is not a cost to accessing a large memory bank and it speeds up computation or discussion, then even if NH is the same, NH' will be lower for the larger memory bank (the division makes it smaller). But again, NH is only the same between the two anyway if symbol frequencies are equal and this occurs only if you see no big patterns at all and it looks like noise in both cases, or if you have really chosen the optimum set of symbols for the data (physics equations seem to be seeking smallest NH, not smallest NH'). It seems like a larger number of symbols available will almost always detect more patterns, with no problems about narrow-mindedness if the training data set is large enough. The true "disorder" of a physical object relative to itself (per particle or per symbol) rather than to it's mass or size (So) or information is H' = S/ln(number of microstates). If one is trying to compare the total entropy of objects or data that use completely different sets of microstates (symbols) that are very different, the measure N*H' is still completely objective.
With varying-length symbols, NH' has to more seriously lose all connection with physical entropy and energy on a per bit level. It has to be strictly on a per symbol level. Varying symbols lose all connection to physics. consider the following: n times symbol bit length is no longer representing a measure of the bit memory space required because bit length is varying. N divided by symbol bit length is also not the number of symbols in a message. There are more memory locations addressed, with total memory required being n*average symbol length. It seems like it would be strictly "symbols only", especially if the log base is chosen based on n. The log base loses the proportional connection to "energy" (if energy were still connected to number of bits instead of symbols).
For varying-length symbols there is also a problem when trying to relate computations to energy and memory even if I keep base 2. To keep varying symbol lengths on a bit basis for the entropy in regards to computation energy required per information entropy, I need the following: For varying bit-length symbols where symbol i has bit length Li, I get
H=1/N*sum ( count_i * Li * log2(N/(count_i*Li) ) )
This is average bit (and real entropy since N is in bits) variation per bit communicated.
where N has to be in bits instead of symbols and the log base has to be in bits. Inside the log is the ability of the count to reduce the number of bits needed to be transmitted. Remember for positive H it is p*log(1/p) so this represents a higher probability of "count" occurring for a particular bit if L is higher. I don't know how to convert the log to a standard base where it varies from 0 to 1. Because they are varying length, it kind of loses relevance. Maybe the sum frequency of symbol occurring times tits length divided by sum (count_i/N*Li/(sum( L)/n))
Log base two on varying symbol bit-lengths loses its connection to bits required to re-code, so the above has to be used.
So in order to keep information connected to physics, you need to stick with log base 2 so you can count the changes in energy accurately. The symbols can vary in bit length as long as they are the same length.
So H' is a disorder measurement per symbol, comparable across all types of systems with varying number of n and N symbols. It varies from 0 to 1. It is "per energy" only if there is a fixed energy cost per symbol transmission and storage without regard to symbol length. If someone has remembered a lot of symbols of varying lengths, they might know a sequence of bits immediately as 2 symbols and get a good NH' on the data at hand but not a better NH if they had remembered all sequences of that length. If they know the whole message as 1 symbol they already know the message it is 0 in both and the best in both. Being able to utilize the symbol for profit is another matter (low k factor as above). A huge memory will have a cost that raises k. Knowing lots of facts but not being able to know how to use them is a high k (no profit). Facts without profit are meaningless. You might as well read data and just store it.
Maybe "patterns" or "symbols" or "microstates" to be recognized are the ones that lead to profit. You run scenarios and look at the output. That requires patterns as constraints and capabilities. Or look at profit patterns and reverse-invent scenarios of possible patterns under the constraints.
Someone who knows many symbols of a Rubik's cube is one who recognizes the patterns that lead to the quickest profit. The current state plus the procedure to take is a pattern that leads to profit. The goal is speed. Energy losses for computation and turn cost are not relevant. They will have the lowest NH', so H' is relevant.
================
Thanks for the link to indistinguishable particles. The clearest explanation seems to be here, [[Gibbs_paradox#The_mixing_paradox|the mixing paradox]]. The idea is this: if we need to know the kTNH energy required (think NH for a given kT) to return to the initial state at the level 010, 100, 001 with correct sequence from a certain final sequence, then we need do the microstates at that low level. Going the other way, "my" method should be mathematically the same as "yours" if it is required to NOT specify the exact initial and final sequences, since those were implicitly not measured. Measuring the initial state sequences without the final state sequences would be changing the base of the logarithm mid-stream. H is in units of true entropy per symbol when the base of logarithm is equal to the number of distinct symbols. In this way H always varies from 0 to 1 for all objects and data packets, giving a true disorder (entropy) per symbol (particle). You multiply by ln(2)/ln(n) to change base from 2 to n symbols. Therefore the ultimate objective entropy (disorder or information) in all systems, physical or information, when applied to data that accurately represents the source should be
Entropy = N*(-H) = \sum_i count_i \log_n (N/count_i)
Entropy = sum(count*logn(N/count))
where i=1 to n distinct symbols in data N symbols long. Shannon did not specify which base H uses, so it is a valid H. To convert it to nats of normal physical entropy or entropy in bits, multiply by ln(n) or log2(n). The count/N is inverted to make H positive. In this equation, with the ln(2) conversion factor, this entropy of "data" is physically same as the entropy of "physics" if the symbols are indistinguishable, and we use energy to change the state of our system E=kT*NH where our computer system has a k larger than kb due to inefficiency. Notice that changes in entropy will be the same without regard to k, which seems to explain why ultimately distinguishable states get away with using higher-level microstates definitions that are different with different absolute entropy. For thermo, kb is what appears to have fixed not caring about the deeper states that were ultimately distinguishable.
In the equation above, there is a penalty if you chose a larger symbol set. Maybe that accounts for the extra memory space required to define symbols.
The best wiki articles are going to be like this: you derive what is the simplest but perfectly accurate view, then find the sources using that conclusion to justifyits inclusion.
So if particles (symbols) are distinguishable and we use that level of distinguishability, the count at the 010 level has to be used. Knowing the sequence means knowing EACH particle's energy. The "byte-position" in a sequence of bits represents WHICH particle. This is not mere symbolism because the byte positions on a computer have a physical location in a volume, so that memory core and CPU entropy changes are exactly the physical entropy changes if they are at 100% efficiency ([[Landauer's principle]]). (BTW the isotope method won't work better than different molecules because it has more mass. This does not affect temperature, but it affects pressure, which means the count has to be different so that pressure is the same. So if you do not do anything that changes P/n in PV=nRT, using different gases will have no effect to your measured CHANGE in entropy, and you will not know if they mixed or not. ) [[User:Ywaz|Ywaz]] ([[User talk:Ywaz|talk]]) 11:48, 10 December 2015 (UTC)
By using indistinguishable states, physics seems to be using a non-fundamental set of symbols, which allows it to define states that work in terms of energy and volume as long as kb is used. The ultimate, as far as physicists might know, might be phase space (momentum and position) as well as spin, charge, potential energy and whatever else. Momentum and position per particle are 9 more variables because unlike energy momentum is a 6D vector (including angular), and a precise description of the "state" of a system would mean which particle has the quantities matters, not just the total. Thermo gets away with just assigning states based on internal energy and volume, each per particle. I do not see kb in the ultimate quantum description of entropy unless they are trying to bring it back out in terms of thermo. If charge, spin, and particles are made up of even smaller distinguishable things, it might be turtles all the way down, in which case, defining physical entropy as well as information entropy in the base of the number of symbols used (our available knowledge) might be best.
I didn't make it up. It's normally called normalized entropy, although they normally refer to this H with logn "as normalized entropy" when according to Shannon they should say "per symbol" and use NH to call it an entropy. I'm saying there's a serious objectivity to it that I did not realize until reading about indistinguishable states.
I hope you agree "entropy/symbol" is a number that should describe a certain variation in a probability distribution, and that if a set of n symbols were made of continuous p's, then a set of m symbols should have the same continuous distribution. But you can't do that (get the same entropy number) for the exact same "extrapolated" probability distributions if they use a differing number of symbols. You have to let the log base equal the number of symbols. I'll get back to the issue of more symbols having a "higher resolution". The point is that any set of symbols can have the same H and have the same continuous distribution if extrapolated.
If you pick a base like 2, you are throwing in an arbitrary element, and then have to call it (by Shannon's own words) "bits/symbol" instead of "entropy/symbol". Normalized entropy makes sense because of the following
entropy in bits/symbol = log2(2^("avg" entropy variation/symbol))
entropy per symbol = logn(n^("avg" entropy variation/symbol))
The equation I gave above is the normalized entropy that gives this 2nd result.
Previously we showed for a message of N bits, NH=MH' if the bits are converted to bytes and you calculate H' based on the byte symbols using the same log base as the bits, and if the bits were independent. M = number of byte symbols = N/8. This is fine for digital systems that have to use a certain amount of energy per bit. But what if energy is per symbol? We would want NH = M/8*H' because the byte system used 8 fewer symbols.
By using log base n, H=H' for any set of symbols with the same probability distribution, and N*H=M/8*H.
Bytes can take on an infinite number of different p distributions for the same H value, whereas bits are restricted to a certain pair of values for p0 and p1 (or p1 and p0) for a certain H, since p0=1-p0. So bytes have more specificity, that could allow for higher compression or describing things like 6-vector momentum instead of just a single scalar for energy, using the same number of SYMBOLS. The normalized entropy allows them to have the same H to get the same kTNH energy without going through contortions. So for N particles let's say bits are being used to describe each one's energy with entropy/particle H, and bytes are used to described their momentums with entropy/particle H'. Momentums uniquely describe the energy (but not vice versa). NH=NH'. And our independent property does not appear to be needed: H' can take on a specific values of p's that satisfy H=H', not some sort of average of those sets. Our previous method of NH=MH' is not as nice, violating Occam's razor.
entropy and kolmogorov complexity
== Shannon entropy of a universal measure of Kolmogorov complexity ==
What is the relation of Kolmogorov complexity to [[Information theory]]? It seems very close to Shannon entropy, in that both are maximised by randomness, although one seems to deal more with the message than the context. [[User:Cesiumfrog|Cesiumfrog]] ([[User talk:Cesiumfrog|talk]]) 00:03, 14 October 2010 (UTC)
:Shannon entropy is a statistical measure applied to data which shows how efficiently the symbols are transferring bits based solely on how many times the symbols occur relative to each other. It's like the dumbest, simplest way of looking for patterns in data, which is ironically why it is useful. Its biggest most basic use is for taking data with a large number of different symbols and stating how many bits are needed to say the same thing without doing any compression on the data.
:Kolmogorov complexity is the absolute extreme in the direction Shannon entropy takes step 0. It is the highest possible level of intelligent compression of a data set. It is not computable in most cases, it's just a theoretical idea. But Shannon entropy is always computable and blind. By "compression" I mean an algorithm has been added to the data, and the data reduced to a much smaller set that is the input to the algorithm to generate the original data. Kolmogorov is the combination in bits of the program plus the smaller data set.
:I would use Shannon entropy to help determine the "real" length in bits of a program that is proposing to be better than others at get close to the idealized Kolmogorov complexity. Competing programs might be using a different or larger sets of functions, which I would assign a symbol to. Programs that use a larger set of functions would be penalized when the Shannon entropy measure is applied. There are a lot of problems with this staring point that I'll get to, but I would assign a symbol to each "function" like a=next line, b=*, e=OR, f=space between arguments, and so on. Numbers would remain number symbols 1,2,3, because they are already efficiently encoded. Then the Shannon entropy (in bits) and therefore the "reduced" level attempted Kolgomorov complexity (Shannon entropy in bits) is
: N*H = - N \sum_i f_i \log_2 (f_i) = \sum_i count_i \log_2 (N/count_i)
:where i=1 to n is for each of n distinct symbols, f_i is "frequency of symbol i occurring in program", and count_i is the number of times symbol i occurs in the program that is N symbols long. This is the Shannon bits (the unit is called "shannons") in the program. The reason this is a better measure of k is because if
:normalized H = \sum_i count_i/N \log_n (N/count_i)
:is < 1, then there is an inefficiency in the way the symbols themselves were used (which has no relevance to the efficiency of the logic of the algorithm) that is more efficient when expressed as bits. Notice the log base is "n" in this 2nd equation. This H is normalized and is the truest entropy per symbol. To calculate things in this base use logn(x) = ln(x)/ln(n)
:But there is a big problem. Suppose you want to define a function to be equal to a longer program and just assign a symbol to it, so the program length is reduced to nearly "1". Or maybe a certain function in a certain language is more efficient for certain problems. So there needs to be a reference "language" (set of allowable functions) to compare one K program to the next. All standard functions have a known optimal expression in Boolean logic: AND, NOT, and OR. So by choosing those 3 as the only functions, any program in any higher-level language can be reduced back to a standard. Going even further, these can be reduced back to a transistor count or to a single universal logic gate like NAND or XOR, or a single universal reversible logic gate like Toffoli or Fredkin. Transistors are just performing a function to, so i guess they are universal to, but I am not sure. The universal gates can be wired to performs all logic operations and are Turing complete. So any program in any language can be re-expressed into a single fundamental function. The function itself will not even need a symbol in the program to identify it (as I'll show), so that the only code in the program is describing the wiring between "addresses". So the complexity describes the complexity of the physical connections, the path the electrons or photons in the program take, without regard to the distance between them. Example: sequence of symbols ABCCDA&CEFA&G-A would mean "perform A NAND B and send output to C, perform C NAND D and send output to A and C, perform E NAND F and send output to A and F, go to A. The would define input and output addresses, I would use something like ABD>(program)>G. The "goto" symbol "-" is not used if the program is expanded to express Levine's modified K complexity which penalizes loop \s by expanding them. It thereby takes into account computational time (and therefore a crude measure of energy energy) as well and a different type of program length.
:The program length by itself could or should be a measure of the computational infrastructure required (measured as energy required to create the transistors), which is another reason it should be in terms of something like a single gate: so it's cost to implement can be measured. What's the cost to build an OR verses a fourier transform? Answer: Count the required transistors or the NAND gates (which are made up of a distinct number of transistors). All basic functions already have had their Boolean logic and transistors reduce to a minimal level so it's not arbitrary or difficult.
:I think this is how you can get a comparable K in all programs in all languages: express them in a single function with distinct symbols representing the input and ouput addresses. Then calculate Shannon's N*H to get its K or Levine complexity in absolute physical terms. Using [[Landauer's principle]], counting the number of times a bit address (at the input or output of a NAND gate) changes state will give the total energy in joules that the program used, Joules=k*T*ln(2)*N where k is larger than Boltzmann's constant kb as a measure of the inefficiency of the particular physical implementation of the NAND gates. If a theoretically reversible gate is used there is theoretically no computational energy loss, except for the input and output changes. [[User:Ywaz|Ywaz]] ([[User talk:Ywaz|talk]]) 22:20, 12 December 2015 (UTC)
========
simplifying ST equation for ideal gas and looking at it from an entropy view.
== Derivation from uncertainty principle and relation to information entropy ==
Volume and energy are not the direct source of the states that generate entropy, so I wanted to express it in terms of x*p/h' number of states for each N. Someone above asked for a derivation from the uncertainty principle ("the U.P.") and he says it's pretty easy. S-T pre-dates U.P., so it may be only for historical reasons that a more efficient derivation is not often seen.
The U.P. says x'p'>h/4pi where x'p' are the standard deviations of x and p. The x and p below are the full range, not the 34.1% of the standard deviation so I multiplied x and p each by 0.341. It is 2p because p could be + or - in x,y, and z. By plugging the variables in and solving, it comes very close to the S-T equation. For Ω/N=1000 this was even more accurate than S-T, 0.4% lower.
Sackur-Tetrode equation:
where
Stirling's approximation N!=(N/e)^N is used in two places that results in a 1/N^(5/2) and and e^(5/2) which is where the 5/2 factor comes from. The molecules' internal energy U is kinetic energy for the monoatomic gas case for which the S-T applies. b=1 for monoatomic, and it may simply be changed for other non-monoatomic gases that have a different K.E./U ratio. The equation for p is the only difficult part of getting from the U.P. to the S-T equation and it is difficult only because the thermodynamic measurements T (kinetic energy per atom) and V are an energy and a distance where the U.P. needs x*p or t*E. This strangeness is where the 3/2 and 5/2 factors come from. The 2m is to get 2*m*1/2*m*V^2 = p^2. Boltzmann's entropy assumes it is a max for the given T, V, and P which I believe means the N's are evenly distributed in x^3 and assumes all are carrying the same magnitude p momentum.
To show how this can come directly from information theory, first remember that Shannon's H function is valid only for a random variable. In this physical case, there are only 2 possible values that each phase space can have: with an atom in it, or not, so it is like a binary file. But unlike normal information entropy, some or many of the atoms may have zero momentum as the others carry more: the total energy just has to be the same. So physical entropy can use anywhere from N to 1 symbols (atoms) to carry the same message (the energy), whereas information entropy is stuck with N. Where information entropy has a log(1/N^N)=N*log(N) factor, physical entropy has log(1/N!)=N*log(N)+N which is a higher entropy. My correction to Shannon's H shown below is known as the sum of the surprisals and is equal to the information (entropy) content in bits. The left sum is the information contributions from the empty phase space slots and the right side are those where an atom occurs. The left sum is about 0.7 without regard to N (1 if ln() had been used) and the right side is about 17*N for a gas at room temperature (for Ω/N ~ 100,000 states/atom):
Physical entropy S then comes directly from this Shannon entropy:
A similar procedure can be applied to phonons in solids to go from information theory to physical entropy. For reference here is a 1D oscillator in solids:
[[User:Ywaz|Ywaz]] ([[User talk:Ywaz|talk]]) 02:37, 15 January 2016 (UTC)
=============
email to jon bain, NYU 1/17/2016 (this conatins errors. See above for the best stuff..
You have a nice entropy presentation, but I wanted to show you Shannon's H is an "intensive" entropy which is entropy per symbol of a message as he states explicitly in section 1.7 of his paper. This has much more of a direct analogy to Boltzmann's H-theorem that Shannon says was his inspiration.
Specific entropy in physics is S/kg or S/mole which is what both the H's are. Regular extensive entropy S for both Shannon and Boltzmann is:
S=N*H
So conceptually it seems the only difference from Shannon/Boltzmann and any other physical entropy is that Shannon/Boltzmann make use independent (non-interacting) symbols (particles) because they are available. It seems like they could be forced to generalize towards Gibbs or QM entropy as needed, mostly by just changing definitions of the variables (semantics).
Boltzmann's N particles
Let O{N!} = "O!" = # of QM-possible mAcrostates for N at E
Only a portion of N may carry E (and therefore p) and each Ni may not occupy less than (xp/h')^3 phase space:
O{N!} = (2xp/h')^3N = [2x*sqrt(2m*E/N!)/h']^3N , N!=(N/e)^N
2xp because p could be + or -.
ln [ (O{N!})^3 / N! ] = ln [O^3N*(1/e)^(-3/2*N) / (N/e)^N ] = 5/2*N*[ln(O/N)+1]
O= 2xP/N/h' = 2xp/h'
In order to count O possible states I have to consider than some of the Ni's did not have the minimum xp'/h' which is why S-T has error for small N: they can't be zero by QM.
S=k*ln(O{N!}/N!)=5/2*k*N*[ln(O/N) + 1]
results in a constant in the S-T equation: S=kN(ln(O/N) + 5/2)
S= k*ln(Gamma) = k*ln("O!"/N!) = k*5/2*N*(ln(O/N)+1] )
S=k*N*(-1*Oi/N*sum[ln(N/Oi) -const]
S=k*N*(sum[Oi/N*const - Oi/N*ln(N/Oi)]
Shannon: N=number of symbols in the data, O^O=distinct symbols,
S=N*(-sum(O/N*(log(O/N))
So the Shannon entropy appears to me to be Boltzmann's entropy except that particles can be used only once and symbols can be used a lot. It seems to me Shannon entropy math could be applied to Boltzmann's physics (or vice versa) it get the right answer with only a few changes to semantics.
If I sent messages with uniquely-colored marbles from a bag with gaps in the possible time slots, I am not sure there is a difference between Shannon and Boltzmann entropy.
Can the partition function be expressed as a type of N!/O! ?
The partition function's macrostates seems to work backwards to get to Shannon's H because measurable macrostates are the starting point.
The S-T equation for a mono-atomic gas is S=kN*(ln(O/N)+5/2) where O is (2px/h')^3=(kT/h'f)^3 in the V=x^3 volume where h' is an adjusted h-bar to account for p*x in the uncertainty principle being only 1 standard deviation (34.1%) instead of 100% of the possibilities, so h'=h-bar/(0.34.1)^2 and there are no other constants or pi's in S-T equation except for 5/2 which comes from [(xp/h')^3N]/N! = [x*sqrt(2m*E/N!)/h']^3 ~ (sqrt(E/e))^(3N*sqrt(E)) / (N/e)^N = 5/2
k is just a conversion factor from the kinetic energy per N as measured by temperature in order to get it into joules when you use dQ=dS*T. If we measure the kinetic energy of the N's in Joules instead of temp, then k is absent. Of course ln(2) is the other man-made historical difference.
If the logarithm base is made N, then it is a normalized entropy that ranges from 0 to 1 for all systems, giving a universal measure of how close the system is to maximum or minimum entropy.
I think your equations should keep "const" inside the parenthesis like this because it depends on N and I believe it is a simple ratio. I think the const is merely the result of "sampling without replacement" factorials.
=========
they say Boltzmann's entropy is when the system has reached thermal equilibrium so that S=N*H works, or something like that, and that Gibbs is more general where the systems macrostate measurements may have a lower entropy than boltzmann's calculation. I think this is like a message where you have calculated the p's, but it sends you something different. Or that you calculate S=N*H and source's H is different than it turns out to be in the future.
=======
post to rosettacode.org on "entropy" where they give code for H in many languages.
== This is not exactly "entropy". H is in bits/symbol. ==
This article is confusing Shannon entropy with information entropy and incorrectly states Shannon entropy H has units of bits.
There are many problems in applying H= -1*sum(p*log(p)) to a string and calling it the entropy of that string. H is called entropy but its units are bits/symbol, or entropy/symbol if the correct log base is chosen. For example, H of 01 and 011100101010101000011110 are exactly the same "entropy", H=1 bit/symbol, even though the 2nd one obviously carries more information entropy than the 1st. Another problem is that if you simply re-express the same data in hexadecimal, H gives a different answer for the same information entropy. The best and real information entropy of a string is 4) below. Applying 4) to binary data gives the entropy in "bits", but since the data was binary, its units are also a true "entropy" without having to specify "bits" as a unit.
Total entropy (in an arbitrarily chosen log base, which is not the best type of "entropy") for a file is S=N*H where N is the length of the file. Many times in [http://worrydream.com/refs/Shannon%20-%20A%20Mathematical%20Theory%20of%20Communication.pdf Shannon's book] he says H is in units of "bits/symbol", "entopy/symbol", and "information/symbol". Some people don't believe Shannon, so [https://schneider.ncifcrf.gov/ here's a modern respected researcher's home page] that tries to clear the confusion by stating the units out in the open.
Shannon called H "entropy" when he should have said "specific entropy" which is analogous to physics' S0 that is on a per kg or per mole basis instead of S. On page 13 of [http://worrydream.com/refs/Shannon%20-%20A%20Mathematical%20Theory%20of%20Communication.pdf Shannon's book], you easily can see Shannon's horrendous error that has resulted in so much confusion. On that page he says H, his "entropy", is in units of "entropy per symbol". This is like saying some function "s" is called "meters" and its results are in "meters/second". He named H after Boltzmann's H-theorem where H is a specific entropy on a per molecule basis. Boltzmann's entropy S = k*N*H = k*ln(states).
There 4 types of entropy of a file of N symbols long with n unique types of symbols:
1) Shannon (specific) entropy '''H = sum(count_i / N * log(N / count_i))'''
where count_i is the number of times symbol i occured in N.
Units are bits/symbol if log is base 2, nats/symbol if natural log.
2) Normalized specific entropy: '''Hn = H / log(n).'''
The division converts the logarithm base of H to n. Units are entropy/symbol. Ranges from 0 to 1. When it is 1 it means each symbol occurred equally often, n/N times. Near 0 means all symbols except 1 occurred only once, and the rest of a very long file was the other symbol. "Log" is in same base as H.
3) Total entropy '''S' = N * H.'''
Units are bits if log is base 2, nats if ln()).
4) Normalized total entropy '''Sn' = N * H / log(n).''' See "gotcha" below in choosing n.
Unit is "entropy". It varies from 0 to N
5) Physical entropy S of a binary file when the data is stored perfectly efficiently (using Landauer's limit): '''S = S' * kB / log(e)'''
6) Macroscopic information entropy of an ideal gas of N identical molecules in its most likely random state (n=1 and N is known a priori): '''S' = S / kB / ln(1)''' = kB*[ln(states^N/N!)] = kB*N* [ln(states/N)+1].
*Gotcha: a data generator may have the option of using say 256 symbols but only use 200 of those symbols for a set of data. So it becomes a matter of semantics if you chose n=256 or n=200, and neither may work (giving the same entropy when expressed in a different symbol set) because an implicit compression has been applied.
=====
rosetta stone, on the main entropy page:
Calculate the Shannon entropy H of a given input string.
Given the discreet random variable that is a string of "symbols" (total characters) consisting of different characters (n=2 for binary), the Shannon entropy of X in '''bits/symbol''' is :
:
where is the count of character .
For this task, use X="1223334444" as an example. The result should be 1.84644... bits/symbol. This assumes X was a random variable, which may not be the case, or it may depend on the observer.
This coding problem calculates the "specific" or "[[wp:Intensive_and_extensive_properties|intensive]]" entropy that finds its parallel in physics with "specific entropy" S0 which is entropy per kg or per mole, not like physical entropy S and therefore not the "information" content of a file. It comes from Boltzmann's H-theorem where where N=number of molecules. Boltzmann's H is the same equation as Shannon's H, and it gives the specific entropy H on a "per molecule" basis.
The "total", "absolute", or "[[wp:Intensive_and_extensive_properties|extensive]]" information entropy is
: bits
This is not the entropy being coded here, but it is the closest to physical entropy and a measure of the information content of a string. But it does not look for any patterns that might be available for compression, so it is a very restricted, basic, and certain measure of "information". Every binary file with an equal number of 1's and 0's will have S=N bits. All hex files with equal symbol frequencies will have bits of entropy. The total entropy in bits of the example above is S= 10*18.4644 = 18.4644 bits.
The H function does not look for any patterns in data or check if X was a random variable. For example, X=000000111111 gives the same calculated entropy in all senses as Y=010011100101. For most purposes it is usually more relevant to divide the gzip length by the length of the original data to get an informal measure of how much "order" was in the data.
Two other "entropies" are useful:
Normalized specific entropy:
:
which varies from 0 to 1 and it has units of "entropy/symbol" or just 1/symbol. For this example, Hn<\sub>= 0.923.
Normalized total (extensive) entropy:
:
which varies from 0 to N and does not have units. It is simply the "entropy", but it needs to be called "total normalized extensive entropy" so that it is not confused with Shannon's (specific) entropy or physical entropy. For this example, Sn<\sub>= 9.23.
Shannon himself is the reason his "entropy/symbol" H function is very confusingly called "entropy". That's like calling a function that returns a speed a "meter". See section 1.7 of his classic [http://worrydream.com/refs/Shannon%20-%20A%20Mathematical%20Theory%20of%20Communication.pdf A Mathematical Theory of Communication] and search on "per symbol" and "units" to see he always stated his entropy H has units of "bits/symbol" or "entropy/symbol" or "information/symbol". So it is legitimate to say entropy NH is "information".
In keeping with Landauer's limit, the physics entropy generated from erasing N bits is if the bit storage device is perfectly efficient. This can be solved for H2*N to (arguably) get the number of bits of information that a physical entropy represents.
rosetta stone, on the main entropy page:
Calculate the Shannon entropy H of a given input string.
Given the discreet random variable that is a string of "symbols" (total characters) consisting of different characters (n=2 for binary), the Shannon entropy of X in '''bits/symbol''' is :
:
where is the count of character .
For this task, use X="1223334444" as an example. The result should be 1.84644... bits/symbol. This assumes X was a random variable, which may not be the case, or it may depend on the observer.
This coding problem calculates the "specific" or "[[wp:Intensive_and_extensive_properties|intensive]]" entropy that finds its parallel in physics with "specific entropy" S0 which is entropy per kg or per mole, not like physical entropy S and therefore not the "information" content of a file. It comes from Boltzmann's H-theorem where where N=number of molecules. Boltzmann's H is the same equation as Shannon's H, and it gives the specific entropy H on a "per molecule" basis.
The "total", "absolute", or "[[wp:Intensive_and_extensive_properties|extensive]]" information entropy is
: bits
This is not the entropy being coded here, but it is the closest to physical entropy and a measure of the information content of a string. But it does not look for any patterns that might be available for compression, so it is a very restricted, basic, and certain measure of "information". Every binary file with an equal number of 1's and 0's will have S=N bits. All hex files with equal symbol frequencies will have bits of entropy. The total entropy in bits of the example above is S= 10*18.4644 = 18.4644 bits.
The H function does not look for any patterns in data or check if X was a random variable. For example, X=000000111111 gives the same calculated entropy in all senses as Y=010011100101. For most purposes it is usually more relevant to divide the gzip length by the length of the original data to get an informal measure of how much "order" was in the data.
Two other "entropies" are useful:
Normalized specific entropy:
:
which varies from 0 to 1 and it has units of "entropy/symbol" or just 1/symbol. For this example, Hn<\sub>= 0.923.
Normalized total (extensive) entropy:
:
which varies from 0 to N and does not have units. It is simply the "entropy", but it needs to be called "total normalized extensive entropy" so that it is not confused with Shannon's (specific) entropy or physical entropy. For this example, Sn<\sub>= 9.23.
Shannon himself is the reason his "entropy/symbol" H function is very confusingly called "entropy". That's like calling a function that returns a speed a "meter". See section 1.7 of his classic [http://worrydream.com/refs/Shannon%20-%20A%20Mathematical%20Theory%20of%20Communication.pdf A Mathematical Theory of Communication] and search on "per symbol" and "units" to see he always stated his entropy H has units of "bits/symbol" or "entropy/symbol" or "information/symbol". So it is legitimate to say entropy NH is "information".
In keeping with Landauer's limit, the physics entropy generated from erasing N bits is if the bit storage device is perfectly efficient. This can be solved for H2*N to (arguably) get the number of bits of information that a physical entropy represents.
=========
== From Shannon's H to ideal gas S ==
This is a way I can go from information theory to the Sackur-Tetrode equation by simply using the sum of the surprisals or in a more complicated way by using Shannon's H. It gives the same result and has 0.16% difference from ST for neon at standard conditions.
Assume each of the N atoms can either be still or be moving with the total energy divided among "i" of them. The message the set of atoms send is "the the total kinetic energy in this volume is E". The length of each possible message is the number of moving atoms "i". The number of different symbols they can use is N different energy levels as the number of moving atoms ranges from 1 to N. When fewer atoms are carrying the same total kinetic energy E, they will each have a larger momentum which increases the number of possible states they can have inside the volume in accordance with the uncertainty principle. A complication is that momentum increases as the square root energy and can go in 3 different directions (it's a vector, not just a magnitude), so there is a 3/2 power involved. The information theory entropy S in bits is the sum of the surprisals. The log gives the information each message sends, summed over N messages.
where
To convert this to physical entropy, change the base of the logarithm to ln():
You can make the following substitions to get the Sackur-Tetrode equation:
The probability of encountering an atom in a certain momentum state depends (through the total energy constraint) on the state of the other atoms. So the probability of the states of the individual atoms are not a random variable with regard to the other atoms, so I can't write H as a function of the state of each atom (I can't use S=N*H) directly. But by looking at the math, it seems valid to re-interpret interpret an S=ΩH as an S=N*H. The H inside () is entropy per state. The 1/i makes it entropy per moving atom, then the sum over N gives total entropy. The sum for i over N was for the H, but then the 1/i re-interpreted it to be per atom. So it is an odd type of S=NH. Notice my sum for j does not actually use j as it is just adding the same pi up for all states.
Here's how I used Shannon's H in a more direct but complicated way but got the same result:
There will be N symbols to count in order to measure the Shannon entropy: empty states (there are Ω-N of them), states with a moving atom (i of them), and states with a still atom (N-i). Total energy determines what messages the physical system can send, not the length of the message. It can send many different messages. This is why it's hard to connect information entropy to physical entropy: physical entropy has more freedom than a normal message source. So in this approximation, N messages will have their Shannon H calculated and averaged. Total energy is evenly split among "i" moving atoms, where "i" will vary from 1 to N, giving N messages. The number of phase states (the length of each message) increases as "i" (moving atoms) decreases because each atom has to have more momentum to carry the total energy. The Ωi states (message length) for a given "i" of moving atoms is a 3/2 power of energy because the uncertainty principle determining number of states is 1D and volume is 3D, and momentum is a square root of energy.
Again use to convert to physical entropy. Shannon's entropy Hi for the 3 symbols is the sum of the probability of encountering an empty state, a moving-atom state, and a "still" atom state. I am employing a cheat beyond the above reasoning by counting only 1/2 the entropy of the empty states. Maybe that's a QM effect.
Notice H*Ω simplifies to the count equation programmers use. E=empty state, M=state with moving atom, S=state with still atom.
By some miracle the above simplifies to the previous equation. I couldn't do it, so I wrote a Perl program to calculate it directly to compare it to the ST equation for neon gas at ambient conditions for a 0.2 micron cube (so small to reduce number of loops to N<1 confirmed="" correct="" entropy="" equation="" is="" mega="" million.="" million="" molar="" nbsp="" official="" s="" st="" standard="" sup="" the="" with="">01>
for neon: 146.22 entropy/mole / 6.022E23 * N. It was within 0.20%. I changed P, T, or N by 1/100 and 100x and the difference was from 0.24% and 0.12%. #!/usr/bin/perl== From Shannon's H to ideal gas S ==
This is a way I can go from information theory to the Sackur-Tetrode equation by simply using the sum of the surprisals or in a more complicated way by using Shannon's H. It gives the same result and has 0.16% difference from ST for neon at standard conditions.
Assume each of the N atoms can either be still or be moving with the total energy divided among "i" of them. The message the set of atoms send is "the the total kinetic energy in this volume is E". The length of each possible message is the number of moving atoms "i". The number of different symbols they can use is N different energy levels as the number of moving atoms ranges from 1 to N. When fewer atoms are carrying the same total kinetic energy E, they will each have a larger momentum which increases the number of possible states they can have inside the volume in accordance with the uncertainty principle. A complication is that momentum increases as the square root energy and can go in 3 different directions (it's a vector, not just a magnitude), so there is a 3/2 power involved. The information theory entropy S in bits is the sum of the surprisals. The log gives the information each message sends, summed over N messages.
where
To convert this to physical entropy, change the base of the logarithm to ln():
You can make the following substitions to get the Sackur-Tetrode equation:
The probability of encountering an atom in a certain momentum state depends (through the total energy constraint) on the state of the other atoms. So the probability of the states of the individual atoms are not a random variable with regard to the other atoms, so I can't write H as a function of the state of each atom (I can't use S=N*H) directly. But by looking at the math, it seems valid to re-interpret interpret an S=ΩH as an S=N*H. The H inside () is entropy per state. The 1/i makes it entropy per moving atom, then the sum over N gives total entropy. The sum for i over N was for the H, but then the 1/i re-interpreted it to be per atom. So it is an odd type of S=NH. Notice my sum for j does not actually use j as it is just adding the same pi up for all states.
Here's how I used Shannon's H in a more direct but complicated way but got the same result:
There will be N symbols to count in order to measure the Shannon entropy: empty states (there are Ω-N of them), states with a moving atom (i of them), and states with a still atom (N-i). Total energy determines what messages the physical system can send, not the length of the message. It can send many different messages. This is why it's hard to connect information entropy to physical entropy: physical entropy has more freedom than a normal message source. So in this approximation, N messages will have their Shannon H calculated and averaged. Total energy is evenly split among "i" moving atoms, where "i" will vary from 1 to N, giving N messages. The number of phase states (the length of each message) increases as "i" (moving atoms) decreases because each atom has to have more momentum to carry the total energy. The Ωi states (message length) for a given "i" of moving atoms is a 3/2 power of energy because the uncertainty principle determining number of states is 1D and volume is 3D, and momentum is a square root of energy.
Again use to convert to physical entropy. Shannon's entropy Hi for the 3 symbols is the sum of the probability of encountering an empty state, a moving-atom state, and a "still" atom state. I am employing a cheat beyond the above reasoning by counting only 1/2 the entropy of the empty states. Maybe that's a QM effect.
Notice H*Ω simplifies to the count equation programmers use. E=empty state, M=state with moving atom, S=state with still atom.
By some miracle the above simplifies to the previous equation. I couldn't do it, so I wrote a Perl program to calculate it directly to compare it to the ST equation for neon gas at ambient conditions for a 0.2 micron cube (so small to reduce number of loops to N<1 confirmed="" correct="" entropy="" equation="" is="" mega="" million.="" million="" molar="" nbsp="" official="" s="" st="" standard="" sup="" the="" with="">01>
# neon gas entropy by Sackur-Tetrode (ST), sum of surprisals (SS), and Shannon's H (SH)
$T=298; $V=8E-21; $kB=1.381E-23; $m=20.8*1.66E-27; $h=6.6262E-34; $P=101325; # neon, 1 atm, 0.2 micron sides cube
$N = int($P*$V/$kB/$T+0.5); $U=$N*3/2*$kB*$T;
$ST = $kB*$N*(log($V/$N*(4*3.142/3*$m*$U/$N/$h**2)**1.5)+5/2)/log(2.718);
$x = $V**0.33333; $p = (2*$m*$U/$N)**0.5; $O = ($x*$p/($h/(4*3.142*0.341**2)))**3;
for ($i=1;$i<$N;$i++) { $Oi=$O*($N/$i)**1.5;
$SH += 0.5*($Oi-$N)*log($Oi/($Oi-$N)) + $i*log($Oi/$i) + ($N-$i)*log($Oi/($N-$i));
$SS += log($Oi/($i)); }
$SH += 0.5*($O-$N)*log($O/($O-$N)) + $N*log($O/$N); # for $i=$N
$SH = $kB*$SH/log(2.718)/$N;
$SS = $kB*$SS/log(2.718);
print "SH=$SH, SS=$SS, ST=$ST, SH/ST=".$SH/$ST.", N=".int($N).", Omega=".int($O).", O/N=".int($O/$N);
exit;
[[User:Ywaz|Ywaz]] ([[User talk:Ywaz|talk]]) 19:54, 27 January 2016 (UTC)
Monday, November 23, 2015
FDA, pharmaceuticals, chemo, heart surgery, and supplements
The FDA will not allow "snake oil" to be advertised as a cure for a disease. It's a great idea. However, in order to meet the FDA's burden of proof, it is a very expensive process to prove a treatment can help a disease. The same burden of proof is not required for any surgery because there is no way to give the surgery that many times, especially in multiple centers trying to use different techniques, or even have a placebo group. This is how the cardiology field gets away with literally selling snake oil and killing people who could have lived. Honest groups of cardiologists have even published the vast failure of heart surgeries that should have been putting in tubes instead (basically, if you are not in the wake of a recent heart attack, you should be skeptical of the need for open heart surgery over a stent, according to the medical field's own research). The oncology field commits the same crimes under the pretense of investigating new drugs (for the past 4 decades...with no improvements) that sell false hope. It's not distinguishable from snake oil. The details are usually arguable, but in many specific cases it is clear. You can't get doctors to testify against other doctors in most cases or they can get in deep systematic trouble with their peers, especially with the heads of medical boards that issue the licenses needed to testify. Researchers face the same problem: be careful what truths or opinions you tell or your funding dries up.
My point in the comparison is that heart surgeries and chemotherapies that are known to be harmful and not provide any benefit are being prescribed, and yet people who sell very safe inexpensive compounds are not allowed to tell you (by FDA regulations and severe penalties) that they stop Parkinson's in the test tube, in animals, and prevent PD in epidemiological studies. Instead, supplements require the same level of proof as usually toxic and expensive pharmaceuticals. Even the majority of private and public funding go to the pharmaceuticals there is every reason to believe will not help (at least in cancer) as opposed to the safe and cheap supplements that already have supporting evidence. Where's the logic in this if it is not because of some sort of accidental or intentional conspiracy in our economics?
Then there is advertising: GM1 phase 3 trials were complete over 5 years ago. But almost no one here has heard about GM1. See
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3532888/
It reverted and stopped the disease from progressing, which is not being said of nilotinib. It was 77 patients instead of 12. The data is published. Nilotinib study is not. It was a phase 3 trial. Nilotinib is only a phase 1 trial that is supposed to be only looking at safety. I wonder if a lot of professionals are aghast that they are saying these things through the media before publishing, especially since it was supposed to look only at safety. On the other hand only 12 people and the researchers doing such an outlandish thing like going public could mean it really is that good. I just hope it is not cold fusion all over again, if you remember that fiasco where they went public without the science to back it up. Getting back tot he point of advertising: I came across GM1 accidentally because I was looking for the trial on nilotinib.
Two years after treatment stopped, they had progressed to where "standard care" patients had been 2 years earlier. In other words, the benefits seemed to permanently reverse the condition by 2 years even after treatment was stopped. That's my reading of figure 2 in the link above.
It's not a conspiracy theory, but I think it's just how economics works. We all seek profit. Doctors, politicians, pharmaceuticals, and researchers should not be expected to act any different than we do. I assume everyone acts about like mechanics, plumbers, painters, and AC repairmen. My experience with them does not usually fall under the heading of "honest" and "fair". If the consumer is not knowledgeable, then he should expect to be taken to the cleaners.
I do not know of something better than GM1, but that does not mean there are not 5 other compounds out there with similar proof. It's just one I accidentally saw. Gallic acid might be just as good and it's super cheap, but it probably hasn't been tested in people, except for the benefits noted from black tea and grape seed.
My point in the comparison is that heart surgeries and chemotherapies that are known to be harmful and not provide any benefit are being prescribed, and yet people who sell very safe inexpensive compounds are not allowed to tell you (by FDA regulations and severe penalties) that they stop Parkinson's in the test tube, in animals, and prevent PD in epidemiological studies. Instead, supplements require the same level of proof as usually toxic and expensive pharmaceuticals. Even the majority of private and public funding go to the pharmaceuticals there is every reason to believe will not help (at least in cancer) as opposed to the safe and cheap supplements that already have supporting evidence. Where's the logic in this if it is not because of some sort of accidental or intentional conspiracy in our economics?
Then there is advertising: GM1 phase 3 trials were complete over 5 years ago. But almost no one here has heard about GM1. See
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3532888/
It reverted and stopped the disease from progressing, which is not being said of nilotinib. It was 77 patients instead of 12. The data is published. Nilotinib study is not. It was a phase 3 trial. Nilotinib is only a phase 1 trial that is supposed to be only looking at safety. I wonder if a lot of professionals are aghast that they are saying these things through the media before publishing, especially since it was supposed to look only at safety. On the other hand only 12 people and the researchers doing such an outlandish thing like going public could mean it really is that good. I just hope it is not cold fusion all over again, if you remember that fiasco where they went public without the science to back it up. Getting back tot he point of advertising: I came across GM1 accidentally because I was looking for the trial on nilotinib.
Two years after treatment stopped, they had progressed to where "standard care" patients had been 2 years earlier. In other words, the benefits seemed to permanently reverse the condition by 2 years even after treatment was stopped. That's my reading of figure 2 in the link above.
It's not a conspiracy theory, but I think it's just how economics works. We all seek profit. Doctors, politicians, pharmaceuticals, and researchers should not be expected to act any different than we do. I assume everyone acts about like mechanics, plumbers, painters, and AC repairmen. My experience with them does not usually fall under the heading of "honest" and "fair". If the consumer is not knowledgeable, then he should expect to be taken to the cleaners.
I do not know of something better than GM1, but that does not mean there are not 5 other compounds out there with similar proof. It's just one I accidentally saw. Gallic acid might be just as good and it's super cheap, but it probably hasn't been tested in people, except for the benefits noted from black tea and grape seed.
Monday, November 16, 2015
parkinson's: distinguishing tremors,charcot's May 22, 1888 lecture, mercury
Intention Tremor (associated with mercury poisoning and multiple sclerosis)
Intention tremor results when the antagonist activation that normally stops a goal-directed movement as the goal is approached, is inappropriately sized or timed. It often indicates a lesion in the dentate nucleus or its outflow tract through the superior cerebellar peduncle. Underlying causes include multiple sclerosis, spinocerebellar ataxias, and other degenerative, metabolic, or neoplastic disorders affecting these cerebellar structures. Treatment is often not satisfactory, but low doses of benzodiazepines can improve the situation, and promising results of DBS treatment have been reported.[82]
Essential Tremor
Essential tremor is the most common form of tremor, and probably the most common movement disorder in general. Unfortunately, there is no uniformly accepted definition of essential tremor. Widely used definitions are those developed by the Movement Disorder Society's Tremor Investigation Group and those used in the Washington Heights-Inwood genetic study, but several others exist.[1,24–27] However, the percentage of individuals fulfilling different commonly used diagnostic criteria for essential tremor has shown considerable variation.[26] The Movement Disorder Society's Tremor Investigation Group defines essential tremor as a bilateral, largely symmetric postural or kinetic tremor involving hands and forearms that is visible and persistent, and in which there is no other explanation for the tremor.[1] Additional or isolated head tremor is compatible with essential tremor as long as there is no abnormal head posturing.[1] In view of the difficulties in applying essential tremor diagnostic criteria, it may be reasonable to consider treating individuals even if they do not strictly fulfill these criteria. Probably, essential tremor reflects a clinical syndrome rather than a single disease entity.Only about half of essential tremor patients report a positive family history, which means that the term "familial tremor" is not congruent with essential tremor.[26] Essential tremor may involve the voice, but not in isolation, and only rarely affects the legs. Although the diagnostic criteria require "largely symmetric" tremor, 50% of 487 consecutive individuals diagnosed with essential tremor at Mayo Clinic had asymmetrical disease, most of them with greater tremor severity on their dominant side.[26] A tremor strictly confined to an ipsilateral arm and leg (hemibody tremor) is less likely essential tremor and more likely secondary to a structural lesion.[28] Essential tremor usually has a frequency of 5 to 10 Hz and no latency to onset. Symptom severity often increases over time, but the progression can be very slow.[26,29] The majority of patients do not show accompanying neurologic signs or symptoms, but occasionally instability or more distinct cerebellar signs may be found during examination, especially in long-standing tremor.[30]
Several lines of evidence suggest that cerebellar function is disturbed in essential tremor. This is consistent with clinical observations of cerebellar signs, such as slight dysmetria, an ataxic gait, or a component of intention tremor within a subgroup of patients with essential tremor.[30,31] Until a few years ago, essential tremor was considered a non-degenerative disorder resulting from abnormal excitability alone. More recently, relatively slight but distinct pathologic changes in essential tremor have been studied systematically.[4] Different patterns of pathologic appearance were distinguishable. In one group of essential tremor patients, there were pathologic changes within the cerebellar cortex. These included the loss of Purkinje cells, rounded swellings of their axons (visible microscopically as "torpedoes") and dendrites, a disturbed microarchitecture of the cerebellar cortex with heterotopic Purkinje cells displaced into the molecular layer, and unusually dense and tangled basket cell plexus ("hairy baskets").[4] A pathologically distinct second group of essential tremor patients had Lewy bodies in the locus ceruleus (but not in other structures, as in PD).[4] The noradrenergic cells of the locus ceruleus terminate in the branches of the widely ramified Purkinje cell dendrites. Purkinje cells are γ-amino-butyratergic cells that exert an inhibitory effect on the neurons of the dentate nucleus. Cell loss in the locus ceruleus leads to decreased noradrenergic stimulation of Purkinje cells, which reduces their inhibitory effect on the dentate nucleus and the other components of the triangle of Guillain and Mollaret. This mechanism is analogous to the severe action tremor characteristic of spinocerebellar ataxia type 2 (SCA 2), whose pathologic correlate is the preferential degeneration of Purkinje cells.[32] Efferent fibers of the cerebellar dentate nucleus also project to the ventrointermediate nucleus of thalamus (VIM). These more recent pathologic findings, taken together with the higher incidence of essential tremor observed in relatives of individuals with other neurodegenerative disorders, such as PD and possibly a common genetic background,[33–35] have led to suggestions that essential tremor in fact also is a neurodegenerative disorder.
Treatment of Essential Tremor A variety of treatment options for essential tremor are available today which makes it possible, but also necessary, to select the most appropriate solution for the individual patient. The patient's subjective experience of the tremor's severity and the degree of impairment and disability that it causes in the patient's life are more important than the objective assessment during the patient's clinic visit. Such assessment may be difficult. Studies have shown that on average, physical and mental quality of life measures are lower in essential tremor patients compared with healthy individuals.[36,37] Nevertheless, a considerable number of essential tremor patients have a low degree of impairment or disability and little emotional suffering from their disorder. The nonpharmacologic treatment options outlined above are considered for all patients with essential tremor.
Pharmacologic treatment may be utilized either intermittently or daily and is most effective at reducing limb tremor in essential tremor. In the absence of contraindications, propranolol or primidone are both recommended as first-line choices.[15,38–40] Propranolol may be effective typically in doses of 40 to 240 mg/day. It may be prudent to obtain an electrocardiogram prior to starting propranolol, to assess for significant bradycardia, and to be cognizant of a β-blocker's potential to induce orthostatic hypotension, especially in older patients. Primidone is not approved for the treatment of essential tremor in many countries (including the United States), but widely considered effective. It should be initiated gingerly; perhaps 12.5 mg daily, then titrated upward slowly to the lowest, effective dose, which is usually between 50 and 750 mg daily (divided into twice-daily or thrice-daily dosing). If propranolol or primidone do not provide satisfactory tremor relief, guidelines unanimously recommend the combination of propranolol plus primidone. Gabapentin, topiramate, or lorazepam are considered second- and third-line drugs.[15,39,40] Clozapine or botulinum toxin injections may provide relief to patients not responding to the options above, but both have disadvantages.[39,41] Clozapine confers a risk of agranulocytosis and necessitates checking regular blood cell counts. Botulinum toxin remains expensive, needs to be administered repeatedly, and there is a risk for weakness in the body parts treated. Overall, the pharmacologic treatment efficacy of essential tremor is unfortunately low. A reduction of the tremor's severity by 75% is considered a good response, and only 40 to 50% of patients will benefit from pharmacologic treatment.[3,38] The tremor will rarely disappear completely or in all situations, and thus physician and patient need to be aware that the goal of treatment is a noticeable reduction in tremor severity, not utter freedom from symptoms. A recent study that included 528 essential tremor patients found that almost one-third of patients discontinued treatment within the first year.[4] This fraction was similar for those with mild or more severe tremor, and the result was largely ascribed to the inadequacy of medical treatment options.[4]
In one study, alcohol ingestion was more efficacious at alleviating the tremor of essential tremor than propranolol or primidone, but some patients experience a rebound worsening of tremor when the alcohol's effect wanes.[42] There are concerns about alcohol dependence and abuse, but studies addressing this issue have led to conflicting results.[15] Alcohol may not be acceptable to a patient for personal, cultural, or religious reasons. Surgical treatment with deep brain stimulation (DBS) of a target within or near the VIM (Fig. 1) can improve essential tremor in patients who do not respond satisfactorily to other treatment modalities, and has a good short-term and long-term effect.[43] Maximal motor improvement of motor symptoms and minimal side effects were achieved by targeting DBS at the cerebellothalamic tracts in the subthalamic area rather than the thalamus itself.[44]
Tremor in Parkinson's Disease
Tremor is one of the cardinal features of PD and was described in ancient Indian descriptions of the "kampavata" illness, which probably corresponds to the modern definition of PD.[45] Tremor is often the presenting feature of PD, but it is not a necessary feature for this diagnosis, and ~25% of patients with PD in fact never develop tremor (akinetic-rigid form). Furthermore, tremor may diminish in later stages of the disease, when bradykinesia becomes more prominent.[46] The typical and rather complex movements of parkinsonian rest tremor (Fig. 2) indicate PD with high specificity. They include agonist and antagonist activation alternating in a precisely tuned manner, often leading to a stereotypical series of movements, such as the typical pill-rolling tremor.There are descriptions of patients with only a rest tremor who do not subsequently develop PD, and the term monosymptomatic rest tremor has been suggested when this situation has persisted for at least 2 years.[1] However, a reduced putaminal fluorodopa uptake has been found in some patients with monosymptomatic rest tremor, suggesting they may have subclinical parkinsonian syndromes.[47]
Diagnostic difficulties can arise when a patient only has tremor and no other signs and symptoms are found, when no tremor is visible during the office visit, or when other forms of tremor coexist.[1] Rarely, PD patients may only have a kinetic tremor. Thus, a diagnosis of PD should never be based solely on tremor, but requires the presence of the other cardinal symptoms of PD, notably, bradykinesia.
Tremor in PD is often more difficult to alleviate than the hypokinetic PD manifestations (bradykinesia and rigidity).[48] The tremor is not as responsive to dopaminergic therapy as the hypokinesias, or may not improve with medical treatment at all.[48] An analysis of what is known about the pathogenesis of PD tremor may help explain this discrepancy. The pathologic hallmark of PD is the loss of dopamine-producing neurons in the substantia nigra pars compacta (SNc), especially its ventrolateral portion, which projects to the putamen.[48,49] This induces a dopaminergic deficit in the striatum, where these neurons form synapses on neurons belonging to two distinct classical corticostriatothalamocortical circuits, known as the indirect and direct pathways (Fig. 1). Dopaminergic neurons project to striatal cells that form part of the indirect pathway. These are equipped with inhibitory D2 receptors. Thus, dopamine exerts an inhibitory net effect on the indirect pathway loop, and the dopaminergic deficit of PD reduces this inhibition. Dopamine also acts on the excitatory D1 receptors found on inhibitory striatopallidal pathway cells of the direct pathway. More recent findings also show that the anatomic connections between the brainstem nuclei are more complex than previously appreciated. Cortical neurons that activate the STN without any relay in the basal ganglia have been identified, the hyperdirect pathway.[50,51] Furthermore, the "striatofugal" neurons from the striatum to the internal globus pallidus (GPi), which form part of the classical direct pathway, at least in nonhuman primates also send collaterals to the external globus pallidus (GPe).[52,53] This means that the classical direct and indirect pathways are closely interwoven. Furthermore, feedback neurons from the GPe to the striatum as well as from GPe to GPi have been discovered in different mammals.[54]
Several intriguing findings argue against the striatonigral dopaminergic deficit directly causing PD tremor. The extent of dopamine deficiency and the degree of disease progression correlate well with the severity of rigidity and bradykinesia, but not with tremor.[54–56] In statistical analyses of PD patients' symptoms, tremor occurred independently from the other cardinal features.[57] Through DBS electrodes, high-frequency oscillations were recorded from the STN in PD patients with tremor, and likewise, these oscillations correlated with akinesia and rigidity, but not with tremor.[56,58] Rigidity and bradykinesia improved after the injection of GABA agonist muscimol into the pallidum, but simultaneously rest tremor deteriorated.[59] In view of these findings, it has been postulated that tremor in PD results from a compensatory mechanism downstream of the disturbed basal ganglia activity.[54] Another possibility is that tremor, analogous to many other signs and symptoms of PD,[60] may be another consequence of the neurodegenerative changes that underlie PD, independent from the direct cause of bradykinesia or rigidity.
TReatment of Tremor in Parkinson's Disease Available treatment options include dopaminergic agents, anticholinergics, β blockers, and DBS. Levodopa and dopamine agonists alleviate parkinsonian symptoms, including tremor in some patients, but often tremor control is not satisfactory. Although frequently discussed, there is no convincing data showing that dopamine agonists lead to greater improvement of tremor than levodopa. The clinical trials that were performed with this question in mind either did not directly compare a dopamine agonist to levodopa;[61–65] did not assess tremor as primary outcome but in post hoc analyses;[62,65] or the recorded effect sizes, even though statistically significant, were small.[63,64] It also remains uncertain whether the addition of a dopamine agonist to levodopa may lead to small improvements in tremor. In general, levodopa remains the antiparkinsonian medication producing maximal motor benefit in PD patients, with the fewest side effects. In patients younger than 60 years of age, dopamine agonists may be considered as there is some evidence for a possibly lower risk for dyskinesias in later stages of the disease, compared with when treatment was initiated with levodopa.[66] More recently, 10-year follow-up data from a multicenter cohort found no such difference.[67]
Beta-blockers have a documented effect also in parkinsonian tremor, but may increase the orthostatic hypotension that often develops in PD, which can have serious consequences. However, in 2003 a Cochrane review of four studies could not determine whether β-blocker therapy is effective and safe for the treatment of tremor in PD and warned against bradykinesias as a side effect.[68]
Anticholinergic drugs were formerly used for the treatment of PD. The rationale behind their use is that the dopaminergic deficit in PD leads to a relative excess of acetylcholine in the striatum, and that anticholinergic drugs can restore a balance on a lower level of both transmitters. In fact, experience shows that anticholinergics can improve tremor in PD. However, there are no modern studies on their use, and side effects can be dramatic. Nevertheless, some authorities recommend anticholinergics as one of several treatment options for younger patients with tremor-dominant PD who do not respond to other medications.[69] DBS targeting the subthalamic nucleus or VIM, stations in the indirect pathway, has been shown to be effective.[70,71]
================
from Wikipedia concerning inorganic mercury vapors:
"In the late 1800’s hat makers, or hatters, used to use mercury nitrate when working with beaver fur to make felt. Over time, the hatters started exhibiting apparent changes in personality and also experienced tremors or shaking. Mercury poisoning attacks the nervous system, causing drooling, hair loss, uncontrollable muscle twitching, a lurching gait, and difficulties in talking and thinking clearly. Stumbling about in a confused state with slurred speech and trembling hands, affected hatters were sometimes mistaken for drunks. The ailment became known as 'The Danbury Shakes' in the community of Danbury where hat making was a major industry. In very severe cases, they experienced hallucinations. The term “mad as a hatter” may be a product of mercury toxicity. The practice did not completely stop until 1943."
"Occupational exposure has resulted in erethism[inorganic mercury poisoning], with irritability, excitability, excessive shyness, and insomnia as the principal features of a broad-ranging functional disturbance. With continuing exposure, a fine tremor develops, initially involving the hands and later spreading to the eyelids, lips, and tongue, causing violent muscular spasms in the most severe cases. The tremor is reflected in the handwriting which has a characteristic appearance. In milder cases, erethism and tremor regress slowly over a period of years following removal from exposure. Decreased nerve conduction velocity in mercury-exposed workers has been demonstrated. Long-term, low-level exposure has been found to be associated with less pronounced symptoms of erethism, characterized by fatigue, irritability, loss of memory, vivid dreams, and depression."
"The man affected is easily upset and embarrassed, loses all joy in life and lives in constant fear of being dismissed from his job. He has a sense of timidity and may lose self control before visitors. Thus, if one stops to watch such a man in a factory, he will sometimes throw down his tools and turn in anger on the intruder, saying he cannot work if watched. Occasionally a man is obliged to give up work because he can no longer take orders without losing his temper or, if he is a foreman, because he has no patience with men under him. Drowsiness, depression, loss of memory and insomnia may occur, but hallucinations, delusions and mania are rare.
The most characteristic symptom, though it is seldom the first to appear, is mercurial tremor. It is neither as fine nor as regular as that of hyperthyroidism. It may be interrupted every few minutes by coarse jerky movements. It usually begins in the fingers, but the eyelids, lips and tongue are affected early. As it progresses it passes to the arms and legs, so that it becomes very difficult for a man to walk about the workshop, and he may have to be guided to his bench. At this stage the condition is so obvious that it is known to the layman as "hatter's shakes."
“
”
Buckell et al, Chronic Mercury Poisoning (1946)[7]
"Effects of chronic occupational exposure to mercury, such as that commonly experienced by affected hatters, include mental confusion, emotional disturbances, and muscular weakness.[14] Severe neurological damage and kidney damage can also occur.[4] Neurological effects include Korsakoff's dementia and erethism (the set of neurological symptoms characteristically associated with mercury poisoning). Signs and symptoms can include red fingers, red toes, red cheeks, sweating, loss of hearing, bleeding from the ears and mouth, loss of appendages such as teeth, hair, and nails, lack of coordination, poor memory, shyness, insomnia, nervousness, tremors, and dizziness.[4] A survey of exposed U.S. hatters revealed predominantly neurological symptomatology, including intention tremor.[7] After chronic exposure to the mercury vapours, hatters tended to develop characteristic psychological traits, such as pathological shyness and marked irritability (box).[2] Such manifestations among hatters prompted several popular names for erethism, including "mad hatter disease",[14] "mad hatter syndrome",[15][16] "hatter's shakes" and "Danbury shakes".
erethism (neurological problems from inorganic mercury vapors)
It is commonly characterized through behavioral changes such as irritability, low self-confidence, depression, apathy, shyness[3][4] and timidity, and in some extreme cases with prolonged exposure to mercury vapors, delirium, personality changes and memory loss occur as a result. People with erethism find it difficult to interact socially with others, with behaviors similar to that of a social phobia. Although most of the effects of erethism are neurological, some physical problems arise as well, including a decrease in physical strength, “headaches, general pain, and tremors after exposure to metallic mercury”[5] as well as irregular heartbeat. It has been documented that “the tremor in the hands can be so severe that the victim is unable to hold a glass of water without spilling its contents.”
Acute mercury exposure has given rise to psychotic reactions such as delirium, hallucinations, and suicidal tendency. Occupational exposure has resulted in erethism, with irritability, excitability, excessive shyness, and insomnia as the principal features of a broad-ranging functional disturbance. With continuing exposure, a fine tremor develops, initially involving the hands and later spreading to the eyelids, lips, and tongue, causing violent muscular spasms in the most severe cases. The tremor is reflected in the handwriting which has a characteristic appearance. In milder cases, erethism and tremor regress slowly over a period of years following removal from exposure. Decreased nerve conduction velocity in mercury-exposed workers has been demonstrated. Long-term, low-level exposure has been found to be associated with less pronounced symptoms of erethism, characterized by fatigue, irritability, loss of memory, vivid dreams, and depression (WHO, 1976).
================
Leçons du mardi à la Salpêtrière, professeur Charcot : policlinique 1887-1888
===============
Policlinique du Mardi 22 Mai 1888.
DIX-NEUVIÈME LEÇON
OBJET :
1° Trois malades atteints de tremblement mercuriel n°s 1 et 2, doreurs sur métaux; n° 3, chapelier.
2e Paralysie agitante unilatérale (4e malade,).
M. CHARCOT: Voici trois malades qui forment un ensemble homogène et qui nous permettent d'étudier une forme de tremblement dont le nom est bien connu, mais qui n'a peut-être pas encore été étudié comme il conviendrait, par le détail. Je vous dirai tout à l'heure de quoi il s'agit; nous allons examiner tout d'abord l'un de ces malades que j'appellerai le n° 1 (Louis Am. âgé de 80 ans).
Il est affecté d'un tremblement qui présente les caractères typiques de l'affection dont je veux vous entretenir sous une forme très'accentuée, très régulière. Ce n'est pas cependant en tant que tremblement, ce qu'on peut observer de plus fort ; c'est un cas bien dessiné, mais d'une intensité moyenne.
Voilà ce que je voudrais .que vous considériez surtout, en premier lieu. Ce malade étant assis tranquillement, les mains reposant sur les genoux, celles-ci, vous le voyez, sont agitées par des oscillations rhythmées qui nous paraissent rapides, nombreuses, dans un temps donné. Il y a cependant des tremblements plus rapides que celui-là ainsi que je le montrerai tout à l'heure. Vous savez d'ailleurs qu'il ne faut pas juger de l'a rapidité des oscillations rhythmées qui constituent un tremblement d'après les seules renseignements fournis par l'oeil.
Ce n'est pas trop, pour s'éclairer à ce sujet, de faire appel aux procédés de la méthode graphique. Nous n'ignorons pas que les données fournies par ces mensurations ont, en clinique, une importance considérable.
On peut dire, en effet, au moins d'une façon très générale que chaque espèce nosographiquement déterminée de tremblement, tend à se distinguer des autres espèces, par le nombre des oscillations consigné sur les appareils enregistreurs,
? 299 ? ?
dans, un temps donné, pendant une seconde, par exemple. A ce point de vue, il y a lieu de reconnaître 3 groupes de tremblements à savoir: 1° le tremblement à oscillations lentes ; 2° le tremblement à oscillations de rapidité moyenne et enfin 3° le tremblement à oscillations rapides, autrement dit tremblement vibratoire. Il ne s'agit pas là, bien entendu, d'une classification tout à fait naturelle, mais elle offre certainement, il nous sera facile de vous le démontrer, un intérêt réel au point de vue pratique, ce qui est déjà quelque chose.
TREMBLEMENT OU OSCILLATIONS RHYTHMÉES.
\ T?n,TO m? i Sclérose en plaques. ? MaA.
MaA. INTENTIONNEL : .. j ladie de FlriedVeich.
11° A oscillations lentes, 4 à ( Paralysie agitante. 5 oscillations par seconde. ( Tremblement sénile.
\2o Type intermédaire (3:-f \ m ,-, .-. ,, ?
B. _ PENDANT LE 1/2) à (6-6). V ^ j Tremblement hystérique.
1 ! 1° Pas de tremblement indi3"
indi3" oscillations rapides. - \ 7iduel, d£s do^ts- ~ Ma" Tremblement vibratoire^ O0lade de Basedow. (8 ou 9) par seconde. ) 2° Tremblement individuel
v 'r I des doigts, alcoolique.
V 3° Paralysie générale. .
1» Pendant le repos, surtout si émotion (5 à 6) (
par seconde. \ _, ,, . 1
3° Intentionnel. (Exagération considérable des Lremuement mercunel.
oscillations. (
Si vous voulez jeter les yeux sur le tableau que j'ai fait placer sous vos yeux, vous aurez un aperçu de cette classification des tremblements suivant la rapidité du rhythme. Je laisserai de côté le groupe A sur lequel je reviendrai tout à l'heure et où la rapidité des oscillations n'est pas traduite et ne peut pas être traduite par des chiffres. Voici maintenant l'indication sommaire des autres groupes.
J'appelle tremblement lent celui dans lequel il se produit en moyenne 4 ou 8 oscillations seulement par seconde. Dans ce groupe, nous rencontrons la paralysie agitante, le tremblement sénile qui doit être soigneusement distingué de la première, malgré les opinions contraires récemment émises.
Un autre groupe forme en quelque sorte l'intermédiaire entre le précédent et celui qui va suivre. Là se place le tremblement des hystériques qui, au point du vue du rhythme, présente, c'est peut-être là un de ses caractères, de très: grandes variations ; le taux clés oscillations par seconde varie entre 3 1/2 et 6.
Au dernier groupe appartiennent les tremblements à oscillations rapides ou tremblements vibratoires ; nous trouvons réunis là : le tremblement alcoolique, celui
300
de la paralysie générale progressive, le tremblement de la maladie de Basedow, etc., le nombre des oscillations par seconde est de 8 à 9. Vous comprenez qu'en tenant compte de ce caractère fourni par l'étude graphique du tremblement, on peut déjà établir les premières lignes d'un diagnostic différentiel.
Je propose d'établir un groupe à part pour le tremblement mercuriel qui est justement celui que nous allons pouvoir étudier sur les hommes qui figurent devant . vous. Je vous dirai et je vais vous faire connaître que cette sorte de tremblement se rapproche, par le nombre des oscillations, de celui de la paralysie agitante. C'est-à-dire que,par seconde, il y a 4 à 8 secousses ; il s'agit donc d'un tremblement lent, mais ici se montrent des caractères particuliers qui permettent du premier coup de distinguer le tremblement mercuriel du tremblement de la paralysie agitante, par exemple. Ce caractère, je vais le mettre en relief, justement, chez notre malade n° 1 qui, comme vous l'avez remarqué déjà, tremble des mains, alors même crue celles-ci sont dans l'attitude du repos. Notre homme exerce la profession de doreur sur métaux, et c'est, à sa profession qu'il doit d'être sous le coup de l'intoxication hydrargyrique. L'hôpital delà Charité était autrefois le rendez-vous des saturnins. Pour ce qui est de ihydrargyrisme, c'est surtout à l'hôpital St. Antoine que vous rencontrerez ses victimes, ou encore à l'hôpital Tenon, ce qui tient simplement à ce que les étabissemenls dans lesquels on contracte ces intoxications appartiennent aux quartiers où sont situés ces deux hôpitaux, et justement notre malade nous a été adressé pour servir à l'étude et à la démonstration par un de nos collègues de l'hôpital St. Antoine.
Vous savez-quel rôle joue le mercure dans la dorure sur métaux et vous n'ignorez pas que dans les ateliers où on se livre à ce genre de travail, l'atmosphère, si la ventilation est imparfaite, est toujours plus ou moins chargée de vapeurs mercurielles. Cependant, depuis quelques années, l'aménagement de ces ateliers a été, à ce qu'il paraît, singulièrement amélioré au point de vue de l'hygiène; l'aération y est réglée dans de bonnes conditions et de plus, il est apporté aux opérations diverses un soin particulier qui empêche que l'action des vapeurs mercurielles y soit aussi nocive qu'autrefois. Mais il existe encore des ateliers où ces perfectionnements n'ont pas été introduits et où tout au moins les précautions hygiéniques prescrites sont peu ou sont mal observées. Notre malade, dans les conversations que nous avons eues avec lui ces jours-ci, nous a appris que pendant de longues années, pendant plus de 29 ans, il a travaillé dans de certains ateliers où l'on tient grand compte des règlements hygiéniques, en particulier chez Barbedienne, sans jamais avoir été atteint du moindre tremblement. Tandis qu'il lui a suffi de travailler quelcrues mois dans d'autres ateliers mal ventilés pour voir survenir chez lui le tremblement que nous observons aujourd'hui.
(Au malade) : Veuillez nous donner quelques détails sur votre carrière de doreur sur métaux.
Le malade : J'ai travaillé longtemps chez Barbedienne. J'y serais resté 20 ans que je n'aurais jamais tremblé j'en suis sûr.
M CHARCOT: Ainsi, dans l'atelier de Barbedienne, les ouvriers sont rarement atteints du tremblement mercuriel et cependant la dorure s'y fait parfaitement; tandis qu'il y a d'autres ateliers où ion est rapidement intoxiqué. Du reste, cet autre
-, 301 ?
malade (le n° 2) (Gabriell Men... 42 ans) est un nouvel exemple du même genre; lui aussi est doreur sur métaux : il exerce ce métier depuis l'âge de 11 ans, et il en a aujourd'hui 42. Il est resté 30 ans sans connaître le tremblement. Il en a été frappé depuis, coup sur conp, depuis un an qu'il a changé d'atelier. Le second malade: Oui, j'ai été pris 2 fois cette année.
M. CHARCOT : Mais, j'en reviens au tremblement. Je veux vous indiquer le caractère qui permet de le distinguer du premier coup, des autres espèces de tremblement lent. Je vous ferai remarquer, en premier lieu,que le tremblement mercuriel, occupe quelquefois, assez habituellement même la tète. Ce n'est pas le cas chez cet homme, mais je vous montrerai lout-à-iheure un cas où la tête tremble et je vous ferai remarquer la façon dont elle tremble. En second lieu, il occupe quelquefois, assez souvent même les lèvres et la langue. Cela est, vous voyez, très accentué chez notre maiade n° 1, pour peu qu'il parle. Vous reconaissez pendant qu'il parle, que l'articulation des mois est en quelque sorte entrecoupée par une sorte de trépidation. Celle sorte de trépidation, due au tremblement dés lèvres est fort intéressante pour le clinicien, parce qu'elle rappelle beaucoup, au point qu'on puisse s'y méprendre, l'embarras de la parole de certains cas de paralysie générale. J'ajouterai que lorsque le malade tire la langue, celle-ci est bientôt prise d'une trépidation ressemblant à celle cju'on observe si souvent chez ces mêmes paralytiques généraux.
Voilà donc une première remarque sur laquelle il faut insister. N'allez pas du premier coup, quand vous verrez un malade trembler de la langue lorsqu'il la tire, et des mains lorsqu'il les étend, bredouiller lorsqu'il parle, en conclure sans plus d'information, qu'il s'agit, chez lui, de paralysie générale. Ce pourrait être un alcoolique, ce pourrait être encore un cas de tremblement mercuriel. Chez notre homme, je tiens à l'avouer publiquement, lorsqu'il s'est présenté devant nous pour la première fois, notre impression avait été, en raison surtout cle l'embarras de la parole, qu'il s'agissait là delà méningo-encéphalite diffuse. Il est vrai que presque immédiatement, la question a changé de face lorsque nous eûmes appris qu'il exerçait la profession de doreur sur métaux.
C'est ici le cas de rappeler d'ailleurs que le tremblement nerveux, d'après les observations répétées que nous avons faites ces jours-ci chez nos trois malades, ne compte que 4 ou 6 oscillations par seconde, les mains reposant sur les genoux, tandis que dans la paralysie générale, on en compte 8 ou 9. Peut-être trouverait-on là un élément de diagnostic dans un cas difficile.
Vous n'ignorez sans doute pas qu'il est encore une autre affection dans laquelle l'embarras de la parole et aussi le tremblement des extrémités rappellent jusqu'à un certain point ce que ion voit dans la paralysie générale et dans le tremblement mercuriel. Je veux parler de la sclérose en plaques.
Pour ne parler que des analogies symptomatiques qui peuvent exister entre le tremblement mercuriel et la sclérose en plaques, je vous rappellerai que la parole lente et scandée qui s'observe dans la dernière affection n'est vraiment pas très difficile à distinguer de l'articulation trépidante des hydrargyriques ; mais par contre, vous allez le voir, en ce qui concerne le tremblement des extrémités, l'analogie est telle que le diagnostic pourra, dans certains cas, se montrer hérissé
302
de difficultés sérieuses. De fait, entre le tremblement mercuriel et celui de la sclérose en plaques, il y a,un caractère commun, qui les rapproche étroitement l'un de l'autre en même temps qu'il les éloigne l'un et l'autre de toutes les autres formes du tremblement. C'est ici naturellement le lieu de rappeler les caractères du tremblement dans la sclérose en plaques. On dit que ce tremblement est intentionnel (Voir le tableau des tremblements A) (Intentionszitlern des auteurs allemands) (4) et par là on entend ce qui suit :
Lorsque les mains du malade sont dans l'altitude-du repos, posées tranquillement sur les genoux, elles ne tremblent pas. Mais s'il veut faire un acte quelconque, prendre un verre par exemple, un objet quelconque, alors commence une période pendant laquelle vous voyez la main s'agiter et les oscillations devenir de plus en plus rapides et d'autant plus étendues cpie la main devient plus proche du but à atteindre. S'il s'agit, par exemple, pour lui de saisir une cuiller pour la porter à sa bouche, il est tort possible qu'en raison de l'étendue croissante des oscillations, il ne le puisse pas faire. De même s'il s'agit d'un verre rempli d'eau; celle-ci sera projetée de tous côtés avant de parvenir aux lèvres. Cette série d'événements est facile à saisir dans le schéma suivant. A B répond à la période pendant laquelle la main repose tranquillement sur les genoux. Il n'y a pas traces de tremblement pendant cette période. En B, le malade commence un acte intentionnel, volontaire, comme pour porter un verre à sa bouche. Le tremblement se dessine aussitôt et
les oscillations qui le composent, augmentent, comme on voit, progressivement d'amplitude à mesure que la main se rapproche du but à atteindre C. (Fig. 43)
Par contraste, rappelons ce qui se passe dans la paralysie agitante. Ici encore la main se repose sur les genoux du malade, pendant la période AB (Fig. 44.) Mais même, contrairement à ce qui a lieu dans la sclérose en plaques, pendant
Fig. 43. ? Schéma n° 1.
(1) Intentionnel pris dans ce sens n'appartient pas à la langue correcte, c'est, si on veut, du Français retour d'Allemagne ; mais l'application du terme est assez commode et semble devoir passer dans la pratique.
? 303 ?
cette période dite de repos, la main est agitée d'oscillations ; elle tremble ; par contre lorsque, pendant la période B C, elle exécute un mouvement volontaire, les oscillations rhythmées n'augmentent pas d'amplitude et quelquefois même, ainsi
crue je vous en présenterai un exemple tout-à-iheure, le tremblement peut, pendant l'exercice du mouvement voulu, cesser complètement (Schéma N° 3). Je suis à même de bien vous montrer maintenant en quoi le tremblement mercuriel
mercuriel à celui de la sclérose en plaques et en quoi il en diffère (Schéma N°4). Supposons la période A B pendant laquelle les mains,si vous voulez,reposentsur
les genoux du malade. Pendant cette période chez lTiydrargyrique, ainsi que vous le constatez une fois de plus chez notre malade N° 1, la main tremble. Sous ce rapport donc, il y aurait la plus grande analogie avec ce qui a lieu dans la paraFig,
paraFig, ? Schéma n° 2.
Fig. 45. ? Schéma n° 3.
Fig. 46. ? Schéma n- 4.
? 304 ?
lysie agitante. (Schéma N° 2) où le malade tremble « sans repos ni trêve ». Mais voici à cet égard une différence à signaler. C'est que dans le tremblement mercuriel, par moments, dans cette période de repos, le tremblement cesse momentanément d'exister. Il est vrai qu'on le voit reparaître aussitôt qu'on éveille l'attention du malade, qu'on lui parle. Vous avez vu ce qui s'est passé il n'y a qu'un instant chez notre sujet. Un moment abandonné à lui-même, ne pensant plus à rien, ses mains ont été un instant mobiles, privées de tremblement; celui-ci s'est montré à nouveau, aussitôt que par une interpellation l'attention, du malade a été fixée. Une émotion quelle qu'elle soit peut, dans la période de repos (ÀB) faire reparaître le tremblement un instant effacé. D'après cela, oïl peut dire que dans la période A B, dite période de repos, le tremblement est nul dans la sclérose en plaques ; que dans cette même période, il est permanent dans la paralysie agitante (ne cessant qu'au moment du sommeil) qu'enfin, dans le tremblement mercuriel, il est rémittent, cessant par moments pour reparaître dans d'autres, par exemple sous l'influence des moindres émotions.
Voici maintenant le caractère qui, ainsi que nous l'avons fait prévoir, rapproche le tremblement mercuriel de la sclérose en plaques, en même temps qu'il s'éloigne de la paralysie agitante (voir le schéma N° 4). Nous commandons à notre malade N° 1 de saisir un verre rempli d'eau, placé sur une table voisine, et de le porter à sa bouche.
Vous remarquez que dès le moment où l'acte volontaire prescrit commence à s'accomplir, le tremblement s'exagère considérablement, les oscillations devenant progressivement de plus en plus amples à mesure qu'on s'approche du but à atteindre. C'est absolument ce qui se passe, vous le reconnaissez, dans la sclérose en plaques. Il y a 8 jours, constamment, chez noire homme, dans cet acte, l'eau était projetée au loin avant que le verre eût atteint les lèvres ; aujourd'hui, déjà, sous l'influence dû traitement, l'état s'est amélioré, le malade peut boire.
Vous voyez, par cet exemple, jusqu'à quel point, pendant l'accomplissement des actes volontaires, le tremblement hydrargyrique et celui de la sclérose en plaques sont, pour ainsi dire identiques, mais vous n'avez pas oublié combien, dans la période de repos (AB),la différence esfcapitale, puisque, dans la dernière affection le tremblement fait défaut pendant cette période, tandis qu'il y existe au moins par moments dans la première.
Il y a longtemps, d'ailleurs, que j'ai fait ce rapprochement entre le tremblement de la sclérose en plaques que j'ai fait connaître et le tremblement mercuriel. Je vous ferai remarquer, en passant, puisque l'occasion s'en présente, que, en général, le tremblement hydrargyrique se modifie et s'atténue avec une grande rapidité lorsque le sujet est placé clans des conditions favorables. Notre doreur sur métaux, depuis 18 jours qu'il est à l'hôpital, a été soustrait à l'influence des vapeurs mercurielles et soumis à un traitement qui n'est autre que iexpeclation déguisée. Il prend chaque jour 3 ou 4 cuillerées à nouche d'une solution de chlorure de sodium. Cette solution est destinée uniquement à masquer iexpectation que les malades
n'aiment point. ? « Populus vult decipi » » Sous l'influence de cette seule
méthode, tous les symptômes se sont très rapidement amendés. Aujourd'hui, je le répète, il peut, non sans difficulté, manger seul, tandis qu'il y a 8 jours encore, il
? 308 ?
était incapable de porter ses aliments à sa bouche. Voilà une petite leçon de thérapeutique négative dont il faudra savoir tirer profit à l'occasion.
Une autre analogie encore à faire ressortir entre le tremblement mercuriel et celui delà sclérose en plaques est celle-ci ? et j'aurais pu vous la faire constater chez ce malade avant que sa situation ne se fût; améliorée ? c'est que, pendant la marche, c'est-à-dire pendant iaccomjjlissement d'actes volontaires portant sur l'ensemble, la tète et les jambes tremblent.
En conséquence de l'intervention d'oscillations rhylmées et par suite, particulièrement, des tremblements dont les membres inférieurs deviennent le siège pendant la progression, la démarche est titubante. C'est absolument ce que ion voit dans la sclérose en plaques typique, et je regrette bien de n'avoir pas de malade de ce genre sous la main pour vous remettre ce fait en mémoire. Tout cela existait encore il y a peu de jours chez notre hydrargyrique ; mais cela ne se voit plus aujourd'hui tant l'amélioration a été rapide. (Au malade assis) : Etendez vos jambes, tenez vos pieds élevés au-dessus du sol.
(Aux assistants) : Vous voyez que les tremblements persistent encore dans les membres inférieurs, pendant l'acte volontaire que je lui ai prescrit. Les pieds détachés du sol sont manifestement agités d'oscillations très marquées.
Reprenons maintenant ce que l'étude de notre malade nous a jusqu'ici permis de relever : Tremblement des lèvres rendant l'articulation des mots difficiles, tremblement des membres inférieurs à l'étal de repos lorsque le malade est ému ; tremblement s'exagérant beaucoup lorsqu'il s'agit de prendre un objet et de le déplacer. Ce dernier trait rappelle ce qu'on voit dans la sclérose en plaques typique. Enfin : tremblement de la tête, ce qui fournil encore un rapprochement à établir entre les deux affections. Par contre, ceci constitue jusqu'à un certain point un caractère distinctif vis-à-vis de la paralysie agitante où ordinairement la tête ne tremble pas « par elle-même ». Je sais bien qu'on va m'opposer que mes assertions à cet égard ont été autrefois trop absolues, puisque des exemples très authentiques de maladie de Parkinson ont été publiés où la « tète tremble par elle-même ». Mais en réalité, je ne puis m'empêcher de déclarer que cela est vraiment rare, tandis que la chose est habituelle, classique en d'autres termes, aussi bien dans la sclérose en plaques que dans le tremblement hydrargyrique.
Je vais devant vous relever encore, par un trait, la caractéristique du tremblement mercuriel. Ecrire est pour ces malades la chose la plus difficile, c'est à peu près impossible en public, pour peu qu'il y ait quelque émotion. Ainsi le malade que nous considérons (n° 1) qui s est si notablement amélioré ces jours-ci, qui peut aujourd'hui porter ses aliments et ses boissons à sa bouche, eh bien 1 ce malade là est dans l'impossibilité absolue d'écrire son nom devant vous, lorsqu'il est bien tranquille dans la salle, loin des regards, il le fait, bien que d'une façon très incomplète, mais devant vous, lorsqu'il a le crayon en main, il ne peut que griffonner un gribouillage illisible. Tenez, voilà tout ce que ion peut obtenir de lui dans ce moment en ce genre.
Un dernier point nous reste à considérer. Je veux parler des modifications concernant la force dynamométrique chez les sujets atteints de tremblement hydrargyrique. Tout récemment, un de nos distingués collègues des hôpitaux, M. Letulle,
CHARCOT. Leçons du Mardi, t. i, 2« édit. . 39
? 306 ?
a publié un travail très intéressant, je ne dirai pas sur le tremblement mercuriel qui n'y est touché qu'accessoirement, mais sur ce qu'il appelle, je crois, les névroses mercurieltes. Ce travail est à la fois expérimental et clinique. Il figure dans les Archives de physiologie (avril 1887).L'auteur s'y est surtout occupé de la paralysie mercurielle, et il est conduit à affirmer quedansihydrargyrisme, à peu près toujours, l'affaiblissement dynamométrique est très prononcé et. que toujours la paralysie ou pour le moins la parésie précéderait le tremblement. Que l'affaiblissement existe, chez les malades atteint de tremblement mercuriel, à peu près toujours à un certain degré cela est incontestable, je pense, mais d'après ce que j'ai vu sur les 3 malades d'aujourd'hui, je crois pouvoir affirmer, à mon tour, que iakinésie n'est pas toujours, tant s'en faut, aussi prononcée que le dit M. Letulle. Il n'est peut-être pas sans intérêt de rappeler ici les principaux résultats des observations
observations expériences de M. Letulle. Les expériences ont été faites sur des cobayes dans le laboratoire de Vulpian.
Ces petits animaux étaient intoxiqués lentement par l'ingestion de peplonesmercurielles. Il est souvent très difficile, dans un laboratoire, de produire chez l'animal des maladies lentes comparables aux maladies chroniques que nous rencontrons dans la clinique chez l'homme. On y parvient cependant quand on y met beaucoup de soins et de patience. Que de patience et desoins ne nous a-t-il pas fallu mettre en oeuvre, mon collaborateur Gombault et moi, pour produire, chez ces mêmes cobayes, tous les accidents du saturnisme chronique !
M. Letulle a réussi parfaitement à déterminer chez euxThydrargyrisme et il faut l'en louer; quoi qu'il en soit, ce qui mérite d'être relevé surtout dans ses expériences, c'est l'existence chez ces animaux ainsi intoxiqués, d'une lésion des nerfs périphériques consistant dans la destruction du cylindre de myéline, sans prolifération des noyaux de la gaîne de Sclrsvannet sans destruction du cylindre axile. Ce
Fig. 47. ? Fac-similé de la signature du n° 1 (AMAND)
? 307 ?
sont ces deux derniers traits qui distingueraient la lésion nerveuse hydrargyrique de toutes les autres formes de la névrite périphérique.
Pour M. Letulle, la persistance du cylindre axile, alors que la myéline a disparu, servirait à expliquer le caractère ce intentionnel » du tremblement hydrargyrique, . comme il sert à expliquer dans la théorie que j'ai proposée dans le temps et à laquelle je ne tiens pas plus que de raison ? l'existence de ce même caractère, dans la sclérose en plaques où, vous le savez, le cylindre axile persiste souvent dans l'aire des plaques scléreuses. C'est donc à cette dénudation du cylindre axile conservé et pouvant encore transmettre les ordres de la volonté tant bien que mal, que seraient dues ces oscillations qui troublent l'accomplissement des mouvements, volontaires. Mais peu importe la théorie pour le moment. Retenons ce fait que cette lésion nerveuse, cause du tremblement intentionnel, serait aussi la cause de la parésie, qui suivant l'auteur, précède nécessairement le tremblement.
Eh bien, pour en revenir à cette parésie, tout ce que je veux dire, c'est que chez les 3 sujets que nous avons sous les yeux et qui, incontestablement, représentent de beaux exemples de tremblement mercuriel, elle s'est montrée constamment beaucoup moins prononcée que ne l'indique M. Letulle.
Ainsi, M. Letulle donne des chiffres qui, chez les malades qu'il a observés, varient de 10 à 44 ; tandis que chez nos 3 malades, les chiffres ont été les suivants : N° 1, 40 à gauche, 80 à droite ; N° 2. 70 à droite et à gauche; No 3, 88 à droite, 88 à gauche. Je vais actuellement compléter l'observation du malade en ajoutant quelques mots relatifs à son histoire passée.
Il n'a pas d'antécédents héréditaires connus ; il exerce la profession de doreur sur métaux depuis 30 ans. C'est au mois de juillet 1887 qu'il a travaillé pour la première fois dans un atelier autre que celui où il était occupé autrefois ; la ventilation y est mauvaise, et c'est au mois de septembre, c'est-à-dire deux mois seulement après être entré dans celte maison, que son tremblement ia- pris. Il s'est aperçu d'abord qu'il ne pouvait plus écrire, mais il a travaillé quand même jusqu'à la fin d'avril et c'est au bout de 8 mois seulement qu'il est venu ici.
Que va-t-il maintenant lui arriver? Nous allons lui donner de iiodure de potassium, le soumettre à un traitement hydrothérapique, mais même avant tout traitement, nous avons vu sa situation s'améliorer sous le régime d'expectation voilée auquel il est soumis. Se rélablira-t-il jamais complètement? Chez tous les malades que j'ai vus, un certain degré de tremblement, quelque léger qu'il soit, a toujours persisté après ce qu'on appelle la guérison. 11 semble qu'on ne se débarrasse jamais complètement de ce tremblement, alors qu'il s'est une bonne: fois manifesté. C'est un fait que vous allez pouvoir constater chez les deux autres malades que je vais vous présenter maintenant.
? 308 ?
2e MALADE (Gabriel Men..., âgé de 42 ans).
M. CHARCOT : Lui aussi est un doreur sur métaux; il a 42 ans; il a commencé le métier à iâge de onze ans à Paris; il n'a jamais exercé d'autre profession, on ne lui connaît pas d'antécédents héréditaires ; il n'est pas véritablement alcoolique. (Au malade) : Est-ce que vous buvez ? Le malade : Pas trop.
M. CHARCOT : Vous voyez, il est sincère.... Pendant une période de 31 ans, c'està-dire jusqu'en 1883, il n'a jamais rien ressenti et il attribue le tremblement dont il est atteint actuellement au fait d'avoir travaillé dans ces derniers temps dans des ateliers mal ventilés, mal aménagés. Depuis 1883, il a eu 3 ou 4 accès de tremblement. Le malade : Trois seulement.
M. CHARCOT :.Combien de temps a duré chacun de vos accès ? Le malade : Six semaines environ.
M. CHARCOT : Et au bout de ces 6 semaines, vous êtes-vous trouvé guéri? Lemalade : Non, monsieur, incomplètement; du moins, j'ai toujours tremblé un peu dans les intervalles. Quand on va mieux, si ion retourne dans l'atelier où ion est tombé malade, au bout de deux mois, on retombe. D'ailleurs, quand on a été pris une fois ; il vous reste toujours un petit tremblement.
M. CHARCOT : Le malade a aussi, vous le voyez, une trépidation de la langue. Lorsque celle-ci est tirée hors de la bouche, c'est comme chez le précédent; mais chez celui dont il s'agit maintenant, il n'y a pas de trépidation des lèvres pendant l'articulation de la parole.
(Au malade) : Faites reposer vos mains sur vos genoux, abandonnez-les tranquillement. Vous voyez qu'il tremble un peu des mains alors qu'il devrait être en plein repos. Etendez vos jambes, soulevez vos pieds un instant au-dessus du sol ; vous voyez que dans cet acte volontaire, ses pieds tremblent manifestement comme chez notre premier malade. Venez prendre cette cuiller qui est là devant vous. Vous pouvez manger seul ?
Le malade : Oui, monsieur, mais difficilement.
M. CHARCOT : En effet, vous constatez que dans l'acte de porter une cuillère à sa bouche, il tremble à peu près autant que le ferait notre N° 1.
(A un interne) : Mettez, je vous prie, un peu d'eau dans le verre que voici, remplissez-le jusqu'au bord... Il est bon, en pareil cas, quand on fait cette épreuve qui doit contribuer au diagnostic, de remplir le verre. Le sentiment qu'éprouve le malade, à la vue du verre plein, de la presque impossibilité de le porter à la bouche sans tout verser, l'émeut par avance, et rend le tremblement plus intense. Le tremblement est toujours moins prononcé, toutes choses égales d'ailleurs quand le verre est à la moitié ou tout à fait vide. Cela je l'ai constaté bien des fois. Les choses sont plus accentuées encore lorsque le verre plein est placé sur un plateau qu'on porte sur une main ; c'est qu'alors prendre le verre sans le renverser est plus difficile encore que lorsque celui-ci est, au préalable, placé sur une table. Ce sont là de petits artifices qu'il faut connaître, parce qu'ils permettent dans un cas
? 309 -
difficile de bien mettre en relief le symptôme tremblement intentionnel ; cela peut parfois, quand il est nettement prononcé, décider du diagnostic.
(Le malade porte à sa bouche le verre rempli d'eau).
M. CHARCOT : Vous constatez les oscillations qui se produisent et qui rendent difficile l'accomplissement de l'acte. Enfin, voici le verre près de la bouche, c'est le moment solennel, si je puis ainsi parler... les oscillations augmentent d'amplitude, l'eau est en partie projetée hors au verre et vous entendez le claquement saccadé que produit le verre en frappant les dents à chaque oscillation ; c'est absolument le même tableau que vous auriez sous les yeux dans les mêmes circonstances s'il s'agissait de la sclérose en plaques.
(Au malade) : Voulez-vous prendre ce crayon et essayer d'écrire votre nom ?
Vous allez voir qu'il y a entre le tremblement dont cet*homme est affecté et celui de la paralysie agitante un contraste vraiment remarquable.
(Le malade écrit son nom avec difficulté).
M. CHARCOT: C'est presque lisible, mais, vous l'avez vu, ce n'est pas sans peine qu'il est parvenu à ce résultat. Je vous rappellerai que là force dynamométrique est, pour les 2 mains, de 70e environ des deux côtés. C'est presque iétat normal.
11 n'y a pas chez ce malade de tremblement de la tête, ni de titubation pendant la marche. Il est curieux de constater chez nos deux premiers sujets la rapidité avec laquelle l'amendement s'est produit dans le tremblement sous la seule influence de iexpeclation. Ces malades sont ici depuis une vingtaine de jours à peine et déjà,chez eux, le tremblement s'est énormément atténué. Malheureusement, des circonstances indépendantes de ma volonté m'ont forcé d'interrompre mes leçons un instant et je regrette de n'avoir pu vous montrer ces sujets alors qu'ils étaient tout à fait dans leur beau. Mais il est heureux encore quemalgré l'atténuation qu'il a subie déjà, le tremblement persiste sous une forme suffisamment caractérisque.
Fig. 48. ? Fac-similé de la signature du n° 2 (MENDLER).
3e MALADE.
; Nous passons maintenant à l'examen de notre 3e malade. Il est âgé de 82 ans, il s'appelle ^Schumaeher, je puis le nommer tout au long, cela lui est égal;: il ne; me fera pas j'en suis sur, un procès pour cela; si j,e le nomme c'est qu'il occupe un rang distingué dans l'histoire clinique du tremblement hydrargyrique en partieù-
? 310 ?
lier et des maladies mercurielles, en général. Il n'est pas une thèse, pas un travail qui ait paru à Paris sur ce sujet du tremblement mercuriel depuis 7 ou 8 ans, où ne figure pas sa biographie. Il est bon de saisir que sous la rubrique Schum... dans ces travaux divers, c'est toujours de lui qu'il s'agit. Or ce nom de Schum, paraît ne pas avoir été rapporté toujours à un seul et même personnage, bien qu'en réalité toujours il s'agisse d'un seul et même sujet dont l'identité, d'ailleurs, est facile à reconnaître. Il porte six doigts à chaque pied et ce trait dislinctif se trouve signalé dans quelques-unes des observations recueillies, par diverses personnes, à propos du même malade.
Il a bien voulu nous raconter en détail toute son histoire pathologique. C'est une véritable Iliade « Iliada malorum» comme dit Torti. Il nous a appris qu'il a fréquenté successivement pendant les 8 dernières années qu'il est sous le coup du tremblement hydrargyrique, l'hôpital Tenon, l'hôpital St -Antoine, l'hôpital Lariboisière, et dans chacun de ces hôpitaux il a été l'objet d'études attentives; ? c'est qu'en réalité, il a présenté toujours depuis 8 ans les caractères de l'intoxication mercùrielle sous une forme très accentuée, typique,et par conséquent parfaitement appropriée aux études cliniques.
Aujourd'hui encore, le sujet peut être donné comme représentant la forme typique du tremblement hydrargyrique intense.
Vous remarquerez d'abord le tremblement de la tête, très accentué surtout quand le malade se tient debout ou quand il marche. Ce tremblement qui consiste surtout en oscillations antéro-postérieures existait déjà dès ses premiers accès. Il nous raconte qu'étant à l'hôpital de Lariboisière, dans la même salle qu'un mercuriel comme lui, sa tète oscillait d'avant en arrière, tandis que celle de son camarade oscillait dans le sens latéral, de gauche à droite et de droite à gauche, de telle sorte que l'un disait « oui» tandis que l'autre disait «non», ce qui donnait un spectacle, fort étrange et ne manquait jamais d'exciter l'hilarité des gens du service. Ce tremblement de la têle, je tiens à le redire encore, rappelle absolument cequi se voit dans la sclérose en plaques. Je ne vois vraimententre les deux cas aucune différence appréciable. Vous savez cependant que dans l'une de ces affections, la sclérose en plaques, il existe des lésions organiques accentuées, tandis que dans l'intoxication hydrargyrique, ces lésions, en ce qui concerne du moins le cerveau et la moelle épinière, font absolument défaut.
La seule lésion matérielle appréciable qui ait été constatée clans lemercurialisme, c'est l'altération des nerfs périphériques décrite par M. Letulle et encore, si je ne me .trompe, n'est-ce que sur les cobayes qu'elle a été, jusqu'ici rencontrée.
Les autopsies chez l'homme font défaut, je crois ; cela tient sans doute à ce que « quoad vitam » le tremblement mercuriel est une affection bénigne en ce sens qu'il ne conduit que très indirectement à une terminaison fatale. (Au malade) : Tirez votre langue.
Sa langue trépide un peu, mais peut-être moins que chez celui que nous avons appelé le No 1. Il a également moins d'embarras de la parole que n'en avait celuici. Les mains tremblent manifestement à l'état de repos, mais beaucoup moins fort qu'il y a 18 jours. Certainement, si j'avais attendu 18 jours encore, je n'aurais pu aujourd'hui placer sous vos yeux qu'un cas effacé, fruste. (Au malade): Est-ce que vous pouvez manger seul ?
- 311 ?
Le malade : Oui, Monsieur.
M. CHARCOT ; Ce doit être une affaire d'état. Cependant essayez de porter cette cuiller à votre bouche. (Le malade s'efforce de le faire). Vous voyez qu'il y réussit fort mal, le bout de la cuiller approche de la bouche, on entend "un bruit que produit celui-ci en frappant les dents en cadence. (Au malade): Recommencez.
Vous entendez une fois de plus le bruit qu'il fait.
Ses repas doivent être singulièrement entrecoupés ; cependant il est très possible que,dans la solitude, les actes volontaires soient beaucoup plus faciles à accomplir que lorsqu'il s'agit de les effectuer en public. Au moment où le but va être atteint, on voit les oscillations s'accentuer de plus en plus dans la tête et dans la main. En somme, ainsi que cela se voit dans la sclérose en plaques, le but est le plus souvent manqué. C'est un vrai supplice de Tantale.
Le tremblement considéré en générai dans la mercurialisation est habituellement tout à fait symétrique ; les deux mains en sont atteintes au même degré. Je ne crois pas qae le tremblement mercuriel puisse ne pas être symétrique. Cependant, je n'oserais pas affirmer qu'il ne puisse quelquefois en être ainsi, mais cela doit être bien rare. Si j'insiste là-dessus, c'est qu'il n'est pas rare au contraire dans la maladie de Parkinson, de voir quelquefois, pendant fort longtemps le tremblement rester longtemps unilatéral ; je YOUS montrerai tout à l'heure un exemple de ce genre.
Nous allons passer maintenant à l'exercice de l'écriture. Vous voyez que notre malade s'en tire encore plus mal, s'il est possible que les deux premiers.
Mettez-vous bien le spectacle que vous avez sous les yeux dans la mémoire, car si le tremblement mercuriel a été souvent décrit, il ne l'a pas été toujours avec la précision de détails qu'il faut y mettre aujourd'hui. Et à ce propos, si vous lisez par hasard dans le troisième volume de mes leçons (t. III, p. 213) quelques renseignements que j'y donne sur le tremblement considéré en général,vous verrez que le tremblement mercuriel n'y est pas mis à sa place. Là, en effet, je ne sais pour quelle raison, j'ai placé le tremblement mercuriel parmi les tremblements rapides, vibratoires, tandis qu'en réalité, vous savez par l'étude des 3 malades présents qu'il s'agit ici d'un tremblement à oscillations lentes (moins de 8) du moins
Fig. 49 ? Fac-similé de la signature du n° 3 (SCHUMA.K.EII).
? 312 ?
pendant la période de repos. C'est là, je le répète, une erreur qui s'est glissée je ne sais comment, dans mon exposé ? car je me fais toujours une règle absolue de décrire d'après nature ? une erreur, dis-je, qu'il convenait de rectifier.
Quand notre malade est entré à la Salpêtrière, il y a de cela à peu près une dizaine de jours, il lui était presqu'impossible de progresser en marchant, non-seulement il était obligé de s'appuyer fortement sur un bâton mais de plus, sa démarche était des plus singulières à cause de la trépidation dont ses membres inférieurs étaient affectés sous l'influence de l'acte volontaire de inarcher. D'une part, . il oscillait, titubait, était menacé de choira chaque instant, et de plus, en raison des mouvements contradictoires dont ses membres étaient le siège, à peine avait-il fait un pas en avant, que immédiatement après il faisait un pas en arrière. Par moments aussi, ses jambes fléchissaient souslui.llva sans dire que dans ces tentatives de progression, la tête se mettait de la partie et oscillait de plus belle. Aujourd'hui, il peut déjà marcher sans canne; cependant, vous voyez qu'il oscille et qu'il titube très manifestement. Mais il y a seulement 8 jours, c'était bien autre chose. Vous savez qu'il y a de ces malades qui ne peuvent plus marcher du tout à cause de l'intensité du tremblement des membres inférieurs et qui sont.nécessairement confinés au lit. Cela a été presque le cas de notre 3e malade. On peut dire de lui qu'il présente à l'état d'exagération tout ce que les deux autres nous ont présenté à l'état relativement rudimentaire.
Son histoire est assez particulière : il est né à Forbach et il est âgé de 82 ans. Il a d'abord travaillé dans les mines, il est maintenant chapelier, il travaille dans le secrélage des peaux. C'est une opération dans laquelle, vous le savez, on emploie le nitrate de mercure. Les vapeurs mercurielles qui se développent dans les diverses opérations du secrélage sont les causes qui font que les chapeliers sont exposés à contracter le tremblement hydrargyrique. Il y a bien d'autres professions encore où cela peut arriver. Je me bornerai à signaler les miroitiers, les fabricants de thermomètres, les mineurs de cinabre (Almaden, en Espagne),etc., etc., et il convient d'ajouter qu'en dehors des professions désignées, ce tremblement peut se produire accidentellement comme dans le cas classique du vaisseau le « Triumph » ou encore à la suite d'un traitement hydrargyrique trop prolongé et mal conduit. Vous saurez, je pense, après ce qui précède, reconnaître maintenant le tremblement mercuriel pour ce qu'il est à tous les degrés et sous toutes les formes où il peut se présenter. J'ajouterai seulement au tableau quelques traits qui viendront le compléter. Nos malades n'ont pas ces troubles de la sensibilité d'ailleurs très discrets qu'on rencontre quelquefoisassocies.au tremblement, pas de troublessensoriels; tous ont des dents affreuses, noires, déchaussées; plusieurs,le 3° surtout, ont eu de la salivation. Us ne sont pas particulièrement cachexiques; en somme pas de modifications très importantes de l'état général.
(Au malade n° 3) : Dormez-vous ? Avez-vous jamais été empêché de dormir par votre tremblement?
Le malade: Oui, pendant 3 ou 4 jours, au commencement de chacun de mes accès. M. CHARCOT: Quand ils s'endorment, le tremblement cesse. Je crois intéressant de compléter par quelques détails l'histoire du malade que vous avez sous les yeux (Schum...). Comme je l'ai dit, il a commencé par être mi-
? 313 ?
neur, il est venu ensuite à Paris, en 1882 et là, il a travaillé comme manoeuvre dans une maison de charbons en gros. En 1869, il entre dans une fabrique de feutre, comme homme de peine; il ne travaille pas dans les ateliers et par conséquent n'est pas exposé, pendant cette période, à être atteint d'accidents mercuriels. Enfin, il prend part aux opérations du secrétage des peaux en 1880. Il paraît que l'atelier dans lequel il est entré, n'était pas un atelier de première classe car, immédiatement, en 1880, il subit une première attaque de tremblement, alors déjà qu'il a éprouvé de l'embarras de la parole, des tremblements des extrémités, etc., etc. Ces mêmes phénomènes se sont reproduits à chaque nouvel accès (Il en compte 8 aujourd'hui) sans jamais cesser complètement dans les intervalles. Dans plusieurs de ces accès, le tremblement des membres inférieurs à l'occasion des mouvements volontaires a été assez prononcé pour que la marche soit devenue, pendant quelques jours, absolument impossible.
J'arrive maintenant à un épisode fort intéressant de l'histoire pathologique de Schum... Cet épisode est raconté dans le travail de M. Letulle et plus particulièrement dans la thèse de M. Maréchal (Des troubles nerveux dans l'intoxication mercurielle lente. Thèse de Paris, 1888). On y fait également allusion dans la plupart des thèses qui, vers la même époque, ont paru à Paris sur l'intoxication mercurielle (1).
Voici de quoi il s'agit: c'était peu de temps, je crois, après le début du tremblement mercuriel. Il lui est arrivé un jour de tomber à terre privé de connaissance ; il s'est relevé hémiplégique du côté gauche : cette hémiplégie s'accompagnait, ainsi que le constate en particulier iobservalion recueillie par M. Maréchal, d'une hémianeslhésie sensitive et sensorielle de ce même côté gauche. Il y avait de ce côté là rétrécissement du champ visuel. S'est-il agi alors d'une hémiplégie hystérique ou d'une hémiplégie par lésion de la partie postérieure delà capsule interne? Telle est la question.
Eh bien! One particularité de cette hémiplégie qui semble, tout d'abord, plaider en faveur de l'existence d'une lésion capsuiaire est celle-ci : la langue, au moment de l'hémiplégie était déviée du côté gauche, c'est-à-dire du côté de la paralysie, comme cela a lieu dans les cas de lésion organique; seulement, remarquez bien ceci et cette fois le phénomène me semble révéler au contraire l'hystérie, la langue était si fortement déviée (cela résulte des manoeuvres auxquelles le malade a été soumis et cela est consigné dans l'observation de M. Maréchal) qu'elle ne pouvait pas être tirée hors de la bouche. (Au malade) : La bouche était-elle déviée d'un côté ou de l'autre?
Le malade) ."Je ne crois pas.
M. CHARCOT: Pouviez-vous parler?
Le malade: A peine, à cause de ma langue. Je bégayais.
M. CHARCOT : Je reviens ici sur des choses que j'ai dites bien des fois. Vous savez
(1) Sur l'intoxication et le tremblement mercuriels, consultez parmi les travaux récents, en outre dumémoire de M. Letulle: HaLlopeau, thèse d'agrégation. Maréchal, thèse de Paris, 1885. Schull, Des tremblements mercuriels, thèse de Paris, 1881. Hischman, Intoxication et hystérie, thèse de Paris, 1888.
CHARCOT. Leçons du Mardi, t. i, 2» édit.
40
? 314 ?
que j'en suis encore à voir dans une hémiplégie hystérique une véritable et légitime paralysie du facial inférieur. Je ne veux pas nier que cela puisse se voir, car font récemment, encore, il m'a été communiqué un travail d'un médecin italien, M. Lombroso qui, connaissant l'opinion que j'ai émise à cet. égard, assure avoir vu dans l'hystérie des paralysies faciales en tout comparables à celle qu'on voit dans les hémiplégies capsulaires. Mais je crois pouvoir affirmer que cela est, pour le moins, extrêmement rare. (Aumalade): Vous rappelez-vous comment était votre langue dans ce temps là ? Le malade : Parfaitement, le voici.
Le malade reproduit alors d'après ses souvenirs (d'ailleurs conformes à ce qu'on lit dans les observations de M. Letulle et de M. Maréchal) la position qu'avait la langue au moment de son hémiplégie. Il tord sa langue vers la gauche, de façon à lui faire figurer un crochet. Il fait mine de ne pouvoir la sortir de sa bouche et va la chercher à l'aide des doigts d'une de ses mains, pour l'attirer au dehors.
Le malade : C'est ainsi qu'était ma langue et c'est ainsi que je faisais quand on me disait de la tirer hors de la bouche.
D'après ces indications, il faut reconnaître qu'il s'est agi là du spasme glosso-labié des hystériques et non de la déviation de la langue qui se voit communément dans les hémiplégies capsulaires. Le malade a donc été hystérique elles symptômes relatifs à l'épisode que je viens de signaler ont donc été des symptômes hystériques. Je n'en doute nullement, .Messieurs.Ne savez-vous pas que diverses intoxications évoquent l'hystérie chez l'homme, l'alcoolisme, le saturnisme en particulier? Oui il y a des hystéries alcooliques, des hystéries saturines, cela est classique aujourd'hui. Non pas que ces hysiénes-là diffèrent, des autres autrement que par l'élément éliologique, car l'hystérie est une et indivisible. Mais la cause occasionnelle mérite évidemment toujours d'être rappelée ; c'est pourquoi il y a lieu, à côté de l'hystérie alcoolique et de la saturnine, de faire figurer l'hystérie "mercurielle dont Schum... nous a présenté un bel exemple, avec celle particularité que, chez lui, les symptômes hystériques se sont, entremêlés avec les phénomènes intimement liés à l'intoxication à savoir le tremblement mercuriel.
Vous trouverez quelques observations de ce genre, c'est-à-dire pouvant être rapportées à l'hystérie mercurielle, dans la thèse de M. Hischmaim, (thèse de Pans, 1888) ; parmi les 3 cas du groupe signalé par l'auteur, figure, il n'y a pas a en douter, l'observation de Schumacher (empruntée à la thèse de M. Maréchal).
Je n'insisterai pas plu s longuement à propos de ce cas, sur le. spasme glossolabié des hsytériques ; c'est un sujet que j'ai discuté avec vous maintes fois déjà et sur lequel j'aurai certainement l'occasion de revenir. Je tiens à relever seulement ce fait important que les symptômes hystériques peuvent, dans certains cas, venir se mêler à ceux qui relèvent plus directement de l'intoxication mercurielle.
Les attaques d'hystérie sesont, chez notre.homme, reproduites à deux reprises. Aujourd'hui, il ne "reste plus rien de tout cela. L'hystérie n'a été d'ailleurs, je le répète, qu'un épisode. Je vous ferai remarquer, en particulier l'absence actuelle de tout trouble de sensibilité cutanée ou sensorielle.
318 ?
4° MALADE.
(Les malades 1 et 2 se retirent, un 4° est introduit et placé à côté du N° 3.)
M. CHARCOT : C'est le moment d'employer la méthode des contrastes et afin que vous ayez mieux gravés dans l'esprit les caractères du tremblement mercuriel, je vais vous mettre sous les yeux un sujet qui vient de se présenter à la consultation et qui offre, paraît-il, un assez bel exemple delà paralysie agitante ou maladie de Parkinson.
Ce malade s'appelle Olivier Louis, il est âgé de82 ans. Vous remarquez immédiatement que le tremblement des extrémités qui existe chez lui est unilatéral, limité exclusivement au côté droit.
Vous savez qu'avec les appareils d'enregistrement, le tremblement de la maladie de Parkinson donne environ 4 ou 8 oscillations par seconde ; c'est à peu près le même chiffre pour le tremblement mercuriel. 11 n'y a donc pas de différence sous ce l'apport. Mais tandis que dans la période de repos (A B sur les schémas), le tremblement de la paralysie agitante est constant, permanent, sauf au moment du sommeil, celui do l'intoxication mercurielle peut s'arrêter de temps à autre pour reparaître au moment où lemalade devient attentif ou ému.
Mais entre ces deux espèces de tremblement il y a bien d'autres et plus importantes différences à signaler. D'abord, vous savez que si le tremblement mercuriel s'efface temporairement pendanlla période derepos, il s'exagère au contraire toujours considérablement pendant l'accomplissement des actes înlentionels (B C sur le schéma). Voyons si nous retrouvons ces caractères chez ce malade atteint de paralysie agitante.
(A un interne) : Voulez-vous lui donner la cuiller ?
(Au malade) : Prenez celle cuiller. Portez-la à la bouche.
Eh bien, il se produit ici, comme vous le voyez, un fait bien remarquable, c'est queson tremblement s'elfacepresque complètement pendant l'acte volontaire, contrairement à ceux que vous savez exister dans le tremblement mercuriel ; il y a donc là un contraste des plus frappants. (Voir les schémas page suivante).
Il ne faudrait pas croire que cet arrêt si prononcé du tremblement, pendant l'accomplissement des actes intentionnels soit la règle clans la maladie de Parkinson. Mais cela se voit souvent lorsque la maladie n'est pas très avancée. Dans les cas ordinaires, le tremblement de la période de repos se continue sans modifications importantes pendant la périodedes actes intentionnels, ou bien il s'exagère un peu mais jamais à un très haut degré. C'est pourquoi vous voyez les malades atteints delà maladie de Parkinson continuer à se servir des mains, porter leurs aliments et leurs boissons à leur bouche jusqu'à une période très avancée, tandis que, dès l'origine, pour peu que le cas soit de quelque intensité, cela devient impossible aux malades atteints de tremblement hydrargyrique.
(Au malade) : Prenez-ce crayon et écrivez. .
Vous remarquez que le tremblement s'atténue et cesse au moment où le crayon à la main, il approche du papier où il doit écrire.
Vous voyez, il écrit à peu près sans trembler, iL écrit lentement et cette lenteur
Fig. 50. ? Sclérose en plaques.
Fig. 51. ? Tremblement mercuriel.
Fig. 52. ? Paralysie agitante. 1»! variété dans laquelle le tremblement cesse pendant les actes volontaires.
Fig. 53. ? Paralysie agitante. ? 2* variété dans laquelle le tremblement continue tel quel
pendant l'acte volontaire.
Fig, 54. ? Paralysie agitante. ? 3e-variété, dans laquelle le tremblement augmente un peu d'amplitude pendant l'acte volontaire.
AU, dans tous les schémas, indique la période de repos. ? BC, indique la période pendant laquelle s'accomplit.-un acte volontaire (écrire, porter un verre à la bouche, etc., etc'l.
? 317 ?
est un caractère de tous les mouvements volontaires dans cette affection. Mais il écrit, comme vous le voyez, bien que cène soit, certainement pas un clerc, très lisiblement; les caractères sont bien formés, seulement vous remarquerez, surtout en y regardant d'un peu près ou à l'aide de la loupe, que les pleins et les déliés sont légèrement tremblés.
Cette fois, le contraste est peut-être plus accusé encore. Vous voyez combien de nuances délicates permettent dé reconnaître qu'il y a bien des choses diverses foncièrement
foncièrement les unes des autres sous ce nom générique de tremblement.
Faisons-lui porter maintenant à la bouche un verre plein d'eau.
Il accomplit cet acte, vous le voyez, presque sans trembler, les oscillations menues de la période de repos (A B) n'augmentent pas d'amplitude, en tout cas. L'eau du verre est introduite dans la bouche sans qu'il en soit versé une seule goutte.
Encore un autre caractère distinctif. Il y a habituellement dans la paralysie agitante une déformation particulière des mains atteintes de tremblement, déformation toujours à peu près la même et que j'ai décrite. Cette déformation tient à l'état dé rigidité de certains muscles. Il y a plusieurs types de ces déformations (Voir les Leçons sur le syst. nerveux, t. I)'. Chez notre malade, la déformation rappelle celle de la main qui tient une plume à écrire ; rien de cela ne se voit dans les tremblements mercuriels.
En dehors du tremblement il y a, chez notre malade, un certain nombre de faits à signaler. D'abord, je vous ferai remarquer le contraste qu'il présente avec son voisin le murcuriel (Schum...) que j'ai fait retenir près de lui ; sa tète ne tremble pas. Dans la paralysie agitalante, les malades peuvent trembler souvent delà mâchoire, de la langue, avoir la parole embarrassée et bredouillante, un peu comme les mercuriels et certains paralytiques généraux, mais en général, bien qu'il y ait à cette règle des exceptions assurément fort rares, la tête ne tremble pas ou plutôt elle ne tremble pas par elle-même ; le tremblement qu'on- y voit est un tremblement communiqué. Sans cloute, si on plaçait un petit plumet sur la tête de cet homme, nous verrions ce plumet légèrement agité à chaque secousse du corps ; mais je le répète, au moins dans l'immense majorité des cas, ta tète ne tremble pas, comme je le disais tout-à-iheure « par elle-même ».
Fig, 55. ? Fac-similé de l'écriture du n° 4.
? 318 ?
Il y a un autre caractère que je ne veux pas manquer de faire ressortir et qui vous a certainement frappé : C'est l'immobilité des traits du visage. Depuis qu'il est là, notre homme n'a pas cligné une fois des yeux, tandis que son voisin le mercuriel cligne à chaque instant. Il n'a pas détache ses yeux un seul instant de moi : fixité du regard, immobilité des traits ; expression d'impassibilité, d'étonnement, de stupeur. Il n'a pas une seule fois tourné la tèle, soit à droite, soit à gauche. Mais ce sont là des traits particuliers à la maladie de Parkinson qui méritent bien, en raison de leur importance clinique, d'être étudiés avec détails ; je me réserve d'y revenir dans une occasion prochaine. (A.u malade) : Levez-vous un peu, marchez.
Cette immobilité, cette fixité, celle sorte de soudure générale qui est si remarquable, si caractéristique quand le malade est assis persiste à un haut degré encore quand il se tient debout et quand il marche. Ce faciès n'avait pas frappé, tout d'abord les observateurs. Il ne figure pas dans la description de Parkinson. Je crois avoir été le premier à relever ces caractères là qui sont, tellement saisissants qu'ils suffisent vraiment pour permettre sans peine de faire le diagnostic.
Notre malade ne paraît pas avoir d'antécédents héréditaires. C'est un cocher, il lui est arrive un accident le 2S mai 1887. (Au malade) : Que vous est-il arrivé? Le malade : Je suistombé de voilure il y a un an, mon cheval s'est emporté, et, en heurtant le trottoir, il m'a fait vaciller,' puis tomber. M. CHARCOT : Et vous êtes tombé .sur le côté gauche ? Le malade : Oui, Monsieur. M. CHARCOT : Vous n'étiez pas gris? Le malade : Non, Monsieur.
M. CHARCOT: 11 a la parole lente et, en parlant, une trépidation qui se manifeste de temps en temps dans l'articulation des mots.
Ze malade : J'ai éprouvé une vive frayeur parce que mon cheval s'est emporté et que des gamins sortaient au même moment de l'école..., il pouvait en résulter des accidents.
M. CHARCOT : On voil souvent la paralysie agitante se développer sous l'influence d'une émotion vive. Je peux citer un cas bien remarquable, c'est celui d'un individu qui pendant les affaires de la Commune a été pris et mis contre un mur pour être fusillé. Je ne sais comment il s'est fait qu'il ne l'a pas été, mais quand on lui a dit de s'en aller, à peine pouvait-il marcher, il était déjà pris de raideur des membres inférieurs et peu de jours après, il présentait cette fixité du regard et de la tète des paralytiques agitants. Chez celui-ci il s'est passé un certain temps entre le développement des premiers symptômes de la maladie et l'accident qui parait en avoir été la cause occasionnelle. C'est seulement 4 ou 8 mois après qu'il s'est aperçu de la raideur d'un de ses bras. C'est le premier symptôme qui s'est manifesté, mais peut-être, comme il est cocher et cju'il travaille surtout de la main gauche (il est gaucher, ne s'était-il pas aperçu que la main droite était déjà prise de raideur et peut-être de tremblement. On peut se demander si c'est bien la frayeur qu'il a éprouvée, qui est la véritable cause de sa maladie. Il est bien certain que la terreur détermine l'apparition de beaucoup de maladies nerveuses et les maladies ainsi produites sont des plus diverses. Une bombe tombe au milieu d'un groupe.; ceux qui le forment éprouvent une grande émotion. Chacun a ses
? 319 ?
tendances, l'un deviendra hystérique, un autre deviendra paralytique agitant. 11 y a bien des exemples de maladies ainsi contractées. Je rappellerai celui de ce matelot hollandais observé par M.leProfesseurPel (d'Amsterdam), qui,là Batavia, étant descendu sur le rivage pour laver son linge, voit un requin se précipiter sur lui. Heureusement pour lui, le requin se contente de happer le linge, il manque l'homme. Le marin veut remonter à bord, il se-sent "les jambes extrêmement faibles, un peu plus tard, il était bel et bien atteint de paraplégie. 11 présentait tous les caractères de la paraplégie hystérique.
Le résultat d'un accident de chemin de fer, d'une collision dans laquelle un grand nombre de voyageurs sont compris, peut, être pour les uns, je pourrais citer des exemples du genre, la paralysie agitante, pour d'autres l'hystérie, pour d'autres encore la neurasthénie Iraumalique ;. chacun est donc atteint suivant sa manière de réagir et suivant, le caractère de ses prédispositions personnelles.
Non seulement le malade qui est devant vous a. eu peur au moment où est arrivé l'accident mais il est tombé sur le côté gauche et remarquez bien cela, son hémiplégie est du côté droit, de telle sorte qu'elle ne vient pas confirmer ce que j'ai vu plusieurs fois et dont j'ai parlé dans le premier volume de mes Leçons sur les maladies du système nerveux (!)..
Vous savez que dans les hystéries d'origine traumatique, la paralysie vient se produire sur les membres qui ont été le siège du choc local, de la contusion. Un individu tombe sur l'épaule, il est paralysé du membre sur lequel le choc a porté. J'ai fait remarquer, il y a longtemps déjà, qu'il arrive souvent quelque chose de semblable clans la paralysie agitante. Ainsi, j'ai rapporté l'histoire d'un homme qui, comme celui-ci, en tombant de voilure, s'était contusionné la cuisse gauche très fortement et qui, peu après, voit le tremblement de la paralysie agitante commencer par le pied du même côté.. J'ai cité également le cas d'une femme qui s'était démis le maxillaire inférieur en'tombant, et chez laquelle le tremblement a commencé par la mâchoire, etc., etc. 11 semble donc que pour la paralysie agitante aussi, en tant qu'elle semble relever d'un traumatisme, le siège du choc local détermine le siège des premiers symptômes de tremblement.
Si je rappelle ces faits, c'est qu'Olivier invoque pour point de départ de la maladie, la chute qu'il a faite sur Je côté gauche. Mais je tiens à vous faire remarquer que, contrairement à ce qui a eu lieu dans les cas que je citais loul-à-1'heure, ce n'est, pas de ce côté-là, mais, bien du côté opposé que le tremblement s'est montré tout d'abord.
C'en est assez sur l'action des causes occasionnelles sur le développement de la paralysie agitante. C'est un sujet qui présente encore beaucoup d'obscurité et qui réclame de nouvelles éludes. Je me bornerai actuellement à relever chez notre malade d'autres symptômes qui appartiennent à la maladie de Parkinson régulière. Il a, la nuit, ce sentiment de chaleur sur lequel j'ai appelé l'attention et qui l'oblige souvent, la nuit, à se découvrir. Comme le tremblement est chez lui unilatéral, je lui ai demandé si le sentiment de chaleur en question était plus prononcé
(1) T. III. 4e édition, p. 446. ? Traumatisme cl paralysie agitante. ? Appendice.
? 320 ?
du côté du tremblement que de l'autre côté. Il m'a répondu que ce sentiment était
général, aussi prononcé d'un côté que de l'autre.
- Je vous démontrerai d'ailleurs que cette sensation de chaleur ne tient pas au
tremblement en vous présentant un malade atteint de maladie de Parkinson sans
tremblement et qui éprouve cependant la sensation dont il s'agit d'une façon très
marquée.
C'en est assez sur la paralysie agitante pour aujourd'hui, c'est un sujet, sur lequel j'aurai à vous présenter quelques nouveaux développements dans une séance prochaine.
Subscribe to:
Posts (Atom)