Excerpt from chat with Sahu about entropy pre 11-7-13

2015-01-13

++ Excerpt from chat with Sahu about entropy pre 11-7-13

Okay let's say I have two distributions
each distribution has 10 numbers
and I want to see how much entropy is in each distrubtion
so for the first distribution
11:41 PM let's say the sequence is
3,3,3,5,5,5,8,8,8,10
calculating the entropy of this distribution we get
=1.31
No for the next distribution
we have
11:42 PM 1,2,3,4,5,6,7,8,9,10
anshuman: yes
me: calculating the entropy we get
anshuman: this is correct
me: -(1/10*log(1/10)*10
=2.30
which is higher than the entropy for the distribution with grouped numbers
I think this is all making sense now


Alright I was also thinking about what kinds of distributions lead to the best entropy for life. Complete entropy is not good for life, and total complete order is not good for life. Given our 10 numbers with a value 0-10 example:
lowest entropy distribution would be: 1,1,1,1,1,1,1,1,1,1
Highest entropy distribution would be: 0,1,2,3,4,5,6,7,8,9
Some intermediate entropy would be: 3,3,3,5,5,5,8,8,8,10