Human connectivity through entropic processes refers to the existence of a consciousness state. The latest studies performed by using more capable equipment can detect the entropic processes at brain level. Still, there is more to be discovered before we will completely understand all the complex cognitive processes.

Why is important that we understand whether the brain is capable of entropic processes?

Entropy is a powerful explanatory tool for cognitive neuroscience. Through a quantitative index of the dynamic system's randomness or disorder we can learn about the systems at any given point in time. As in Lean where we want to know step by step every single process, in the context of the brain, this allows us to make a translation between mechanistic and qualitative properties. In learning system, the entropic Shannon’s entropy, measures the average amount of information conveyed by the events that occur with probability p_i. Entropy can also be seen as the amount of uncertainty of a random variable. The more uncertain the events of X, the larger the information content, with a maximum for equiprobable events. Since the human clustering seem to be the most acceptable form of balance between local and global, the algorithm developed are a reflection of these findings.

What does it mean?

It implies that entropy will work perfectly at the lowest amount of energy transferred which implies that there will be systems working with the smallest effort while achieving the largest outcomes. In economic data, it would imply that only certain clustered activities will be allowable in order to prevent losses.

http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0089948

http://en.wikipedia.org/wiki/Entropy_%28information_theory%29