
More recently the connection between black hole (BH) physics and optics, e.g., ultraslow light ( 18), fiber-optical analog of the event horizon ( 19), and quantum entanglement ( 20), has led to fascinating physics.

Half a century later the union of general relativity and thermodynamics was found to yield surprising results such as Bekenstein–Hawking black hole entropy ( 3 – 6), particle emission from a black hole ( 5 – 9), and acceleration radiation ( 10 – 17).

#INTRIGUE AND ENTROPY FULL#
This is the situation of maximum uncertainty as it is most difficult to predict the outcome of the next toss the result of each toss of the coin delivers one full bit of information.General relativity as originally developed by Einstein ( 1) is based on the union of geometry and gravity ( 2). The entropy of the unknown result of the next toss of the coin is maximized if the coin is fair (that is, if heads and tails both have equal probability 1/2). Main articles: Binary entropy function and Bernoulli processĬonsider tossing a coin with known, not necessarily fair, probabilities of coming up heads or tails this can be modelled as a Bernoulli process. The information content, also called the surprisal or self-information, of an event E is However, knowledge that a particular number will win a lottery has high informational value because it communicates the outcome of a very low probability event. For instance, the knowledge that some particular number will not be the winning number of a lottery provides very little information, because any particular chosen number will almost certainly not win. On the other hand, if a highly unlikely event occurs, the message is much more informative. If a highly likely event occurs, the message carries very little information. The core idea of information theory is that the "informational value" of a communicated message depends on the degree to which the content of the message is surprising.

Two bits of entropy: In the case of two fair coin tosses, the information entropy in bits is the base-2 logarithm of the number of possible outcomes with two coins there are four possible outcomes, and two bits of entropy.
