답변:
Cover and Thomas의 저서 Elements of Information Theory 는 엔트로피와 그 응용에 대한 훌륭한 소스입니다.
엔트로피와 관련된 수학적 통계에 관심이 있다면이 책을 참조하십시오.
http://www.renyi.hu/~csiszar/Publications/Information_Theory_and_Statistics:_A_Tutorial.pdf
그것은 자유롭게 사용할 수 있습니다!
엔트로피는 하나의 개념으로, 어떤 시스템을 설명하는 데 필요한 정보의 양입니다. 일반화는 많이 있습니다. 샘플 엔트로피는 심박수 분석에 사용되는 엔트로피와 유사한 설명 자일뿐입니다.
Jaynes shows how to derive Shannon's entropy from basic principles in his book.
One idea is that if you approximate by , entropy is the rewriting of the following quantity
The quantity inside the log is the number of different length n observation sequences over outcomes that are matched by distribution , so it's a kind of a measure of explanatory power of the distribution.
Grünwald and Dawid's paper Game theory, maximum entropy, minimum discrepancy and robust Bayesian decision theory discuss generalisations of the traditional notion of entropy. Given a loss, its associated entropy function is the mapping from a distribution to the minimal achievable expected loss for that distribution. The usual entropy function is the generalised entropy associated with the log loss. Other choices of losses yield different entropy such as the Rényi entropy.