Statistics

   

Mutual Information and Nonadditive Entropies: The Case of Tsallis Entropy

Authors: Amelia Carolina Sparavigna

Mutual information of two random variables can be easily obtained from their Shannon entropies. However, when nonadditive entropies are involved, the calculus of the mutual information is more complex. Here we discuss the basic matter about information from Shannon entropy. Then we analyse the case of the generalized nonadditive Tsallis entropy

Comments: 4 Pages. Published in International Journal of Sciences, 2015, 4(10):1-4. DOI:10.18483/ijSci.845

Download: PDF

Submission history

[v1] 2015-12-12 02:35:48

Unique-IP document downloads: 54 times

Vixra.org is a pre-print repository rather than a journal. Articles hosted may not yet have been verified by peer-review and should be treated as preliminary. In particular, anything that appears to include financial or legal advice or proposed medical treatments should be treated with due caution. Vixra.org will not be responsible for any consequences of actions that result from any form of use of any documents on this website.

Add your own feedback and questions here:
You are equally welcome to be positive or negative about any paper but please be polite. If you are being critical you must mention at least one specific error, otherwise your comment will be deleted as unhelpful.

comments powered by Disqus