Artificial Intelligence

   

Refutation of the Definition of Mutual Information Copyright © 2018 by Colin James III All Rights Reserved.

Authors: Colin James III

The mutual information between two random variables is defined and tested to represent the amount of information learned about the variable from knowing another variable. Since the definition is symmetric, the conjecture also represents the amount of information learned about another variable from the variable. The conjecture is found not tautologous and hence refuted.

Comments: 1 Page. Copyright © 2018 by Colin James III All rights reserved. Note that comments on Disqus are not forwarded or read, so respond to author's email address: info@cec-services dot com.

Download: PDF

Submission history

[v1] 2018-07-15 14:41:01

Unique-IP document downloads: 3 times

Vixra.org is a pre-print repository rather than a journal. Articles hosted may not yet have been verified by peer-review and should be treated as preliminary. In particular, anything that appears to include financial or legal advice or proposed medical treatments should be treated with due caution. Vixra.org will not be responsible for any consequences of actions that result from any form of use of any documents on this website.

Add your own feedback and questions here:
You are equally welcome to be positive or negative about any paper but please be polite. If you are being critical you must mention at least one specific error, otherwise your comment will be deleted as unhelpful.

comments powered by Disqus