Dempster Shafer evidence theory (D-S theory) is more and more extensively applied to information fusion for the advantage dealing with uncertain information. However, the results opposite to common sense are often obtained when combining the different evidence using the Dempster’s combination rules. How to measure the divergence between different evidence is still an open issue. In this paper, a new relative entropy named as Deng relative entropy is proposed in order to measure the divergence between different basic probability assignments (BPAs). The Deng relative entropy is the generalization of Kullback-Leibler Divergence because when the BPA is degenerated as probability, Deng relative entropy is equal to Kullback-Leibler Divergence. Numerical examples are used to illustrate the effectiveness of the proposed Deng relative entropy.
Comments: 15 Pages.
[v1] 2015-11-17 06:09:55
Unique-IP document downloads: 37 times
Add your own feedback and questions here:
You are equally welcome to be positive or negative about any paper but please be polite. If you are being critical you must mention at least one specific error, otherwise your comment will be deleted as unhelpful.