Artificial Intelligence

   

Divergence Measure of Belief Function

Authors: Yutong Song, Yong Deng

it is important to measure the divergent or conflicting degree among pieces of information for information preprocessing in case for the unreliable results which come from the combination of conflicting bodies of evidence using Dempster's combination rules. However, how to measure the divergence of different evidence is still an open issue. In this paper, a new divergence measure of belief function based on Deng entropy is proposed in order to measure the divergence of different belief function. The divergence measure is the generalization of Kullback-Leibler divergence for probability since when the basic probability assignment (BPA) is degenerated as probability, divergence measure is equal to Kullback-Leibler divergence. Numerical examples are used to illustrate the effectiveness of the proposed divergence measure.

Comments: 3 Pages.

Download: PDF

Submission history

[v1] 2019-02-17 03:32:56

Unique-IP document downloads: 5 times

Vixra.org is a pre-print repository rather than a journal. Articles hosted may not yet have been verified by peer-review and should be treated as preliminary. In particular, anything that appears to include financial or legal advice or proposed medical treatments should be treated with due caution. Vixra.org will not be responsible for any consequences of actions that result from any form of use of any documents on this website.

Add your own feedback and questions here:
You are equally welcome to be positive or negative about any paper but please be polite. If you are being critical you must mention at least one specific error, otherwise your comment will be deleted as unhelpful.

comments powered by Disqus