Page 145 - DCAP310_INTRODUCTION_TO_ARTIFICIAL_INTELLIGENCE_AND_EXPERT_SYSTEMS
P. 145
Unit 7: Probabilistic Reasoning
Specifically, the combination (called the joint mass) is calculated from the two sets of masses m Notes
1
and m in the following manner:
2
m (∅)= 0
1,2
1
()m C
m (A)= (m ⊕ m ) (A) = ∑ mB ( )
1,2 1 2 1 2
1 K BC A=≠∅
−
I
where
K = ∑ mB 2 ( ).
C
()m
1
I
BC=∅
K is a measure of the amount of conflict between the two mass sets.
Effects of Conflict
The normalization factor above, 1 – K, has the effect of completely ignoring conflict and
attributing any mass associated with conflict to the null set. This combination rule for evidence
can therefore produce counterintuitive results, as we show next.
Example: Producing Correct Results in Case of High Conflict
The following example shows how Dempster’s rule produces intuitive results when applied in
a preference fusion situation, even when there is high conflict.
Suppose that two friends, Alice and Bob, want to see a film at the cinema one evening, and that
there are only three films showing: X, Y and Z. Alice expresses her preference for film X with
probability 0.99, and her preference for film Y with a probability of only 0.01. Bob expresses his
preference for film Z with probability 0.99, and his preference for film Y with a probability of
only 0.01. When combining the preferences with Dempster’s rule of combination it turns out
that their combined preference results in probability 1.0 for film Y, because it is the only film
that they both agree to see. Dempster’s rule of combination produces intuitive results even in
case of totally conflicting beliefs when interpreted in this way. Assume that Alice prefers film X
with probability 1.0, and that Bob prefers film Z with probability 1.0. When trying to combine
their preferences with Dempster’s rule it turns out that it is undefined in this case, which means
that there is no solution. This would mean that they can not agree on seeing any film together,
so they don’t go to the cinema together that evening.
Example: Producing Counter-intuitive Results in Case of High Conflict
An example with exactly the same numerical values was introduced by Zadeh in 1979, to point
out counter-intuitive results generated by Dempster’s rule when there is a high degree of conflict.
The example goes as follows:
Suppose that one has two equi-reliable doctors and one doctor believes a patient has either a
brain tumor—with a probability (i.e. a basic belief assignment - bba’s, or mass of belief) of
0.99—or meningitis—with a probability of only 0.01. A second doctor believes the patient has a
concussion—with a probability of 0.99—and believes the patient suffers from meningitis—with
a probability of only 0.01. Applying Dempster’s rule to combine these two sets of masses of
belief, one gets finally m(meningitis)=1 (the meningitis is diagnosed with 100 percent of
confidence).
Such result goes against the common sense since both doctors agree that there is a little chance
that the patient has a meningitis. This example has been the starting point of many research
works for trying to find a solid justification for Dempster’s rule and for foundations of
Dempster – Shafer Theory or to show the inconsistencies of this theory.
LOVELY PROFESSIONAL UNIVERSITY 139