fbpx
Wikipedia

Dempster–Shafer theory

The theory of belief functions, also referred to as evidence theory or Dempster–Shafer theory (DST), is a general framework for reasoning with uncertainty, with understood connections to other frameworks such as probability, possibility and imprecise probability theories. First introduced by Arthur P. Dempster[1] in the context of statistical inference, the theory was later developed by Glenn Shafer into a general framework for modeling epistemic uncertainty—a mathematical theory of evidence.[2][3] The theory allows one to combine evidence from different sources and arrive at a degree of belief (represented by a mathematical object called belief function) that takes into account all the available evidence.

Arthur P. Dempster at the Workshop on Theory of Belief Functions (Brest, 1 April 2010).

In a narrow sense, the term Dempster–Shafer theory refers to the original conception of the theory by Dempster and Shafer. However, it is more common to use the term in the wider sense of the same general approach, as adapted to specific kinds of situations. In particular, many authors have proposed different rules for combining evidence, often with a view to handling conflicts in evidence better.[4] The early contributions have also been the starting points of many important developments, including the transferable belief model and the theory of hints.[5]

Overview edit

Dempster–Shafer theory is a generalization of the Bayesian theory of subjective probability. Belief functions base degrees of belief (or confidence, or trust) for one question on the subjective probabilities for a related question. The degrees of belief themselves may or may not have the mathematical properties of probabilities; how much they differ depends on how closely the two questions are related.[6] Put another way, it is a way of representing epistemic plausibilities, but it can yield answers that contradict those arrived at using probability theory.

Often used as a method of sensor fusion, Dempster–Shafer theory is based on two ideas: obtaining degrees of belief for one question from subjective probabilities for a related question, and Dempster's rule[7] for combining such degrees of belief when they are based on independent items of evidence. In essence, the degree of belief in a proposition depends primarily upon the number of answers (to the related questions) containing the proposition, and the subjective probability of each answer. Also contributing are the rules of combination that reflect general assumptions about the data.

In this formalism a degree of belief (also referred to as a mass) is represented as a belief function rather than a Bayesian probability distribution. Probability values are assigned to sets of possibilities rather than single events: their appeal rests on the fact they naturally encode evidence in favor of propositions.

Dempster–Shafer theory assigns its masses to all of the subsets of the set of states of a system—in set-theoretic terms, the power set of the states. For instance, assume a situation where there are two possible states of a system. For this system, any belief function assigns mass to the first state, the second, to both, and to neither.

Belief and plausibility edit

Shafer's formalism starts from a set of possibilities under consideration, for instance numerical values of a variable, or pairs of linguistic variables like "date and place of origin of a relic" (asking whether it is antique or a recent fake). A hypothesis is represented by a subset of this frame of discernment, like "(Ming dynasty, China)", or "(19th century, Germany)".[2]: p.35f. 

Shafer's framework allows for belief about such propositions to be represented as intervals, bounded by two values, belief (or support) and plausibility:

beliefplausibility.

In a first step, subjective probabilities (masses) are assigned to all subsets of the frame; usually, only a restricted number of sets will have non-zero mass (focal elements).[2]: 39f.  Belief in a hypothesis is constituted by the sum of the masses of all subsets of the hypothesis-set. It is the amount of belief that directly supports either the given hypothesis or a more specific one, thus forming a lower bound on its probability. Belief (usually denoted Bel) measures the strength of the evidence in favor of a proposition p. It ranges from 0 (indicating no evidence) to 1 (denoting certainty). Plausibility is 1 minus the sum of the masses of all sets whose intersection with the hypothesis is empty. Or, it can be obtained as the sum of the masses of all sets whose intersection with the hypothesis is not empty. It is an upper bound on the possibility that the hypothesis could be true, because there is only so much evidence that contradicts that hypothesis. Plausibility (denoted by Pl) is thus related to Bel by Pl(p) = 1 − Bel(~p). It also ranges from 0 to 1 and measures the extent to which evidence in favor of ~p leaves room for belief in p.

For example, suppose we have a belief of 0.5 for a proposition, say "the cat in the box is dead." This means that we have evidence that allows us to state strongly that the proposition is true with a confidence of 0.5. However, the evidence contrary to that hypothesis (i.e. "the cat is alive") only has a confidence of 0.2. The remaining mass of 0.3 (the gap between the 0.5 supporting evidence on the one hand, and the 0.2 contrary evidence on the other) is "indeterminate," meaning that the cat could either be dead or alive. This interval represents the level of uncertainty based on the evidence in the system.

Hypothesis Mass Belief Plausibility
Neither (alive nor dead) 0 0 0
Alive 0.2 0.2 0.5
Dead 0.5 0.5 0.8
Either (alive or dead) 0.3 1.0 1.0

The "neither" hypothesis is set to zero by definition (it corresponds to "no solution"). The orthogonal hypotheses "Alive" and "Dead" have probabilities of 0.2 and 0.5, respectively. This could correspond to "Live/Dead Cat Detector" signals, which have respective reliabilities of 0.2 and 0.5. Finally, the all-encompassing "Either" hypothesis (which simply acknowledges there is a cat in the box) picks up the slack so that the sum of the masses is 1. The belief for the "Alive" and "Dead" hypotheses matches their corresponding masses because they have no subsets; belief for "Either" consists of the sum of all three masses (Either, Alive, and Dead) because "Alive" and "Dead" are each subsets of "Either". The "Alive" plausibility is 1 − m (Dead): 0.5 and the "Dead" plausibility is 1 − m (Alive): 0.8. In other way, the "Alive" plausibility is m(Alive) + m(Either) and the "Dead" plausibility is m(Dead) + m(Either). Finally, the "Either" plausibility sums m(Alive) + m(Dead) + m(Either). The universal hypothesis ("Either") will always have 100% belief and plausibility—it acts as a checksum of sorts.

Here is a somewhat more elaborate example where the behavior of belief and plausibility begins to emerge. We're looking through a variety of detector systems at a single faraway signal light, which can only be coloured in one of three colours (red, yellow, or green):

Hypothesis Mass Belief Plausibility
None 0 0 0
Red 0.35 0.35 0.56
Yellow 0.25 0.25 0.45
Green 0.15 0.15 0.34
Red or Yellow 0.06 0.66 0.85
Red or Green 0.05 0.55 0.75
Yellow or Green 0.04 0.44 0.65
Any 0.1 1.0 1.0

Events of this kind would not be modeled as distinct entities in probability space as they are here in mass assignment space. Rather the event "Red or Yellow" would be considered as the union of the events "Red" and "Yellow", and (see probability axioms) P(Red or Yellow) ≥ P(Yellow), and P(Any) = 1, where Any refers to Red or Yellow or Green. In DST the mass assigned to Any refers to the proportion of evidence that can not be assigned to any of the other states, which here means evidence that says there is a light but does not say anything about what color it is. In this example, the proportion of evidence that shows the light is either Red or Green is given a mass of 0.05. Such evidence might, for example, be obtained from a R/G color blind person. DST lets us extract the value of this sensor's evidence. Also, in DST the empty set is considered to have zero mass, meaning here that the signal light system exists and we are examining its possible states, not speculating as to whether it exists at all.

Combining beliefs edit

Beliefs from different sources can be combined with various fusion operators to model specific situations of belief fusion, e.g. with Dempster's rule of combination, which combines belief constraints[8] that are dictated by independent belief sources, such as in the case of combining hints[5] or combining preferences.[9] Note that the probability masses from propositions that contradict each other can be used to obtain a measure of conflict between the independent belief sources. Other situations can be modeled with different fusion operators, such as cumulative fusion of beliefs from independent sources, which can be modeled with the cumulative fusion operator.[10]

Dempster's rule of combination is sometimes interpreted as an approximate generalisation of Bayes' rule. In this interpretation the priors and conditionals need not be specified, unlike traditional Bayesian methods, which often use a symmetry (minimax error) argument to assign prior probabilities to random variables (e.g. assigning 0.5 to binary values for which no information is available about which is more likely). However, any information contained in the missing priors and conditionals is not used in Dempster's rule of combination unless it can be obtained indirectly—and arguably is then available for calculation using Bayes equations.

Dempster–Shafer theory allows one to specify a degree of ignorance in this situation instead of being forced to supply prior probabilities that add to unity. This sort of situation, and whether there is a real distinction between risk and ignorance, has been extensively discussed by statisticians and economists. See, for example, the contrasting views of Daniel Ellsberg, Howard Raiffa, Kenneth Arrow and Frank Knight.[citation needed]

Formal definition edit

Let X be the universe: the set representing all possible states of a system under consideration. The power set

 

is the set of all subsets of X, including the empty set  . For example, if:

 

then

 

The elements of the power set can be taken to represent propositions concerning the actual state of the system, by containing all and only the states in which the proposition is true.

The theory of evidence assigns a belief mass to each element of the power set. Formally, a function

 

is called a basic belief assignment (BBA), when it has two properties. First, the mass of the empty set is zero:

 

Second, the masses of all the members of the power set add up to a total of 1:

 

The mass m(A) of A, a given member of the power set, expresses the proportion of all relevant and available evidence that supports the claim that the actual state belongs to A but to no particular subset of A. The value of m(A) pertains only to the set A and makes no additional claims about any subsets of A, each of which have, by definition, their own mass.

From the mass assignments, the upper and lower bounds of a probability interval can be defined. This interval contains the precise probability of a set of interest (in the classical sense), and is bounded by two non-additive continuous measures called belief (or support) and plausibility:

 

The belief bel(A) for a set A is defined as the sum of all the masses of subsets of the set of interest:

 

The plausibility pl(A) is the sum of all the masses of the sets B that intersect the set of interest A:

 

The two measures are related to each other as follows:

 

And conversely, for finite A, given the belief measure bel(B) for all subsets B of A, we can find the masses m(A) with the following inverse function:

 

where |A − B| is the difference of the cardinalities of the two sets.[4]

It follows from the last two equations that, for a finite set X, one needs to know only one of the three (mass, belief, or plausibility) to deduce the other two; though one may need to know the values for many sets in order to calculate one of the other values for a particular set. In the case of an infinite X, there can be well-defined belief and plausibility functions but no well-defined mass function.[11]

Dempster's rule of combination edit

The problem we now face is how to combine two independent sets of probability mass assignments in specific situations. In case different sources express their beliefs over the frame in terms of belief constraints such as in the case of giving hints or in the case of expressing preferences, then Dempster's rule of combination is the appropriate fusion operator. This rule derives common shared belief between multiple sources and ignores all the conflicting (non-shared) belief through a normalization factor. Use of that rule in other situations than that of combining belief constraints has come under serious criticism, such as in case of fusing separate belief estimates from multiple sources that are to be integrated in a cumulative manner, and not as constraints. Cumulative fusion means that all probability masses from the different sources are reflected in the derived belief, so no probability mass is ignored.

Specifically, the combination (called the joint mass) is calculated from the two sets of masses m1 and m2 in the following manner:

 
 

where

 

K is a measure of the amount of conflict between the two mass sets.

Effects of conflict edit

The normalization factor above, 1 − K, has the effect of completely ignoring conflict and attributing any mass associated with conflict to the empty set. This combination rule for evidence can therefore produce counterintuitive results, as we show next.

Example producing correct results in case of high conflict edit

The following example shows how Dempster's rule produces intuitive results when applied in a preference fusion situation, even when there is high conflict.

Suppose that two friends, Alice and Bob, want to see a film at the cinema one evening, and that there are only three films showing: X, Y and Z. Alice expresses her preference for film X with probability 0.99, and her preference for film Y with a probability of only 0.01. Bob expresses his preference for film Z with probability 0.99, and his preference for film Y with a probability of only 0.01. When combining the preferences with Dempster's rule of combination it turns out that their combined preference results in probability 1.0 for film Y, because it is the only film that they both agree to see.
Dempster's rule of combination produces intuitive results even in case of totally conflicting beliefs when interpreted in this way. Assume that Alice prefers film X with probability 1.0, and that Bob prefers film Z with probability 1.0. When trying to combine their preferences with Dempster's rule it turns out that it is undefined in this case, which means that there is no solution. This would mean that they can not agree on seeing any film together, so they do not go to the cinema together that evening. However, the semantics of interpreting preference as a probability is vague: if it is referring to the probability of seeing film X tonight, then we face the fallacy of the excluded middle: the event that actually occurs, seeing none of the films tonight, has a probability mass of 0.

Example producing counter-intuitive results in case of high conflict edit

An example with exactly the same numerical values was introduced by Lotfi Zadeh in 1979,[12][13][14] to point out counter-intuitive results generated by Dempster's rule when there is a high degree of conflict. The example goes as follows:

Suppose that one has two equi-reliable doctors and one doctor believes a patient has either a brain tumor, with a probability (i.e. a basic belief assignment—bba's, or mass of belief) of 0.99; or meningitis, with a probability of only 0.01. A second doctor believes the patient has a concussion, with a probability of 0.99, and believes the patient suffers from meningitis, with a probability of only 0.01. Applying Dempster's rule to combine these two sets of masses of belief, one gets finally m(meningitis)=1 (the meningitis is diagnosed with 100 percent of confidence).

Such result goes against common sense since both doctors agree that there is a little chance that the patient has a meningitis. This example has been the starting point of many research works for trying to find a solid justification for Dempster's rule and for foundations of Dempster–Shafer theory[15][16] or to show the inconsistencies of this theory.[17][18][19]

Example producing counter-intuitive results in case of low conflict edit

The following example shows where Dempster's rule produces a counter-intuitive result, even when there is low conflict.

Suppose that one doctor believes a patient has either a brain tumor, with a probability of 0.99, or meningitis, with a probability of only 0.01. A second doctor also believes the patient has a brain tumor, with a probability of 0.99, and believes the patient suffers from concussion, with a probability of only 0.01. If we calculate m (brain tumor) with Dempster's rule, we obtain
 

This result implies complete support for the diagnosis of a brain tumor, which both doctors believed very likely. The agreement arises from the low degree of conflict between the two sets of evidence comprised by the two doctors' opinions.

In either case, it would be reasonable to expect that:

 

since the existence of non-zero belief probabilities for other diagnoses implies less than complete support for the brain tumour diagnosis.

Dempster–Shafer as a generalisation of Bayesian theory edit

As in Dempster–Shafer theory, a Bayesian belief function   has the properties   and  . The third condition, however, is subsumed by, but relaxed in DS theory:[2]: p. 19 

 

Either of the following conditions implies the Bayesian special case of the DS theory:[2]: p. 37, 45 

  •  
  • For finite X, all focal elements of the belief function are singletons.

As an example of how the two approaches differ, a Bayesian could model the color of a car as a probability distribution over (red, green, blue), assigning one number to each color. Dempster–Shafer would assign numbers to each of (red, green, blue, (red or green), (red or blue), (green or blue), (red or green or blue)). These numbers do not have to be coherent; for example, Bel(red)+Bel(green) does not have to equal Bel(red or green).

Thus, Bayes' conditional probability can be considered as a special case of Dempster's rule of combination.[2]: p. 19f.  However, it lacks many (if not most) of the properties that make Bayes' rule intuitively desirable, leading some to argue that it cannot be considered a generalization in any meaningful sense.[20] For example, DS theory violates the requirements for Cox's theorem, which implies that it cannot be considered a coherent (contradiction-free) generalization of classical logic—specifically, DS theory violates the requirement that a statement be either true or false (but not both). As a result, DS theory is subject to the Dutch Book argument, implying that any agent using DS theory would agree to a series of bets that result in a guaranteed loss.

Bayesian approximation edit

The Bayesian approximation[21][22] reduces a given bpa   to a (discrete) probability distribution, i.e. only singleton subsets of the frame of discernment are allowed to be focal elements of the approximated version   of  :

 

It's useful for those who are only interested in the single state hypothesis.

We can perform it in the 'light' example.

Hypothesis            
None 0 0 0 0 0 0
Red 0.35 0.11 0.32 0.41 0.30 0.37
Yellow 0.25 0.21 0.33 0.33 0.38 0.38
Green 0.15 0.33 0.24 0.25 0.32 0.25
Red or Yellow 0.06 0.21 0.07 0 0 0
Red or Green 0.05 0.01 0.01 0 0 0
Yellow or Green 0.04 0.03 0.01 0 0 0
Any 0.1 0.1 0.02 0 0 0

Criticism edit

Judea Pearl (1988a, chapter 9;[23] 1988b[24] and 1990)[25] has argued that it is misleading to interpret belief functions as representing either "probabilities of an event," or "the confidence one has in the probabilities assigned to various outcomes," or "degrees of belief (or confidence, or trust) in a proposition," or "degree of ignorance in a situation." Instead, belief functions represent the probability that a given proposition is provable from a set of other propositions, to which probabilities are assigned. Confusing probabilities of truth with probabilities of provability may lead to counterintuitive results in reasoning tasks such as (1) representing incomplete knowledge, (2) belief-updating and (3) evidence pooling. He further demonstrated that, if partial knowledge is encoded and updated by belief function methods, the resulting beliefs cannot serve as a basis for rational decisions.

Kłopotek and Wierzchoń[26] proposed to interpret the Dempster–Shafer theory in terms of statistics of decision tables (of the rough set theory), whereby the operator of combining evidence should be seen as relational joining of decision tables. In another interpretation M. A. Kłopotek and S. T. Wierzchoń[27] propose to view this theory as describing destructive material processing (under loss of properties), e.g. like in some semiconductor production processes. Under both interpretations reasoning in DST gives correct results, contrary to the earlier probabilistic interpretations, criticized by Pearl in the cited papers and by other researchers.

Jøsang proved that Dempster's rule of combination actually is a method for fusing belief constraints.[8] It only represents an approximate fusion operator in other situations, such as cumulative fusion of beliefs, but generally produces incorrect results in such situations. The confusion around the validity of Dempster's rule therefore originates in the failure of correctly interpreting the nature of situations to be modeled. Dempster's rule of combination always produces correct and intuitive results in situation of fusing belief constraints from different sources.

Relational measures edit

In considering preferences one might use the partial order of a lattice instead of the total order of the real line as found in Dempster–Schafer theory. Indeed, Gunther Schmidt has proposed this modification and outlined the method.[28]

Given a set of criteria C and a bounded lattice L with ordering ≤, Schmidt defines a relational measure to be a function μ from the power set of C into L that respects the order ⊆ on  (C):

 

and such that μ takes the empty subset of  (C) to the least element of L, and takes C to the greatest element of L.

Schmidt compares μ with the belief function of Schafer, and he also considers a method of combining measures generalizing the approach of Dempster (when new evidence is combined with previously held evidence). He also introduces a relational integral and compares it to the Choquet integral and Sugeno integral. Any relation m between C and L may be introduced as a "direct valuation", then processed with the calculus of relations to obtain a possibility measure μ.

See also edit

References edit

  1. ^ Dempster, A. P. (1967). "Upper and lower probabilities induced by a multivalued mapping". The Annals of Mathematical Statistics. 38 (2): 325–339. doi:10.1214/aoms/1177698950.
  2. ^ a b c d e f Shafer, Glenn; A Mathematical Theory of Evidence, Princeton University Press, 1976, ISBN 0-608-02508-9
  3. ^ Fine, Terrence L. (1977). "Review: Glenn Shafer, A mathematical theory of evidence". Bull. Amer. Math. Soc. 83 (4): 667–672. doi:10.1090/s0002-9904-1977-14338-3.
  4. ^ a b Kari Sentz and Scott Ferson (2002); Combination of Evidence in Dempster–Shafer Theory, Sandia National Laboratories SAND 2002-0835
  5. ^ a b Kohlas, J., and Monney, P.A., 1995. A Mathematical Theory of Hints. An Approach to the Dempster–Shafer Theory of Evidence. Vol. 425 in Lecture Notes in Economics and Mathematical Systems. Springer Verlag.
  6. ^ Shafer, Glenn; Dempster–Shafer theory, 2002
  7. ^ Dempster, Arthur P.; , Journal of the Royal Statistical Society, Series B, Vol. 30, pp. 205–247, 1968
  8. ^ a b Jøsang, A.; Simon, P. (2012). "Dempster's Rule as Seen by Little Colored Balls". Computational Intelligence. 28 (4): 453–474. doi:10.1111/j.1467-8640.2012.00421.x. S2CID 5143692.
  9. ^ Jøsang, A., and Hankin, R., 2012. Interpretation and Fusion of Hyper Opinions in Subjective Logic. 15th International Conference on Information Fusion (FUSION) 2012. E-ISBN 978-0-9824438-4-2, IEEE.|url=http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6289948
  10. ^ Jøsang, A.; Diaz, J. & Rifqi, M. (2010). "Cumulative and averaging fusion of beliefs". Information Fusion. 11 (2): 192–200. CiteSeerX 10.1.1.615.2200. doi:10.1016/j.inffus.2009.05.005. S2CID 205432025.
  11. ^ J.Y. Halpern (2017) Reasoning about Uncertainty MIT Press
  12. ^ L. Zadeh, On the validity of Dempster's rule of combination, Memo M79/24, Univ. of California, Berkeley, USA, 1979
  13. ^ L. Zadeh, Book review: A mathematical theory of evidence, The Al Magazine, Vol. 5, No. 3, pp. 81–83, 1984
  14. ^ L. Zadeh, A simple view of the Dempster–Shafer Theory of Evidence and its implication for the rule of combination, The Al Magazine, Vol. 7, No. 2, pp. 85–90, Summer 1986.
  15. ^ E. Ruspini, "The logical foundations of evidential reasoning", SRI Technical Note 408, December 20, 1986 (revised April 27, 1987)
  16. ^ N. Wilson, "The assumptions behind Dempster's rule", in Proceedings of the 9th Conference on Uncertainty in Artificial Intelligence, pages 527–534, Morgan Kaufmann Publishers, San Mateo, CA, USA, 1993
  17. ^ F. Voorbraak, "On the justification of Dempster's rule of combination", Artificial Intelligence, Vol. 48, pp. 171–197, 1991
  18. ^ Pei Wang, "A Defect in Dempster–Shafer Theory", in Proceedings of the 10th Conference on Uncertainty in Artificial Intelligence, pages 560–566, Morgan Kaufmann Publishers, San Mateo, CA, USA, 1994
  19. ^ P. Walley, "Statistical Reasoning with Imprecise Probabilities", Chapman and Hall, London, pp. 278–281, 1991
  20. ^ Dezert J., Tchamova A., Han D., Tacnet J.-M., Why Dempster's fusion rule is not a generalization of Bayes fusion rule, Proc. Of Fusion 2013 Int. Conference on Information Fusion, Istanbul, Turkey, July 9–12, 2013
  21. ^ Bauer; Mathias (1996). Proceedings of the Twelfth international conference on Uncertainty in artificial intelligence. pp. 73–80.
  22. ^ Voorbraak, Frans (1989-05-01). "A computationally efficient approximation of Dempster-Shafer theory". International Journal of Man-Machine Studies. 30 (5): 525–536. doi:10.1016/S0020-7373(89)80032-X. hdl:1874/26317. ISSN 0020-7373.
  23. ^ Pearl, J. (1988a), Probabilistic Reasoning in Intelligent Systems, (Revised Second Printing) San Mateo, CA: Morgan Kaufmann.
  24. ^ Pearl, J. (1988b). "On Probability Intervals". International Journal of Approximate Reasoning. 2 (3): 211–216. doi:10.1016/0888-613X(88)90117-X.
  25. ^ Pearl, J. (1990). "Reasoning with Belief Functions: An Analysis of Compatibility". The International Journal of Approximate Reasoning. 4 (5/6): 363–389. doi:10.1016/0888-613X(90)90013-R.
  26. ^ M. A. Kłopotek, S. T. Wierzchoń': "A New Qualitative Rough-Set Approach to Modeling Belief Functions." [in:] L. Polkowski, A, Skowron eds: Rough Sets And Current Trends In Computing. Proc. 1st International Conference RSCTC'98, Warsaw, June 22–26, 1998, Lecture Notes in Artificial Intelligence 1424, Springer-Verlag, pp. 346–353.
  27. ^ M. A. Kłopotek and S. T. Wierzchoń, "Empirical Models for the Dempster–Shafer Theory". in: Srivastava, R. P., Mock, T. J., (Eds.). Belief Functions in Business Decisions. Series: Studies in Fuzziness and Soft Computing. Vol. 88 Springer-Verlag. March 2002. ISBN 3-7908-1451-2, pp. 62–112
  28. ^ Gunther Schmidt (2006) Relational measures and integration, Lecture Notes in Computer Science # 4136, pages 343−57, Springer books

Further reading edit

  • Yang, J. B. and Xu, D. L. Evidential Reasoning Rule for Evidence Combination, Artificial Intelligence, Vol.205, pp. 1–29, 2013.
  • Yager, R. R., & Liu, L. (2008). Classic works of the Dempster–Shafer theory of belief functions. Studies in fuzziness and soft computing, v. 219. Berlin: Springer. ISBN 978-3-540-25381-5.
  • Joseph C. Giarratano and Gary D. Riley (2005); Expert Systems: principles and programming, ed. Thomson Course Tech., ISBN 0-534-38447-1
  • Beynon, M., Curry, B. and Morgan, P. The Dempster–Shafer theory of evidence: an alternative approach to multicriteria decision modelling[dead link], Omega, Vol.28, pp. 37–50, 2000.

External links edit

  • BFAS: Belief Functions and Applications Society

dempster, shafer, theory, theory, belief, functions, also, referred, evidence, theory, general, framework, reasoning, with, uncertainty, with, understood, connections, other, frameworks, such, probability, possibility, imprecise, probability, theories, first, . The theory of belief functions also referred to as evidence theory or Dempster Shafer theory DST is a general framework for reasoning with uncertainty with understood connections to other frameworks such as probability possibility and imprecise probability theories First introduced by Arthur P Dempster 1 in the context of statistical inference the theory was later developed by Glenn Shafer into a general framework for modeling epistemic uncertainty a mathematical theory of evidence 2 3 The theory allows one to combine evidence from different sources and arrive at a degree of belief represented by a mathematical object called belief function that takes into account all the available evidence Arthur P Dempster at the Workshop on Theory of Belief Functions Brest 1 April 2010 In a narrow sense the term Dempster Shafer theory refers to the original conception of the theory by Dempster and Shafer However it is more common to use the term in the wider sense of the same general approach as adapted to specific kinds of situations In particular many authors have proposed different rules for combining evidence often with a view to handling conflicts in evidence better 4 The early contributions have also been the starting points of many important developments including the transferable belief model and the theory of hints 5 Contents 1 Overview 1 1 Belief and plausibility 1 2 Combining beliefs 2 Formal definition 3 Dempster s rule of combination 3 1 Effects of conflict 3 1 1 Example producing correct results in case of high conflict 3 1 2 Example producing counter intuitive results in case of high conflict 3 1 3 Example producing counter intuitive results in case of low conflict 4 Dempster Shafer as a generalisation of Bayesian theory 5 Bayesian approximation 6 Criticism 7 Relational measures 8 See also 9 References 10 Further reading 11 External linksOverview editDempster Shafer theory is a generalization of the Bayesian theory of subjective probability Belief functions base degrees of belief or confidence or trust for one question on the subjective probabilities for a related question The degrees of belief themselves may or may not have the mathematical properties of probabilities how much they differ depends on how closely the two questions are related 6 Put another way it is a way of representing epistemic plausibilities but it can yield answers that contradict those arrived at using probability theory Often used as a method of sensor fusion Dempster Shafer theory is based on two ideas obtaining degrees of belief for one question from subjective probabilities for a related question and Dempster s rule 7 for combining such degrees of belief when they are based on independent items of evidence In essence the degree of belief in a proposition depends primarily upon the number of answers to the related questions containing the proposition and the subjective probability of each answer Also contributing are the rules of combination that reflect general assumptions about the data In this formalism a degree of belief also referred to as a mass is represented as a belief function rather than a Bayesian probability distribution Probability values are assigned to sets of possibilities rather than single events their appeal rests on the fact they naturally encode evidence in favor of propositions Dempster Shafer theory assigns its masses to all of the subsets of the set of states of a system in set theoretic terms the power set of the states For instance assume a situation where there are two possible states of a system For this system any belief function assigns mass to the first state the second to both and to neither Belief and plausibility edit Shafer s formalism starts from a set of possibilities under consideration for instance numerical values of a variable or pairs of linguistic variables like date and place of origin of a relic asking whether it is antique or a recent fake A hypothesis is represented by a subset of this frame of discernment like Ming dynasty China or 19th century Germany 2 p 35f Shafer s framework allows for belief about such propositions to be represented as intervals bounded by two values belief or support and plausibility belief plausibility In a first step subjective probabilities masses are assigned to all subsets of the frame usually only a restricted number of sets will have non zero mass focal elements 2 39f Belief in a hypothesis is constituted by the sum of the masses of all subsets of the hypothesis set It is the amount of belief that directly supports either the given hypothesis or a more specific one thus forming a lower bound on its probability Belief usually denoted Bel measures the strength of the evidence in favor of a proposition p It ranges from 0 indicating no evidence to 1 denoting certainty Plausibility is 1 minus the sum of the masses of all sets whose intersection with the hypothesis is empty Or it can be obtained as the sum of the masses of all sets whose intersection with the hypothesis is not empty It is an upper bound on the possibility that the hypothesis could be true because there is only so much evidence that contradicts that hypothesis Plausibility denoted by Pl is thus related to Bel by Pl p 1 Bel p It also ranges from 0 to 1 and measures the extent to which evidence in favor of p leaves room for belief in p For example suppose we have a belief of 0 5 for a proposition say the cat in the box is dead This means that we have evidence that allows us to state strongly that the proposition is true with a confidence of 0 5 However the evidence contrary to that hypothesis i e the cat is alive only has a confidence of 0 2 The remaining mass of 0 3 the gap between the 0 5 supporting evidence on the one hand and the 0 2 contrary evidence on the other is indeterminate meaning that the cat could either be dead or alive This interval represents the level of uncertainty based on the evidence in the system Hypothesis Mass Belief PlausibilityNeither alive nor dead 0 0 0Alive 0 2 0 2 0 5Dead 0 5 0 5 0 8Either alive or dead 0 3 1 0 1 0The neither hypothesis is set to zero by definition it corresponds to no solution The orthogonal hypotheses Alive and Dead have probabilities of 0 2 and 0 5 respectively This could correspond to Live Dead Cat Detector signals which have respective reliabilities of 0 2 and 0 5 Finally the all encompassing Either hypothesis which simply acknowledges there is a cat in the box picks up the slack so that the sum of the masses is 1 The belief for the Alive and Dead hypotheses matches their corresponding masses because they have no subsets belief for Either consists of the sum of all three masses Either Alive and Dead because Alive and Dead are each subsets of Either The Alive plausibility is 1 m Dead 0 5 and the Dead plausibility is 1 m Alive 0 8 In other way the Alive plausibility is m Alive m Either and the Dead plausibility is m Dead m Either Finally the Either plausibility sums m Alive m Dead m Either The universal hypothesis Either will always have 100 belief and plausibility it acts as a checksum of sorts Here is a somewhat more elaborate example where the behavior of belief and plausibility begins to emerge We re looking through a variety of detector systems at a single faraway signal light which can only be coloured in one of three colours red yellow or green Hypothesis Mass Belief PlausibilityNone 0 0 0Red 0 35 0 35 0 56Yellow 0 25 0 25 0 45Green 0 15 0 15 0 34Red or Yellow 0 06 0 66 0 85Red or Green 0 05 0 55 0 75Yellow or Green 0 04 0 44 0 65Any 0 1 1 0 1 0Events of this kind would not be modeled as distinct entities in probability space as they are here in mass assignment space Rather the event Red or Yellow would be considered as the union of the events Red and Yellow and see probability axioms P Red or Yellow P Yellow and P Any 1 where Any refers to Red or Yellow or Green In DST the mass assigned to Any refers to the proportion of evidence that can not be assigned to any of the other states which here means evidence that says there is a light but does not say anything about what color it is In this example the proportion of evidence that shows the light is either Red or Green is given a mass of 0 05 Such evidence might for example be obtained from a R G color blind person DST lets us extract the value of this sensor s evidence Also in DST the empty set is considered to have zero mass meaning here that the signal light system exists and we are examining its possible states not speculating as to whether it exists at all Combining beliefs edit Beliefs from different sources can be combined with various fusion operators to model specific situations of belief fusion e g with Dempster s rule of combination which combines belief constraints 8 that are dictated by independent belief sources such as in the case of combining hints 5 or combining preferences 9 Note that the probability masses from propositions that contradict each other can be used to obtain a measure of conflict between the independent belief sources Other situations can be modeled with different fusion operators such as cumulative fusion of beliefs from independent sources which can be modeled with the cumulative fusion operator 10 Dempster s rule of combination is sometimes interpreted as an approximate generalisation of Bayes rule In this interpretation the priors and conditionals need not be specified unlike traditional Bayesian methods which often use a symmetry minimax error argument to assign prior probabilities to random variables e g assigning 0 5 to binary values for which no information is available about which is more likely However any information contained in the missing priors and conditionals is not used in Dempster s rule of combination unless it can be obtained indirectly and arguably is then available for calculation using Bayes equations Dempster Shafer theory allows one to specify a degree of ignorance in this situation instead of being forced to supply prior probabilities that add to unity This sort of situation and whether there is a real distinction between risk and ignorance has been extensively discussed by statisticians and economists See for example the contrasting views of Daniel Ellsberg Howard Raiffa Kenneth Arrow and Frank Knight citation needed Formal definition editLet X be the universe the set representing all possible states of a system under consideration The power set 2 X displaystyle 2 X nbsp is the set of all subsets of X including the empty set displaystyle emptyset nbsp For example if X a b displaystyle X left a b right nbsp then 2 X a b X displaystyle 2 X left emptyset left a right left b right X right nbsp The elements of the power set can be taken to represent propositions concerning the actual state of the system by containing all and only the states in which the proposition is true The theory of evidence assigns a belief mass to each element of the power set Formally a function m 2 X 0 1 displaystyle m 2 X rightarrow 0 1 nbsp is called a basic belief assignment BBA when it has two properties First the mass of the empty set is zero m 0 displaystyle m emptyset 0 nbsp Second the masses of all the members of the power set add up to a total of 1 A 2 X m A 1 displaystyle sum A in 2 X m A 1 nbsp The mass m A of A a given member of the power set expresses the proportion of all relevant and available evidence that supports the claim that the actual state belongs to A but to no particular subset of A The value of m A pertains only to the set A and makes no additional claims about any subsets of A each of which have by definition their own mass From the mass assignments the upper and lower bounds of a probability interval can be defined This interval contains the precise probability of a set of interest in the classical sense and is bounded by two non additive continuous measures called belief or support and plausibility bel A P A pl A displaystyle operatorname bel A leq P A leq operatorname pl A nbsp The belief bel A for a set A is defined as the sum of all the masses of subsets of the set of interest bel A B B A m B displaystyle operatorname bel A sum B mid B subseteq A m B nbsp The plausibility pl A is the sum of all the masses of the sets B that intersect the set of interest A pl A B B A m B displaystyle operatorname pl A sum B mid B cap A neq emptyset m B nbsp The two measures are related to each other as follows pl A 1 bel A displaystyle operatorname pl A 1 operatorname bel overline A nbsp And conversely for finite A given the belief measure bel B for all subsets B of A we can find the masses m A with the following inverse function m A B B A 1 A B bel B displaystyle m A sum B mid B subseteq A 1 A B operatorname bel B nbsp where A B is the difference of the cardinalities of the two sets 4 It follows from the last two equations that for a finite set X one needs to know only one of the three mass belief or plausibility to deduce the other two though one may need to know the values for many sets in order to calculate one of the other values for a particular set In the case of an infinite X there can be well defined belief and plausibility functions but no well defined mass function 11 Dempster s rule of combination editThe problem we now face is how to combine two independent sets of probability mass assignments in specific situations In case different sources express their beliefs over the frame in terms of belief constraints such as in the case of giving hints or in the case of expressing preferences then Dempster s rule of combination is the appropriate fusion operator This rule derives common shared belief between multiple sources and ignores all the conflicting non shared belief through a normalization factor Use of that rule in other situations than that of combining belief constraints has come under serious criticism such as in case of fusing separate belief estimates from multiple sources that are to be integrated in a cumulative manner and not as constraints Cumulative fusion means that all probability masses from the different sources are reflected in the derived belief so no probability mass is ignored Specifically the combination called the joint mass is calculated from the two sets of masses m1 and m2 in the following manner m 1 2 0 displaystyle m 1 2 emptyset 0 nbsp m 1 2 A m 1 m 2 A 1 1 K B C A m 1 B m 2 C displaystyle m 1 2 A m 1 oplus m 2 A frac 1 1 K sum B cap C A neq emptyset m 1 B m 2 C nbsp where K B C m 1 B m 2 C displaystyle K sum B cap C emptyset m 1 B m 2 C nbsp K is a measure of the amount of conflict between the two mass sets Effects of conflict edit The normalization factor above 1 K has the effect of completely ignoring conflict and attributing any mass associated with conflict to the empty set This combination rule for evidence can therefore produce counterintuitive results as we show next Example producing correct results in case of high conflict edit The following example shows how Dempster s rule produces intuitive results when applied in a preference fusion situation even when there is high conflict Suppose that two friends Alice and Bob want to see a film at the cinema one evening and that there are only three films showing X Y and Z Alice expresses her preference for film X with probability 0 99 and her preference for film Y with a probability of only 0 01 Bob expresses his preference for film Z with probability 0 99 and his preference for film Y with a probability of only 0 01 When combining the preferences with Dempster s rule of combination it turns out that their combined preference results in probability 1 0 for film Y because it is the only film that they both agree to see Dempster s rule of combination produces intuitive results even in case of totally conflicting beliefs when interpreted in this way Assume that Alice prefers film X with probability 1 0 and that Bob prefers film Z with probability 1 0 When trying to combine their preferences with Dempster s rule it turns out that it is undefined in this case which means that there is no solution This would mean that they can not agree on seeing any film together so they do not go to the cinema together that evening However the semantics of interpreting preference as a probability is vague if it is referring to the probability of seeing film X tonight then we face the fallacy of the excluded middle the event that actually occurs seeing none of the films tonight has a probability mass of 0 Example producing counter intuitive results in case of high conflict edit An example with exactly the same numerical values was introduced by Lotfi Zadeh in 1979 12 13 14 to point out counter intuitive results generated by Dempster s rule when there is a high degree of conflict The example goes as follows Suppose that one has two equi reliable doctors and one doctor believes a patient has either a brain tumor with a probability i e a basic belief assignment bba s or mass of belief of 0 99 or meningitis with a probability of only 0 01 A second doctor believes the patient has a concussion with a probability of 0 99 and believes the patient suffers from meningitis with a probability of only 0 01 Applying Dempster s rule to combine these two sets of masses of belief one gets finally m meningitis 1 the meningitis is diagnosed with 100 percent of confidence Such result goes against common sense since both doctors agree that there is a little chance that the patient has a meningitis This example has been the starting point of many research works for trying to find a solid justification for Dempster s rule and for foundations of Dempster Shafer theory 15 16 or to show the inconsistencies of this theory 17 18 19 Example producing counter intuitive results in case of low conflict edit The following example shows where Dempster s rule produces a counter intuitive result even when there is low conflict Suppose that one doctor believes a patient has either a brain tumor with a probability of 0 99 or meningitis with a probability of only 0 01 A second doctor also believes the patient has a brain tumor with a probability of 0 99 and believes the patient suffers from concussion with a probability of only 0 01 If we calculate m brain tumor with Dempster s rule we obtainm brain tumor Bel brain tumor 1 displaystyle m text brain tumor operatorname Bel text brain tumor 1 nbsp dd This result implies complete support for the diagnosis of a brain tumor which both doctors believed very likely The agreement arises from the low degree of conflict between the two sets of evidence comprised by the two doctors opinions In either case it would be reasonable to expect that m brain tumor lt 1 and Bel brain tumor lt 1 displaystyle m text brain tumor lt 1 text and operatorname Bel text brain tumor lt 1 nbsp since the existence of non zero belief probabilities for other diagnoses implies less than complete support for the brain tumour diagnosis Dempster Shafer as a generalisation of Bayesian theory editAs in Dempster Shafer theory a Bayesian belief function bel 2 X 0 1 displaystyle operatorname bel 2 X rightarrow 0 1 nbsp has the properties bel 0 displaystyle operatorname bel emptyset 0 nbsp and bel X 1 displaystyle operatorname bel X 1 nbsp The third condition however is subsumed by but relaxed in DS theory 2 p 19 If A B then bel A B bel A bel B displaystyle text If A cap B emptyset text then operatorname bel A cup B operatorname bel A operatorname bel B nbsp Either of the following conditions implies the Bayesian special case of the DS theory 2 p 37 45 bel A bel A 1 for all A X displaystyle operatorname bel A operatorname bel bar A 1 text for all A subseteq X nbsp For finite X all focal elements of the belief function are singletons As an example of how the two approaches differ a Bayesian could model the color of a car as a probability distribution over red green blue assigning one number to each color Dempster Shafer would assign numbers to each of red green blue red or green red or blue green or blue red or green or blue These numbers do not have to be coherent for example Bel red Bel green does not have to equal Bel red or green Thus Bayes conditional probability can be considered as a special case of Dempster s rule of combination 2 p 19f However it lacks many if not most of the properties that make Bayes rule intuitively desirable leading some to argue that it cannot be considered a generalization in any meaningful sense 20 For example DS theory violates the requirements for Cox s theorem which implies that it cannot be considered a coherent contradiction free generalization of classical logic specifically DS theory violates the requirement that a statement be either true or false but not both As a result DS theory is subject to the Dutch Book argument implying that any agent using DS theory would agree to a series of bets that result in a guaranteed loss Bayesian approximation editThe Bayesian approximation 21 22 reduces a given bpa m displaystyle m nbsp to a discrete probability distribution i e only singleton subsets of the frame of discernment are allowed to be focal elements of the approximated version m displaystyle underline m nbsp of m displaystyle m nbsp m A B A B m B C m C C A 1 0 otherwise displaystyle underline m A left begin aligned amp frac sum limits B A subseteq B m B sum limits C m C cdot C amp A 1 amp 0 amp text otherwise end aligned right nbsp It s useful for those who are only interested in the single state hypothesis We can perform it in the light example Hypothesis m 1 displaystyle m 1 nbsp m 2 displaystyle m 2 nbsp m 1 2 displaystyle m 1 2 nbsp m 1 displaystyle underline m 1 nbsp m 2 displaystyle underline m 2 nbsp m 1 2 displaystyle underline m 1 2 nbsp None 0 0 0 0 0 0Red 0 35 0 11 0 32 0 41 0 30 0 37Yellow 0 25 0 21 0 33 0 33 0 38 0 38Green 0 15 0 33 0 24 0 25 0 32 0 25Red or Yellow 0 06 0 21 0 07 0 0 0Red or Green 0 05 0 01 0 01 0 0 0Yellow or Green 0 04 0 03 0 01 0 0 0Any 0 1 0 1 0 02 0 0 0Criticism editJudea Pearl 1988a chapter 9 23 1988b 24 and 1990 25 has argued that it is misleading to interpret belief functions as representing either probabilities of an event or the confidence one has in the probabilities assigned to various outcomes or degrees of belief or confidence or trust in a proposition or degree of ignorance in a situation Instead belief functions represent the probability that a given proposition is provable from a set of other propositions to which probabilities are assigned Confusing probabilities of truth with probabilities of provability may lead to counterintuitive results in reasoning tasks such as 1 representing incomplete knowledge 2 belief updating and 3 evidence pooling He further demonstrated that if partial knowledge is encoded and updated by belief function methods the resulting beliefs cannot serve as a basis for rational decisions Klopotek and Wierzchon 26 proposed to interpret the Dempster Shafer theory in terms of statistics of decision tables of the rough set theory whereby the operator of combining evidence should be seen as relational joining of decision tables In another interpretation M A Klopotek and S T Wierzchon 27 propose to view this theory as describing destructive material processing under loss of properties e g like in some semiconductor production processes Under both interpretations reasoning in DST gives correct results contrary to the earlier probabilistic interpretations criticized by Pearl in the cited papers and by other researchers Josang proved that Dempster s rule of combination actually is a method for fusing belief constraints 8 It only represents an approximate fusion operator in other situations such as cumulative fusion of beliefs but generally produces incorrect results in such situations The confusion around the validity of Dempster s rule therefore originates in the failure of correctly interpreting the nature of situations to be modeled Dempster s rule of combination always produces correct and intuitive results in situation of fusing belief constraints from different sources Relational measures editIn considering preferences one might use the partial order of a lattice instead of the total order of the real line as found in Dempster Schafer theory Indeed Gunther Schmidt has proposed this modification and outlined the method 28 Given a set of criteria C and a bounded lattice L with ordering Schmidt defines a relational measure to be a function m from the power set of C into L that respects the order on P displaystyle mathbb P nbsp C A B m A m B displaystyle A subseteq B implies mu A leq mu B nbsp and such that m takes the empty subset of P displaystyle mathbb P nbsp C to the least element of L and takes C to the greatest element of L Schmidt compares m with the belief function of Schafer and he also considers a method of combining measures generalizing the approach of Dempster when new evidence is combined with previously held evidence He also introduces a relational integral and compares it to the Choquet integral and Sugeno integral Any relation m between C and L may be introduced as a direct valuation then processed with the calculus of relations to obtain a possibility measure m See also editImprecise probability Upper and lower probabilities Possibility theory Probabilistic logic Bayes theorem Bayesian network G L S Shackle Transferable belief model Info gap decision theory Subjective logic Doxastic logic Linear belief functionReferences edit Dempster A P 1967 Upper and lower probabilities induced by a multivalued mapping The Annals of Mathematical Statistics 38 2 325 339 doi 10 1214 aoms 1177698950 a b c d e f Shafer Glenn A Mathematical Theory of Evidence Princeton University Press 1976 ISBN 0 608 02508 9 Fine Terrence L 1977 Review Glenn Shafer A mathematical theory of evidence Bull Amer Math Soc 83 4 667 672 doi 10 1090 s0002 9904 1977 14338 3 a b Kari Sentz and Scott Ferson 2002 Combination of Evidence in Dempster Shafer Theory Sandia National Laboratories SAND 2002 0835 a b Kohlas J and Monney P A 1995 A Mathematical Theory of Hints An Approach to the Dempster Shafer Theory of Evidence Vol 425 in Lecture Notes in Economics and Mathematical Systems Springer Verlag Shafer Glenn Dempster Shafer theory 2002 Dempster Arthur P A generalization of Bayesian inference Journal of the Royal Statistical Society Series B Vol 30 pp 205 247 1968 a b Josang A Simon P 2012 Dempster s Rule as Seen by Little Colored Balls Computational Intelligence 28 4 453 474 doi 10 1111 j 1467 8640 2012 00421 x S2CID 5143692 Josang A and Hankin R 2012 Interpretation and Fusion of Hyper Opinions in Subjective Logic 15th International Conference on Information Fusion FUSION 2012 E ISBN 978 0 9824438 4 2 IEEE url http ieeexplore ieee org stamp stamp jsp tp amp arnumber 6289948 Josang A Diaz J amp Rifqi M 2010 Cumulative and averaging fusion of beliefs Information Fusion 11 2 192 200 CiteSeerX 10 1 1 615 2200 doi 10 1016 j inffus 2009 05 005 S2CID 205432025 J Y Halpern 2017 Reasoning about Uncertainty MIT Press L Zadeh On the validity of Dempster s rule of combination Memo M79 24 Univ of California Berkeley USA 1979 L Zadeh Book review A mathematical theory of evidence The Al Magazine Vol 5 No 3 pp 81 83 1984 L Zadeh A simple view of the Dempster Shafer Theory of Evidence and its implication for the rule of combination The Al Magazine Vol 7 No 2 pp 85 90 Summer 1986 E Ruspini The logical foundations of evidential reasoning SRI Technical Note 408 December 20 1986 revised April 27 1987 N Wilson The assumptions behind Dempster s rule in Proceedings of the 9th Conference on Uncertainty in Artificial Intelligence pages 527 534 Morgan Kaufmann Publishers San Mateo CA USA 1993 F Voorbraak On the justification of Dempster s rule of combination Artificial Intelligence Vol 48 pp 171 197 1991 Pei Wang A Defect in Dempster Shafer Theory in Proceedings of the 10th Conference on Uncertainty in Artificial Intelligence pages 560 566 Morgan Kaufmann Publishers San Mateo CA USA 1994 P Walley Statistical Reasoning with Imprecise Probabilities Chapman and Hall London pp 278 281 1991 Dezert J Tchamova A Han D Tacnet J M Why Dempster s fusion rule is not a generalization of Bayes fusion rule Proc Of Fusion 2013 Int Conference on Information Fusion Istanbul Turkey July 9 12 2013 Bauer Mathias 1996 Proceedings of the Twelfth international conference on Uncertainty in artificial intelligence pp 73 80 Voorbraak Frans 1989 05 01 A computationally efficient approximation of Dempster Shafer theory International Journal of Man Machine Studies 30 5 525 536 doi 10 1016 S0020 7373 89 80032 X hdl 1874 26317 ISSN 0020 7373 Pearl J 1988a Probabilistic Reasoning in Intelligent Systems Revised Second Printing San Mateo CA Morgan Kaufmann Pearl J 1988b On Probability Intervals International Journal of Approximate Reasoning 2 3 211 216 doi 10 1016 0888 613X 88 90117 X Pearl J 1990 Reasoning with Belief Functions An Analysis of Compatibility The International Journal of Approximate Reasoning 4 5 6 363 389 doi 10 1016 0888 613X 90 90013 R M A Klopotek S T Wierzchon A New Qualitative Rough Set Approach to Modeling Belief Functions in L Polkowski A Skowron eds Rough Sets And Current Trends In Computing Proc 1st International Conference RSCTC 98 Warsaw June 22 26 1998 Lecture Notes in Artificial Intelligence 1424 Springer Verlag pp 346 353 M A Klopotek and S T Wierzchon Empirical Models for the Dempster Shafer Theory in Srivastava R P Mock T J Eds Belief Functions in Business Decisions Series Studies in Fuzziness and Soft Computing Vol 88 Springer Verlag March 2002 ISBN 3 7908 1451 2 pp 62 112 Gunther Schmidt 2006 Relational measures and integration Lecture Notes in Computer Science 4136 pages 343 57 Springer booksFurther reading editYang J B and Xu D L Evidential Reasoning Rule for Evidence Combination Artificial Intelligence Vol 205 pp 1 29 2013 Yager R R amp Liu L 2008 Classic works of the Dempster Shafer theory of belief functions Studies in fuzziness and soft computing v 219 Berlin Springer ISBN 978 3 540 25381 5 Joseph C Giarratano and Gary D Riley 2005 Expert Systems principles and programming ed Thomson Course Tech ISBN 0 534 38447 1 Beynon M Curry B and Morgan P The Dempster Shafer theory of evidence an alternative approach to multicriteria decision modelling dead link Omega Vol 28 pp 37 50 2000 External links editBFAS Belief Functions and Applications Society Retrieved from https en wikipedia org w index php title Dempster Shafer theory amp oldid 1146824455, wikipedia, wiki, book, books, library,

article

, read, download, free, free download, mp3, video, mp4, 3gp, jpg, jpeg, gif, png, picture, music, song, movie, book, game, games.