LAWRENCE — How should people make decisions when the outcomes of their choices are uncertain and the uncertainty is described by probability theory?
That’s the question facing Prakash Shenoy, Ronald G. Harper Emeritus Professor of Artificial Intelligence at the University of Kansas School of Business.
His answer can be found in the article “An Interval-Valued Utility Theory for Decision Making with Dempster-Shafer Belief Functions”, which appears in the September issue of the International Journal of Approximate Reasoning.
“People assume you can always attach probabilities to uncertain events,” Shenoy said.
“But in real life, you never know what the odds are. You don’t know if it’s 50% or 60%. That’s the essence of the belief functions theory that Arthur Dempster and Glenn Shafer formulated in the 1970s.
His paper (co-authored with Thierry Denoeux) generalizes decision-making theory from probability to belief functions.
“Probability Decision Theory is used to make any type of high stakes choice. Like should I accept a new job or a marriage proposal? Something high stakes. You wouldn’t need it to find out where to go for lunch,” he said.
“But in general, you never know what will happen. You accept a job, but you may have a bad boss. There is a lot of uncertainty. You can have two job offers, so you have to decide on two choices of which to accept. Then you do the pros and cons and attach probabilities to it. The odds are good when you have a lot of reps. But if it’s a one-time thing, you can’t “average your earnings”.
One of the first answers to this question was provided by John von Neumann and Oskar Morgenstern in their 1947 book “Theory of Games and Economic Behavior,” Shenoy said. In 1961, Daniel Ellsberg showed via experiments that von Neumann and Morgenstern’s decision theory was not descriptive of human behavior, especially when there was ambiguity in the theory’s representation of uncertainty. probabilities.
In the late 1960s and mid-1970s, Arthur Dempster and Glenn Shafer (who was a former KU faculty member in mathematics and business) formulated an uncertainty calculus called belief functions which was a generalization of probability theory, which was better able to account for ambiguity. However, there was no decision theory for making decisions when uncertainty was described by this theory.
Shenoy’s paper provides the first formulation of a theory of decision making when uncertainty is described by Dempster-Shafer belief functions which is analogous to von Neumann-Morgenstern theory. And Shenoy said this theory is better able to explain Ellsberg’s experimental findings for choices under ambiguity.
The professor first approached Denoeux about it three years ago when they were both talking to PhD students.
“(Denoeux) went through all the theories of decision-making with belief functions. Then I went to tell him, ‘Not everything you said is satisfactory.’ And he agreed with me! I said I would like to come and work with him on this. So he sent me an invitation.
Shenoy applied for a sabbatical, then traveled to France in the spring of 2019, where he spent five months working with Denoeux at the Université de Technologie de Compiègne.
“It was culturally very enriching and professionally rewarding,” he said.
Now in his 43rd year at KU, Shenoy remains an expert in uncertain reasoning and its applications to artificial intelligence. He is the inventor of Valuation-Based Systems (VBS), a mathematical architecture for knowledge representation and inference that includes many uncertainty calculations. Its VBS architecture is currently used for multi-sensor fusion in ballistic missiles for the US Department of Defense.
He hopes his latest research can benefit those who rely on belief functions.
“That includes a lot of people in the military, for example,” Shenoy said. “They like belief functions because of their flexibility, and they want to know how you make decisions. And if you’re going to reduce everything to probabilities in the end, why not use probabilities to start with? »
Top picture: Pixabay