Friday, March 26, 2010

Probability Uncertainty

decision theory - probability theory + utility theory (depends on my preference eg. represent how much time to spend waiting for flight)
- agent is rational when it chooses the action with the maximum expected utility taken over all results of actions

atomic event - a complete specification of the world about which the agent is uncertain, ie the rows of the truth table, setting all random variables
- mutually exclusive: at most one is true,
- exhaustive: at least one is true

disjunction rule?
P(a V b) = P(a) + P(b) - P(a ^ b)

Sum(P(A)) = 1
P(True) = 1
P(False) = 0

random values - assignment of values to random variables

unconditional probability P(A), prior, marginal, probability that A will appear in the absence of any other info

conditional probability P(effect|cause), posterior, new information (cause) can change probability of (effect)
eg P(a|b) = P(a ^ b) / P(b) , P(b) = denominator = normalization constant alpha

product rule: P(a ^ b) = P(a|b)P(b) = P(b|a)P(a)
also P(a^b^c) = P(a^b|c)P(c) = P(a|b^c)P(b^c)

independence:
if a and b are independent then P(a^b) = P(a)P(b)
P(a|b) = P(a)

objectivity - objectively observable, known
subjectivity - based on own degree of belief
eg. betting game

joint probability = table of all possible probability values
eg. P(pits in all spots, breeze in 3 places)

General idea: compute distribution on query variable by fixing evidence
variables and summing over hidden variables (H)
H = X - Y -E
eg. P(~cavity|toothache)
X = everything (toothache, cavity, catch)
Y = query (cavity)
E = evidence (toothache)

problems: exponential time and space complexities, how to fill in the numbers for all

conditionally independent:
A is conditionally independent of B given C:
P(A|B,C) = P(A|C) # throw B out from evidence
P(B|A,C) = P(B|C) # throw A out from evidence
P(A,B|C) = P(A|C) * P(B|C) # can't throw A and B because they are in the query

Baye's rule: for diagnosis, derived from product rule: P(a,b) = P(a|b)P(b) = P(b|a)P(a)
So P(a|b) = P(b|a)P(a)/P(b) = alpha * P(b|a)P(a)
P(cause|effect) = P(effect|cause)P(cause)/P(effect)

naive bayes - Baye's rule and conditional independence
P(Cause,Effect1, ... ,Effectn) = P(Cause) πiP(Effecti|Cause)

Probability is a rigorous formalism for uncertain knowledge
• Joint probability distribution specifies probability
of every atomic event
• Queries can be answered by summing over
atomic events
• For nontrivial domains, we must find a way to
reduce the joint size
• Independence and conditional independence
provide the tools


inference by enumeration using conditional independence

No comments: