Assign prior probability sam having lung cancer
1.3. LARGE INSTANCES / BAYESIAN NETWORKS 29
A probability like P(LungCancer = present) is called a prior probability because, in a particular model, it is the probability of some event prior to updating the probability of that event, within the framework of that model, using new information. Do not mistakenly think it means a probability prior to any information. A probability like P(LungCancer = present|Test = positive) is called a posterior probability because it is the probability of an event after its prior probability has been updated, within the framework of some model, based on new information. The following example illustrates how prior probabilities can change depending on the situation we are modeling.
1.3.1 The Difficulties Inherent in Large Instances
Recall the situation, discussed at the beginning of this chapter, where several features (variables) are related through inference chains. We introduced the following example of this situation: Whether or not an individual has a history of smoking has a direct influence both on whether or not that individual has bronchitis and on whether or not that individual has lung cancer. In turn, the
Variable | Value | When the Variable Takes this Value |
---|---|---|
|
Note that we presented this same table at the beginning of this chapter, but we called the random variables ‘features’. We had not yet defined random variable
|
---|
an exponential number of terms in the sums in Equality 1.5. That is, there are 22terms in the sum in the denominator, and, if there were 100 variables in the application, there would be 297terms in that sum. So, in the case of a large instance, even if we had some means for eliciting the values in the