Line 29: | Line 29: | ||
<p style="text-align:center"><img src="https://static.igem.org/mediawiki/2018/8/82/T--Tongji_China--picture-drylab-model-1.png" width="30%" height="30%"></p> | <p style="text-align:center"><img src="https://static.igem.org/mediawiki/2018/8/82/T--Tongji_China--picture-drylab-model-1.png" width="30%" height="30%"></p> | ||
Although Bayes' theorem is a fundamental result of probability theory, it has a specific interpretation in Bayesian statistics. In the above equation, <I>A</I> usually represents a proposition (such as the statement that a coin lands on heads fifty percent of the time) and <I>B</I> represents the evidence, or new data that is to be considered (such as the result of a series of coin flips). <I>P(A)</I> is the prior probability of <I>A</I> which expresses one's beliefs about <I>A</I> before evidence is considered. The prior probability may also quantify prior knowledge or information about <I>A</I>. <I>P(B|A)</I> is the likelihood function, which can be interpreted as the probability of the evidence <I>B</I> given that <I>A</I> is true. The likelihood quantifies the extent to which the evidence <I>B</I> supports the proposition <I>A</I>. <I>P(A|B)</I> is the posterior probability, the probability of the proposition B into account. Essentially, Bayes' theorem updates one's prior beliefs <I>P(A)</I> after considering the new evidence <I>B</I>.<br> | Although Bayes' theorem is a fundamental result of probability theory, it has a specific interpretation in Bayesian statistics. In the above equation, <I>A</I> usually represents a proposition (such as the statement that a coin lands on heads fifty percent of the time) and <I>B</I> represents the evidence, or new data that is to be considered (such as the result of a series of coin flips). <I>P(A)</I> is the prior probability of <I>A</I> which expresses one's beliefs about <I>A</I> before evidence is considered. The prior probability may also quantify prior knowledge or information about <I>A</I>. <I>P(B|A)</I> is the likelihood function, which can be interpreted as the probability of the evidence <I>B</I> given that <I>A</I> is true. The likelihood quantifies the extent to which the evidence <I>B</I> supports the proposition <I>A</I>. <I>P(A|B)</I> is the posterior probability, the probability of the proposition B into account. Essentially, Bayes' theorem updates one's prior beliefs <I>P(A)</I> after considering the new evidence <I>B</I>.<br> | ||
+ | The probability of the evidence <I>P(B)</I> can be calculated using the law of total probability. If <I>{A1, A2, …, An}</I> is a partition of the sample space, which is the set of all outcomes of an experiment, then,<br> |
Revision as of 03:57, 9 October 2018
Dry Lab
Model
Acknowledge:CPU China. This part is made by Team CPU China and thanks for their collaboration!
We use Bayesian statistics to predict which type of mutation is most likely to product MHC strong binding peptides with the sum of affinity of each mutation site and each allele type.Bayesian statistics is a theory in the field of statistics based on the Bayesian interpretation of probability where probability expresses a degree of belief in an event. The degree of belief may be based on prior knowledge about the event, such as the results of previous experiments, or on personal beliefs about the event. This differs from a number of other interpretations of probability, such as the frequentist interpretation that views probability as the limit of the relative frequency of an event after a large number of trials.
Bayes' theorem is a fundamental theorem in Bayesian statistics, as it is used by Bayesian methods to update probabilities, which are degrees of belief, after obtaining new data. Given two events A and B, the conditional probability of A given that B is true is expressed as follows:
Although Bayes' theorem is a fundamental result of probability theory, it has a specific interpretation in Bayesian statistics. In the above equation, A usually represents a proposition (such as the statement that a coin lands on heads fifty percent of the time) and B represents the evidence, or new data that is to be considered (such as the result of a series of coin flips). P(A) is the prior probability of A which expresses one's beliefs about A before evidence is considered. The prior probability may also quantify prior knowledge or information about A. P(B|A) is the likelihood function, which can be interpreted as the probability of the evidence B given that A is true. The likelihood quantifies the extent to which the evidence B supports the proposition A. P(A|B) is the posterior probability, the probability of the proposition B into account. Essentially, Bayes' theorem updates one's prior beliefs P(A) after considering the new evidence B.
The probability of the evidence P(B) can be calculated using the law of total probability. If {A1, A2, …, An} is a partition of the sample space, which is the set of all outcomes of an experiment, then,