Beta-bistomal law law — Wikipedia

before-content-x4

A wikipedia article, free l’encyclopéi.

Beta-binomial law
Image illustrative de l’article Loi bêta-binomiale
Mass function
Image illustrative de l’article Loi bêta-binomiale
Distribution function

Settings
Mass function
Distribution function
Hope
Variance
Asymmetry
Standardized Kurtosis see description
Function generating moments
Characteristic function

In probability theory, the beta-binomial law is a law of discreet probability with finished support, corresponding to a process of Bernoulli draws whose probability of success is random (according to a beta law). It is frequently used in Bayesian inference.

Bernoulli’s law is a special case for the parameter n = 1 . For a = b = 1 , it corresponds to the discreet uniform law on {0, .., n } . It also approaches binomial law when the parameters a And b are arbitrarily large. The beta-binomial law is a one-dimensional version of the Multivariate Pólya law, similarly to the binomial and beta laws which are respectively special cases of multinomial and Direct laws.

Beta-binomial law as a compound law [ modifier | Modifier and code ]

The beta law is the combined law of the binomial law. This results from an analytical change of a composed law where the parameter p of the binomial law is random and given by a beta law. More specifically, if

is the binomial law where

p {displaystyle p}

is a random beta law variable

then the compound law is given by

Using the properties of the beta function, this can be written as follows:

In this context, the beta-binomial law often appears in Bayesian inference: the binomial beta law is the predictive law of a binomial random variable with a probability of success given by a beta law.

Beta-binomial law in an urns model [ modifier | Modifier and code ]

The beta-binomial law can also be represented by a model of urns, for parameters a And b positive whole. More specifically, we consider an urn containing a Red balls and β Black balls, random prints are then made. If a red ball is drawn, then two red balls are placed in the ballot box (itself more). In the same way, if a black ball is drawn, it is given with another black ball in the ballot box. If we repeat this operation n times, then the probability of drawing k Red balls follows a beta-binomial parameter law n , a And b .

It should be noted that if after the prints we replace a single ball, then the law is binomial, and if the prints are made without a discount, then the law is hypergeometric.

The first three moments are

and Kurtosis is

You have had

p = αα+β{Text style mathrm {p} = {frac {alpha} {alpha +beta}}}

, we note that the average can be written in the form

m = nαα+β= n p {Text style mu = {frac {nalpha} {alpha +beta} = nmathrm {p}}

and variance by

Or

r = 1α+β+1{Text style rho = {frac {1} {alpha +beta +1}}}

is the correlation two to two between the n Bernoulli draws and is called the overdown parameter.

Moments [ modifier | Modifier and code ]

The estimation by the moments method can be obtained by the use of the first and second moments of the beta-binomial law, that is to say

and by considering them equal to empirical moments

By resolving in a one b , we obtain

Maximum likelihood [ modifier | Modifier and code ]

While the maximum likelihood method is unusable, knowing that the probability density is the density of a function torque (gamma function and/or beta function), they can be easily calculated via direct digital optimization. The estimated maximum likelihood from empirical data can be calculated using general methods adapted to the laws of Pólya multinomials described in (Minka 2003).

Example [ modifier | Modifier and code ]

The following data give the number of boys among the first 12 children of families of 13 children, for 6,115 families taken in hospital data in Saxony XIX It is century (Sokal and Rohlf, p. 59 of Lindsey). You 13 It is As a child is ignored to consider the non -random fact that families stop when they get the expected genre.

Boys 0 first 2 3 4 5 6 7 8 9 ten 11 twelfth
Families 3 24 104 286 670 1 033 1 343 1 112 829 478 181 45 7

The first two empirical moments are

and thus estimated by the moments method are

The estimated by the maximum likelihood can be found digitally

and the maximum log-vraise is

* Minka, Thomas P. (2003). Estimating a Dirichlet distribution . Microsoft technical report.

after-content-x4