- 2020-08-17 22:52
*views 3*- mathematics
- statistical analysis

Recently, I'm looking at some knowledge of probability and statistics , By the way, I made some notes .

Basic probability model

Here are three concepts , Classical probability , Frequency school , Bayesian school .

*

Classical probability

In this model , All possible results of randomized experiments are limited , And the probability of each basic result is the same

such as ： Toss an even coin , There are only two results （ Suppose the coin doesn't stand up ）, Face up and back up , So the probability of facing up is 0.5. This is a calculation based on the classical probability model .

*

Frequency school

It is considered that the parameter to be estimated is an unknown constant , Through many tests , The ratio of the number of events to the total test was counted , Get the value of the parameter to be estimated .

such as ： Estimate the probability of getting a positive by throwing an even coin . We do it 1000 Tests , Yes 498 Second up , So the probability of getting positive is 0.498.

* Bayesian school

The parameter to be estimated is not a fixed constant , It's a random variable （ Obey a certain distribution ）. About this random variable , We can have a priori estimation of its distribution according to common sense or other objective facts （ belief ）, Then adjust the distribution according to the experiment , Finally, the posterior distribution of the random variable is obtained .

This idea solves the problem of test deviation caused by too few experiments in frequency school experiment , such as , Toss a homogeneous coin 5 second , this 5 It's always face up , According to the viewpoint of frequency school , What is the probability of coin toss face up

P( Face up )=55=1, This is obviously out of the ordinary sense .

Now define the event A=( Toss the coin face up once ),B=( throw 5 Sub coin ,5 Second up ). In the framework of Bayes , According to common sense, we think that the probability of tossing a coin face up is 0.5, So we can assume that the prior obeying parameter is

Beta(10,10) The distribution of , Then according to Bayesian theorem P(A|B)=P(A)P(B|A)P(B) Can be calculated in the event B Under the condition of occurrence A The probability distribution of is distribution Beta(15,10)

, The expected value of this distribution is 0.6. Through Bayesian framework , We calculated that the probability of the coin facing up is still close 0.5 Value of , More in line with our common sense .（ about Beta The detailed calculation of distribution and posterior probability will be introduced in later chapters ）

This graph is a prior distribution plotted separately Beta(10,10)（ blue ） And posterior distribution Beta(15,10)（ green ）

Conditional probability and mutual independence

conditional probability , if P(B)>0, be P(A|B)=P(AB)P(B) Recorded as an event B In case of occurrence ,A Probability of occurrence .

If P(A|B)=P(A), be A And B Independent of each other and ,P(A∩B)=P(A)P(B)

Bayes theorem

Discrete form

P(Bi|A)=P(Bi)P(A|Bi)∑nj=1P(Bj)P(A|Bj)=P(A∩B)p(A)

Continuous form

f(θ|y)=f(y|θ)f(θ)f(y)=f(y|θ)f(θ)∫f(y|θ)f(θ)dθ=likelihood×priornormalizingconsta

nt∝likelihood×prior

Common distribution of unit random variables

* Bernoulli distribution （0-1 distribution ） Bernoulli

The probability distribution is

pn={1−ppn=0n=1

expect E(x)=p,

variance Var(X)=p(1−p)

*

Binomial distribution binomial

to the full n Independent Bernoulli test .N In the second independent test , The incident happened K Probability distribution of times

P(X=k)=Cknpk(1−p)n−k

expect E(X)=np

variance Var(X)=np(1−p)

*

uniform distribution Uniform

Go to the room a,b Between the probability density function of uniform distribution

f(x)={1/(b−a)0a<x<b other

expect E(X)=a+b2

variance Var(X)=(b−a)212

* exponential distribution

The parameter is λ The probability density of exponential function

f(x)={λe−λxx>00x≤0

expect E(X)=1λ

variance Var(X)=1λ2

* Normal fraction

The mean value is μ, The standard deviation is σ Probability density of normal distribution

f(x)=12π√σe(x−μ)22σ2

expect E(X)=μ

variance Var(X)=σ2

Technology

- Java393 articles
- Python205 articles
- Linux112 articles
- Vue98 articles
- MySQL85 articles
- SpringBoot70 articles
- javascript65 articles
- Spring63 articles
- more...

Daily Recommendation

views 2

©2019-2020 Toolsou All rights reserved,

Non preemptive static priority scheduling algorithm for operating system （C language ）Go Language learning notes （GUI programming ）XCTF Attack and defense world web Advanced practice _ 2_lottery What's the difference between computer major and training background ?python realization vlookup_ Dry goods I ： Why python It's inside vlookup Bubble sort primary springboot2 Separation of front and rear platforms ,token Put in header Pit for verification Python Case conversion of letters （ Two methods ）javascript event （ Detailed explanation of zero basis ）Unity2019 UIElement note （ ten ） Simple exercise 2