- 2020-08-18 07:03
*views 3*- machine learning
- Python Data analysis

Bayes formula ： A priori probability of a set of events of interest is given , If you receive a new message , So update your rule for the probability of events ：

A car crash and escape happened in a city , There are only two colors in the city , blue 15%, green 85%, At the time of the incident, a man saw it at the scene , He pointed to the blue car . But according to the experts' Analysis on the spot , At that time, the possibility that the conditions were correct was that 80%. that , What is the probability that the car causing the accident is a blue car ?

order B It's a blue car incident in the city ,G Because the car is green ,E To observe the blue color of the car . It can be concluded from the known conditions P(B)=0.15,P(G)=P(~B)=0.85, as for P(E) We'll talk about it later .

okay , Now? , If no witness saw the car , So we can only guess blindly

, Therefore, the probability that the driver's car is blue can only be the probability that the car in the whole city is blue , That is, a priori probability P(B)=0.15, Because at this time we have no other evidence to intervene , Only a rough estimate can be made .

next , Dangdangdangdang , There's a witness . The witness said he saw the car , And said it was blue , be careful , There are two cases ,………… Say important things twice ：

Bayesian inner phenomenon ( New evidence ) Parts always come in two situations ： One is that the car is really blue , And the witness correctly identified the car as blue , The probability is

P(E,B)=P(B)xP(E|B)=0.15x0.8=0.12, Second, the car is green , It's just that the witness looks blue , The probability is P(E,~B)=P(~B)xP(E|~B)=P(~B)x（1

- P(~E|~B))=0.85x(1-0.8)=0.17, therefore P(E)=P(E,B)+P(E,~B)=0.12+0.17=0.29

then , What we're asking for is the probability that the car will turn blue if there is a witness , that is P(B|E)=P(E,B)/P(E)=0.12/0.29=0.41

You see ,P(B|E) It is P(B) Enhanced version of

, The basic description of conditional probability and prior probability is the same thing . Well, it should be , Here comes another conclusion ： When new evidence emerges ,P(B|E) Will replace the original P(B) The role of . let me put it another way , Now the police have found a new witness , He also thought the car was blue , At this time, in a new round of Bayesian probability calculation , Basic probability P(B)=0.41, Not the original 0.15, You can count it , new P(B|E)=0.73, let me put it another way , When two people saw that the car was blue , When only one person saw the blue color of the vehicle , The probability that the car will actually be blue is greatly increased

Technology

- Java393 articles
- Python205 articles
- Linux112 articles
- Vue98 articles
- MySQL85 articles
- SpringBoot70 articles
- javascript65 articles
- Spring63 articles
- more...

Daily Recommendation

views 2

©2019-2020 Toolsou All rights reserved,

Non preemptive static priority scheduling algorithm for operating system （C language ）Go Language learning notes （GUI programming ）XCTF Attack and defense world web Advanced practice _ 2_lottery What's the difference between computer major and training background ?python realization vlookup_ Dry goods I ： Why python It's inside vlookup Bubble sort primary springboot2 Separation of front and rear platforms ,token Put in header Pit for verification Python Case conversion of letters （ Two methods ）javascript event （ Detailed explanation of zero basis ）Unity2019 UIElement note （ ten ） Simple exercise 2