*
Inductive learning with quantitative dependent variables is called regression , Or continuous variable prediction

*
Inductive learning with qualitative dependent variable is called classification , Or discrete variable prediction
      P(AB)= P(B|A)P(A)
  P(A) be called A Prior probability of events , In general , think A Probability of occurrence .
  P(B|A) It's called likelihood , yes A It occurs when the assumptions are true B The probability of .
  P(A|B) It's called posterior probability , stay B In case of occurrence A The probability of , That is, the probability to be calculated .
  P(B) It's called a normalized constant , and A The definition of a priori probability is similar , In general ,B Probability of occurrence of .
  
  (1) Gauss naive Bayes (Gaussian Naive Bayes);
  (2) Polynomial naive Bayes (Multinomial Naive Bayes);
  (3) Bernoulli naive Bayes (Bernoulli Naive Bayes).

   among , Gaussian naive Bayes uses Gaussian probability density formula to classify and fit . Polynomial naive Bayes are often used in high dimensional vector classification , The most common scenario is article classification . Bernoulli naive Bayes is a process of classifying vectors of boolean type eigenvalues .
from sklearn.naive_bayes import GaussianNB # Gaussian Bayesian classification # 0: Fine 1: Yin 2: precipitation 3: cloudy
data_table = [["date", "weather"], [1, 0], [2, 1], [3, 2], [4, 1], [5, 2], [6,
0], [7, 0], [8, 3], [9, 1], [10, 1]] # The weather of the day X = [[0], [1], [2], [1], [2], [0],
[0], [3], [1]] # The weather of the day corresponds to the weather of the following day y = [1, 2, 1, 2, 0, 0, 3, 1, 1] #
Now put the training data and the corresponding classification into the classifier for training clf = GaussianNB().fit(X, y) # BernoulliNB() bernoulli
ComplementNB() polynomial GaussianNB() Gaussian p = [[1]] print(clf.predict(p))
The result is [2].

<> One , Gaussian
>>> import numpy as np >>> X = np.array([[-1, -1], [-2, -1], [-3, -2], [1, 1],
[2, 1], [3, 2]]) >>> Y = np.array([1, 1, 1, 2, 2, 2]) >>> from
sklearn.naive_bayes import GaussianNB >>> clf = GaussianNB() >>> clf.fit(X, Y)
GaussianNB(priors=None, var_smoothing=1e-09) >>> print(clf.predict([[-0.8,
-1]])) [1] >>> clf_pf = GaussianNB() >>> clf_pf.partial_fit(X, Y, np.unique(Y))
GaussianNB(priors=None, var_smoothing=1e-09) >>> print(clf_pf.predict([[-0.8,
-1]])) [1]
<> Two , polynomial
>>> import numpy as np >>> X = np.random.randint(5, size=(6, 100)) >>> y =
np.array([1, 2, 3, 4, 5, 6]) >>> from sklearn.naive_bayes import ComplementNB
>>> clf = ComplementNB() >>> clf.fit(X, y) ComplementNB(alpha=1.0,
class_prior=None, fit_prior=True, norm=False) >>> print(clf.predict(X[2:3])) [3]
<> Three , bernoulli
>>> import numpy as np >>> X = np.random.randint(2, size=(6, 100)) >>> Y =
np.array([1, 2, 3, 4, 4, 5]) >>> from sklearn.naive_bayes import BernoulliNB
>>> clf = BernoulliNB() >>> clf.fit(X, Y) BernoulliNB(alpha=1.0, binarize=0.0,
class_prior=None, fit_prior=True) >>> print(clf.predict(X[2:3])) [3]

Technology
©2019-2020 Toolsou All rights reserved,
Huawei 2021 session Hardware Engineer Logical post (FPGA) Super detailed surface !!!Vue-element-admin upgrade ui edition virtual machine VMware Download and install the most detailed tutorial !C++ Move constructor and copy constructor sound of dripping water java Backstage interview pygame Realize full screen mode and adjustable window size mysql Database setting character set configuration modification my.ini file (windows)30 What's the experience of being a junior programmer at the age of 20 C++ Multithreading programming ( Summary of common functions and parameters )python_ cherry tree