[关闭]
@nrailgun 2015-10-12T21:17:39.000000Z 字数 2542 阅读 1789

提升方法 AdaBoost

机器学习


AdaBoost

Input: Training set T={(x1,y1),(x2,y2),,(xN,yN)}, xiXRn; A weak learning algorithm.

Output: Classifier G(x).

  1. Initialize weight distribution

    D1=(w11,,w1N),w1i=1N,i=1,2,,N

  2. For m=1,2,,M:

    1. Train classfier Gm(x):X{1,+1} with weight distribution Dm;
    2. Define error rate:
      em=i=1NwmiI(Gm(xi)yi)
    3. Define Gm(x) factor α as:
      am=12log1emem
    4. Update weight distribution of training set Dm+1=(wm+1,1,wm+1,2,,wm+1,N):
      wm+1,i=wmiZmexp(αmyiGm(xi))

      where Zm=Ni=1wmiexp(αmyiGm(xi)).
  3. Construct linear combination of basic classifiers:
    G(x)=sign(f(x))=sign(m=1MαmGm(x))

Forward Stagewise Additive Modeling

AdaBoost is a special case of Forward Stagewise Additive Modeling, with basic classifier as addictive model and exponential function as loss function.

Input: Training set T={(x1,y1),(x2,y2),,(xN,yN)}, xiXRn; Loss function L(y,f(x)); Base function set {b(x;γ)}.

Output: Additive model f(x).

  1. Initialize f0(x)=0;
  2. For m=1,2,,M:
    1. Minimize loss function
      (βm,γm)=argmaxβ,γi=1NL(yi,fm1(xi)+βb(xi;γ))
    2. Update fm(x)=fm1(x)+βmb(x;γm);
  3. Obtain addictive model:
    f(x)=fM(x)=m=1MβMb(x;γm)

Boosting Tree

Boosting Tree is one kind of boosting with regression tree or classification tree as basic classifier. Boosting tree model can be presented as additive model of decision tree

fM(x)=i=1MT(x;Θm)

Input: Training set T={(x1,y1),(x2,y2),,(xN,yN)}, xiXRn; yiYR.

Output: Boosting tree fM(x).

  1. Initialize f0(x)=0;
  2. For m=1,2,,M:
    1. Calculate residue
      rmi=yifm1(xi),i=1,2,,N
    2. Learn a regression tree fitting residue rm, obtain T(x;Θm);
    3. Update fm(x)=fm1(x)+T(x;Θm);
  3. Obtain regression boosting tree:
    fM(x)=m=1MT(x;Θm)
添加新批注
在作者公开此批注前,只有你和作者可见。
回复批注