분류 - 앙상블

머신 러닝
공개

2025년 7월 27일

voting

  • 서로 다른 알고리즘이 결합. 분류에서는 voting1으로 결정

Example

import pandas as pd

from sklearn.ensemble import VotingClassifier
from sklearn.linear_model import LogisticRegression
from sklearn.neighbors import KNeighborsClassifier
from sklearn.datasets import load_breast_cancer
from sklearn.model_selection import train_test_split
from sklearn.metrics import accuracy_score
import warnings

warnings.filterwarnings('ignore')

cancer = load_breast_cancer()

df = pd.DataFrame(cancer.data, columns=cancer.feature_names)
df
mean radius mean texture mean perimeter mean area mean smoothness mean compactness mean concavity mean concave points mean symmetry mean fractal dimension ... worst radius worst texture worst perimeter worst area worst smoothness worst compactness worst concavity worst concave points worst symmetry worst fractal dimension
0 17.99 10.38 122.80 1001.0 0.11840 0.27760 0.30010 0.14710 0.2419 0.07871 ... 25.380 17.33 184.60 2019.0 0.16220 0.66560 0.7119 0.2654 0.4601 0.11890
1 20.57 17.77 132.90 1326.0 0.08474 0.07864 0.08690 0.07017 0.1812 0.05667 ... 24.990 23.41 158.80 1956.0 0.12380 0.18660 0.2416 0.1860 0.2750 0.08902
2 19.69 21.25 130.00 1203.0 0.10960 0.15990 0.19740 0.12790 0.2069 0.05999 ... 23.570 25.53 152.50 1709.0 0.14440 0.42450 0.4504 0.2430 0.3613 0.08758
3 11.42 20.38 77.58 386.1 0.14250 0.28390 0.24140 0.10520 0.2597 0.09744 ... 14.910 26.50 98.87 567.7 0.20980 0.86630 0.6869 0.2575 0.6638 0.17300
4 20.29 14.34 135.10 1297.0 0.10030 0.13280 0.19800 0.10430 0.1809 0.05883 ... 22.540 16.67 152.20 1575.0 0.13740 0.20500 0.4000 0.1625 0.2364 0.07678
... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ...
564 21.56 22.39 142.00 1479.0 0.11100 0.11590 0.24390 0.13890 0.1726 0.05623 ... 25.450 26.40 166.10 2027.0 0.14100 0.21130 0.4107 0.2216 0.2060 0.07115
565 20.13 28.25 131.20 1261.0 0.09780 0.10340 0.14400 0.09791 0.1752 0.05533 ... 23.690 38.25 155.00 1731.0 0.11660 0.19220 0.3215 0.1628 0.2572 0.06637
566 16.60 28.08 108.30 858.1 0.08455 0.10230 0.09251 0.05302 0.1590 0.05648 ... 18.980 34.12 126.70 1124.0 0.11390 0.30940 0.3403 0.1418 0.2218 0.07820
567 20.60 29.33 140.10 1265.0 0.11780 0.27700 0.35140 0.15200 0.2397 0.07016 ... 25.740 39.42 184.60 1821.0 0.16500 0.86810 0.9387 0.2650 0.4087 0.12400
568 7.76 24.54 47.92 181.0 0.05263 0.04362 0.00000 0.00000 0.1587 0.05884 ... 9.456 30.37 59.16 268.6 0.08996 0.06444 0.0000 0.0000 0.2871 0.07039

569 rows × 30 columns

lr_clf = LogisticRegression(solver='liblinear')
knn_clf = KNeighborsClassifier(n_neighbors=8)

vo_clf = VotingClassifier(estimators=[('LR', lr_clf), ('KNN', knn_clf)],
                          voting='soft')
X_train, X_test, y_train, y_test = train_test_split(cancer.data, cancer.target, test_size=0.2)
vo_clf.fit(X_train, y_train)
pred = vo_clf.predict(X_test)
accuracy = accuracy_score(y_test, pred)
accuracy
0.9210526315789473
for classifier in [lr_clf, knn_clf]:
    classifier.fit(X_train, y_train)
    pred = classifier.predict(X_test)
    class_name = classifier.__class__.__name__
    print(f'{class_name} 정확도: {accuracy_score(y_test, pred):.4f}')
LogisticRegression 정확도: 0.9298
KNeighborsClassifier 정확도: 0.9211
  • 반드시 voting이 제일 좋은 모델을 선택하는 것보다 좋은건 아님

bagging

  • 같은 유형의 알고리즘의 분류기가 boostrap 해가서 예측. random forest가 대표적. 분류에서는 voting2으로 결정

RandomForest

from sklearn.ensemble import RandomForestClassifier

def get_new_feature_name_df(old):
    df = pd.DataFrame(data=old.groupby('column_name').cumcount(), columns=['dup_cnt'])
    df = df.reset_index()
    new_df = pd.merge(old.reset_index(), df, how='outer')
    new_df['column_name'] = new_df[['column_name', 'dup_cnt']].apply(lambda x: x[0] + '_' + str(x[1]) if x[1] > 0 else x[0], axis=1)
    new_df = new_df.drop(['index'], axis=1)
    return new_df

def get_human_dataset():
    feature_name_df = pd.read_csv('_data/human_activity/features.txt', sep='\s+', header=None, names=['column_index', 'column_name'])
    new_feature_name_df = get_new_feature_name_df(feature_name_df)
    feature_name = new_feature_name_df.iloc[:, 1].values.tolist()

    X_train = pd.read_csv('_data/human_activity/train/X_train.txt', sep='\s+', names=feature_name)
    X_test = pd.read_csv('_data/human_activity/test/X_test.txt', sep='\s+', names=feature_name)

    y_train = pd.read_csv('_data/human_activity/train/y_train.txt', sep='\s+', header=None, names=['action'])
    y_test = pd.read_csv('_data/human_activity/test/y_test.txt', sep='\s+', header=None, names=['action'])

    return X_train, X_test, y_train, y_test

X_train, X_test, y_train, y_test = get_human_dataset()
rf_clf = RandomForestClassifier(max_depth=8)
rf_clf.fit(X_train, y_train)
pred = rf_clf.predict(X_test)
accuracy = accuracy_score(y_test, pred)
accuracy
0.9178825924669155

boosting

GBM

# from sklearn.ensemble import GradientBoostingClassifier
# import time
# 
# X_train, X_test, y_train, y_test = get_human_dataset()
# start_time = time.time()
# 
# gb_clf = GradientBoostingClassifier()
# gb_clf.fit(X_train, y_train)
# gb_pred = gb_clf.predict(X_test)
# gb_accuracy = accuracy_score(y_test, gb_pred)
#
# end_time = time.time()
#
# print(f'{gb_accuracy:.3f}, {end_time - start_time}초')

0.939, 701.6343066692352초

  • 아주 오래 걸림.

XGBoost

  • 결손값을 자체 처리할 수 있다.

  • 조기 종료 기능이 있다.

  • 자체적으로 교차 검증, 성능 평가, 피처 중요도 시각화 기능이 있다.

  • python xgboost

import xgboost as xgb
from xgboost import plot_importance
import numpy as np

dataset = load_breast_cancer()

X_train, X_test, y_train, y_test = train_test_split(dataset.data, dataset.target, test_size=0.2)
X_tr, X_val, y_tr, y_val = train_test_split(X_train, y_train, test_size=0.1)
dtr = xgb.DMatrix(data=X_tr, label=y_tr)
dval = xgb.DMatrix(data=X_val, label=y_val)
dtest = xgb.DMatrix(data=X_test, label=y_test)
params = {
    'max_depth': 3,
    'eta': 0.05,
    'objective': 'binary:logistic',
    'eval_metric': 'logloss'
}
num_rounds = 400
eval_list = [(dtr, 'train'), (dval, 'eval')]

xgb_model = xgb.train(params=params, dtrain=dtr, num_boost_round=num_rounds, early_stopping_rounds=50, evals=eval_list)
[0] train-logloss:0.61277   eval-logloss:0.58601
[1] train-logloss:0.57664   eval-logloss:0.55582
[2] train-logloss:0.54304   eval-logloss:0.52806
[3] train-logloss:0.51255   eval-logloss:0.50298
[4] train-logloss:0.48471   eval-logloss:0.47989
[5] train-logloss:0.45884   eval-logloss:0.45674
[6] train-logloss:0.43517   eval-logloss:0.43736
[7] train-logloss:0.41363   eval-logloss:0.42013
[8] train-logloss:0.39341   eval-logloss:0.40354
[9] train-logloss:0.37494   eval-logloss:0.38841
[10]    train-logloss:0.35744   eval-logloss:0.37480
[11]    train-logloss:0.34054   eval-logloss:0.35955
[12]    train-logloss:0.32442   eval-logloss:0.34527
[13]    train-logloss:0.30963   eval-logloss:0.32976
[14]    train-logloss:0.29638   eval-logloss:0.32018
[15]    train-logloss:0.28306   eval-logloss:0.30855
[16]    train-logloss:0.27060   eval-logloss:0.29773
[17]    train-logloss:0.25894   eval-logloss:0.28763
[18]    train-logloss:0.24842   eval-logloss:0.27760
[19]    train-logloss:0.23861   eval-logloss:0.27116
[20]    train-logloss:0.22890   eval-logloss:0.26281
[21]    train-logloss:0.21995   eval-logloss:0.25627
[22]    train-logloss:0.21183   eval-logloss:0.25092
[23]    train-logloss:0.20363   eval-logloss:0.24393
[24]    train-logloss:0.19606   eval-logloss:0.23852
[25]    train-logloss:0.18881   eval-logloss:0.23108
[26]    train-logloss:0.18165   eval-logloss:0.22460
[27]    train-logloss:0.17502   eval-logloss:0.21895
[28]    train-logloss:0.16889   eval-logloss:0.21248
[29]    train-logloss:0.16323   eval-logloss:0.20904
[30]    train-logloss:0.15760   eval-logloss:0.20456
[31]    train-logloss:0.15228   eval-logloss:0.20042
[32]    train-logloss:0.14728   eval-logloss:0.19496
[33]    train-logloss:0.14254   eval-logloss:0.19157
[34]    train-logloss:0.13775   eval-logloss:0.18653
[35]    train-logloss:0.13340   eval-logloss:0.18317
[36]    train-logloss:0.12934   eval-logloss:0.17926
[37]    train-logloss:0.12562   eval-logloss:0.17500
[38]    train-logloss:0.12162   eval-logloss:0.17143
[39]    train-logloss:0.11812   eval-logloss:0.16801
[40]    train-logloss:0.11493   eval-logloss:0.16522
[41]    train-logloss:0.11171   eval-logloss:0.16259
[42]    train-logloss:0.10874   eval-logloss:0.16035
[43]    train-logloss:0.10593   eval-logloss:0.15740
[44]    train-logloss:0.10316   eval-logloss:0.15462
[45]    train-logloss:0.10017   eval-logloss:0.15140
[46]    train-logloss:0.09748   eval-logloss:0.14985
[47]    train-logloss:0.09515   eval-logloss:0.14732
[48]    train-logloss:0.09280   eval-logloss:0.14613
[49]    train-logloss:0.09039   eval-logloss:0.14424
[50]    train-logloss:0.08814   eval-logloss:0.14301
[51]    train-logloss:0.08583   eval-logloss:0.14055
[52]    train-logloss:0.08369   eval-logloss:0.13767
[53]    train-logloss:0.08167   eval-logloss:0.13494
[54]    train-logloss:0.07971   eval-logloss:0.13295
[55]    train-logloss:0.07782   eval-logloss:0.13025
[56]    train-logloss:0.07603   eval-logloss:0.12777
[57]    train-logloss:0.07431   eval-logloss:0.12528
[58]    train-logloss:0.07265   eval-logloss:0.12285
[59]    train-logloss:0.07107   eval-logloss:0.12062
[60]    train-logloss:0.06952   eval-logloss:0.11986
[61]    train-logloss:0.06804   eval-logloss:0.11877
[62]    train-logloss:0.06626   eval-logloss:0.11728
[63]    train-logloss:0.06490   eval-logloss:0.11527
[64]    train-logloss:0.06361   eval-logloss:0.11325
[65]    train-logloss:0.06205   eval-logloss:0.11093
[66]    train-logloss:0.06085   eval-logloss:0.10911
[67]    train-logloss:0.05957   eval-logloss:0.10839
[68]    train-logloss:0.05846   eval-logloss:0.10659
[69]    train-logloss:0.05701   eval-logloss:0.10541
[70]    train-logloss:0.05598   eval-logloss:0.10382
[71]    train-logloss:0.05487   eval-logloss:0.10325
[72]    train-logloss:0.05390   eval-logloss:0.10238
[73]    train-logloss:0.05262   eval-logloss:0.10137
[74]    train-logloss:0.05173   eval-logloss:0.09989
[75]    train-logloss:0.05078   eval-logloss:0.09905
[76]    train-logloss:0.04998   eval-logloss:0.09774
[77]    train-logloss:0.04902   eval-logloss:0.09725
[78]    train-logloss:0.04813   eval-logloss:0.09723
[79]    train-logloss:0.04728   eval-logloss:0.09558
[80]    train-logloss:0.04655   eval-logloss:0.09499
[81]    train-logloss:0.04558   eval-logloss:0.09360
[82]    train-logloss:0.04481   eval-logloss:0.09289
[83]    train-logloss:0.04411   eval-logloss:0.09233
[84]    train-logloss:0.04323   eval-logloss:0.09104
[85]    train-logloss:0.04244   eval-logloss:0.09051
[86]    train-logloss:0.04163   eval-logloss:0.08929
[87]    train-logloss:0.04105   eval-logloss:0.08824
[88]    train-logloss:0.04029   eval-logloss:0.08709
[89]    train-logloss:0.03970   eval-logloss:0.08667
[90]    train-logloss:0.03908   eval-logloss:0.08651
[91]    train-logloss:0.03840   eval-logloss:0.08554
[92]    train-logloss:0.03790   eval-logloss:0.08459
[93]    train-logloss:0.03717   eval-logloss:0.08382
[94]    train-logloss:0.03655   eval-logloss:0.08279
[95]    train-logloss:0.03609   eval-logloss:0.08246
[96]    train-logloss:0.03551   eval-logloss:0.08162
[97]    train-logloss:0.03503   eval-logloss:0.08062
[98]    train-logloss:0.03438   eval-logloss:0.07993
[99]    train-logloss:0.03390   eval-logloss:0.07963
[100]   train-logloss:0.03329   eval-logloss:0.07899
[101]   train-logloss:0.03284   eval-logloss:0.07873
[102]   train-logloss:0.03245   eval-logloss:0.07871
[103]   train-logloss:0.03202   eval-logloss:0.07846
[104]   train-logloss:0.03158   eval-logloss:0.07822
[105]   train-logloss:0.03122   eval-logloss:0.07799
[106]   train-logloss:0.03076   eval-logloss:0.07690
[107]   train-logloss:0.03032   eval-logloss:0.07710
[108]   train-logloss:0.02993   eval-logloss:0.07759
[109]   train-logloss:0.02950   eval-logloss:0.07750
[110]   train-logloss:0.02908   eval-logloss:0.07647
[111]   train-logloss:0.02867   eval-logloss:0.07550
[112]   train-logloss:0.02831   eval-logloss:0.07529
[113]   train-logloss:0.02787   eval-logloss:0.07401
[114]   train-logloss:0.02750   eval-logloss:0.07395
[115]   train-logloss:0.02712   eval-logloss:0.07300
[116]   train-logloss:0.02674   eval-logloss:0.07235
[117]   train-logloss:0.02635   eval-logloss:0.07196
[118]   train-logloss:0.02599   eval-logloss:0.07107
[119]   train-logloss:0.02565   eval-logloss:0.07043
[120]   train-logloss:0.02536   eval-logloss:0.07095
[121]   train-logloss:0.02505   eval-logloss:0.07092
[122]   train-logloss:0.02473   eval-logloss:0.07007
[123]   train-logloss:0.02444   eval-logloss:0.07007
[124]   train-logloss:0.02418   eval-logloss:0.07058
[125]   train-logloss:0.02393   eval-logloss:0.07069
[126]   train-logloss:0.02363   eval-logloss:0.07066
[127]   train-logloss:0.02333   eval-logloss:0.06986
[128]   train-logloss:0.02305   eval-logloss:0.06984
[129]   train-logloss:0.02277   eval-logloss:0.06906
[130]   train-logloss:0.02252   eval-logloss:0.06911
[131]   train-logloss:0.02224   eval-logloss:0.06825
[132]   train-logloss:0.02198   eval-logloss:0.06751
[133]   train-logloss:0.02175   eval-logloss:0.06699
[134]   train-logloss:0.02155   eval-logloss:0.06748
[135]   train-logloss:0.02138   eval-logloss:0.06752
[136]   train-logloss:0.02114   eval-logloss:0.06747
[137]   train-logloss:0.02096   eval-logloss:0.06682
[138]   train-logloss:0.02075   eval-logloss:0.06686
[139]   train-logloss:0.02057   eval-logloss:0.06663
[140]   train-logloss:0.02032   eval-logloss:0.06654
[141]   train-logloss:0.02013   eval-logloss:0.06599
[142]   train-logloss:0.01995   eval-logloss:0.06647
[143]   train-logloss:0.01972   eval-logloss:0.06640
[144]   train-logloss:0.01950   eval-logloss:0.06636
[145]   train-logloss:0.01925   eval-logloss:0.06568
[146]   train-logloss:0.01910   eval-logloss:0.06597
[147]   train-logloss:0.01891   eval-logloss:0.06518
[148]   train-logloss:0.01876   eval-logloss:0.06547
[149]   train-logloss:0.01854   eval-logloss:0.06481
[150]   train-logloss:0.01838   eval-logloss:0.06530
[151]   train-logloss:0.01824   eval-logloss:0.06490
[152]   train-logloss:0.01806   eval-logloss:0.06506
[153]   train-logloss:0.01789   eval-logloss:0.06519
[154]   train-logloss:0.01771   eval-logloss:0.06496
[155]   train-logloss:0.01762   eval-logloss:0.06516
[156]   train-logloss:0.01742   eval-logloss:0.06457
[157]   train-logloss:0.01729   eval-logloss:0.06484
[158]   train-logloss:0.01716   eval-logloss:0.06408
[159]   train-logloss:0.01698   eval-logloss:0.06389
[160]   train-logloss:0.01679   eval-logloss:0.06333
[161]   train-logloss:0.01671   eval-logloss:0.06355
[162]   train-logloss:0.01657   eval-logloss:0.06357
[163]   train-logloss:0.01645   eval-logloss:0.06321
[164]   train-logloss:0.01631   eval-logloss:0.06317
[165]   train-logloss:0.01621   eval-logloss:0.06322
[166]   train-logloss:0.01604   eval-logloss:0.06270
[167]   train-logloss:0.01594   eval-logloss:0.06232
[168]   train-logloss:0.01587   eval-logloss:0.06253
[169]   train-logloss:0.01572   eval-logloss:0.06206
[170]   train-logloss:0.01564   eval-logloss:0.06167
[171]   train-logloss:0.01554   eval-logloss:0.06097
[172]   train-logloss:0.01547   eval-logloss:0.06117
[173]   train-logloss:0.01534   eval-logloss:0.06110
[174]   train-logloss:0.01526   eval-logloss:0.06115
[175]   train-logloss:0.01516   eval-logloss:0.06047
[176]   train-logloss:0.01502   eval-logloss:0.06018
[177]   train-logloss:0.01493   eval-logloss:0.06022
[178]   train-logloss:0.01482   eval-logloss:0.06012
[179]   train-logloss:0.01475   eval-logloss:0.05975
[180]   train-logloss:0.01468   eval-logloss:0.05968
[181]   train-logloss:0.01461   eval-logloss:0.05988
[182]   train-logloss:0.01454   eval-logloss:0.05952
[183]   train-logloss:0.01447   eval-logloss:0.05945
[184]   train-logloss:0.01437   eval-logloss:0.05952
[185]   train-logloss:0.01428   eval-logloss:0.05933
[186]   train-logloss:0.01420   eval-logloss:0.05926
[187]   train-logloss:0.01410   eval-logloss:0.05917
[188]   train-logloss:0.01404   eval-logloss:0.05883
[189]   train-logloss:0.01397   eval-logloss:0.05843
[190]   train-logloss:0.01389   eval-logloss:0.05825
[191]   train-logloss:0.01382   eval-logloss:0.05821
[192]   train-logloss:0.01372   eval-logloss:0.05829
[193]   train-logloss:0.01364   eval-logloss:0.05811
[194]   train-logloss:0.01358   eval-logloss:0.05808
[195]   train-logloss:0.01352   eval-logloss:0.05823
[196]   train-logloss:0.01346   eval-logloss:0.05829
[197]   train-logloss:0.01340   eval-logloss:0.05823
[198]   train-logloss:0.01331   eval-logloss:0.05832
[199]   train-logloss:0.01324   eval-logloss:0.05813
[200]   train-logloss:0.01317   eval-logloss:0.05811
[201]   train-logloss:0.01312   eval-logloss:0.05769
[202]   train-logloss:0.01305   eval-logloss:0.05754
[203]   train-logloss:0.01296   eval-logloss:0.05764
[204]   train-logloss:0.01289   eval-logloss:0.05749
[205]   train-logloss:0.01284   eval-logloss:0.05756
[206]   train-logloss:0.01279   eval-logloss:0.05772
[207]   train-logloss:0.01273   eval-logloss:0.05768
[208]   train-logloss:0.01268   eval-logloss:0.05783
[209]   train-logloss:0.01263   eval-logloss:0.05752
[210]   train-logloss:0.01258   eval-logloss:0.05711
[211]   train-logloss:0.01251   eval-logloss:0.05697
[212]   train-logloss:0.01243   eval-logloss:0.05662
[213]   train-logloss:0.01237   eval-logloss:0.05671
[214]   train-logloss:0.01231   eval-logloss:0.05659
[215]   train-logloss:0.01226   eval-logloss:0.05629
[216]   train-logloss:0.01219   eval-logloss:0.05595
[217]   train-logloss:0.01214   eval-logloss:0.05591
[218]   train-logloss:0.01208   eval-logloss:0.05580
[219]   train-logloss:0.01201   eval-logloss:0.05606
[220]   train-logloss:0.01196   eval-logloss:0.05592
[221]   train-logloss:0.01190   eval-logloss:0.05601
[222]   train-logloss:0.01185   eval-logloss:0.05608
[223]   train-logloss:0.01181   eval-logloss:0.05569
[224]   train-logloss:0.01176   eval-logloss:0.05559
[225]   train-logloss:0.01169   eval-logloss:0.05585
[226]   train-logloss:0.01164   eval-logloss:0.05576
[227]   train-logloss:0.01158   eval-logloss:0.05549
[228]   train-logloss:0.01153   eval-logloss:0.05521
[229]   train-logloss:0.01149   eval-logloss:0.05509
[230]   train-logloss:0.01145   eval-logloss:0.05493
[231]   train-logloss:0.01140   eval-logloss:0.05507
[232]   train-logloss:0.01136   eval-logloss:0.05469
[233]   train-logloss:0.01132   eval-logloss:0.05500
[234]   train-logloss:0.01128   eval-logloss:0.05474
[235]   train-logloss:0.01124   eval-logloss:0.05472
[236]   train-logloss:0.01120   eval-logloss:0.05490
[237]   train-logloss:0.01115   eval-logloss:0.05503
[238]   train-logloss:0.01111   eval-logloss:0.05516
[239]   train-logloss:0.01107   eval-logloss:0.05524
[240]   train-logloss:0.01103   eval-logloss:0.05537
[241]   train-logloss:0.01099   eval-logloss:0.05536
[242]   train-logloss:0.01096   eval-logloss:0.05568
[243]   train-logloss:0.01090   eval-logloss:0.05543
[244]   train-logloss:0.01087   eval-logloss:0.05556
[245]   train-logloss:0.01083   eval-logloss:0.05519
[246]   train-logloss:0.01081   eval-logloss:0.05537
[247]   train-logloss:0.01077   eval-logloss:0.05536
[248]   train-logloss:0.01072   eval-logloss:0.05549
[249]   train-logloss:0.01068   eval-logloss:0.05562
[250]   train-logloss:0.01064   eval-logloss:0.05537
[251]   train-logloss:0.01061   eval-logloss:0.05555
[252]   train-logloss:0.01058   eval-logloss:0.05568
[253]   train-logloss:0.01054   eval-logloss:0.05532
[254]   train-logloss:0.01051   eval-logloss:0.05508
[255]   train-logloss:0.01049   eval-logloss:0.05525
[256]   train-logloss:0.01045   eval-logloss:0.05524
[257]   train-logloss:0.01042   eval-logloss:0.05537
[258]   train-logloss:0.01037   eval-logloss:0.05554
[259]   train-logloss:0.01033   eval-logloss:0.05563
[260]   train-logloss:0.01030   eval-logloss:0.05528
[261]   train-logloss:0.01028   eval-logloss:0.05546
[262]   train-logloss:0.01025   eval-logloss:0.05559
[263]   train-logloss:0.01022   eval-logloss:0.05558
[264]   train-logloss:0.01017   eval-logloss:0.05576
[265]   train-logloss:0.01014   eval-logloss:0.05593
[266]   train-logloss:0.01011   eval-logloss:0.05558
[267]   train-logloss:0.01008   eval-logloss:0.05571
[268]   train-logloss:0.01005   eval-logloss:0.05584
[269]   train-logloss:0.01002   eval-logloss:0.05561
[270]   train-logloss:0.00998   eval-logloss:0.05560
[271]   train-logloss:0.00997   eval-logloss:0.05577
[272]   train-logloss:0.00992   eval-logloss:0.05574
[273]   train-logloss:0.00989   eval-logloss:0.05540
[274]   train-logloss:0.00987   eval-logloss:0.05557
[275]   train-logloss:0.00984   eval-logloss:0.05570
[276]   train-logloss:0.00981   eval-logloss:0.05547
[277]   train-logloss:0.00978   eval-logloss:0.05546
[278]   train-logloss:0.00973   eval-logloss:0.05543
[279]   train-logloss:0.00970   eval-logloss:0.05556
[280]   train-logloss:0.00969   eval-logloss:0.05562
[281]   train-logloss:0.00966   eval-logloss:0.05528
pred_probs = xgb_model.predict(dtest)
preds = [1 if x > 0.5 else 0 for x in pred_probs]
  • sklearn xgboost
from xgboost import XGBClassifier

evals = [(X_tr, y_tr), (X_val, y_val)]
xgb = XGBClassifier(n_estimators=400, 
                    learning_rate=0.05, 
                    max_depth=3, 
                    early_stopping_rounds=50,
                    eval_metric=['logloss'])
xgb.fit(X_tr, y_tr, eval_set=evals)
preds = xgb.predict(X_test)
pred_probs = xgb.predict_proba(X_test)[:, 1]
[0] validation_0-logloss:0.61277    validation_1-logloss:0.58601
[1] validation_0-logloss:0.57664    validation_1-logloss:0.55582
[2] validation_0-logloss:0.54304    validation_1-logloss:0.52806
[3] validation_0-logloss:0.51255    validation_1-logloss:0.50298
[4] validation_0-logloss:0.48471    validation_1-logloss:0.47989
[5] validation_0-logloss:0.45884    validation_1-logloss:0.45674
[6] validation_0-logloss:0.43517    validation_1-logloss:0.43736
[7] validation_0-logloss:0.41363    validation_1-logloss:0.42013
[8] validation_0-logloss:0.39341    validation_1-logloss:0.40354
[9] validation_0-logloss:0.37494    validation_1-logloss:0.38841
[10]    validation_0-logloss:0.35744    validation_1-logloss:0.37480
[11]    validation_0-logloss:0.34054    validation_1-logloss:0.35955
[12]    validation_0-logloss:0.32442    validation_1-logloss:0.34527
[13]    validation_0-logloss:0.30963    validation_1-logloss:0.32976
[14]    validation_0-logloss:0.29638    validation_1-logloss:0.32018
[15]    validation_0-logloss:0.28306    validation_1-logloss:0.30855
[16]    validation_0-logloss:0.27060    validation_1-logloss:0.29773
[17]    validation_0-logloss:0.25894    validation_1-logloss:0.28763
[18]    validation_0-logloss:0.24842    validation_1-logloss:0.27760
[19]    validation_0-logloss:0.23861    validation_1-logloss:0.27116
[20]    validation_0-logloss:0.22890    validation_1-logloss:0.26281
[21]    validation_0-logloss:0.21995    validation_1-logloss:0.25627
[22]    validation_0-logloss:0.21183    validation_1-logloss:0.25092
[23]    validation_0-logloss:0.20363    validation_1-logloss:0.24393
[24]    validation_0-logloss:0.19606    validation_1-logloss:0.23852
[25]    validation_0-logloss:0.18881    validation_1-logloss:0.23108
[26]    validation_0-logloss:0.18165    validation_1-logloss:0.22460
[27]    validation_0-logloss:0.17502    validation_1-logloss:0.21895
[28]    validation_0-logloss:0.16889    validation_1-logloss:0.21248
[29]    validation_0-logloss:0.16323    validation_1-logloss:0.20904
[30]    validation_0-logloss:0.15760    validation_1-logloss:0.20456
[31]    validation_0-logloss:0.15228    validation_1-logloss:0.20042
[32]    validation_0-logloss:0.14728    validation_1-logloss:0.19496
[33]    validation_0-logloss:0.14254    validation_1-logloss:0.19157
[34]    validation_0-logloss:0.13775    validation_1-logloss:0.18653
[35]    validation_0-logloss:0.13340    validation_1-logloss:0.18317
[36]    validation_0-logloss:0.12934    validation_1-logloss:0.17926
[37]    validation_0-logloss:0.12562    validation_1-logloss:0.17500
[38]    validation_0-logloss:0.12162    validation_1-logloss:0.17143
[39]    validation_0-logloss:0.11812    validation_1-logloss:0.16801
[40]    validation_0-logloss:0.11493    validation_1-logloss:0.16522
[41]    validation_0-logloss:0.11171    validation_1-logloss:0.16259
[42]    validation_0-logloss:0.10874    validation_1-logloss:0.16035
[43]    validation_0-logloss:0.10593    validation_1-logloss:0.15740
[44]    validation_0-logloss:0.10316    validation_1-logloss:0.15462
[45]    validation_0-logloss:0.10017    validation_1-logloss:0.15140
[46]    validation_0-logloss:0.09748    validation_1-logloss:0.14985
[47]    validation_0-logloss:0.09515    validation_1-logloss:0.14732
[48]    validation_0-logloss:0.09280    validation_1-logloss:0.14613
[49]    validation_0-logloss:0.09039    validation_1-logloss:0.14424
[50]    validation_0-logloss:0.08814    validation_1-logloss:0.14301
[51]    validation_0-logloss:0.08583    validation_1-logloss:0.14055
[52]    validation_0-logloss:0.08369    validation_1-logloss:0.13767
[53]    validation_0-logloss:0.08167    validation_1-logloss:0.13494
[54]    validation_0-logloss:0.07971    validation_1-logloss:0.13295
[55]    validation_0-logloss:0.07782    validation_1-logloss:0.13025
[56]    validation_0-logloss:0.07603    validation_1-logloss:0.12777
[57]    validation_0-logloss:0.07431    validation_1-logloss:0.12528
[58]    validation_0-logloss:0.07265    validation_1-logloss:0.12285
[59]    validation_0-logloss:0.07107    validation_1-logloss:0.12062
[60]    validation_0-logloss:0.06952    validation_1-logloss:0.11986
[61]    validation_0-logloss:0.06804    validation_1-logloss:0.11877
[62]    validation_0-logloss:0.06626    validation_1-logloss:0.11728
[63]    validation_0-logloss:0.06490    validation_1-logloss:0.11527
[64]    validation_0-logloss:0.06361    validation_1-logloss:0.11325
[65]    validation_0-logloss:0.06205    validation_1-logloss:0.11093
[66]    validation_0-logloss:0.06085    validation_1-logloss:0.10911
[67]    validation_0-logloss:0.05957    validation_1-logloss:0.10839
[68]    validation_0-logloss:0.05846    validation_1-logloss:0.10659
[69]    validation_0-logloss:0.05701    validation_1-logloss:0.10541
[70]    validation_0-logloss:0.05598    validation_1-logloss:0.10382
[71]    validation_0-logloss:0.05487    validation_1-logloss:0.10325
[72]    validation_0-logloss:0.05390    validation_1-logloss:0.10238
[73]    validation_0-logloss:0.05262    validation_1-logloss:0.10137
[74]    validation_0-logloss:0.05173    validation_1-logloss:0.09989
[75]    validation_0-logloss:0.05078    validation_1-logloss:0.09905
[76]    validation_0-logloss:0.04998    validation_1-logloss:0.09774
[77]    validation_0-logloss:0.04902    validation_1-logloss:0.09725
[78]    validation_0-logloss:0.04813    validation_1-logloss:0.09723
[79]    validation_0-logloss:0.04728    validation_1-logloss:0.09558
[80]    validation_0-logloss:0.04655    validation_1-logloss:0.09499
[81]    validation_0-logloss:0.04558    validation_1-logloss:0.09360
[82]    validation_0-logloss:0.04481    validation_1-logloss:0.09289
[83]    validation_0-logloss:0.04411    validation_1-logloss:0.09233
[84]    validation_0-logloss:0.04323    validation_1-logloss:0.09104
[85]    validation_0-logloss:0.04244    validation_1-logloss:0.09051
[86]    validation_0-logloss:0.04163    validation_1-logloss:0.08929
[87]    validation_0-logloss:0.04105    validation_1-logloss:0.08824
[88]    validation_0-logloss:0.04029    validation_1-logloss:0.08709
[89]    validation_0-logloss:0.03970    validation_1-logloss:0.08667
[90]    validation_0-logloss:0.03908    validation_1-logloss:0.08651
[91]    validation_0-logloss:0.03840    validation_1-logloss:0.08554
[92]    validation_0-logloss:0.03790    validation_1-logloss:0.08459
[93]    validation_0-logloss:0.03717    validation_1-logloss:0.08382
[94]    validation_0-logloss:0.03655    validation_1-logloss:0.08279
[95]    validation_0-logloss:0.03609    validation_1-logloss:0.08246
[96]    validation_0-logloss:0.03551    validation_1-logloss:0.08162
[97]    validation_0-logloss:0.03503    validation_1-logloss:0.08062
[98]    validation_0-logloss:0.03438    validation_1-logloss:0.07993
[99]    validation_0-logloss:0.03390    validation_1-logloss:0.07963
[100]   validation_0-logloss:0.03329    validation_1-logloss:0.07899
[101]   validation_0-logloss:0.03284    validation_1-logloss:0.07873
[102]   validation_0-logloss:0.03245    validation_1-logloss:0.07871
[103]   validation_0-logloss:0.03202    validation_1-logloss:0.07846
[104]   validation_0-logloss:0.03158    validation_1-logloss:0.07822
[105]   validation_0-logloss:0.03122    validation_1-logloss:0.07799
[106]   validation_0-logloss:0.03076    validation_1-logloss:0.07690
[107]   validation_0-logloss:0.03032    validation_1-logloss:0.07710
[108]   validation_0-logloss:0.02993    validation_1-logloss:0.07759
[109]   validation_0-logloss:0.02950    validation_1-logloss:0.07750
[110]   validation_0-logloss:0.02908    validation_1-logloss:0.07647
[111]   validation_0-logloss:0.02867    validation_1-logloss:0.07550
[112]   validation_0-logloss:0.02831    validation_1-logloss:0.07529
[113]   validation_0-logloss:0.02787    validation_1-logloss:0.07401
[114]   validation_0-logloss:0.02750    validation_1-logloss:0.07395
[115]   validation_0-logloss:0.02712    validation_1-logloss:0.07300
[116]   validation_0-logloss:0.02674    validation_1-logloss:0.07235
[117]   validation_0-logloss:0.02635    validation_1-logloss:0.07196
[118]   validation_0-logloss:0.02599    validation_1-logloss:0.07107
[119]   validation_0-logloss:0.02565    validation_1-logloss:0.07043
[120]   validation_0-logloss:0.02536    validation_1-logloss:0.07095
[121]   validation_0-logloss:0.02505    validation_1-logloss:0.07092
[122]   validation_0-logloss:0.02473    validation_1-logloss:0.07007
[123]   validation_0-logloss:0.02444    validation_1-logloss:0.07007
[124]   validation_0-logloss:0.02418    validation_1-logloss:0.07058
[125]   validation_0-logloss:0.02393    validation_1-logloss:0.07069
[126]   validation_0-logloss:0.02363    validation_1-logloss:0.07066
[127]   validation_0-logloss:0.02333    validation_1-logloss:0.06986
[128]   validation_0-logloss:0.02305    validation_1-logloss:0.06984
[129]   validation_0-logloss:0.02277    validation_1-logloss:0.06906
[130]   validation_0-logloss:0.02252    validation_1-logloss:0.06911
[131]   validation_0-logloss:0.02224    validation_1-logloss:0.06825
[132]   validation_0-logloss:0.02198    validation_1-logloss:0.06751
[133]   validation_0-logloss:0.02175    validation_1-logloss:0.06699
[134]   validation_0-logloss:0.02155    validation_1-logloss:0.06748
[135]   validation_0-logloss:0.02138    validation_1-logloss:0.06752
[136]   validation_0-logloss:0.02114    validation_1-logloss:0.06747
[137]   validation_0-logloss:0.02096    validation_1-logloss:0.06682
[138]   validation_0-logloss:0.02075    validation_1-logloss:0.06686
[139]   validation_0-logloss:0.02057    validation_1-logloss:0.06663
[140]   validation_0-logloss:0.02032    validation_1-logloss:0.06654
[141]   validation_0-logloss:0.02013    validation_1-logloss:0.06599
[142]   validation_0-logloss:0.01995    validation_1-logloss:0.06647
[143]   validation_0-logloss:0.01972    validation_1-logloss:0.06640
[144]   validation_0-logloss:0.01950    validation_1-logloss:0.06636
[145]   validation_0-logloss:0.01925    validation_1-logloss:0.06568
[146]   validation_0-logloss:0.01910    validation_1-logloss:0.06597
[147]   validation_0-logloss:0.01891    validation_1-logloss:0.06518
[148]   validation_0-logloss:0.01876    validation_1-logloss:0.06547
[149]   validation_0-logloss:0.01854    validation_1-logloss:0.06481
[150]   validation_0-logloss:0.01838    validation_1-logloss:0.06530
[151]   validation_0-logloss:0.01824    validation_1-logloss:0.06490
[152]   validation_0-logloss:0.01806    validation_1-logloss:0.06506
[153]   validation_0-logloss:0.01789    validation_1-logloss:0.06519
[154]   validation_0-logloss:0.01771    validation_1-logloss:0.06496
[155]   validation_0-logloss:0.01762    validation_1-logloss:0.06516
[156]   validation_0-logloss:0.01742    validation_1-logloss:0.06457
[157]   validation_0-logloss:0.01729    validation_1-logloss:0.06484
[158]   validation_0-logloss:0.01716    validation_1-logloss:0.06408
[159]   validation_0-logloss:0.01698    validation_1-logloss:0.06389
[160]   validation_0-logloss:0.01679    validation_1-logloss:0.06333
[161]   validation_0-logloss:0.01671    validation_1-logloss:0.06355
[162]   validation_0-logloss:0.01657    validation_1-logloss:0.06357
[163]   validation_0-logloss:0.01645    validation_1-logloss:0.06321
[164]   validation_0-logloss:0.01631    validation_1-logloss:0.06317
[165]   validation_0-logloss:0.01621    validation_1-logloss:0.06322
[166]   validation_0-logloss:0.01604    validation_1-logloss:0.06270
[167]   validation_0-logloss:0.01594    validation_1-logloss:0.06232
[168]   validation_0-logloss:0.01587    validation_1-logloss:0.06253
[169]   validation_0-logloss:0.01572    validation_1-logloss:0.06206
[170]   validation_0-logloss:0.01564    validation_1-logloss:0.06167
[171]   validation_0-logloss:0.01554    validation_1-logloss:0.06097
[172]   validation_0-logloss:0.01547    validation_1-logloss:0.06117
[173]   validation_0-logloss:0.01534    validation_1-logloss:0.06110
[174]   validation_0-logloss:0.01526    validation_1-logloss:0.06115
[175]   validation_0-logloss:0.01516    validation_1-logloss:0.06047
[176]   validation_0-logloss:0.01502    validation_1-logloss:0.06018
[177]   validation_0-logloss:0.01493    validation_1-logloss:0.06022
[178]   validation_0-logloss:0.01482    validation_1-logloss:0.06012
[179]   validation_0-logloss:0.01475    validation_1-logloss:0.05975
[180]   validation_0-logloss:0.01468    validation_1-logloss:0.05968
[181]   validation_0-logloss:0.01461    validation_1-logloss:0.05988
[182]   validation_0-logloss:0.01454    validation_1-logloss:0.05952
[183]   validation_0-logloss:0.01447    validation_1-logloss:0.05945
[184]   validation_0-logloss:0.01437    validation_1-logloss:0.05952
[185]   validation_0-logloss:0.01428    validation_1-logloss:0.05933
[186]   validation_0-logloss:0.01420    validation_1-logloss:0.05926
[187]   validation_0-logloss:0.01410    validation_1-logloss:0.05917
[188]   validation_0-logloss:0.01404    validation_1-logloss:0.05883
[189]   validation_0-logloss:0.01397    validation_1-logloss:0.05843
[190]   validation_0-logloss:0.01389    validation_1-logloss:0.05825
[191]   validation_0-logloss:0.01382    validation_1-logloss:0.05821
[192]   validation_0-logloss:0.01372    validation_1-logloss:0.05829
[193]   validation_0-logloss:0.01364    validation_1-logloss:0.05811
[194]   validation_0-logloss:0.01358    validation_1-logloss:0.05808
[195]   validation_0-logloss:0.01352    validation_1-logloss:0.05823
[196]   validation_0-logloss:0.01346    validation_1-logloss:0.05829
[197]   validation_0-logloss:0.01340    validation_1-logloss:0.05823
[198]   validation_0-logloss:0.01331    validation_1-logloss:0.05832
[199]   validation_0-logloss:0.01324    validation_1-logloss:0.05813
[200]   validation_0-logloss:0.01317    validation_1-logloss:0.05811
[201]   validation_0-logloss:0.01312    validation_1-logloss:0.05769
[202]   validation_0-logloss:0.01305    validation_1-logloss:0.05754
[203]   validation_0-logloss:0.01296    validation_1-logloss:0.05764
[204]   validation_0-logloss:0.01289    validation_1-logloss:0.05749
[205]   validation_0-logloss:0.01284    validation_1-logloss:0.05756
[206]   validation_0-logloss:0.01279    validation_1-logloss:0.05772
[207]   validation_0-logloss:0.01273    validation_1-logloss:0.05768
[208]   validation_0-logloss:0.01268    validation_1-logloss:0.05783
[209]   validation_0-logloss:0.01263    validation_1-logloss:0.05752
[210]   validation_0-logloss:0.01258    validation_1-logloss:0.05711
[211]   validation_0-logloss:0.01251    validation_1-logloss:0.05697
[212]   validation_0-logloss:0.01243    validation_1-logloss:0.05662
[213]   validation_0-logloss:0.01237    validation_1-logloss:0.05671
[214]   validation_0-logloss:0.01231    validation_1-logloss:0.05659
[215]   validation_0-logloss:0.01226    validation_1-logloss:0.05629
[216]   validation_0-logloss:0.01219    validation_1-logloss:0.05595
[217]   validation_0-logloss:0.01214    validation_1-logloss:0.05591
[218]   validation_0-logloss:0.01208    validation_1-logloss:0.05580
[219]   validation_0-logloss:0.01201    validation_1-logloss:0.05606
[220]   validation_0-logloss:0.01196    validation_1-logloss:0.05592
[221]   validation_0-logloss:0.01190    validation_1-logloss:0.05601
[222]   validation_0-logloss:0.01185    validation_1-logloss:0.05608
[223]   validation_0-logloss:0.01181    validation_1-logloss:0.05569
[224]   validation_0-logloss:0.01176    validation_1-logloss:0.05559
[225]   validation_0-logloss:0.01169    validation_1-logloss:0.05585
[226]   validation_0-logloss:0.01164    validation_1-logloss:0.05576
[227]   validation_0-logloss:0.01158    validation_1-logloss:0.05549
[228]   validation_0-logloss:0.01153    validation_1-logloss:0.05521
[229]   validation_0-logloss:0.01149    validation_1-logloss:0.05509
[230]   validation_0-logloss:0.01145    validation_1-logloss:0.05493
[231]   validation_0-logloss:0.01140    validation_1-logloss:0.05507
[232]   validation_0-logloss:0.01136    validation_1-logloss:0.05469
[233]   validation_0-logloss:0.01132    validation_1-logloss:0.05500
[234]   validation_0-logloss:0.01128    validation_1-logloss:0.05474
[235]   validation_0-logloss:0.01124    validation_1-logloss:0.05472
[236]   validation_0-logloss:0.01120    validation_1-logloss:0.05490
[237]   validation_0-logloss:0.01115    validation_1-logloss:0.05503
[238]   validation_0-logloss:0.01111    validation_1-logloss:0.05516
[239]   validation_0-logloss:0.01107    validation_1-logloss:0.05524
[240]   validation_0-logloss:0.01103    validation_1-logloss:0.05537
[241]   validation_0-logloss:0.01099    validation_1-logloss:0.05536
[242]   validation_0-logloss:0.01096    validation_1-logloss:0.05568
[243]   validation_0-logloss:0.01090    validation_1-logloss:0.05543
[244]   validation_0-logloss:0.01087    validation_1-logloss:0.05556
[245]   validation_0-logloss:0.01083    validation_1-logloss:0.05519
[246]   validation_0-logloss:0.01081    validation_1-logloss:0.05537
[247]   validation_0-logloss:0.01077    validation_1-logloss:0.05536
[248]   validation_0-logloss:0.01072    validation_1-logloss:0.05549
[249]   validation_0-logloss:0.01068    validation_1-logloss:0.05562
[250]   validation_0-logloss:0.01064    validation_1-logloss:0.05537
[251]   validation_0-logloss:0.01061    validation_1-logloss:0.05555
[252]   validation_0-logloss:0.01058    validation_1-logloss:0.05568
[253]   validation_0-logloss:0.01054    validation_1-logloss:0.05532
[254]   validation_0-logloss:0.01051    validation_1-logloss:0.05508
[255]   validation_0-logloss:0.01049    validation_1-logloss:0.05525
[256]   validation_0-logloss:0.01045    validation_1-logloss:0.05524
[257]   validation_0-logloss:0.01042    validation_1-logloss:0.05537
[258]   validation_0-logloss:0.01037    validation_1-logloss:0.05554
[259]   validation_0-logloss:0.01033    validation_1-logloss:0.05563
[260]   validation_0-logloss:0.01030    validation_1-logloss:0.05528
[261]   validation_0-logloss:0.01028    validation_1-logloss:0.05546
[262]   validation_0-logloss:0.01025    validation_1-logloss:0.05559
[263]   validation_0-logloss:0.01022    validation_1-logloss:0.05558
[264]   validation_0-logloss:0.01017    validation_1-logloss:0.05576
[265]   validation_0-logloss:0.01014    validation_1-logloss:0.05593
[266]   validation_0-logloss:0.01011    validation_1-logloss:0.05558
[267]   validation_0-logloss:0.01008    validation_1-logloss:0.05571
[268]   validation_0-logloss:0.01005    validation_1-logloss:0.05584
[269]   validation_0-logloss:0.01002    validation_1-logloss:0.05561
[270]   validation_0-logloss:0.00998    validation_1-logloss:0.05560
[271]   validation_0-logloss:0.00997    validation_1-logloss:0.05577
[272]   validation_0-logloss:0.00992    validation_1-logloss:0.05574
[273]   validation_0-logloss:0.00989    validation_1-logloss:0.05540
[274]   validation_0-logloss:0.00987    validation_1-logloss:0.05557
[275]   validation_0-logloss:0.00984    validation_1-logloss:0.05570
[276]   validation_0-logloss:0.00981    validation_1-logloss:0.05547
[277]   validation_0-logloss:0.00978    validation_1-logloss:0.05546
[278]   validation_0-logloss:0.00973    validation_1-logloss:0.05543
[279]   validation_0-logloss:0.00970    validation_1-logloss:0.05556
[280]   validation_0-logloss:0.00969    validation_1-logloss:0.05562
[281]   validation_0-logloss:0.00966    validation_1-logloss:0.05528

LightGBM

  • 성능은 xgboost랑 별로 차이가 없음.

  • 1만건 이하의 데이터 세트에 대해 과적합이 발생할 가능성이 높다.

  • one hot 인코딩 필요 없음

  • python lightgbm

from lightgbm import LGBMClassifier, early_stopping, plot_importance
import matplotlib.pyplot as plt

lgbm = LGBMClassifier(n_estimators=400, learning_rate=0.05)
evals = [(X_tr, y_tr), (X_val, y_val)]
lgbm.fit(X_tr, y_tr, 
         callbacks = [early_stopping(stopping_rounds = 50)],
         eval_metric='logloss', 
         eval_set=evals)
preds = lgbm.predict(X_test)
pred_proba = lgbm.predict_proba(X_test)[:, 1]

plot_importance(lgbm)
plt.show()
[LightGBM] [Info] Number of positive: 262, number of negative: 147
[LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000223 seconds.
You can set `force_col_wise=true` to remove the overhead.
[LightGBM] [Info] Total Bins 4092
[LightGBM] [Info] Number of data points in the train set: 409, number of used features: 30
[LightGBM] [Info] [binary:BoostFromScore]: pavg=0.640587 -> initscore=0.577912
[LightGBM] [Info] Start training from score 0.577912
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
Training until validation scores don't improve for 50 rounds
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
Early stopping, best iteration is:
[159]   training's binary_logloss: 0.00195741   valid_1's binary_logloss: 0.0442418

stacking

from sklearn.neighbors import KNeighborsClassifier
from sklearn.ensemble import RandomForestClassifier
from sklearn.ensemble import AdaBoostClassifier
from sklearn.tree import DecisionTreeClassifier
from sklearn.linear_model import LogisticRegression

X_train, X_test, y_train, y_test = train_test_split(cancer.data, cancer.target, test_size=0.2)

knn_clf = KNeighborsClassifier(n_neighbors=4)
rf_clf = RandomForestClassifier(n_estimators=100)
dt_clf = DecisionTreeClassifier()
ada_clf = AdaBoostClassifier(n_estimators=100)

lr_final = LogisticRegression()
knn_clf.fit(X_train, y_train)
rf_clf.fit(X_train, y_train)
dt_clf.fit(X_train, y_train)
ada_clf.fit(X_train, y_train)

knn_pred = knn_clf.predict(X_test)
rf_pred = rf_clf.predict(X_test)
dt_pred = dt_clf.predict(X_test)
ada_pred = ada_clf.predict(X_test)

pred = np.array([knn_pred, rf_pred, dt_pred, ada_pred])
pred = np.transpose(pred)
lr_final.fit(pred, y_test)
final = lr_final.predict(pred)
print(f'{accuracy_score(y_test, final):.3f}')
0.991
  • test 셋으로 훈련을 하고 있는 부분이 문제 → cv 세트로 해야함

CV 세트 기반 stacking

from sklearn.model_selection import KFold
from sklearn.metrics import mean_absolute_error

def get_stacking_base_datasets(model, X_train_n, y_train_n, X_test_n, n_folds):
    kf = KFold(n_splits=n_folds, shuffle=False)
    train_fold_pred = np.zeros((X_train_n.shape[0], 1))
    test_pred = np.zeros((X_test_n.shape[0], n_folds))
    for folder_counter, (train_index, valid_index) in enumerate(kf.split(X_train_n)):
        X_tr = X_train_n[train_index]
        y_tr = y_train_n[train_index]
        X_te = X_train_n[valid_index]

        model.fit(X_tr, y_tr)
        train_fold_pred[valid_index, :] = model.predict(X_te).reshape(-1, 1)
        test_pred[:, folder_counter] = model.predict(X_test_n)

    test_pred_mean = np.mean(test_pred, axis=1).reshape(-1, 1)

    return train_fold_pred, test_pred_mean
knn_train, knn_test = get_stacking_base_datasets(knn_clf, X_train, y_train, X_test, 7)
rf_train, rf_test = get_stacking_base_datasets(rf_clf, X_train, y_train, X_test, 7)
dt_train, dt_test = get_stacking_base_datasets(dt_clf, X_train, y_train, X_test, 7)
ada_train, ada_test = get_stacking_base_datasets(ada_clf, X_train, y_train, X_test, 7)
Stack_final_X_train = np.concatenate((knn_train, rf_train, dt_train, ada_train), axis=1)
Stack_final_X_test = np.concatenate((knn_test, rf_test, dt_test, ada_test), axis=1)

lr_final.fit(Stack_final_X_train, y_train)
stack_final = lr_final.predict(Stack_final_X_test)

print(f'{accuracy_score(y_test, stack_final):.3f}')
0.982

Baysian Optimization

  • Grid search로는 시간이 너무 오래 걸리는 경우

  • 목표 함수: 하이퍼파라미터 입력 n개에 대한 모델 성능 출력 1개의 모델

  • Surrogate model: 목표 함수에 대한 예상 모델. 사전확률 분포에서 최적해 나감.

  • acquisition function: 불확실성이 가장 큰 point를 다음 관측 데이터로 결정.

from hyperopt import hp, fmin, tpe, Trials, STATUS_OK

search_space = {'x': hp.quniform('x', -10, 10, 1),
                'y': hp.quniform('y', -15, 15, 1)}
def objective_func(search_space):
    x = search_space['x']
    y = search_space['y']

    return x ** 2 - 20 * y

trial_val = Trials()
best = fmin(fn=objective_func,
            space=search_space,
            algo=tpe.suggest,
            max_evals=20,
            trials=trial_val)
best
  0%|          | 0/20 [00:00<?, ?trial/s, best loss=?]100%|██████████| 20/20 [00:00<00:00, 1705.00trial/s, best loss: -284.0]
{'x': 4.0, 'y': 15.0}

XGBoost 하이퍼파라미터 최적화

dataset = load_breast_cancer()

X_train, X_test, y_train, y_test = train_test_split(dataset.data, dataset.target, test_size=0.2)
X_tr, X_val, y_tr, y_val = train_test_split(X_train, y_train, test_size=0.1)

xgb_search_space = {
    'max_depth': hp.quniform('max_depth', 5, 20, 1),
    'min_child_weight': hp.quniform('min_child_weight', 1, 2, 1),
    'learning_rate': hp.uniform('learning_rate', 0.01, 0.2),
    'colsample_bytree': hp.uniform('colsample_bytree', 0.5, 1)
}
# hp.choice('tree_criterion', ['gini', 'entropy']) 이런식으로도 가능
from sklearn.model_selection import cross_val_score

def objective_func(search_space):
    xgb_clf = XGBClassifier(n_estimators=100, 
                            max_depth=int(search_space['max_depth']),
                            min_child_weight=int(search_space['min_child_weight']),
                            learning_rate=search_space['learning_rate'],
                            colsample_bytree=search_space['colsample_bytree'],
                            eval_metric='logloss')
    accuracy = cross_val_score(xgb_clf, X_train, y_train, scoring='accuracy', cv=3)
    return {'loss': -1 * np.mean(accuracy), 'status': STATUS_OK}

trial_val = Trials()
best = fmin(fn=objective_func,
            space=xgb_search_space,
            algo=tpe.suggest,
            max_evals=50,
            trials=trial_val)
best
  0%|          | 0/50 [00:00<?, ?trial/s, best loss=?]  2%|▏         | 1/50 [00:00<00:15,  3.22trial/s, best loss: -0.9582752410828395]  4%|▍         | 2/50 [00:00<00:14,  3.38trial/s, best loss: -0.9714186127570582]  6%|▌         | 3/50 [00:00<00:15,  3.13trial/s, best loss: -0.9714186127570582]  8%|▊         | 4/50 [00:01<00:16,  2.82trial/s, best loss: -0.9714186127570582] 10%|█         | 5/50 [00:01<00:13,  3.43trial/s, best loss: -0.9714186127570582] 12%|█▏        | 6/50 [00:01<00:10,  4.12trial/s, best loss: -0.9714186127570582] 14%|█▍        | 7/50 [00:01<00:08,  4.92trial/s, best loss: -0.9714186127570582] 16%|█▌        | 8/50 [00:01<00:07,  5.39trial/s, best loss: -0.9714186127570582] 18%|█▊        | 9/50 [00:02<00:07,  5.79trial/s, best loss: -0.9714186127570582] 20%|██        | 10/50 [00:02<00:06,  5.88trial/s, best loss: -0.9714186127570582] 22%|██▏       | 11/50 [00:02<00:05,  6.59trial/s, best loss: -0.9714186127570582] 24%|██▍       | 12/50 [00:02<00:05,  6.58trial/s, best loss: -0.9714186127570582] 26%|██▌       | 13/50 [00:02<00:05,  7.31trial/s, best loss: -0.9714186127570582] 28%|██▊       | 14/50 [00:02<00:04,  7.28trial/s, best loss: -0.9714186127570582] 30%|███       | 15/50 [00:02<00:04,  7.17trial/s, best loss: -0.9714186127570582] 38%|███▊      | 19/50 [00:03<00:02, 14.61trial/s, best loss: -0.9714186127570582] 42%|████▏     | 21/50 [00:03<00:02, 10.53trial/s, best loss: -0.9714186127570582] 46%|████▌     | 23/50 [00:03<00:02,  9.43trial/s, best loss: -0.9714186127570582] 50%|█████     | 25/50 [00:03<00:02,  8.95trial/s, best loss: -0.9736261182758219] 54%|█████▍    | 27/50 [00:04<00:02,  8.87trial/s, best loss: -0.9736261182758219] 56%|█████▌    | 28/50 [00:04<00:02,  8.90trial/s, best loss: -0.9736261182758219] 60%|██████    | 30/50 [00:04<00:02,  8.95trial/s, best loss: -0.9736261182758219] 62%|██████▏   | 31/50 [00:04<00:02,  8.95trial/s, best loss: -0.9736261182758219] 64%|██████▍   | 32/50 [00:04<00:02,  7.08trial/s, best loss: -0.9736261182758219] 66%|██████▌   | 33/50 [00:05<00:02,  5.92trial/s, best loss: -0.9736261182758219] 68%|██████▊   | 34/50 [00:05<00:03,  5.20trial/s, best loss: -0.9736261182758219] 70%|███████   | 35/50 [00:05<00:03,  3.97trial/s, best loss: -0.9736261182758219] 72%|███████▏  | 36/50 [00:05<00:03,  3.84trial/s, best loss: -0.9736261182758219] 74%|███████▍  | 37/50 [00:06<00:03,  3.79trial/s, best loss: -0.9736261182758219] 76%|███████▌  | 38/50 [00:06<00:02,  4.05trial/s, best loss: -0.9736261182758219] 78%|███████▊  | 39/50 [00:06<00:02,  4.46trial/s, best loss: -0.9736261182758219] 80%|████████  | 40/50 [00:06<00:01,  5.31trial/s, best loss: -0.9736261182758219] 82%|████████▏ | 41/50 [00:06<00:01,  6.03trial/s, best loss: -0.9736261182758219] 84%|████████▍ | 42/50 [00:06<00:01,  6.79trial/s, best loss: -0.9736261182758219] 86%|████████▌ | 43/50 [00:07<00:01,  6.73trial/s, best loss: -0.9736261182758219] 88%|████████▊ | 44/50 [00:07<00:00,  7.04trial/s, best loss: -0.9736261182758219] 90%|█████████ | 45/50 [00:07<00:00,  7.43trial/s, best loss: -0.9736261182758219] 92%|█████████▏| 46/50 [00:07<00:00,  7.02trial/s, best loss: -0.9736261182758219] 94%|█████████▍| 47/50 [00:07<00:00,  7.14trial/s, best loss: -0.9736261182758219] 96%|█████████▌| 48/50 [00:07<00:00,  6.67trial/s, best loss: -0.9736261182758219] 98%|█████████▊| 49/50 [00:07<00:00,  7.19trial/s, best loss: -0.9736261182758219]100%|██████████| 50/50 [00:08<00:00,  7.52trial/s, best loss: -0.9736261182758219]100%|██████████| 50/50 [00:08<00:00,  6.22trial/s, best loss: -0.9736261182758219]
{'colsample_bytree': 0.8065561529248224,
 'learning_rate': 0.1128018538935688,
 'max_depth': 18.0,
 'min_child_weight': 2.0}
맨 위로

각주

  1. hard voting (단순 다수결), soft voting(label을 예측할 확률의 가중 평균으로 분류)으로 나뉨. 일반적으로 soft voting이 사용됨.↩︎

  2. hard voting (단순 다수결), soft voting(label을 예측할 확률의 가중 평균으로 분류)으로 나뉨. 일반적으로 soft voting이 사용됨.↩︎