2017-06-07 9 views
1

ケラス(バックエンドはテンソルフロー)を使用してマルチクラスニューラルネットワークを訓練しています。私は最終的な位置に私の設定といくつかのコードを与えます。ケラスを使って複数クラスのnnを訓練するとき、喪失の理由は何であるか

説明: 私は10個のフォルダのクロスバリデーションを行うと、トレーニングの損失と検証の損失は最初の10-15エポックで下がりますが、15エポック後にはさらに下がり続けることはできません(損失:1.0606 - acc:0.6301 - val_loss:1.1577 - val_acc:0.5774)。

自分の設定にいくつか変更を加えました。たとえば、非表示のレイヤーを追加し、normalization.BatchNormalization()を追加し、adamからsgdまたはrmspropに変更オプティマイザを変更し、categorical_crossentropyから他のレイヤーに損失の機能を変更します。しかし効果はありません。

私は、このようなことが起こりそうな理由について議論したいと思います。要約文書やプレゼンテーションがあればとても幸せです。

私のデータは10000行あります。また、フィーチャには507の属性0/1があります。クラスはnum = 7のクラスを持つ複数クラスです。より大きいデータセットから10000個のデータを選択したので、クラス間のバランスはほぼOKです。

model = Sequential() 
model.add(Dense(500, activation='relu', input_dim=self.feature_dim, 
kernel_regularizer=regularizers.l2(0.01))) 
model.add(Dense(100, activation='relu')) 
model.add(Dense(self.label_dim, activation='softmax')) 
model.compile(optimizer='adam', loss='categorical_crossentropy', metrics=['accuracy']) 

一部のログを次のように:

skf = KFold(n_splits=cross_validation, shuffle=True) 
for train_index, test_index in skf.split(X): 
    X_train, X_test = X[train_index], X[test_index] 
    y_train, y_test = Y[train_index], Y[test_index] 
    model = None 
    model = self.__create_model() 
    model.fit(X_train, y_train, epochs=epochs, batch_size=batch_size, validation_data=(X_test, y_test)) 

X及びYは、二つの行列とされている次のよう

Running Fold 1/10 
Train on 9534 samples, validate on 1060 samples 
Epoch 1/100 
1000/9534 [==>...........................] - ETA: 7s - loss: 6.9644 - acc: 0.1150 
2000/9534 [=====>........................] - ETA: 3s - loss: 6.8357 - acc: 0.1715 
3000/9534 [========>.....................] - ETA: 2s - loss: 6.7147 - acc: 0.2243 
4000/9534 [===========>..................] - ETA: 1s - loss: 6.5922 - acc: 0.2683 
5000/9534 [==============>...............] - ETA: 1s - loss: 6.4779 - acc: 0.2908 
6000/9534 [=================>............] - ETA: 0s - loss: 6.3618 - acc: 0.3097 
7000/9534 [=====================>........] - ETA: 0s - loss: 6.2513 - acc: 0.3244 
8000/9534 [========================>.....] - ETA: 0s - loss: 6.1465 - acc: 0.3340 
9000/9534 [===========================>..] - ETA: 0s - loss: 6.0439 - acc: 0.3411 
9534/9534 [==============================] - 1s - loss: 5.9900 - acc: 0.3442 - val_loss: 4.8716 - val_acc: 0.4377 
Epoch 2/100 
1000/9534 [==>...........................] - ETA: 0s - loss: 4.8370 - acc: 0.4340 
2000/9534 [=====>........................] - ETA: 0s - loss: 4.7593 - acc: 0.4415 
3000/9534 [========>.....................] - ETA: 0s - loss: 4.6923 - acc: 0.4423 
4000/9534 [===========>..................] - ETA: 0s - loss: 4.6176 - acc: 0.4557 
5000/9534 [==============>...............] - ETA: 0s - loss: 4.5517 - acc: 0.4642 
6000/9534 [=================>............] - ETA: 0s - loss: 4.4809 - acc: 0.4703 
7000/9534 [=====================>........] - ETA: 0s - loss: 4.4036 - acc: 0.4804 
8000/9534 [========================>.....] - ETA: 0s - loss: 4.3364 - acc: 0.4821 
9000/9534 [===========================>..] - ETA: 0s - loss: 4.2652 - acc: 0.4901 
9534/9534 [==============================] - 1s - loss: 4.2316 - acc: 0.4928 - val_loss: 3.5151 - val_acc: 0.5179 
Epoch 3/100 
1000/9534 [==>...........................] - ETA: 1s - loss: 3.4892 - acc: 0.5370 
2000/9534 [=====>........................] - ETA: 1s - loss: 3.4573 - acc: 0.5395 
3000/9534 [========>.....................] - ETA: 0s - loss: 3.4006 - acc: 0.5450 
4000/9534 [===========>..................] - ETA: 0s - loss: 3.3430 - acc: 0.5435 
5000/9534 [==============>...............] - ETA: 0s - loss: 3.2929 - acc: 0.5448 
6000/9534 [=================>............] - ETA: 0s - loss: 3.2414 - acc: 0.5448 
7000/9534 [=====================>........] - ETA: 0s - loss: 3.1959 - acc: 0.5446 
8000/9534 [========================>.....] - ETA: 0s - loss: 3.1489 - acc: 0.5485 
9000/9534 [===========================>..] - ETA: 0s - loss: 3.1021 - acc: 0.5501 
9534/9534 [==============================] - 1s - loss: 3.0832 - acc: 0.5481 - val_loss: 2.6184 - val_acc: 0.5349 
Epoch 4/100 
1000/9534 [==>...........................] - ETA: 1s - loss: 2.5950 - acc: 0.5640 
2000/9534 [=====>........................] - ETA: 1s - loss: 2.5570 - acc: 0.5705 
3000/9534 [========>.....................] - ETA: 0s - loss: 2.5197 - acc: 0.5743 
4000/9534 [===========>..................] - ETA: 0s - loss: 2.4929 - acc: 0.5650 
5000/9534 [==============>...............] - ETA: 0s - loss: 2.4703 - acc: 0.5646 
6000/9534 [=================>............] - ETA: 0s - loss: 2.4388 - acc: 0.5648 
7000/9534 [=====================>........] - ETA: 0s - loss: 2.4054 - acc: 0.5680 
8000/9534 [========================>.....] - ETA: 0s - loss: 2.3798 - acc: 0.5649 
9000/9534 [===========================>..] - ETA: 0s - loss: 2.3522 - acc: 0.5662 
9534/9534 [==============================] - 1s - loss: 2.3342 - acc: 0.5685 - val_loss: 2.0442 - val_acc: 0.5491 
Epoch 5/100 
1000/9534 [==>...........................] - ETA: 0s - loss: 2.0090 - acc: 0.5830 
2000/9534 [=====>........................] - ETA: 0s - loss: 1.9990 - acc: 0.5865 
3000/9534 [========>.....................] - ETA: 0s - loss: 1.9812 - acc: 0.5833 
4000/9534 [===========>..................] - ETA: 0s - loss: 1.9558 - acc: 0.5835 
5000/9534 [==============>...............] - ETA: 0s - loss: 1.9377 - acc: 0.5832 
6000/9534 [=================>............] - ETA: 0s - loss: 1.9173 - acc: 0.5832 
7000/9534 [=====================>........] - ETA: 0s - loss: 1.8968 - acc: 0.5850 
8000/9534 [========================>.....] - ETA: 0s - loss: 1.8759 - acc: 0.5851 
9000/9534 [===========================>..] - ETA: 0s - loss: 1.8582 - acc: 0.5846 
9534/9534 [==============================] - 1s - loss: 1.8501 - acc: 0.5834 - val_loss: 1.6868 - val_acc: 0.5500 
Epoch 6/100 
1000/9534 [==>...........................] - ETA: 0s - loss: 1.6716 - acc: 0.5790 
2000/9534 [=====>........................] - ETA: 0s - loss: 1.6387 - acc: 0.5910 
3000/9534 [========>.....................] - ETA: 0s - loss: 1.6163 - acc: 0.5910 
4000/9534 [===========>..................] - ETA: 0s - loss: 1.6130 - acc: 0.5882 
5000/9534 [==============>...............] - ETA: 0s - loss: 1.5982 - acc: 0.5890 
6000/9534 [=================>............] - ETA: 0s - loss: 1.5861 - acc: 0.5892 
7000/9534 [=====================>........] - ETA: 0s - loss: 1.5724 - acc: 0.5914 
8000/9534 [========================>.....] - ETA: 0s - loss: 1.5578 - acc: 0.5922 
9000/9534 [===========================>..] - ETA: 0s - loss: 1.5492 - acc: 0.5904 
9534/9534 [==============================] - 0s - loss: 1.5468 - acc: 0.5893 - val_loss: 1.4677 - val_acc: 0.5585 
Epoch 7/100 
1000/9534 [==>...........................] - ETA: 0s - loss: 1.4380 - acc: 0.5790 
2000/9534 [=====>........................] - ETA: 0s - loss: 1.4332 - acc: 0.5900 
3000/9534 [========>.....................] - ETA: 0s - loss: 1.4208 - acc: 0.5957 
4000/9534 [===========>..................] - ETA: 0s - loss: 1.4073 - acc: 0.5985 
5000/9534 [==============>...............] - ETA: 0s - loss: 1.4027 - acc: 0.5960 
6000/9534 [=================>............] - ETA: 0s - loss: 1.3922 - acc: 0.5950 
7000/9534 [=====================>........] - ETA: 0s - loss: 1.3842 - acc: 0.5951 
8000/9534 [========================>.....] - ETA: 0s - loss: 1.3729 - acc: 0.5988 
9000/9534 [===========================>..] - ETA: 0s - loss: 1.3611 - acc: 0.6012 
9534/9534 [==============================] - 1s - loss: 1.3588 - acc: 0.6015 - val_loss: 1.3387 - val_acc: 0.5717 
Epoch 8/100 
1000/9534 [==>...........................] - ETA: 0s - loss: 1.3429 - acc: 0.5750 
2000/9534 [=====>........................] - ETA: 0s - loss: 1.3071 - acc: 0.5980 
3000/9534 [========>.....................] - ETA: 0s - loss: 1.2915 - acc: 0.6007 
4000/9534 [===========>..................] - ETA: 0s - loss: 1.2834 - acc: 0.5977 
5000/9534 [==============>...............] - ETA: 0s - loss: 1.2791 - acc: 0.6008 
6000/9534 [=================>............] - ETA: 0s - loss: 1.2636 - acc: 0.6043 
7000/9534 [=====================>........] - ETA: 0s - loss: 1.2521 - acc: 0.6049 
8000/9534 [========================>.....] - ETA: 0s - loss: 1.2495 - acc: 0.6041 
9000/9534 [===========================>..] - ETA: 0s - loss: 1.2506 - acc: 0.6031 
9534/9534 [==============================] - 1s - loss: 1.2491 - acc: 0.6022 - val_loss: 1.2617 - val_acc: 0.5698 
Epoch 9/100 
1000/9534 [==>...........................] - ETA: 0s - loss: 1.1627 - acc: 0.6240 
2000/9534 [=====>........................] - ETA: 0s - loss: 1.1709 - acc: 0.6235 
3000/9534 [========>.....................] - ETA: 0s - loss: 1.2001 - acc: 0.6127 
4000/9534 [===========>..................] - ETA: 0s - loss: 1.2000 - acc: 0.6098 
5000/9534 [==============>...............] - ETA: 0s - loss: 1.2002 - acc: 0.6096 
6000/9534 [=================>............] - ETA: 0s - loss: 1.1969 - acc: 0.6085 
7000/9534 [=====================>........] - ETA: 0s - loss: 1.1894 - acc: 0.6117 
9534/9534 [==============================] - 1s - loss: 1.1793 - acc: 0.6094 - val_loss: 1.2151 - val_acc: 0.5679 
Epoch 10/100 
1000/9534 [==>...........................] - ETA: 1s - loss: 1.1436 - acc: 0.6190 
2000/9534 [=====>........................] - ETA: 0s - loss: 1.1369 - acc: 0.6260 
3000/9534 [========>.....................] - ETA: 0s - loss: 1.1366 - acc: 0.6207 
4000/9534 [===========>..................] - ETA: 0s - loss: 1.1293 - acc: 0.6210 
5000/9534 [==============>...............] - ETA: 0s - loss: 1.1276 - acc: 0.6232 
6000/9534 [=================>............] - ETA: 0s - loss: 1.1289 - acc: 0.6217 
7000/9534 [=====================>........] - ETA: 0s - loss: 1.1321 - acc: 0.6180 
8000/9534 [========================>.....] - ETA: 0s - loss: 1.1352 - acc: 0.6150 
9000/9534 [===========================>..] - ETA: 0s - loss: 1.1341 - acc: 0.6141 
9534/9534 [==============================] - 0s - loss: 1.1349 - acc: 0.6129 - val_loss: 1.1946 - val_acc: 0.5632 
Epoch 11/100 
1000/9534 [==>...........................] - ETA: 0s - loss: 1.1684 - acc: 0.5930 
2000/9534 [=====>........................] - ETA: 0s - loss: 1.1338 - acc: 0.6075 
3000/9534 [========>.....................] - ETA: 0s - loss: 1.1177 - acc: 0.6140 
4000/9534 [===========>..................] - ETA: 0s - loss: 1.1293 - acc: 0.6075 
5000/9534 [==============>...............] - ETA: 0s - loss: 1.1235 - acc: 0.6154 
6000/9534 [=================>............] - ETA: 0s - loss: 1.1188 - acc: 0.6173 
7000/9534 [=====================>........] - ETA: 0s - loss: 1.1147 - acc: 0.6179 
8000/9534 [========================>.....] - ETA: 0s - loss: 1.1068 - acc: 0.6196 
9000/9534 [===========================>..] - ETA: 0s - loss: 1.1090 - acc: 0.6190 
9534/9534 [==============================] - 0s - loss: 1.1092 - acc: 0.6177 - val_loss: 1.1788 - val_acc: 0.5689 
Epoch 12/100 
1000/9534 [==>...........................] - ETA: 0s - loss: 1.0702 - acc: 0.6280 
2000/9534 [=====>........................] - ETA: 0s - loss: 1.0742 - acc: 0.6280 
3000/9534 [========>.....................] - ETA: 0s - loss: 1.0821 - acc: 0.6237 
4000/9534 [===========>..................] - ETA: 0s - loss: 1.0868 - acc: 0.6233 
5000/9534 [==============>...............] - ETA: 0s - loss: 1.0807 - acc: 0.6258 
6000/9534 [=================>............] - ETA: 0s - loss: 1.0884 - acc: 0.6208 
7000/9534 [=====================>........] - ETA: 0s - loss: 1.0905 - acc: 0.6187 
8000/9534 [========================>.....] - ETA: 0s - loss: 1.0895 - acc: 0.6205 
9000/9534 [===========================>..] - ETA: 0s - loss: 1.0899 - acc: 0.6200 
9534/9534 [==============================] - 1s - loss: 1.0900 - acc: 0.6205 - val_loss: 1.1598 - val_acc: 0.5830 
Epoch 13/100 
1000/9534 [==>...........................] - ETA: 0s - loss: 1.0730 - acc: 0.6340 
2000/9534 [=====>........................] - ETA: 0s - loss: 1.0649 - acc: 0.6445 
3000/9534 [========>.....................] - ETA: 0s - loss: 1.0600 - acc: 0.6430 
4000/9534 [===========>..................] - ETA: 0s - loss: 1.0718 - acc: 0.6350 
5000/9534 [==============>...............] - ETA: 0s - loss: 1.0821 - acc: 0.6280 
6000/9534 [=================>............] - ETA: 0s - loss: 1.0779 - acc: 0.6295 
7000/9534 [=====================>........] - ETA: 0s - loss: 1.0713 - acc: 0.6316 
8000/9534 [========================>.....] - ETA: 0s - loss: 1.0737 - acc: 0.6289 
9000/9534 [===========================>..] - ETA: 0s - loss: 1.0767 - acc: 0.6261 
9534/9534 [==============================] - 1s - loss: 1.0752 - acc: 0.6259 - val_loss: 1.1589 - val_acc: 0.5642 
Epoch 14/100 
1000/9534 [==>...........................] - ETA: 0s - loss: 1.0148 - acc: 0.6520 
2000/9534 [=====>........................] - ETA: 0s - loss: 1.0395 - acc: 0.6430 
3000/9534 [========>.....................] - ETA: 0s - loss: 1.0503 - acc: 0.6377 
4000/9534 [===========>..................] - ETA: 0s - loss: 1.0521 - acc: 0.6382 
5000/9534 [==============>...............] - ETA: 0s - loss: 1.0529 - acc: 0.6388 
6000/9534 [=================>............] - ETA: 0s - loss: 1.0519 - acc: 0.6392 
7000/9534 [=====================>........] - ETA: 0s - loss: 1.0561 - acc: 0.6359 
8000/9534 [========================>.....] - ETA: 0s - loss: 1.0547 - acc: 0.6332 
9000/9534 [===========================>..] - ETA: 0s - loss: 1.0591 - acc: 0.6313 
9534/9534 [==============================] - 0s - loss: 1.0606 - acc: 0.6301 - val_loss: 1.1577 - val_acc: 0.5774 
Epoch 15/100 
1000/9534 [==>...........................] - ETA: 0s - loss: 1.0513 - acc: 0.6410 
2000/9534 [=====>........................] - ETA: 0s - loss: 1.0635 - acc: 0.6245 
3000/9534 [========>.....................] - ETA: 0s - loss: 1.0500 - acc: 0.6280 
4000/9534 [===========>..................] - ETA: 0s - loss: 1.0530 - acc: 0.6257 
5000/9534 [==============>...............] - ETA: 0s - loss: 1.0585 - acc: 0.6232 
6000/9534 [=================>............] - ETA: 0s - loss: 1.0562 - acc: 0.6233 
7000/9534 [=====================>........] - ETA: 0s - loss: 1.0507 - acc: 0.6267 
8000/9534 [========================>.....] - ETA: 0s - loss: 1.0540 - acc: 0.6267 
9000/9534 [===========================>..] - ETA: 0s - loss: 1.0513 - acc: 0.6286 
9534/9534 [==============================] - 0s - loss: 1.0492 - acc: 0.6290 - val_loss: 1.1608 - val_acc: 0.5802 
Epoch 16/100 
1000/9534 [==>...........................] - ETA: 0s - loss: 1.0553 - acc: 0.6300 
2000/9534 [=====>........................] - ETA: 0s - loss: 1.0582 - acc: 0.6305 
3000/9534 [========>.....................] - ETA: 0s - loss: 1.0341 - acc: 0.6407 
4000/9534 [===========>..................] - ETA: 0s - loss: 1.0312 - acc: 0.6398 
5000/9534 [==============>...............] - ETA: 0s - loss: 1.0454 - acc: 0.6324 
6000/9534 [=================>............] - ETA: 0s - loss: 1.0438 - acc: 0.6332 
7000/9534 [=====================>........] - ETA: 0s - loss: 1.0445 - acc: 0.6323 
8000/9534 [========================>.....] - ETA: 0s - loss: 1.0426 - acc: 0.6331 
9000/9534 [===========================>..] - ETA: 0s - loss: 1.0439 - acc: 0.6323 
9534/9534 [==============================] - 0s - loss: 1.0427 - acc: 0.6323 - val_loss: 1.1544 - val_acc: 0.5764 
Epoch 17/100 
1000/9534 [==>...........................] - ETA: 0s - loss: 1.0633 - acc: 0.6190 
2000/9534 [=====>........................] - ETA: 0s - loss: 1.0407 - acc: 0.6300 
3000/9534 [========>.....................] - ETA: 0s - loss: 1.0417 - acc: 0.6343 
4000/9534 [===========>..................] - ETA: 0s - loss: 1.0322 - acc: 0.6402 
5000/9534 [==============>...............] - ETA: 0s - loss: 1.0283 - acc: 0.6426 
6000/9534 [=================>............] - ETA: 0s - loss: 1.0355 - acc: 0.6400 
7000/9534 [=====================>........] - ETA: 0s - loss: 1.0361 - acc: 0.6413 
8000/9534 [========================>.....] - ETA: 0s - loss: 1.0336 - acc: 0.6392 
9000/9534 [===========================>..] - ETA: 0s - loss: 1.0309 - acc: 0.6394 
9534/9534 [==============================] - 0s - loss: 1.0342 - acc: 0.6382 - val_loss: 1.1575 - val_acc: 0.5755 
Epoch 18/100 
1000/9534 [==>...........................] - ETA: 0s - loss: 1.0289 - acc: 0.6510 
2000/9534 [=====>........................] - ETA: 0s - loss: 1.0233 - acc: 0.6505 
3000/9534 [========>.....................] - ETA: 0s - loss: 1.0176 - acc: 0.6507 
4000/9534 [===========>..................] - ETA: 0s - loss: 1.0194 - acc: 0.6500 
5000/9534 [==============>...............] - ETA: 0s - loss: 1.0242 - acc: 0.6442 
6000/9534 [=================>............] - ETA: 0s - loss: 1.0239 - acc: 0.6423 
7000/9534 [=====================>........] - ETA: 0s - loss: 1.0249 - acc: 0.6413 
8000/9534 [========================>.....] - ETA: 0s - loss: 1.0264 - acc: 0.6404 
9000/9534 [===========================>..] - ETA: 0s - loss: 1.0277 - acc: 0.6406 
9534/9534 [==============================] - 0s - loss: 1.0299 - acc: 0.6389 - val_loss: 1.1597 - val_acc: 0.5708 
Epoch 19/100 
1000/9534 [==>...........................] - ETA: 0s - loss: 1.0271 - acc: 0.6420 
2000/9534 [=====>........................] - ETA: 0s - loss: 1.0114 - acc: 0.6445 
3000/9534 [========>.....................] - ETA: 0s - loss: 1.0046 - acc: 0.6510 
4000/9534 [===========>..................] - ETA: 0s - loss: 1.0137 - acc: 0.6453 
5000/9534 [==============>...............] - ETA: 0s - loss: 1.0074 - acc: 0.6492 
6000/9534 [=================>............] - ETA: 0s - loss: 1.0112 - acc: 0.6490 
7000/9534 [=====================>........] - ETA: 0s - loss: 1.0072 - acc: 0.6504 
8000/9534 [========================>.....] - ETA: 0s - loss: 1.0093 - acc: 0.6496 
9000/9534 [===========================>..] - ETA: 0s - loss: 1.0137 - acc: 0.6452 
9534/9534 [==============================] - 0s - loss: 1.0159 - acc: 0.6451 - val_loss: 1.1603 - val_acc: 0.5651 
+0

データセットのサンプルを投稿できますか?またはグラフィック表現。 – michetonu

+0

あなたの返事、michetonuありがとうございました。機械学習領域では、トレーニングデータセットの上限性能をどのように見つけることができるのか、常に私は混乱しています。ところで、どうやってデータを渡すことができますか? – iloveml

+0

まず最初に、1つの行はどのように見えますか?それは507バイナリ要素ですか?特定のデータセットに対して可能な上限のパフォーマンスを見つける簡単な方法はありません。データセットがすべてバイナリである場合、前処理の点ではあまりありません。それをネットに送る前にデータをシャッフルしていますか?もしそうなら、どうですか?あなたがラベルを混ぜている可能性はありますか? – michetonu

答えて

0

クロスバリデーションコードは次のように

私のモデルであります形状(10000,507)、(10000,7)

0

どのモデルでもどれくらいの正確さを達成できるかは、データセットとラベル付けの精度によって大きく異なります。

あなたの訓練されたモデルで予測を行い、混乱行列を作成します。偽陽性と偽陰性の予測の具体例を見てください。これらの予測は実際には間違っているか、モデルがラベルよりも正確に予測されていますか?これはプロジェクトで何度か私に起こった。

私はあなたが過ぎるまであなたのモデルを訓練することをお勧めします。私が見ることのできるところから、あなたのモデルはまだ学習しているか、あまりにもオーバーフィットに迫っています。検証のロスや精度が悪化するまで、エポックを追加してください。結果的に試行するために、必要に応じて正規化を適用してください。おそらく.1または.2のドロップアウトから最大で最大5つまでです。

セットアップテンソルボードでは、精度と検証の精度の違いを追跡できます。

多くのカテゴリ変数があります。ダミー変数トラップを回避したことはありますか?ダミー変数の数がカテゴリの数よりも1少ないことを確認する必要があります。http://www.algosome.com/articles/dummy-variable-trap-regression.html

0

返信いただいたpetezurichをありがとうございます。私はあなたにもっと同意することはできません。あなたはとても素敵です。

私は別の複数ラベルの分類プロジェクトを試しています。

データは503バイナリフィーチャと64バイナリラベルです。 私は出力層でsigmoidを使用し、損失関数ではbinary_crossentrophyを使用し、最初の隠れ層では正則化l2を使用します。隠しレイヤ構造は500 * 100です。したがって、ネットワーク全体は503 * 500 * 100 * 64に似ています。私はここに私のデータをポスト: https://ufile.io/f4tvf

私は10のフォルダクロスバリデーションで10000のエポックごとをしたとき、私はパフォーマンスのスコアを得た:私は、各ラベルのための0.5のしきい値を設定した場合

coverage error: 10.485887, 
ranking average precision: 0.766574, 
ranking loss: 0.045134. 

、私はラベルを得ましたベースのメトリックと

zero one loss: 0.848790, 
hamming loss: 0.037109, 
macro precision: 0.426696, 
micro precision: 0.672705, 
macro recall: 0.371033, 
micro recall: 0.636571, 
macro f1: 0.383845, 
micro f1: 0.654140. 

結果は非常に良いですか?それを訓練することに興味がありますか?

関連する問題