ループなしでSVMの損失を計算したいと思います。しかし、私はそれを正しくすることはできません。いくつかのenlightmentが必要です。SVMの損失をベクトル化する方法
と
def svm_loss_vectorized(W, X, y, reg):
loss = 0.0
scores = np.dot(X, W)
correct_scores = scores[y]
deltas = np.ones(scores.shape)
margins = scores - correct_scores + deltas
margins[margins < 0] = 0 # max -> Boolean array indexing
margins[np.arange(scores.shape[0]), y] = 0 # Don't count j = yi
loss = np.sum(margins)
# Average
num_train = X.shape[0]
loss /= num_train
# Regularization
loss += 0.5 * reg * np.sum(W * W)
return loss
それべき次の関数として出力同じ損失。
def svm_loss_naive(W, X, y, reg):
num_classes = W.shape[1]
num_train = X.shape[0]
loss = 0.0
for i in range(num_train):
scores = X[i].dot(W)
correct_class_score = scores[y[i]]
for j in range(num_classes):
if j == y[i]:
continue
margin = scores[j] - correct_class_score + 1 # note delta = 1
if margin > 0:
loss += margin
loss /= num_train # mean
loss += 0.5 * reg * np.sum(W * W) # l2 regularization
return loss
入力の形状は何ですか? – Divakar
W.shape =(3073,10)、X.shape =(500,3073)、y.shape(500、) – WeiJay