2017-07-16 21 views
1

私は、二次導関数(すなわち、勾配の勾配)の計算を必要とするモデルを訓練しています。私は現時点ではソフトマックスクロスエントロピーの第二導関数をサポートしていませんTensorFlowのように思える次のエラーテンソルフローの二次導関数を計算する際のエラー

Traceback (most recent call last): 
    File "/scratch0/Projects/summer/adda/env/lib/python3.6/site-packages/tensorflow/python/ops/gradients_impl.py", line 455, in gradients 
    grad_fn = ops.get_gradient_function(op) 
    File "/scratch0/Projects/summer/adda/env/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 1682, in get_gradient_function 
    return _gradient_registry.lookup(op_type) 
    File "/scratch0/Projects/summer/adda/env/lib/python3.6/site-packages/tensorflow/python/framework/registry.py", line 93, in lookup 
    "%s registry has no entry for: %s" % (self._name, name)) 
LookupError: gradient registry has no entry for: PreventGradient 

During handling of the above exception, another exception occurred: 

Traceback (most recent call last): 
    File "tools/train_adda.py", line 215, in <module> 
    main() 
    File "/scratch0/Projects/summer/adda/env/lib/python3.6/site-packages/click/core.py", line 722, in __call__ 
    return self.main(*args, **kwargs) 
    File "/scratch0/Projects/summer/adda/env/lib/python3.6/site-packages/click/core.py", line 697, in main 
    rv = self.invoke(ctx) 
    File "/scratch0/Projects/summer/adda/env/lib/python3.6/site-packages/click/core.py", line 895, in invoke 
    return ctx.invoke(self.callback, **ctx.params) 
    File "/scratch0/Projects/summer/adda/env/lib/python3.6/site-packages/click/core.py", line 535, in invoke 
    return callback(*args, **kwargs) 
    File "tools/train_adda.py", line 137, in main 
    Jgrads_target = tf.gradients(reg, list(target_vars.values())) 
    File "/scratch0/Projects/summer/adda/env/lib/python3.6/site-packages/tensorflow/python/ops/gradients_impl.py", line 459, in gradients 
    (op.name, op.type)) 
LookupError: No gradient defined for operation 'gradients_1/sparse_softmax_cross_entropy_loss_1/xentropy/xentropy_grad/PreventGradient' (op type: PreventGradient) 

答えて

0

を取得しています

mapping_loss = tf.losses.sparse_softmax_cross_entropy(
    1 - adversary_label, adversary_logits) 
adversary_loss = tf.losses.sparse_softmax_cross_entropy(
    adversary_label, adversary_logits) 

''' # doesnt work using tf.nn.softmax_cross_entropy_with_logits too. 

mapping_loss = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(
    labels = tf.one_hot(1 - adversary_label, 2), logits = adversary_logits, name='loss1')) 
adversary_loss = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(
    labels = tf.one_hot(adversary_label, 2), logits = adversary_logits, name = 'loss2')) 
''' 

grads_target = tf.gradients(mapping_loss, target_vars.values()) 
grads_adv = tf.gradients(adversary_loss, adversary_vars.values()) 

grads_all = grads_target + grads_adv 

reg = 0.5*sum(tf.reduce_sum(tf.square(g)) for g in grads_all) 
Jgrads_target = tf.gradients(reg, target_vars.values()) 
Jgrads_adv = tf.gradients(reg, adversary_vars.values()) 

:ここにあることない短い抜粋です。 https://github.com/tensorflow/tensorflow/blob/c2ce4f68c744e6d328746b144ff1fcf98ac99e6c/tensorflow/python/ops/nn_grad.py#L449

関連する問題