how to optimize f1 score or auc in tensorflow

4196 views tensorflow
9

global_step = tf.Variable(0, name='global_step', trainable=False)
lr_decayed = tf.train.cosine_decay_restarts(learning_rate, global_step, first_decay_steps, m_mul=0.9)

#cost = tf.reduce_max(tf.nn.sigmoid_cross_entropy_with_logits(labels=y, logits=model.fc3),axis=0)
auc, aucop = tf.metrics.auc(labels=y, predictions=model.probs)
cost = tf.add(tf.multiply(auc, -1), 1)
optimizer = tf.train.GradientDescentOptimizer(lr_decayed).minimize(cost)

but when it runs to the last line(optimizer), it crashed, it is ok when I use the cost commented, here is the error rise up:

ValueError: No gradients provided for any variable, check your graph for ops that do not support gradients

answered question

1 Answer

13

Try to change your GradientDescentOptimize() param for something static like 0.05, this should work.

Bye.

posted this

Have an answer?

JD

Please login first before posting an answer.