Fix documentation on softmax_cross_entropy_with_logits_v2 by mbrio · Pull Request #21517 · tensorflow/tensorflow
import tensorflow as tf
import numpy as np
logits = np.array([
[5.0, 2.3, 0.0, -3.2, -0.8],
[2.3, 0.1, 1.8, 1.1, 1.2],
[-1.2, 1.1, 0.1, 2.5, 0.7]
])
labels = np.array([
[1., 1., 0., 0., 0.],
[0., 0., 1., 0., 1.],
[0., 0., 0., 1., 0.]
])
x_entropy = tf.nn.softmax_cross_entropy_with_logits_v2(labels=labels, logits=logits)
with tf.Session() as sess:
print(sess.run(tf.shape(labels))) # [3, 5]
print(sess.run(tf.shape(x_entropy))) # [3]
In this example we see that the x_entropy shape does not equal the shape of labels as stated in the original documentation, it is of shape [batch_size].