Skip to content

Commit 57b8a6a

Browse files
Update losses.py
Replace softmax_cross_entropy_with_logits with binary_crossentropy
1 parent 4733603 commit 57b8a6a

1 file changed

Lines changed: 2 additions & 2 deletions

File tree

  • TensorFlow2/Segmentation/UNet_Medical/utils

‎TensorFlow2/Segmentation/UNet_Medical/utils/losses.py‎

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -32,8 +32,8 @@ def partial_losses(predict, target):
3232
flat_labels = tf.reshape(target,
3333
[tf.shape(input=predict)[0], -1, n_classes])
3434

35-
crossentropy_loss = tf.reduce_mean(input_tensor=tf.nn.softmax_cross_entropy_with_logits(logits=flat_logits,
36-
labels=flat_labels),
35+
crossentropy_loss = tf.reduce_mean(input_tensor=tf.keras.backend.binary_crossentropy(output=flat_logits,
36+
target=flat_labels),
3737
name='cross_loss_ref')
3838
dice_loss = tf.reduce_mean(input_tensor=1 - dice_coef(flat_logits, flat_labels), name='dice_loss_ref')
3939
return crossentropy_loss, dice_loss

0 commit comments

Comments
 (0)