Skip to content

Instantly share code, notes, and snippets.

View jizhang02's full-sized avatar
🐒

Jing Zhang jizhang02

🐒
View GitHub Profile
@wassname
wassname / jaccard_coef_loss.py
Last active January 30, 2024 15:45
jaccard_coef_loss for keras. This loss is usefull when you have unbalanced classes within a sample such as segmenting each pixel of an image. For example you are trying to predict if each pixel is cat, dog, or background. You may have 80% background, 10% dog, and 10% cat. Should a model that predicts 100% background be 80% right, or 30%? Categor…
from keras import backend as K
def jaccard_distance_loss(y_true, y_pred, smooth=100):
"""
Jaccard = (|X & Y|)/ (|X|+ |Y| - |X & Y|)
= sum(|A*B|)/(sum(|A|)+sum(|B|)-sum(|A*B|))
The jaccard distance loss is usefull for unbalanced datasets. This has been
shifted so it converges on 0 and is smoothed to avoid exploding or disapearing
gradient.
@ravnoor
ravnoor / Keras.ipynb
Created April 24, 2017 02:38 — forked from prhbrt/Keras.ipynb
V-Net in Keras and tensorflow
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@wassname
wassname / dice_loss_for_keras.py
Created September 26, 2016 08:32
dice_loss_for_keras
"""
Here is a dice loss for keras which is smoothed to approximate a linear (L1) loss.
It ranges from 1 to 0 (no error), and returns results similar to binary crossentropy
"""
# define custom loss and metric functions
from keras import backend as K
def dice_coef(y_true, y_pred, smooth=1):