0% found this document useful (0 votes)
39 views1 page

Cross Entropy

This document defines two PyTorch loss functions - CrossEntropyLabelSmooth and SoftEntropy. CrossEntropyLabelSmooth implements cross entropy loss with label smoothing, where the target labels are adjusted to be a weighted average of the true label and a uniform distribution over all labels. SoftEntropy calculates cross entropy loss using softmax probabilities of the targets rather than one-hot encoded targets. Both functions take the negative log likelihood of the softmax output and return the mean loss.

Uploaded by

manyam29523
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as TXT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
39 views1 page

Cross Entropy

This document defines two PyTorch loss functions - CrossEntropyLabelSmooth and SoftEntropy. CrossEntropyLabelSmooth implements cross entropy loss with label smoothing, where the target labels are adjusted to be a weighted average of the true label and a uniform distribution over all labels. SoftEntropy calculates cross entropy loss using softmax probabilities of the targets rather than one-hot encoded targets. Both functions take the negative log likelihood of the softmax output and return the mean loss.

Uploaded by

manyam29523
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as TXT, PDF, TXT or read online on Scribd
You are on page 1/ 1

import torch

import torch.nn as nn
import torch.nn.functional as F
from torch.nn import *

class CrossEntropyLabelSmooth(nn.Module):
"""Cross entropy loss with label smoothing regularizer.

Reference:
Szegedy et al. Rethinking the Inception Architecture for Computer Vision.
CVPR 2016.
Equation: y = (1 - epsilon) * y + epsilon / K.

Args:
num_classes (int): number of classes.
epsilon (float): weight.
"""

def __init__(self, num_classes, epsilon=0.1):


super(CrossEntropyLabelSmooth, self).__init__()
self.num_classes = num_classes
self.epsilon = epsilon
self.logsoftmax = nn.LogSoftmax(dim=1).cuda()

def forward(self, inputs, targets):


"""
Args:
inputs: prediction matrix (before softmax) with shape
(batch_size, num_classes)
targets: ground truth labels with shape (num_classes)
"""
log_probs = self.logsoftmax(inputs)
targets = torch.zeros_like(log_probs).scatter_(1, targets.unsqueeze(1),
1)
print('here',targets)
targets = (1 - self.epsilon) * targets + self.epsilon /
self.num_classes
loss = (- targets * log_probs).mean(0).sum()
return loss

class SoftEntropy(nn.Module):
def __init__(self):
super(SoftEntropy, self).__init__()
self.logsoftmax = nn.LogSoftmax(dim=1).cuda()

def forward(self, inputs, targets):


log_probs = self.logsoftmax(inputs)
loss = (- F.softmax(targets, dim=1).detach() * log_probs).mean(0).sum()
return loss

You might also like