Skip to content

L1 loss or SmoothL1Loss? #60

Closed
Closed
@jonathan016

Description

@jonathan016

Hi, I've been reading through the code and I found that L1 loss is used instead of Smooth L1 loss for localization loss. This is quite different from the paper's procedure, where as far as I know SSD uses Smooth L1 loss.

https://github.com/sgrvinod/a-PyTorch-Tutorial-to-Object-Detection/blob/master/model.py#L549

self.smooth_l1 = nn.L1Loss()

https://github.com/sgrvinod/a-PyTorch-Tutorial-to-Object-Detection/blob/master/model.py#L612

loc_loss = self.smooth_l1(predicted_locs[positive_priors], true_locs[positive_priors]) # (), scalar


My questions are:

  1. Has anyone tried changing the loss function to SmoothL1Loss as implemented in PyTorch as of right now?
  2. If it has been tried, is the result similar to what SSD achieves?

Thank you in advance.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions