Skip to content

Commit b5eb33b

Browse files
nikhilaravifacebook-github-bot
authored andcommitted
bugfix
Summary: It seemed that even though the chamfer diff was rebased on top of the knn autograd diff, some of the final updates did not get applied. I'm really surprised that the sandcastle tests did not fail and prevent the diff from landing. Reviewed By: gkioxari Differential Revision: D21066156 fbshipit-source-id: 5216efe95180c1b6082d0bac404fa1920cfb7b02
1 parent b530b0a commit b5eb33b

File tree

1 file changed

+6
-6
lines changed

1 file changed

+6
-6
lines changed

pytorch3d/loss/chamfer.py

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -145,11 +145,11 @@ def chamfer_distance(
145145
cham_norm_x = x.new_zeros(())
146146
cham_norm_y = x.new_zeros(())
147147

148-
x_dists, x_idx = knn_points(x, y, lengths1=x_lengths, lengths2=y_lengths, K=1)
149-
y_dists, y_idx = knn_points(y, x, lengths1=y_lengths, lengths2=x_lengths, K=1)
148+
x_nn = knn_points(x, y, lengths1=x_lengths, lengths2=y_lengths, K=1)
149+
y_nn = knn_points(y, x, lengths1=y_lengths, lengths2=x_lengths, K=1)
150150

151-
cham_x = x_dists[..., 0] # (N, P1)
152-
cham_y = y_dists[..., 0] # (N, P2)
151+
cham_x = x_nn.dists[..., 0] # (N, P1)
152+
cham_y = y_nn.dists[..., 0] # (N, P2)
153153

154154
if is_x_heterogeneous:
155155
cham_x[x_mask] = 0.0
@@ -162,8 +162,8 @@ def chamfer_distance(
162162

163163
if return_normals:
164164
# Gather the normals using the indices and keep only value for k=0
165-
x_normals_near = knn_gather(y_normals, x_idx, y_lengths)[..., 0, :]
166-
y_normals_near = knn_gather(x_normals, y_idx, x_lengths)[..., 0, :]
165+
x_normals_near = knn_gather(y_normals, x_nn.idx, y_lengths)[..., 0, :]
166+
y_normals_near = knn_gather(x_normals, y_nn.idx, x_lengths)[..., 0, :]
167167

168168
cham_norm_x = 1 - torch.abs(
169169
F.cosine_similarity(x_normals, x_normals_near, dim=2, eps=1e-6)

0 commit comments

Comments
 (0)