Description
Hi, I'm working with Ivy where we are wrapping various ML frameworks and trying to align them with the Array API standard https://github.com/unifyai/ivy. The first problem I'm having is slight numerical issues. For example, when using TensorFlow I have gotten the error:
x = ivy.array([[2.5418657e+12, 2.5418657e+12, 2.5418657e+12],
[2.5418657e+12, 2.5418657e+12, 2.5418657e+12],
[2.5418657e+12, 2.5418657e+12, 2.5418657e+12]], dtype=float32)
y = ivy.array([[2.541866e+12, 2.541866e+12, 2.541866e+12],
[2.541866e+12, 2.541866e+12, 2.541866e+12],
[2.541866e+12, 2.541866e+12, 2.541866e+12]], dtype=float32)
> assert all(exactly_equal(x, y)), "The input arrays have different values"
E AssertionError: The input arrays have different values
For PyTorch, I have gotten the error:
x = ivy.array([[0., 0.],
[0., 0.]], dtype=float32)
y = ivy.array([[ 0., 0.],
[ 0., -0.]], dtype=float32)
> assert all(exactly_equal(x, y)), "The input arrays have different values"
E AssertionError: The input arrays have different values
These types of failures don't appear each time I run the tests, only appearing every few runs. I wonder if something similar to NumPy.allclose would be better for value checking, as matrix_power seems like a function susceptible to these very small errors.
Also, when testing against JAX I get a failure caused by the following falsifying example:
Falsifying example: test_matrix_power(
x=ivy.array([[[1., 1.],
[1., 1.]]], dtype=float32), n=-1,
)
which is strange given that here x
isn't invertible and n
is negative, which it says in test_matrix_power
shouldn't occur.