-
-
Notifications
You must be signed in to change notification settings - Fork 18.5k
CI: Troubleshoot CI #44822
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
CI: Troubleshoot CI #44822
Conversation
great lmk when ready |
Looks like npdev changed ndarray.mean s.t. |
Wow, yet another weird case :(. Our
but, happy coincidence apparently the old datetime resolution is broken and just ignores the dtype request... Although, I am still surprised why it doesn't just fail in the new code. Well, I guess, |
@jbrockmendel if you can fix this up |
waiting for a fix in npdev |
Should be fixed, not sure if the nightly is build though. |
ok if not fixed, then can you disable tests? |
will re-push and find out |
Looks like the td64.mean may have gotten fixed but not we're seeing a bunch of new warnings (guessing division by zero related). will dig into this in a bit |
:(, sorry about this, its been a confusing can of worms. One thing that could still be changed that if you provide But I guess this is something awfully gone wrong, or just a different error?:
Where apparently the type resolution is somehow failing (which confuses me a lot, because I am not sure why this would have been any better with the previous version.) |
Yah in these cases we're currently testing that we get a TypeError, so getting a numpy.core._exceptions._UFuncBinaryResolutionError instead isn't the worst thing in the world; if it can't/won't be changed upstream then we can try to catch and re-raise |
Ah OK, puh, I was seriously worried it was not an error before... That error should be inheriting from |
I think the exception is fine. The failing tests look like a new warning which im assuming is division by zero, which is also pretty reasonable so we can adjust to handle on our end. |
Cool, thanks. Let me know if something is looking shady there, so I can have a closer look. |
@jbrockmendel if you have a chance to fix up here (i don't care if you xfail these tests) but cannot merge anything else until we are back to green. |
its gonna take a couple more rounds on the CI bc i cant get npdev working locally |
It looks very tentatively like a lot of |
maybe worthile to simply fix the numpy dev wheel to a working one while this is fixed? |
first should confirm with @seberg whether the guess about |
sorry what i mean is pin the wheel to a working one from 2 days ago |
i guess. no idea how to do that |
|
@lithomas1 can you pinch-hit on this one? |
Sure, #44834 is my attempt. We should still keep this PR open to revert the pin in there and come up with a long-term fix. |
sounds good. ive confirmed locally that the td64 object is not respecting |
@jbrockmendel sorry, only catching up now/tomorrow. I cannot think of immediate changes related to |
(I guess one thing I am confused, is because if |
Ah, I had looked at the wrong CI run yesterday, I think. But I am still confused what the issue is? Do you have an example? Btw. assuming you are on linux, one quick hack that may work to get numpy dev, is:
to drop in a shell with numpy dev version injected into the PYTHONPATH. |
I got npdev working locally by installing from source instead of using the wheel. So I can confirm that a) the |
Found a likely candidate for a bug in our |
Numpy pin reverted + green |
great going to merge but is there anything we should backport to 1.3.5? |
if we want 1.3.5 to work with npdev then the one line in ops_dispatch.pyx would need to be backported i think |
ok let's try to backport this and then |
@meeseeksdev backport 1.3.x |
Something went wrong ... Please have a look at my logs. |
### What changes were proposed in this pull request? This PR fix the wrong aliases in `__array_ufunc__` ### Why are the changes needed? When running test with numpy 1.23.0 (current latest), hit a bug: `NotImplementedError: pandas-on-Spark objects currently do not support <ufunc 'divide'>.` In `__array_ufunc__` we first call `maybe_dispatch_ufunc_to_dunder_op` to try dunder methods first, and then we try pyspark API. `maybe_dispatch_ufunc_to_dunder_op` is from pandas code. pandas fix a bug pandas-dev/pandas#44822 (comment) pandas-dev/pandas@206b249 when upgrade to numpy 1.23.0, we need to also sync this. ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? - Current CI passed - The exsiting UT `test_series_datetime` already cover this, I also test it in my local env with 1.23.0 ```shell pip install "numpy==1.23.0" python/run-tests --testnames 'pyspark.pandas.tests.test_series_datetime SeriesDateTimeTest.test_arithmetic_op_exceptions' ``` Closes #37078 from Yikun/SPARK-39611. Authored-by: Yikun Jiang <[email protected]> Signed-off-by: Hyukjin Kwon <[email protected]>
### What changes were proposed in this pull request? This PR fix the wrong aliases in `__array_ufunc__` ### Why are the changes needed? When running test with numpy 1.23.0 (current latest), hit a bug: `NotImplementedError: pandas-on-Spark objects currently do not support <ufunc 'divide'>.` In `__array_ufunc__` we first call `maybe_dispatch_ufunc_to_dunder_op` to try dunder methods first, and then we try pyspark API. `maybe_dispatch_ufunc_to_dunder_op` is from pandas code. pandas fix a bug pandas-dev/pandas#44822 (comment) pandas-dev/pandas@206b249 when upgrade to numpy 1.23.0, we need to also sync this. ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? - Current CI passed - The exsiting UT `test_series_datetime` already cover this, I also test it in my local env with 1.23.0 ```shell pip install "numpy==1.23.0" python/run-tests --testnames 'pyspark.pandas.tests.test_series_datetime SeriesDateTimeTest.test_arithmetic_op_exceptions' ``` Closes #37078 from Yikun/SPARK-39611. Authored-by: Yikun Jiang <[email protected]> Signed-off-by: Hyukjin Kwon <[email protected]> (cherry picked from commit fb48a14) Signed-off-by: Hyukjin Kwon <[email protected]>
### What changes were proposed in this pull request? This PR fix the wrong aliases in `__array_ufunc__` ### Why are the changes needed? When running test with numpy 1.23.0 (current latest), hit a bug: `NotImplementedError: pandas-on-Spark objects currently do not support <ufunc 'divide'>.` In `__array_ufunc__` we first call `maybe_dispatch_ufunc_to_dunder_op` to try dunder methods first, and then we try pyspark API. `maybe_dispatch_ufunc_to_dunder_op` is from pandas code. pandas fix a bug pandas-dev/pandas#44822 (comment) pandas-dev/pandas@206b249 when upgrade to numpy 1.23.0, we need to also sync this. ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? - Current CI passed - The exsiting UT `test_series_datetime` already cover this, I also test it in my local env with 1.23.0 ```shell pip install "numpy==1.23.0" python/run-tests --testnames 'pyspark.pandas.tests.test_series_datetime SeriesDateTimeTest.test_arithmetic_op_exceptions' ``` Closes #37078 from Yikun/SPARK-39611. Authored-by: Yikun Jiang <[email protected]> Signed-off-by: Hyukjin Kwon <[email protected]> (cherry picked from commit fb48a14) Signed-off-by: Hyukjin Kwon <[email protected]>
### What changes were proposed in this pull request? This PR fix the wrong aliases in `__array_ufunc__` ### Why are the changes needed? When running test with numpy 1.23.0 (current latest), hit a bug: `NotImplementedError: pandas-on-Spark objects currently do not support <ufunc 'divide'>.` In `__array_ufunc__` we first call `maybe_dispatch_ufunc_to_dunder_op` to try dunder methods first, and then we try pyspark API. `maybe_dispatch_ufunc_to_dunder_op` is from pandas code. pandas fix a bug pandas-dev/pandas#44822 (comment) pandas-dev/pandas@206b249 when upgrade to numpy 1.23.0, we need to also sync this. ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? - Current CI passed - The exsiting UT `test_series_datetime` already cover this, I also test it in my local env with 1.23.0 ```shell pip install "numpy==1.23.0" python/run-tests --testnames 'pyspark.pandas.tests.test_series_datetime SeriesDateTimeTest.test_arithmetic_op_exceptions' ``` Closes apache#37078 from Yikun/SPARK-39611. Authored-by: Yikun Jiang <[email protected]> Signed-off-by: Hyukjin Kwon <[email protected]> (cherry picked from commit fb48a14) Signed-off-by: Hyukjin Kwon <[email protected]>
Tacked some CI-troubleshooting stuff onto the just-started FIXMES21 branch.