-
-
Notifications
You must be signed in to change notification settings - Fork 18.5k
BUG: Concat with inner join and empty DataFrame #15397
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from 1 commit
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -1825,6 +1825,15 @@ def test_concat_bug_3602(self): | |
result = concat([df1, df2], axis=1) | ||
assert_frame_equal(result, expected) | ||
|
||
def test_concat_bug_15328(self): | ||
df_empty = pd.DataFrame() | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. put a comment here for the issue number (rather than in the test name) (try to be descriptive with that if possible) |
||
df_a = pd.DataFrame({'a': [1, 2]}, index=[0, 1]) | ||
result = pd.concat([df_empty, df_a], axis=1, join='inner') | ||
self.assertTrue(result.empty) | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. construct the result frame and use assert_frame_equal |
||
|
||
result = pd.concat([df_a, df_empty], axis=1, join='inner') | ||
self.assertTrue(result.empty) | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. same |
||
|
||
def test_concat_series_axis1_same_names_ignore_index(self): | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. can you systematically tests the how=* here? (for the empty cases) There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. sorry i'm not clear on what you mean here. could you clarify our point to something similar ? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. what I mean is something like:
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. oh haha. got it. thanks! |
||
dates = date_range('01-Jan-2013', '01-Jan-2014', freq='MS')[0:-1] | ||
s1 = Series(randn(len(dates)), index=dates, name='value') | ||
|
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -54,6 +54,12 @@ def setUp(self): | |
self.right = DataFrame({'v2': np.random.randn(4)}, | ||
index=['d', 'b', 'c', 'a']) | ||
|
||
def test_merge_bug_15328(self): | ||
df_empty = pd.DataFrame() | ||
df_a = pd.DataFrame({'a': [1, 2]}, index=[0, 1]) | ||
result = pd.merge(df_empty, df_a, left_index=True, right_index=True) | ||
self.assertTrue(result.empty) | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. same as above |
||
|
||
def test_merge_common(self): | ||
joined = merge(self.df, self.df2) | ||
exp = merge(self.df, self.df2, on=['key1', 'key2']) | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
just an FYI, if you put changes in whatsnew NOT at the end (but in empty space that is left), then you won't get conflicts.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ah i see. makes sense. thanks!