-
-
Notifications
You must be signed in to change notification settings - Fork 18.7k
BUG: pd.concat with identical key leads to multi-indexing error #46546
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from 6 commits
c624df5
58b2bb4
fc2cef9
2e4317c
ef8337a
5c8e7db
7330ddb
1e1f525
6d059b4
3c43cea
4c91709
7429217
3cfad33
6ff6439
f2fe384
330187b
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -705,7 +705,7 @@ def _make_concat_multiindex(indexes, keys, levels=None, names=None) -> MultiInde | |
names = [None] | ||
|
||
if levels is None: | ||
levels = [ensure_index(keys)] | ||
levels = [ensure_index(keys).unique()] | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. hmm shouldn't this be the case for a specified levels as well? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Can we check whether the level is unique before? If not, raise ValueError. The doc says it should be unique. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I find we actually do not have the check for depulicated levels in concat function. Something like the following will not raise. Since this problem is an isolated one, thus I will make another PR to escape from confusion. df1 = pd.DataFrame({"A": [1]}, index=["x"])
df2 = pd.DataFrame({"A": [1]}, index=["y"])
pd.concat([df1, df2], levels=[["x", "y", "y"]]) # should raise There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. @GYHHAHA - Agreed what you are pointing out is a separate issue, but here is an example that @jreback was referring to.
This also raises the same error that is being addressed here. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. why you loc with string '1' and '3' instead of numeric value? @rhshadrach There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Yeah, now I get the error. I will look into this. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Sounds good! I believe you just need to apply your change to the There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. But I believe this is caused by duplicated levels input, if levels is [["x", "y"]], then it works fine. Maybe more suitable to add this to another PR related to unique levels keyword. @rhshadrach There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Ah - I see your point; the user should not ever specify a level with duplicate values and so we can raise here instead. That makes sense to separate this off into a different PR; can you see if there is an issue for this already and open one if there isn't? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. It seems that such issue doesn't exist now. I'll open one then link a pr to that after refining the performance warning check for the current PR. And also since we will raise for a duplicated level, then |
||
else: | ||
levels = [ensure_index(x) for x in levels] | ||
|
||
|
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -9,6 +9,7 @@ | |
Series, | ||
concat, | ||
) | ||
from pandas.errors import PerformanceWarning | ||
import pandas._testing as tm | ||
|
||
|
||
|
@@ -323,3 +324,27 @@ def test_concat_multiindex_(self): | |
{"col": ["a", "b", "c"]}, index=MultiIndex.from_product(iterables) | ||
) | ||
tm.assert_frame_equal(result_df, expected_df) | ||
|
||
@pytest.mark.parametrize( | ||
"keys", | ||
[["x", "y", "x"]], | ||
) | ||
def test_concat_with_key_not_unique( | ||
self, | ||
keys: list, | ||
): | ||
# GitHub #46519 | ||
df1 = DataFrame({"name": [1]}) | ||
df2 = DataFrame({"name": [2]}) | ||
df3 = DataFrame({"name": [3]}) | ||
df_a = concat([df1, df2, df3], keys=keys) | ||
with tm.assert_produces_warning(PerformanceWarning): | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. what is showing the performance warning? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Can you specify There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. sure, no problem |
||
out_a = df_a.loc[("x", 0), :] | ||
|
||
df_b = DataFrame( | ||
{"name": [1, 2, 3]}, index=Index([("x", 0), ("y", 0), ("x", 0)]) | ||
GYHHAHA marked this conversation as resolved.
Show resolved
Hide resolved
|
||
) | ||
with tm.assert_produces_warning(PerformanceWarning): | ||
out_b = df_b.loc[("x", 0)] | ||
|
||
tm.assert_frame_equal(out_a, out_b) |
Uh oh!
There was an error while loading. Please reload this page.